Artificial intelligence will never be able to solve the issue of content moderation online

More Articles

Onkar Bhanarkar
Onkar Bhanarkar
Onkar is a Masters in Cyber Security, Ireland. He is a cybersecurity professional with extensive knowledge in Digital forensic investigations and Risk assessment. Onkar Bhanarkar is the Specialist Regtech analyst who contributes articles on money laundering enforcement actions in India, GDPR, Risk assessment, and Cyber Attacks.

Facebook CEO, Mark Zuckerburg has promised this scenario many a time, but this technology looks too futuristic to be true at the moment. Social network tries to keep unwanted content off their platform by using a combination of human moderation and automated filtering.

Human moderators work in stressful conditions. They have to sift through hundreds of reported content every single day. They have to make tough decisions about whether the post violates the platform’s ideology. The training is inadequate and supports minimal. Artificial Intelligence might seem like the obvious solution to fake and inciting posts that lead to repercussions, but the truth of the matter is that it just shifts responsibility on automation, but solves no problems in the long run.

Artificial Intelligence can run like a triage system and can help weed out a fraction of the unwanted content. But the system is rudimentary. It might latch on to verbal cues like nudity or guns and recognize road categories, but humans are required to create the cues in the first place. The technology has broad uses and is fairly reliable but can lead to some problems.
Content that is hard to context is often not comprehensible to the AI system. Most content would have an ethnicity, race, personal background or a mood behind the ethos, which can be difficult to comprehend by humans, and impossible by automation. Artificial Intelligence can not capture human culture and emotion.

AI is maturing rapidly, and future algorithms might be able to sift through information in a much more reliable way. Advances in deep learning have improved speed and competency of the system. But as a replacement to human moderators, AI still seems a little immature.
What can be done is to improve a lot of human moderators.

- Advertisement -spot_imgspot_img

Latest

error: Content is protected !!