Aspiring Business Leaders Worldwide

Facebook Content Moderation Inadequate and Ignores Mental Health of Moderators, Says Report

SHARE
, / 465 0

A new report by the New York University Stern Center for Business and Human Rights explores the consequences of outsourcing content moderation to third parties by social media platforms and how the content affects the mental health of the moderators. The report also indicts the platforms for failing to provide adequate review of hate speech and other material that can spark animosity and violence in developing countries.

The case study focuses on Facebook mainly although it does look into Instagram, Twitter, and YouTube too. According to the study, the sheer volume of content generated on these platforms is unmanageable, and the number of Facebook moderators employed is not enough to perform the tasks adequately.

Facebook-Moderators-lawsuit

At present, Facebook employs 15,000 workers, the overwhelming majority of them employed by third-party vendors. They police Facebook’s main platform and its Instagram subsidiary. About 10,000 people scrutinize YouTube and other Google products. Twitter, a much smaller company, has about 1,500 moderators. Facebook has also partnered with 60 journalist organizations to implement fact-checking, the number of items sent to these groups far exceeds their capacity to verify most claims.

Every day, billions of posts and uploads appear on Facebook, Twitter, and YouTube. On Facebook alone, more than three million items are reported on a daily basis by users and artificial intelligence screening systems as potentially warranting removal.

“Content moderators are the people literally holding this platform together,” a Facebook design engineer reportedly said on an internal company message board during a discussion of moderator grievances in early 2019. “They are the ones keeping the platform safe. They are the people Zuck [founder and CEO Mark Zuckerberg] keeps mentioning publicly when we talk about hiring thousands of people to protect the platform.”

But the coronavirus pandemic has made the task doubly tricky with moderators now forced to do content weeding from home. But privacy and security issues have forced the companies to adopt the Artificial Intelligence route to act as gatekeepers of objectionable content. The role of AI is limited to nudity and similar easily identifiable objectionable content. But the nuanced and hate speech and libelous content need the human element to police.

The report says that users and the company’s artificial intelligence report an error rate of 10% on 3 million uploads by moderators spread across 20 sites, which means Facebook makes about 300,000 content moderation mistakes per day.

The marginalizing of the content moderators due to the coronavirus has led to some harmful content inciting religious and ethnic strife escaping any moderation and appearing on these sites. Another lacuna that the study authors pointed out was that third-party contractors employed underpaid staff that was barely up to the task of performing the job.

The study’s principal author, Paul Barrett, interviewed a number of former content moderators and found that the hiring companies ignored mental health issues among the content recruits. The report calls for improved working conditions that include better physical and mental health care for moderators who are subjected to disturbing content throughout the workday.

“While the third-party vendors that oversee this activity on paper provide a fair amount of benefits related to mental health, this offering was consistently described as being not particularly serious in practice, given how potentially traumatic this activity is,” Barrett said.

The content moderation model is based on making maximum money at the least cost. The task of the moderators is to keep harmful content out of the site and Facebook operates on offering captive, targeted audiences in large numbers to advertisers. These advertisers would quickly disappear if such platforms allowed discontented and unacceptable content to populate the sites.

“One of the revelations for me was realizing just how central the function [of moderation] is to the companies, and therefore how anomalous it is that they hold it at arm’s length,” said Barrett, who is deputy director of the Stern Center. “The second surprise was the connection between the outsourcing issue and the problems they have experienced in what they call ‘at-risk countries."

The report proposes eight steps to remedy the situation:

  • Hire in-house content moderators with substantial pay and benefits.
  • Double the number of content moderators to improve the quality of reviews.
  • A content overseer or high-level executive should be appointed to see that the task is carried out appropriately and meet the standards.
  • Invest in content moderation for “at-risk countries” in Asia and Africa and employ local moderators who understand the culture, languages, and nuances of the region.
  • Provide on-site medical care.
  • Sponsor academic research into the health risks of content moderation.
  • Encourage governments to come up with more regulations to prevent harmful content from appearing so regularly and widely.
  • Expand the scale of fact-checking to attack disinformation and fake news.

Demand for changes in the way social platforms disseminate information has gone up. Internally also Facebook is facing a demand by employees for a more stringent stand on events of import. Mark Zuckerberg has acknowledged that sensitive content moderation has to be done in-house for the time being. Also, AI cannot be relied upon to do a thorough job of gatekeeping, and a human element is necessary to keep things more acceptable to a vast majority.

“It is a very ambitious ask,” Barrett said of the proposal to scrap outsourcing. “But my attitude is if the current arrangement is inadequate, why not just go for it and urge [the company] to remedy the problem in a big way. I don’t think Mark Zuckerberg is going to [smack himself on the head] and say, ‘Oh my god, I never thought of that!’ But I do think it’s possible the company is ready to move in that direction.”

Register today to get full access to:

All articles | Magazine archives | Livestream events | Comments

PASSWORD RESET

Register today to get full access to:

All articles | Magazine archives | Livestream events | Comments

//

Subscribe Plan Details








Register today to get full access to:

All articles | Magazine archives | Livestream events | Comments

LOGIN