Social media platform giant Facebook has agreed to pay $52 million to third-party content moderators who have developed post-traumatic stress disorder (PTSD) and other mental health issues. This came after a scan of several disturbing images of rape, murder and suicide to curb those on the platform. As per reports, in a preliminary settlement in San Mateo Superior Court, the social networking platform agreed to pay damages to 11,250 U.S.-based moderators and provide more counseling facilities to them. As per the compensation by Facebook, each content moderator would receive a minimum of $1,000 and will also be eligible for additional compensation if they are diagnosed with post-traumatic stress disorder or any related conditions.
Facebook earlier hired several leading firms like Accenture, Cognizant, Genpact and ProUnlimited to help it moderate and remove harmful content after the 2016 U.S. presidential election and Cambridge Analytica data scandal.
Any person who is diagnosed with a mental health condition would be eligible for an additional $1,500, and the ones who are diagnosed with PTSD and depression will be eligible for an additional amount of $6,000. ‘We are so pleased that Facebook worked with us to create an unprecedented programme to help people performing work that was unimaginable even a few years ago,’ Steve Williams, a lawyer for the plaintiffs, said in a statement.
In the year 2018, Selena Scola, a former content moderator of Facebook sued the social media company alleging that the social networking giant does not adequately protect moderators who face mental trauma after reviewing distressing images on the platform. She sued Facebook for allegedly ‘ignoring its duty’ to protect content moderators who dealt with mental trauma and PTSD after seeing disturbing imagery. These moderators regularly are “bombarded” with thousands of videos, images and live-streamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder as a part of their work.
In the year 2019, several content moderators stated that they were diagnosed with PTSD after working for Facebook. Facebook had more than 30,000 employees regularly working on safety and security, about half of whom were content moderators.
IT giant Cognizant announced that it would quit the content moderation business with Facebook and so shut down its sites earlier this year. The company also developed “the next generation of wellness practices” for affected content moderators. “In addition to payment for treatment, moderators with a qualifying diagnosis will be eligible to submit evidence of other injuries they suffered for their time at Facebook and could receive up to $50,000 in damages,” a report stated.
Whether you want to stay up-to-date on the latest business news, read in-depth CEO interviews, or find new ideas on leadership, management and innovation, Industry Leaders Magazine is here to suit your needs and help you stay more informed.