Meta is changing who handles content moderation in East Africa

On Tuesday, Facebook confirmed that it would be ending its contract with Sama, the outsourcing firm responsible for moderating graphic content in East Africa. The news, first reported by Time Magazine, comes on the backs of a lawsuit by a former content moderator employed by Sama in Nairobi, Kenya, alleging severe mental health trauma as a result of their work…

On Tuesday, Facebook confirmed that it would be ending its contract with Sama, the outsourcing firm responsible for moderating graphic content in East Africa.

The news, first reported by Time Magazine, comes on the backs of a lawsuit by a former content moderator employed by Sama in Nairobi, Kenya, alleging severe mental health trauma as a result of their work alongside other labour violations.

According to Foxglove Legal, a legal nonprofit that investigates Big Tech companies, $2.2 billion European outsourcing firm Majorel will be taking over the contract. But Foxglove said Majorel is no better than Sama in its treatment of moderators — something also shown by Insider’s reporting from this past summer investigating the abhorrent treatment of Majorel’s content moderators working for TikTok’s parent-company ByteDance in Morocco.

It’s a sign of how troubled the content moderation industry remains. Despite several such lawsuits against social media companies like Facebook, Youtube, TikTok, and Reddit from around the world, workers are still often forced to work several hours long shifts moderating some of the most graphic content on the internet, from child abuse material to videos of gruesome accidents and beheadings, often with very few protections in place for their mental wellbeing.

In August, Insider investigated the labour conditions for TikTok’s content moderators working for Majorel in Morocco — the hub of ByteDance’s Middle East and North Africa content moderation operation. Workers told us they often work over 12-hour shifts, where they flag videos of animal abuse, sexual violence, and other forms of gruesome content. They had fewer breaks than their US counterparts and said the company’s “wellness counsellors” were of little help.

Social media companies claim to use sophisticated algorithms that help clean up people’s feeds, but this covers up the grim reality of how nearly every social media company works. Behind the scenes, a global workforce of tens of thousands filters out repugnant content so it doesn’t end up in front of your eyes.

In recent years, Facebook has settled lawsuits with moderators who reported PTSD as a result of their work for the company and promised to make changes to its labour conditions. But as Vittoria Elliott and I reported in 2020, the relatively meagre concessions rarely make it to workers in India, the Philippines, or Kenya.

Experts like Foxglove Legal have been calling on social media companies like Meta to bring their global content moderation workforce in-house. It may be the only way to make sure handling the worst elements of social media is shouldered by those who are truly accountable to the company and its users. Until then, contractors like those at Sama or Marjorel or dozens of other outsourcing firms will pay the price.

msn

ANA NEWS WIRE Disclaimer:
The African News Agency (ANA) is a news wire service and therefore subscribes to the highest standards of journalism as it relates to accuracy, fairness and impartiality.
ANA strives to provide accurate, well sourced and reliable information across Text, Images and Video. Where errors do appear, ANA will seek to correct these timeously and transparently.
The ANA platform also contains news and information from third party sources. ANA has sought to procure reliable content from trusted news sources but cannot be held responsible for the accuracy and opinions provided by such sources on the ANA platform or linked sites.
The content provided for on the ANA News Wire platform, both through the ANA news operation and via its third party sources, are for the sole use of authorised subscribers and partners. Unauthorised access to and usage of ANA content will be subject to legal steps. ANA reserves its rights in this regard.
ANA makes every effort to ensure that the website is up and running smoothly at all times, however ANA does not take responsibility for, and will not be held liable for times when the website is temporarily unavailable due to technical issues that are beyond our control.