The parent company of Facebook, Instagram and WhatsApp, Meta has been issued a lawsuit alongside its main subcontractor for content moderation in Africa – Sama. The lawsuit which was filed against the company over alleged unfair and unsafe working conditions for employees requires twelve demands on workplace conditions to be met. The law firm handling the case gave Meta and Sama a period of 21 days, starting from the 29th of March, to meet these demands or face the lawsuit.
Sama was accused by Nzili and Sumbi Advocates, the law firm representing Daniel Motaung who was a former Sama employee who was fired for organizing a strike over poor working conditions and remuneration, of violating various rights of its employees including the health and privacy of these employees – both Kenyans and non-Kenyans. Meta and Sama have been required to follow Kenya’s law on labour, privacy and health, in a demand letter received by the companies. They were also instructed to recruit qualified and experienced health professionals, as well as provide the moderators with sufficient compensation and mental health insurance.
According to lawyers from Nzili and Sumbi Advocates, “Facebook subcontracts most of this work to companies like Sama – a practice that keeps Facebook’s profit margins high but at the cost of thousands of moderators’ health – and the safety of Facebook worldwide. Sama moderators report ongoing violations, including conditions which are unsafe, degrading, and pose a risk of post-traumatic stress disorder (PTSD).”
According to reports and claims by former workers, Sama mostly recruited moderators across Africa with a job description that they were taking up pay centre jobs and they only get to know about the true details of their job after relocating to its hub in Nairobi and signing an irreversible employment contract. Their job is to remove Facebook users promoting misinformation, hate and violence, as against the call centre job they were told about.
While these moderators end up being underpaid, they are expected to not disclose what their job is all about to outsiders. Sama which poses as an AI ethical company only recently increased the remunerations of employees after several reports against it.
“Sama and Meta failed to prepare our client for the kind of job he was to do and its effects. The first video he remembers moderating was of a beheading. Up to that point, no psychological support had been offered to him in advance,” lawyers from Nzili and Sumbi Advocates added.
“I use Facebook, like many Kenyans, and it’s an important place to discuss the news. But that is why this case is so important. The very safety and integrity of our democratic process in Kenya depend on a Facebook that is properly staffed, and where content moderators, the front-line workers against hate and misinformation, have the support they need to protect us all. This isn’t an ordinary labour case – the working conditions for Facebook moderators affect all Kenyans,” Mercy Mutemi, who is leading the legal action said.