Under the threat of an expensive lawsuit, Meta is distancing itself from the claims that one of its main subcontractor for content moderation violated a number of employee rights in its Kenyan hub.
Content moderators are essential to the social media ecosystem through the roles they play in removing illegal or banned content before any of it is seen by the average user. They spend hours a day navigating through the dark side of social networks, performing the brutal task of viewing posts perpetrating and perpetuating hate, misinformation and violence. They are bombarded with thousands of videos, images and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder. Social media platforms subcontract most of content moderation, a practice that keeps their profit margins high but at the cost of thousands of moderators' health. Over the past few years, stories of content moderators experiencing severe anxiety and post-traumatic stress disorders have repeatedly made the headlines.
The latest involves the social-media giant Meta – formerly known as Facebook – and its main subcontractor for content moderation in Africa, Sama. The law firm ‘Nzili and Sumbi Advocates’ alleged that Sama had violated various employee rights of its Kenyan and international staff, including those of health and privacy. ‘Sama moderators report ongoing violations, including conditions which are unsafe, degrading, and pose a risk of post-traumatic stress disorder’, the law firm stated. It is also claimed that the productivity of Sama’s employees was tracked using Meta’s software to measure employee screen time and movement during work hours. Despite their importance to Facebook, Sama workers are among the lowest-paid workers for the platform anywhere in the world, with some of them earning as little as $1.50 per hour. Demands were made that Meta and Sama adhere to Kenya’s labor, privacy and health laws, including that it provides its moderators with adequate mental health insurance and a better compensation. Additionally, the law firm demanded that the two firms recruit qualified and experienced health professionals for the content moderators.
Trouble for Sama started in February 2022 when Time described how the company recruited its moderators under the false pretext that they were taking up call centre jobs. The content moderators only learned about the nature of their jobs after signing employment contracts and relocating to its hub in Kenya’s capital, Nairobi. The article also revealed how Sama, which calls itself an ‘ethical AI’ company, suppressed workers’ efforts to secure better working conditions. The company allegedly fired its former employee Daniel Motaung for leading a strike in 2019 over poor pay and work conditions. According to Motaung’s lawyer, Sama failed to provide adequate psychosocial support and mental health measures, including ‘unplanned breaks as needed particularly after exposure to graphic content’. ‘The first video he remembers moderating was of a beheading. Up to that point, no psychological support had been offered to him in advance’, the law firm said.
The law firm representing Motaung has now threatened to go on with the plans to file a lawsuit. Meta has distanced itself from the claims, saying it was not privy to the arrangement its subcontractor had with Motaung. Sama has also denied any wrongdoing and claimed that the contract was terminated because of ‘unacceptable actions taken against fellow employees that jeopardized their safety’. The company also states that it is transparent during its hiring process and has a culture that ‘prioritizes employee health and wellness’. ‘We understand that content moderation is a difficult but essential job to ensure the safety of the internet for everyone, and it's why we invest heavily in training, personal development, and wellness programs’.