Content moderators sue Facebook, Sama over work trauma
NAIROBI, KENYA — Nearly 200 former content moderators from Kenya are suing tech giant Facebook and their local employer Sama over challenging working conditions.
This suit, the first of its kind outside the United States (U.S.), highlights the unfortunate reality faced by content moderators globally.
The moderators were employed in Nairobi at Facebook’s content moderation hub, tasked with vetting and removing content from African users that violated Facebook’s guidelines. This often meant exposure to disturbing and violent videos, including child abuse and murder.
Despite the distressing nature of their work, the workers claim they received inadequate mental health support, were underpaid, and worked under an enforced culture of secrecy.
One plaintiff, Nathan Nkunzimana, described their role as soldiers taking bullets for users and absorbing traumatic content to protect others.
“If you feel comfortable browsing and going through the Facebook page, it is because there’s someone like me who has been there on that screen, checking, ‘Is this okay to be here?,” Nkunzimana stated.
He added that this could bring an emotional toll on workers, particularly for those already carrying personal trauma from violent incidents in their home countries.
The group is now seeking a €1.46 billion (US$1.59 billion) compensation fund for former workers who suffered trauma from working in the company.
However, both Facebook and Sama defended their practices. Facebook’s parent company Meta said its contractors are contractually obliged to pay their employees above the industry standard in their markets and provide on-site support by trained practitioners.
At the same time, Sama said that their Kenyan workers were given salaries that were four times the local minimum wage. All employees were reportedly given unlimited access to one-on-one counseling “without fear of repercussions.”
This lawsuit marks a significant step in addressing the mental health impact of content moderation work. Just last March, a partnership between Meta and Majorel was blocked in Kenya following an illegal termination and blacklisting case in Kenya. The court also barred Sama from effecting any form of redundancy.
Sarah Roberts, an expert in content moderation at the University of California, said that the Kenyan moderators’ tactic to organize and push back against their work conditions could create changes in the industry.
She added that settling the case — Facebook’s usual plan in dealing with issues like this — could be more complicated “if cases are brought in other places.”
In 2020, Facebook paid $52 million in settlement with its content moderators in the U.S. who developed post-traumatic stress disorder (PTSD).