African right group urges Meta to boost harmful content moderation

NAIROBI, KENYA — Following the announcement of Meta’s African outsourced partner that it will no longer screen harmful posts, an African rights group called on the social media giant to “seize the opportunity to improve its content moderation” in the region.
Kenya-based outsourcing firm Sama said on Jan. 10 it would no longer provide content moderation services for Meta’s social media and messaging platforms Facebook, WhatsApp and Instagram come March, as it moves to concentrate on data labelling work.
Sama also said it is letting go of 3% of its staff, which is about 200 employees. This move is part of its operations streamlining to boost efficiency.
Sama and Meta both face a lawsuit over alleged labor abuses and preventing workers from unionising in Kenya. Another lawsuit alleges that Meta is allowing violent posts on Facebook, inflaming civil conflict in neighbouring Ethiopia.