16- August-2023
East African firm says Facebook moderation work harmful, regrets contract
A firm contracted to moderate Facebook posts in East Africa has said, with hindsight, it should not have taken on the job.
Former Kenya-based employees of Sama – an outsourcing company for Facebook – have said they were traumatised by exposure to graphic posts.
Some are now taking legal cases against Meta (Facebook) through the Kenyan courts
Chief executive Wendy Gonzalez said Sama would no longer take work involving moderating harmful content for Facebook.
Some former employees have described being traumatised after viewing videos of beheadings, suicide and other graphic material at the moderation hub, which the firm ran from 2019.
Former moderator Daniel Motaung previously told the BBC the first graphic video he saw was “a live video of someone being beheaded”.
Mr Motaung is suing Sama and Facebook’s owner Meta. Meta says it requires all companies it works with to provide round-the-clock support. Sama says certified wellness counsellors were always on hand.
Ms Gonzalez told the BBC that the work – which never represented more than 4% of the firm’s business – was a contract she would not take again. Sama announced it would end it in January.
“You ask the question: ‘Do I regret it?’ Well, I would probably put it this way. If I knew what I know now, which included all of the opportunity, energy it would take away from the core business I would have not entered the agreement.”
She said there were “lessons learned”, and the firm now had a policy not to take on work that included moderating harmful content. The company would also not do artificial intelligence (AI) work “that supports weapons of mass destruction or police surveillance”.
BBC