Former Content Moderator Sued Meta
“Because of the regularly filtered content, I now have a heightened dread of dying. As a result, my quality of life has significantly improved, “he said during a Tuesday virtual talk. “Going outside does not excite me. Going into public places is something I despise.” The panel, titled “Facebook Content Moderation, Human Rights: Democracy and Dignity at Risk,” took place on the very day the ex-content moderator’s attorneys filed a lawsuit against Meta and Sama. Sama is the outsourcing firm that works with the tech giant for content moderation in Africa. The corporations are accused of engaging in forced labour, human trafficking, treating workers in a “degrading manner.” The allegations also include union-busting, according to the 52-page petition. According to the lawsuit, Motaung was sacked from his employment in 2019 after attempting to form a trade union. The action, filed in Nairobi’s employment and labor relations court, is the latest in a string of complaints against Meta’s content moderators’ working conditions. After content moderators in the United States sued Facebook for allegedly failing to provide workers with a safe workplace, the company settled for $52 million in 2020. The social media platform, which employs over 15,000 moderators, has had difficulty policing harmful content in numerous languages worldwide.
Meta Spokeswoman Denied the Complaint
Grant Klinzman, a spokeswoman for Meta, declined to comment on the complaint. The corporation previously stated that it seriously takes its obligations to content reviewers. It requires partner companies to offer fair compensation, benefits, and assistance, and it audits them regularly. The claims against Sama are “both incorrect and frustrating,” according to Suzin Wold, a spokesman for the company. She claims the company has assisted in lifting more than 59,000 people out of poverty, paid them a fair salary, and is a “longstanding, respected employer in East Africa.” According to the lawsuit, Sama recruits disadvantaged and vulnerable adolescents for content moderation work. And forces them to sign employment contracts before they fully comprehend the role. Motaung, who comes from a poor household, was searching for a job after college to support his family. He had no idea that content moderation could impair his mental health. From censoring graphic content, he developed post-traumatic stress disorder, severe depression, anxiety, a relapse in his epilepsy, and intense flashbacks and dreams. Also read: YouTube’s Gifting Membership Goes Live Today in Beta Version