African OpenAI content moderators vote to unionize in a landmark move which will protect workers’ rights. Over 150 moderators, who supervise content for major tech platforms such as ChatGPT, TikTok and Facebook, marked Labor Day in Nairobi by forming The Content Moderates Union.
This the first time moderators have come together to demand fair treatment. It follows years of controversy surrounding the poor working conditions of tech moderators, especially the ones working for third-party outsourcing companies in Africa.
The Emotional Toll Of Content Moderating
Moderating content can have a detrimental effect on workers’ health. In the US, Facebook has paid millions of dollars to moderators suffering with Post Traumatic Stress Disorder (PTSD).
Moderators are the front line of defense when it comes to internet safety. They spend hours watching graphic videos containing violence, child abuse, rape, murder, and suicide, so we don’t have to.
Despite the emotional and psychological strain, moderators are one of the lowest-paid workers in the industry.
Content reviewers are also under tremendous pressure to review massive amounts of material and often have to meet targets. Although employers offer psychological support, it isn’t always easy to access or offered to workers post-employment.
Moderators, particularly the ones in Africa, get a rough deal, especially compared to the working conditions for regular staff at the Facebook offices.
The Content Moderates Union
According to Time, the idea for The Content Moderates Union started in 2019 when Daniel Motaung was fired from Sama, an outsourcing company because he tried to form a union.
Major social media platforms such as TikTok, Facebook and OpenAI outsource moderator work to third-party companies. Many believe that outsourcing distances them from their responsibilities regarding workers’ rights.
According to the NTV Kenya, Danial claims moderators “face hazardous work conditions without hazard pay. Mental health support is severely lacking, job security is scarce, and some moderators feel silenced by strict non-disclosure agreements.”
A damning report by Time highlighted Daniels’ case and the abysmal working conditions for African content moderators. It revealed that despite working with the most disturbing content, some workers were paid as little as $1.50 per hour.
Sama is the leading outsourcing company in Africa that, until recently, provided moderation for Facebook’s parent company, Meta.
Many employees feel exploited and suffer from severe anxiety and depression. Despite the low pay and lack of support, they continue in the positions because they have no other option.
Since the report was published, several lawsuits have been filed against the company, including one from Daniel Motaung. According to Reuters, he is suing Sama and Meta for unlawful redundancy and violations of the Kenyan constitution.
In April this year, Kenyan officials ruled that Meta can face legal action regarding an unfair dismissal case in a Kenyan court. TechCrunch reported that Meta tried to nullify the case because they are a foreign company and shouldn’t fall under Kenyan jurisdiction.
Meta is also being sued for $2bn by human rights activist groups and the families of the victims of ethnic violence in Ethiopia.
According to Amnesty International, Facebook used harmful algorithms to spread misinformation and hateful content, which incited violence during the Tigray War and led to civilian deaths.