“Without content moderators there would be no Facebook. I assure you that if the content moderators were not there, you would not dedicate a minute to those platforms, because you cannot imagine the amount of toxic, dirty, unbearable content… That is published”. From Nairobi, the Kenyan capital, Nathan Nkunzimana thus describes the hidden side of social networks, the number of publications that remain in the content moderation filter and that make the visible face of these platforms at least humanly tolerable. .
For the past two years, Nkunzimana has been one of the anonymous pawns in that line of defense, moderating content for much of sub-Saharan Africa. Throughout the workday and virtually without interruption, a group of people like him have to visualize and filter all kinds of aberrations, including multiple forms of extreme violence on video. The conditions in which they have been carrying out this work have led them to face all the obstacles to form a union, African Content Moderatorswhich is one of the first created in this sector in the world and which has been set up with the support of the Kenyan trade union organization COWU (Communication Workers Union, for its acronym in English). Ridiculous salaries, unfair dismissals, exploitation and psychological problems derived from their tasks justify their fight.
On May 1, more than 150 moderators content developers and data taggers from Facebook, ChatGPT and TikTok gathered in a Nairobi hotel to share their experiences and concerns. And to take a final step: agree on the constitution of a union. Benson Okwaro is a veteran trade unionist, general secretary of COWU, and remembers that “many global companies don’t like unions in their offices”, but remembers that local laws do recognize the right of workers to freely organize. However, he highlights the difficulties that these large companies do not have formal headquarters in the countries where they employ workers and that they try to avoid national legislation. “That’s why we need to be united and look for joint solutions now,” says Okwaro.
In February 2019, Meta announced the opening in Nairobi of the “first Facebook content review center in sub-Saharan Africa” as part of its “continued investment” in that part of the African continent and its “commitment to security” on the platform. , as pointed out on your own Facebook profile. In the same publication he assured that she would do it “in collaboration with Samasource”. Actually, it was the American company Sama that formally hired the moderators. Okwaro explains that she was a subcontractor and that Nkunzimana and the rest of the moderators allege that her main employer was Meta, for which they claim the responsibility of the parent company of Facebook, Instagram and WhatsApp in their working conditions.
“Sexual harassment, child abuse, sexual activities… That happens on social networks and it happens live. She came home and had the feeling that she did not feel anything ”
Nathan Nkunzimana, content moderator
Daniel Motaung, a South African employee of Sama, had already tried to organize his colleagues during the first summer of the project, with a union embryo that called itself The Alliance (the alliance, in English) and that did not come to be: that attempt to mobilize was stifled. Disgruntled workers were called to order, Motaung was immediately suspended, and a few weeks later he was fired.
This former Facebook moderator uncovered the box of thunder when in February 2022 he counted your story to the magazine Time and revealed the working conditions in the Nairobi offices where the content of the social network for East Africa was reviewed: “The work we do is a kind of mental torture.” After his revelation, different legal processes were opened: first motaung sued Meta and Sama for labor exploitation and union repression. And that unleashed a cascade of legal proceedings and media scandals, which brought to light the low wages for data labellers hired by the same company to correct the toxicity of ChatGPT.
Following these problems, Sama resigned from his contract with Meta (replaced by Majorel) in January this year and announced his termination to the employees. That was the trigger for a new, more discreet mobilization. They denounced the dismissals before the courts, while they were cooking to establish themselves as a union. If the working conditions were hard, the road in recent months has been no less, as Nathan Nkunzimana relates.
In April 2021, Nkunzimana started working for Sama as a content moderator. During his working day, he came to view between 1,500 and 2,000 publications on which he had to decide whether to delete them, if they were sent to another instance for review, or if they were allowed to pass. This Burundian citizen, who arrived in Kenya 12 years ago to complete his studies, explains what was included among these contents: “Sexual harassment, child abuse, sexual activities… That happens on social networks and it happens live. There are terrorist groups that kill people in broad daylight and try to distribute it publicly on those platforms.” He himself acknowledges that this exposure has caused him social, psychological and personal problems. “There were days when he came home and he had the feeling that he didn’t feel anything,” he says, having to carry that weight alone with his wife and his three children.
And it is that in addition to the crudeness of the contents, the moderators have had to face working conditions that aggravate the situation. “You couldn’t talk to anyone about it, because there was a confidentiality clause. You couldn’t even share with your partner what you’re going through, the nature of the job that’s destroying your personal life. It was frustrating,” she laments. To this is added the pressure of productivity: “If in one week you did not reach the required metrics, the next you would receive an email warning you that you were not meeting the objectives. The programs controlled how much time you spent on each piece of content. You couldn’t take your eyes off the screen all day. It took two or three seconds from the moment you clicked on a publication until the machine placed another one for you to review. It didn’t give you a moment of calm, even a minute to go to the bathroom was a problem with your supervisor.
And as icing on the cake, at the moment in which Sama announced the cessation of the activity, the moderators stopped receiving their salary despite the fact that they denounced the irregularity of the procedure. “90% of content moderators are foreigners and what we have experienced during the process is very hard, spending three months without receiving a salary, in a country that is not yours: you cannot pay the rent, you cannot buy food …”, explains Nkuzimana. Cori Crider, co-director of foxglove, a British organization that is accompanying them in this process, adds that this situation “forces them to continue accepting insecure jobs to remain in the country, despite the serious risk to their mental health.” The moderators have resorted to crowdfunding to guarantee your resistance fund.
This Burundian moderator says that “the text content was forwarded to other offices, but the system sent the images and videos to our offices in Africa.” The truth is that, as a result of other complaints in other content moderation centers have managed to psychological support for its employees, but these conditions have not become widespread.
“The situation in this content moderation center is especially bad, with an appalling workload”
Cori Crider, co-director of Foxglove
“The situation in this content moderation center is especially bad because the remuneration is usually extremely low, around 2 or 3 dollars per hour. Only 260 moderators worked in the Nairobi hub, responsible for reviewing content for the eastern and southern African region of some 500 million people. The result is an appalling workload,” explains Cori Crider. “We have the legitimacy to convince these big technology companies that they have a responsibility to regularize our working conditions. A content moderator in Africa is charging $500 or $600 gross. It gives you to pay the rent and the minimum you need to live, because life here [en Nairobi] It is very expensive. The same goes for psychological support. When we have claimed it, we have found messages with which they tried to frighten us: “If you continue like this, you will end up going home,” warns Nkuzimana. Those responsible for the Sama company have declined to answer the questions asked for this report.
siasa place is a Kenyan youth organization that has also covered the mobilization of content moderators and Nerima Wako-Ojiwa, its director, adds: “Some big tech companies take advantage everywhere, but especially in the Global South. There are a number of gaps when it comes to policies, such as data protection or the working conditions of these employees.”
“An unequal fight” with the tech giants
For his part, Nkuzimana makes it clear: “Our request is that our human, constitutional and labor rights be respected, we only ask for that.” and demands that platforms take responsibility of people who moderate content. “We are more than moderators, we are the soldiers who sacrifice to make communities safe. But the companies that manage these communities do not take care of the people who protect them, ”she says.
Meanwhile, the courts have made encouraging decisions for employees in their many trials. Meta tried to avoid a complaint because she does not have residence in Kenya, but the court rejected his arguments. In the same way, a judge forced Meta to suspend the contract with MajorelSama’s replacement company, until decide the fate of the employees and the nature of that relationship. Veteran unionist Benson Okwaro says that “Kenyan laws are very worker-friendly.” And for Foxglove’s Cori Crider, “companies like Facebook, Google and TikTok are some of the most powerful in the world, with almost unlimited resources. It takes incredible bravery to face them with collective power alone.”
Nerima Wako-Ojiwa approaches this situation in the key of a future model: “There are many questions about unions and labor rights, especially regarding virtual work and national labor law. These are questions that countries will have to begin to answer. As tech companies continue to grow, the way people work and interact with them will require legal responses. Many large companies evade responsibilities or taxes by resorting to third-party companies. This activist insists on the particularities of the continent: “Definitely, that of content moderators is an unequal fight, but that is the future of work and it is here to stay. We have to have decent work for people. These are things that the ministries and private companies will have to sit down at the table to negotiate”.
You can follow THE COUNTRY Technology in Facebook and Twitter or sign up here to receive our weekly newsletter.
Subscribe to continue reading
Read without limits
#labor #struggle #clean #Kenya #toxicity #Facebook #TikTok #ChatGPT #euros #hour