In Nairobi, Kenya, where unemployment among youth reaches a staggering 67%, AI jobs initially seemed like a golden opportunity. Workers like Naftali Wambalo, a college graduate with a mathematics degree, believed they had secured a foothold in the tech-driven future. Hired to label and sort data for global tech giants like OpenAI and Meta, Wambalo and his colleagues were tasked with training AI systems to recognise everything from traffic patterns to medical anomalies.
But a grim reality soon set in: while tech companies paid outsourcing firms up to USD 12.00 per hour for this labour, workers like Wambalo received just USD 2.00 per hour, as CBS News’ 60 Minutes recently uncovered, citing documents showing industry leader OpenAI’s agreement with Sama, an outsourcing firm, which says it pays a fair wage for the region — to which workers like Wambalo strongly disagree.
“If the big tech companies are going to keep doing this business, they have to do it the right way,” Wambalo told 60 Minutes. “It’s not because you realise Kenya’s a third-world country, you say, ‘This job I would normally pay USD 30.99 in U.S., but because you are Kenya, USD 2.00 is enough for you.'”
This glaring disparity highlights a broader pattern of exploitation in AI’s global supply chain, where workers in developing countries endure low wages and precarious contracts to fuel Silicon Valley’s ambitions.
Kenya: A Hub for “Humans in the Loop”
Kenya has actively marketed itself as a tech-friendly “Silicon Savannah,” offering financial incentives and less stringent labour laws to attract giants like Google, Microsoft, and OpenAI. Every year, a million young Kenyans enter the job market, desperate for work. For many, roles in the emerging AI sector seemed like a lifeline.
In practice, these jobs — dubbed “humans in the loop” — involved labelling images, videos, and text to train AI models. Workers spent hours reviewing harmful and often graphic content, including images of violence and abuse. The outsourcing firms employed by U.S. companies pitched these jobs as a pathway to a brighter future, but the conditions on the ground tell a different story.
“The workforce is so large and desperate that they could pay whatever and have whatever working conditions, and someone will pick up that job,” said Kenyan civil rights activist Nerima Wako-Ojiwa.
Workers claim SAMA pressured them to finish tasks faster than required, often completing six-month contracts in just three months, leaving them unpaid for the remaining time. While SAMA denies the allegations, workers said the only reward for speeding up was a token gesture: “They’d thank us with a soda and two pieces of KFC chicken,” said Naftali Wambalo.
Another firm, Remotasks—run by U.S.-based Scale AI—faced similar accusations. Workers, paid per task, said they were sometimes denied wages, with accounts abruptly closed and claims of policy violations just before payday. “There’s no recourse or way to complain,” said Ephantus Kanyugi.
In March, after public outcry, Remotasks shut down operations in Kenya, locking workers out of their accounts. The company insisted all completed work adhering to their guidelines was paid.
Sweatshops of the Digital Age
The outsourcing model used by companies like Sama — which contracted with OpenAI — is central to the problem. Sama, based in the California Bay Area, employed over 3,000 workers in Kenya. But the company, which branded itself as an “ethical AI” provider, allegedly pocketed the lion’s share of payments, leaving workers with minimal pay for emotionally taxing work.
These “AI sweatshops,” as Wako-Ojiwa describes them, are characterised by temporary contracts, with some workers hired for only days or weeks at a time. Workers like Wambalo often lack job security, benefits, or adequate mental health support, despite the traumatic nature of their tasks.
Sama, which also worked with Meta until early 2023, faced lawsuits from its moderators for poor working conditions and inadequate wages. The claims extend to allegations that Sama blacklisted former employees when the company’s contract with Meta ended, preventing them from finding similar work with its replacement contractor, Majorel.
One worker, Kauna Malgwi, recounted the toll of reviewing thousands of graphic posts daily. “You’d sift through murders, rapes, and suicides. It sticks with you,” she said.
A Broader Problem in Tech Outsourcing
Kenya is not alone. Similar outsourcing hubs exist in India, the Philippines, and Venezuela, where low wages and high unemployment allow tech giants to cut costs. These countries provide educated workforces willing to do the painstaking labour AI systems need to function.
But the ethical implications are glaring. While tech companies boast about AI’s potential to revolutionise industries, the human labour powering these systems often goes unacknowledged.
Cori Crider, co-founder of Foxglove, a legal nonprofit advocating for better conditions for tech workers, noted: “After years of bullying and intimidation from big tech firms, moderators are saying, ‘Our work matters.’”
The Fight for Justice
Kenyan workers are beginning to push back. A group of 184 Sama moderators has filed a lawsuit against the company, alleging unfair termination and poor working conditions. The Kenyan Employment and Labour Relations Court ruled that Meta could be held liable alongside Sama, marking a significant step in the battle for accountability.
Meta and Sama are also facing a separate lawsuit over their role in moderating harmful content during Ethiopia’s civil war, where critics argue that insufficient moderation fueled violence.
Despite these challenges, the workers’ fight for justice continues. Union organisers like Daniel Motaung, himself a former moderator, have emerged as key voices in the movement.
The revelations about OpenAI’s and Sama’s practices highlight a systemic issue in tech’s supply chain: while innovation garners billions in profits, the human labour behind it remains undervalued and under-protected.
Kenyan activist Wako-Ojiwa sums up the growing frustration: “It’s terrible to see just how many American companies are doing wrong here. And it’s something they wouldn’t do at home, so why do it here?”
As the AI industry expands, companies face increasing scrutiny to ensure their practices align with the ethical standards they claim to uphold. For workers like Wambalo, however, meaningful change remains a distant hope.
Featured Image Credits: Daniel Irungu/EPA