It's been over a year since my last day as a content moderator for Facebook and Instagram in Nairobi, Kenya. However, I still have trouble sleeping without having nightmares.
I started every day at 7 a.m. looking at 500 to 1,000 Instagram and Facebook posts that had been reported for violating the platform's rules. About 80 percent of what I saw was graphic abuse, hate and violence. My job was to look at the video, determine which of Facebook's community standards it violated and try to forget about it.
It's incredible what people can do for each other. I thought of my children and my parents, and how I protected them from what I saw.
My family were refugees. I crossed the border from Ethiopia to Kenya when I was three years old to escape conflict. I took this job because Facebook and Instagram needed Oromo speakers. When I sat down at my workstation on the first day, I was confronted with the violence my family had fled.
I have seen several times on the Oromo Islands where people from my homeland beg for mercy and other Amharic-speaking peoples are slaughtering people. Similar scenes continued to play on my screen from all over Ethiopia.
In my home country, we used to kill each other. But in that big glass building, we sat next to each other with Oromos, Amharas and Tigrayans, erasing the worst of the war from the internet as violence escalated and occupied most weekdays. He was gone. Other employees at the office experienced similar trauma from content emanating from other conflict zones.
Our offices, computers, desks, and chairs were all provided for us by Sama, an outsourcing company in San Francisco. But all the work we did was for Meta, the owner of Facebook. This work was damaging to our mental and emotional health. We tried to support each other. We constantly reminded each other that we were protecting our community from the harm this content would cause. But in order to protect our community, we had to endure harm ourselves.
We learned to support each other and continued to do so even after the day we were all “laid off” last January.
However, our jobs had not disappeared. They had just been transferred to another contractor, Majorelle. The work is hard, but we need work, so we applied. We were all rejected.
It became clear that we had been blacklisted, perhaps because we were organizing negotiations for better mental health care. So hundreds of us filed a lawsuit against Sama and Meta in Kenya, alleging that both companies violated Kenyan labor laws when they fired us. And we won!
The judge ruled that Meta was our “true employer” and ordered the company to repay our wages and provide mental and medical care. At first, I felt relieved because the law was on my side. But I have since realized that there is something more powerful than the law.
More than a year has passed since the first court order. More legal victories followed, but Mehta simply ignores them. The Kenyan government also did not support us. Kenyan President William Ruto and Ethiopian Prime Minister Abiy Ahmed recently attended the ribbon-cutting ceremony for Sama's new Nairobi headquarters. The war rages on, and my colleagues and I are scattered across the continent, waiting to be paid.
On May 23, Ruto will become the first African leader to make an official visit to the White House in nearly 20 years. As they discuss the future of trade between Kenya and the United States, what is at stake if U.S. companies are allowed to exploit African workers and violate local labor laws without consequences. I would like you to consider the workers.
Meta broke the law. I am frustrated and anxious and still can't sleep because of the trauma I endured working for them. Many of my former colleagues and I have formed a union with other content moderators at companies like Meta, TikTok, ChatGPT, and others so we can't allow these powerful companies to continue to use and spit us out. The law is on our side, but only the law of Kenya, and Meta seems to believe the law exists somewhere above that. Who would tell them differently?
Related books: