The 404 Media Podcast (Premium Feed)

from 404 Media

What It’s Like to Be a Data Labeler Training AI

You last listened February 16, 2026

Episode Notes

/

Transcript

I recently traveled to Kenya for a journalism and AI conference. While I was there, I really wanted to meet with Michael Geoffrey Asia, the secretary general of the Data Labelers Association. Data Labeling is a huge job in Kenya. Data labelers are the people who train AI, and who also work on ensuring the outputs are accurate. In some cases, data labelers are themselves pretending to be AI, in order to train AI. Often, data labelers don’t know exactly what they’re working on, because the work usually goes through a platform, a subcontractor, or a combination of both. So basically they can be presented with a backend where they’re asked to perform tasks or answer questions; in some cases their answers may be presented in real time as AI.

Data labeling is notoriously brutal and underpaid work. Workers sometimes earn as little as a few dollars a day, work under algorithmic management, and, because they’re sometimes trying to train AI what not to do or show, they are often shown graphic, violent, or sexual content for hours at a time. It’s kind of similar to content moderation jobs, and lots of people do both data labeling and content moderation, or switch back and forth between the industries. It’s such a big thing in Kenya that I mentioned it to the driver who took me to meet Michael for this interview, and she told me that she too was a data labeler, as are many of her friends.

Michael has since become critical at the Data Labelers Association, a group that is fighting to organize people who do data labeling work and who is advocating for better working conditions, higher pay, and more protections for data labelers. I met Michael at a coworking space in Nairobi in a very tiny room, so I’m not on camera after this, but here’s my conversation with Michael.

The Emotional Labor Behind AI Intimacy by Michael Geoffrey Asia

YouTube Version: https://youtu.be/2geffTpjSc4

0:00 - Intro
5:39 - What Is Data Labeling?
7:11 - The Growth of the Data Labeling Industry in Kenya
8:29 - Michael’s Introduction to Data Labeling
10:15 - Pressures of Data Labeling
11:23 - Inside the Work of Sama
12:12 - Annotating Graphic Images
15:25 - Training AI Companions
17:38 - Cultural Differences Between Kenyans and Chatbot Users
19:10 - Explicit Chabot Conversations
19:25 - The Mental and Physical Toll of Data Labeling
21:41 - Mental Health Support for Data Labelers
23:09 - The Exploitative Nature of Data Labeling
25:07 - The Data Labelers Association
26:22 - Reforming Data Labeling
28:12 - Can Data Labeling Be Reformed?
30:18 - Is There Such Thing as Ethical Data Labeling?
33:05 - Sama’s Client Companies
35:56 - How Common Is Data Labeling in Kenya?
36:54 - Witnessing Violent Imagery As A Data Labeler
37:55 - NDAs
38:58 - Training AI to Replace You
42:23 - Data Labelers Association Outreach
Jason:

Hello, and welcome to the four zero four Media Podcast where we bring you unparalleled access to hidden worlds both online and IRL. Four zero four Media is a journalist founded and owned company and needs your support. To find our work and subscribe, go to 404media.co. Subscribers get access to bonus podcast segments and early access to interview episodes like this one. I'm Jason Keppler, and this week, we have a special episode and something a little bit different.

Jason:

I recently went to Kenya for a journalism and AI conference that I talked about briefly on this pod before. And while I was there, I really wanted to meet with Michael Jeffrey Asia, who is the secretary general of the Data Labelers Association. Data labeling is a huge job in Kenya. A lot of people there are talking about it all the time. It's seen as tech work because it is tech work.

Jason:

And for people who don't know, data labelers are the people who train AI and who also work on ensuring the outputs are accurate. In some cases, data labelers are themselves pretending to be AI in order to eventually train AI tools. A lot of times data labelers don't know exactly what they're working on or who they're working for because the work usually goes through a platform or a subcontractor or a combination of both. Although, as you'll see in this conversation, a lot of data labelers are able to figure out who exactly they're working for. But, basically, they can be presented with a back end where they're asked to perform tasks or correct outputs or answer questions or label like video, photos, things like that.

Jason:

And in some cases, as you'll see, their answers are actually presented in real time as AI to the end user. And so there's been all these stories where in quote unquote, like AI messenger is actually just some data labelers responding and pretending to be AI, even though the data labelers themselves might not know that that's happening. Data labeling is notoriously brutal and underpaid work. Workers sometimes earn as little as a few dollars per day. They work under algorithmic management where they have these really strict quotas.

Jason:

And then because they're sometimes trying to train AI about what not to do or what not to show, they're often shown graphic, violent, or sexual content for hours at a time. This has led to a lot of cases of PTSD. There's actually a lawsuit, couple lawsuits in Kenya going on right now about this. You know, Meta has been sued. There's a company called Sama that, Michael worked for that has had a lot of complaints against it.

Jason:

And data labeling is really similar to content moderation jobs. And a lot of people who work in data labeling also do content moderation or they switch back and forth between the industries. It's such a big thing in Kenya at the moment that sort of driving down the highway going into Nairobi, you, like, see all of these huge office complexes where people do data labeling and, like, big Sema offices and things like that. And I actually was like, I mentioned data labeling to the driver who took me to meet Michael for this interview, and she told me that she was also a data labeler and so are a lot of her friends. I wanted to talk to Michael because he's the author of a report called The Emotional Labor Behind AI Intimacy, which was put out a few months ago by the Data Workers Inquiry Project.

Jason:

In this report, Michael explains working endless hours pretending to be an AI sex bot or a bunch of different AI sex bots, more or less, and the toll that that took on his mental health and his marriage. He's gonna talk about that in the interview, but I wanna read a passage from it that was, like, really affecting for me. It goes, quote, I had to assume fabricated identities, memorizing false backstories, and reading through previous chats. Sometimes I would be assigned a conversation that had been ongoing for several days and had to continue it smoothly so the user wouldn't realize the person responding had changed. I played the part, stepping into carefully crafted personas designed to connect with unsuspecting customers on a quote, unquote personal level, often through sexual or intimate conversations.

Jason:

When I logged into my work dashboard, I had access to multiple fake profiles of varying genders. Typically, three to five different personas I could operate simultaneously. Sometimes, I had to operate male and female personas on the same day, depending on what the platform's users were seeking. One day, I might be Jessica, a 24 year old lesbian college student from California, and Joe, a 30 year old gay man from Florida. Another day, it could be Maria, a 28 year old heterosexual nurse or a nameless woman artist.

Jason:

I felt like I was losing myself in the role. It started as any other job, responding with empathy and willfully pretending to care. But over time, it became harder to separate the act from reality. The lines blurred. I began questioning if I was acting or if I was truly becoming the persona I was forced to embody.

Jason:

I was losing touch with who I really was, a feeling that has never left me. Michael has since become really critical at Association, a group that is fighting to organize people who do data labeling work in Kenya, and I guess internationally actually, and who's advocating for better working conditions, higher pay, and more protection for data labelers. I met Michael at a coworking space in Nairobi in a very tiny room, so I'm not on camera after this. But here's my conversation with Michael.

Jason:

I guess first, can you introduce yourself? My name is Michael.

Michael:

Currently, I'm the General Secretary of the Data Label Association, having been in the space for over five years now. So I'll basically take you through what data labeling is. I'll use a practical example of a self driving car, for example. If you have to have this car on the road working without causing accidents on the roads, this vehicle has to be taught a lot of things. It has to be fed with a lot of information.

Michael:

For example, if this vehicle has to identify, let me say a truck, an SUV, a person, a dog, you know, any other animal, you know, the cabs, the horse fields behind the sides of roads, the traffic signs, the traffic lights and what have you, this vehicle has to be taught. Has to be given this information like a small child. When you have to help it understand that it's a human being, don't take a picture of a human being, just one person, know, you have to take millions and millions of pictures of people in different heights, different sizes, you know, so that by the time you're working on this, you're giving it a variety. You know what people are, because if you gave it to, let me say, pictures of grown ups only, if it finds a kid on the road, it's going to hit the kid. So all of these pictures are taken on the roads randomly.

Michael:

Then you're given this information. So you're told to label these vehicles, you know, correctly. And that is why anytime you're doing all this labeling, you have to be very accurate. You have to identify these as a bus. You have to identify these as a SUV, example, these as a truck, as a trailer, so that it doesn't hit any of this, doesn't cause an accident.

Michael:

So that is why when this information is brought to us, we have to label it up correctly. Once all this is done, the information is put together, then this machine is taught, you know, through an algorithm on how to, you know, identify all these things.

Jason:

Yeah. So I know that there's data labeling and like businesses kind of all over the world, but I know that there's a lot of it in Kenya. Do you know why it's such

Michael:

a popular job here? First of all, Kenya has, I think, one of the best internet penetrations worldwide, if not within the continent of Africa. I think every homestead, almost every homestead in Kenya has an internet connection at the moment. Second, Kenyans are learned. Most Kenyans have gone through other good uneducation system and most of them are aggressive.

Michael:

I think Kenyans, I call it a challenged nation where like all Kenyans want to challenge every system, all Kenyans want to build every system, all Kenyans want to like be part of every system. That is why we are like Kenyans can sit down and decide like this is what we need. I'll give an example of the mobile money transfer which was invented here in Kenya, first used here in Kenya, I know most people have never trusted Kenyans because of all that. You go to Uganda, they have a problem, yeah, then there's a thing, you know, and that was a student, not even a graduate, just a student doing such a thing. Kenyans are tech savvy.

Michael:

We love challenges as Kenyans. We have the best infrastructure in terms of the internet connectivity, you know, guys are educated. Like, yeah, that is why everyone feels so confident bringing their jobs to NKN.

Jason:

How did you personally get involved in data labeling?

Michael:

So after working as a storekeeper for quite some time, I later on joined Betin, which is a betting family in Kenya and two months later, it's closed down. But there's a lady I was with in college when I was doing my diploma in Air Cargo who introduced me to Summer. By the time we were together in Betin, she left then went to summer. Then after Betin shut down, I was like, tell me, how are you surviving today? She sent me a link on the flight and then I went to summer school.

Michael:

Trained then a year later I was absorbed. That was in 2020. But prior to that, I had my own challenges as a person financially and my kid had been diagnosed with cancer of the lymphatic system. This is COVID we are talking about and I'm like, don't have a job so I'm supposed to get back to my pocket because flights are not in operation globally. I couldn't afford by that.

Michael:

I went, talked to someone who borrowed me 1,700,000.0 Kenya shillings. It was an equivalent of around 1,700 US dollars. So that is what I use for medication. So when someone came in, it wasn't offering a good pay, to be honest. It was around 240 US dollars.

Michael:

That is on gross.

Jason:

240 US dollars for Yeah. For month. Per month. Okay. Yeah.

Michael:

So I had to quit the job because I didn't have an option, stayed at the hospital for four months, so it was me at the hospital and me at summer at the same time until my child was actually discharged. Then now this is the burden I have, then now I need to figure out how to find solutions, you know, to the financial crisis at my place because I need to pay house rent, I need to provide for them, basically I need to take care of the sick child and so on and so forth. So that is how I found myself at Samoa.

Jason:

Mhmm. Yeah. What types of, like pressures are you under when you're doing data labeling? Like, I assume you have to be really highly accurate.

Michael:

You have

Jason:

to be very fast as well? Like, are you paid per task or are you paid per month, per hour? So at summer, we

Michael:

were paid per month and that is why I took off the $2.40 US dollars. Now if you're given a target, for example, you're supposed to calculate and know like how many tasks do I need to submit in an hour to hit the target. Then there was what you call the occupancy, you know how busy you are with the tool because most of the tools, it was a meta tool called Workplace and at Workplace it could deactivate your account or put you on an available mode you go eight minutes without touching the mouse. So like basically you're supposed to be cleaned on your machine for you know the working period so we had KPIs and meeting billables also the billable hours that you know the company had you know signed with Meta was you know, a must. So you had to work within a certain time frame.

Michael:

KPIs were set and you had to meet the KPIs. Mhmm.

Jason:

So what types of work were you doing at SAMHSA? SAMHSA was basically

Michael:

labeling or annotation as they call it and mostly images and videos. Yeah, you're given a video and they talk to describe a video for example, you're given an image and you're told to, you know, work on it in according in accordance to, you know, specific guidelines. For example, if you're given lesser pictures of people whose you know the picture only contains faces and they're told to identify faces you know you're supposed to draw bounding boxes around their faces and label that. So that is basically what I used to do at summer. After I joined summer, that is when now through friends and colleagues they're like, you guys are talking about some other companies, some other opportunities online and so on and so forth.

Michael:

So the only advantage we had was we transitioned to work from home. But it was during that phase that I also came across another gig that was not a good one for that matter where we were like expected to annotate pornography. And pornography in this case was like you're presented with a video and you're supposed to put yourself in the minds of the 8,000,000,000 people on earth and you're like to put tags on every frame and by frame I mean every second of that video. You have 8,000,000,000 people in mind, I know someone is searching for this pornography in Cuba, these are the tags they can use, like if you're searching doggy, you know such kind of thing so you have to have all that in mind every time you're watching a video you put tags like 12 to 15 tags on every frame. So you're supposed to work on pornography for eight hours a day, did the project for eight months.

Michael:

So you understand that all. It's something I never want to talk about sometimes because you it wasn't an easy thing. Watching pornography for eight hours a day and for eight months was already I went for therapy for six months.

Jason:

It must be very

Michael:

sort of destabilizing to do that. So much because you get to a point where your body can function. You are here, even if someone's tilt naked, you don't even feel it. And you're here, you have a wife, yeah, who expects a lot from you being a young family, you know, a young woman. She she also has blood flowing, you know, in veins and she expects a lot from you intimately.

Michael:

But then you can't, like, do that. Yeah. It fractured a lot of things that time and that is

Jason:

how most of the things we are lost in the process. Were you specifically labeling, like, what was happening in the video and and for do you know what the purpose was? Was it for searching or was it for like this pornography is actually illegal so we can't include it or or what? I think for them

Michael:

it was basically for to improve viewership on their websites for example because if you're into the internet and you want to search for certain pornographic video, I know there are tags you're likely to use to be able to access that kind of video. So basically it was to improve their viewership from my own assessment. Cause why would I do 12 to 15 tags on one frame? Mhmm. It's to ensure that even if someone was thinking outside the box, they would still come back to, you know, accessing this video.

Michael:

Even if someone was thinking so poorly, like, they will still come back to this very, video. That was the main purpose of, you know, having all these tags on our own frames.

Jason:

Do you know what websites these were ultimately for or you didn't have that level of information?

Michael:

Most of, you know, the communication was done on a no reply mail. So you receive a no reply mail, but it has a link. Then it you know moves you to a certain site. And what I noticed they had like three working sites. So today you're working on this one, next meeting you're working on that one, so you couldn't tell who specifically was responsible or rather who specifically was giving this work.

Michael:

Mhmm. Then once the work is done, like, your account is suspended, you do not have access.

Jason:

So in your report, you talk about, how you feel like you were training, like AI companions at some point. Can you talk to me about how you got into that? Was was that the same job or a different one?

Michael:

That was a different one because this now was basically chat moderation and not data labeling or data notation, let me say so. And when you talk about trying to train these chatbots you sit down and ask yourself like how does these companies resolve disputes? Because part of the things you're told from the beginning is like you're not supposed to share your personal information for example. So here you are, you share your personal information, the system or rather it's flagged. Then now that is how they narrow down to you and suspend your account.

Michael:

So that means because again you still have your messages, there's a message count on you know in your dashboard and that is what they used to pay you. How do you identify the message count without having the messages? How do you improve the message count? How do you resolve conflicts? So it means these messages are being stored for future use.

Michael:

Right. Right. So and we are like, we are here. And these messages, there are certain strict, you know, guidelines you're given that you must have these number of characters before a message can be sent.

Jason:

Mhmm. So that means you

Michael:

are not allowed to send short, very short messages. You're supposed to miss a 10, you know, a threshold of characters on every text message you're sending, and that is why they require someone at the high typing speed Mhmm. Of at least 43 to 50, you know, words per minute. Mhmm. And that is where you're like, now the speed they need is to ensure that they get the very many characters they need within the one minute that is supposed to submit that text message.

Michael:

So the response has to had to be within one or two minutes depending on that site you're actually on. That is where they needed the typing speed.

Jason:

Yeah. Do you think that you were talking to real people? To be honest, yes. Because you could feel

Michael:

the human aspect of their conversation. And there are times when, you know, when someone's looking for love, can tell if this is a real human thing or not. And most of the people who are those sites were lonely people. And I really doubt even Kenya we can have people who are lonely to that extent of you not paying for such services. And to me I would say I think we are raised totally different as a people and generally as Africans and there are certain things you rarely find in Africa.

Michael:

For example, you know people paying subscription fee to have such services offered to them, like someone feels slowly whether they can pay to get company. I haven't seen it anywhere in Africa unless you know it's there. But I think they target the vulnerable and mostly the aged people. Let me say the people who have you know gone through a lot like someone suffered in an accident for example, lost their partners and what have you, they have had mental issues maybe and they need support, they need someone to talk to, you know, those guys are looking for love. And now all these things are all integrated on one side Mhmm.

Michael:

And that is what you're supposed to be responding to. So a target group, I'm not in Africa, to be honest.

Jason:

Yeah. It must be US and Europe. Yeah. Yeah. So you log in to to this job where where you're doing like, chat you're chatting with people.

Jason:

I mean, what sorts of

Michael:

things are you, like, asked to do? This job requires a lot of creativity and fast thinking because if I'm talking to a man, I'm supposed to act the woman. If I'm talking to a woman, hate to a man. If I'm talking to a girl, it will act like one. If I'm talking to a lesbian, it will act like one.

Michael:

So it required a lot of creativity in switching of all these conversations. Did you have

Jason:

to be like explicit with these people? Yeah. Like, you texting them essentially?

Michael:

Or That was basically what we were doing. Yeah. The guys who come here and they strictly need someone to text judge them.

Jason:

And the site provided even pornographic stickers. So you said that, you know, having to watch pornography for eight hours a day desensitized you. It was not a good job. This seems like almost another level of that because you're, like, having to participate back and forth. I mean, was was this difficult for you?

Jason:

The pornographic one was a different one. Right. It was a different one. I'm saying, but you did that and then then you did the sex tape. I thought I was doing this at same time.

Jason:

You were doing them at the same time. At the

Michael:

same time. Because I have a shift between four and nine Mhmm. For the child's motor director job, then I'm like having the other one after nine to around three or 4AM in the morning. So that is how I used to work. So I am experiencing these two jobs, having these two jobs with different experiences, but all of them are Sunday Monday.

Michael:

Yeah. And I have to do the summer shift during the day. So basically that is how we ended up. Some of us never sleeping for at least we used to work for at least eighteen hours Oh my god. A day.

Michael:

And that is why some of us are still suffering from insomnia to date. And when I told you I have gone for like three days without sleep, basically, I mean it because sometimes you're, like, there, you switch on the lights, but you can't sleep. Mhmm. And you realize it's already 5AM in the morning and you can't go to bed at 5AM. So you're, like, supposed to pick up something else and do it.

Michael:

So we've, most of the guys who've been in this space have a problem with sleep, most of them. I happen to go to, there is an institution called Faradjar Cancer Support Center in Parklands where like I engaged one of the therapies who really helped me big time. And I have always said, Faraday Cancer Support Center has been of immense help to me as a person. Because some of the things they took me through to help me, I don't think everyone would have done that. But it was one of the best services I ever got for that year because they stood with me, they were there for me and I think I got to access them because I was a caregiver to a child with cancer.

Michael:

That is how I actually got to interact with them and that's how was like, no, I need a solution to this. Yeah, so I take my child, I go there for therapy as a caregiver of a cancer patient and at the same time I have to explain to them this is what I'm going through at a personal level now. So that is how now I got him. Yeah.

Jason:

Did any place that you work for ever offer you any sort of, like, mental health support at all? Did they even talk about it ever? No. At some point,

Michael:

we hate that. I wouldn't say it was in there, but sometimes you see you go through problems where like even the therapist doesn't want to tell you. They don't even know what to tell you. Not to look like I'm looking down upon them, but like, okay, the issues you go through, when you go there explaining it to them they also shout. So you're like the one who doing counselling to them not them doing it to you.

Michael:

So sometimes there was that gap because now they did it from a general point of view not from a worker point of view because what I'm experiencing is not what they have experienced. You see like I'm trying to advise you on how to curb drugs for example and I have never used one. How the effect of cigarettes you know affect someone and it have never smoked. I can't tell you how it tastes unless I taste it. So for us I felt like those guys needed to have a space because they used to read from a general point of view but if they labeled data for example and not just labeling data but a graphic one for that matter they would understand what we talk about.

Michael:

So I would personally I would have suggested they go through know data labeling first as part of, you know, equipping them, you know, with the knowledge of what is in the pipeline, what is happening to these people for them to be able to advise people accordingly.

Jason:

Do you regret doing this job or do you feel like it's something that you just needed to do in order to, you know, provide for your family and your child?

Michael:

I I didn't have an option. If I did, I I couldn't have done that, but I didn't have an option. I really had a burden of, you know, financial burden that I needed to sort at that time. So the best I could have done is just accept the job. So I think that is the vulnerability that most people are going through and that is why most of these companies are taking advantage of you know, some people.

Michael:

Because under normal circumstances I don't think he would take that job. Today I wouldn't take it. I wouldn't take it whatsoever because I know the damages that the job can actually bring to any human. And also, that is the reason I started from the Data Labelers Association. We need to address some of these problems.

Michael:

And I I would remind them that we're here and they will feel our impact. Yeah. Unless they do things the right way, I'm going after them. And the good thing is I'm doing from a point of experience, not assumption. I have been through this, I know what I'm talking about, and I know how to approach it.

Michael:

So I know today they might try to ignore who we are, but at one point they will call us on these tables we will sit down with them because we will ensure that we have the right protections in place, the right policies in place and we must be part of the policy making process. We are not going to be left behind because we understand the space better than any other person because I went through that mess, I understand what it is, understand what can be done to find a solution to this problem.

Jason:

So let's talk about the Data Labelers Association. How long has it been running for? When did you start it?

Michael:

It was through a discussion, you know, a conversation with friends and colleagues. We started the discussion in around December 2023. So it was after one of us did a research on AI harm that she came across all those challenges and I was like now we need to address this Because some of us never knew that we have so many companies working here in Kenya, like through the research she met someone who works for Netflix. You know as a captioner, you're like Netflix, as employees in Kenya, how? Mhmm.

Michael:

We're doing captioning. I'm like, how? They're like, something must be very special about this space. And they're like, we need to find out. Because every other complaint or every other issue people raised was basically a violation.

Michael:

And we're like no, we can't continue this. One thing that we believed in was like most of the people are doing this job by young people. And the destiny of this nation depends entirely on the opinion and the contribution of its young people. So if these young people are ignored, future is lost. And we are like, we won't lose our future.

Michael:

Mhmm.

Jason:

So what, what sort of like changes and reforms are you fighting for?

Michael:

Number one, we basically need protections around this space because our labor laws were amended in 2007 and digital jobs were not like that clearing a market at sites, so there are no protections for that. And that is why I feel like the tech companies are taking advantage of that loophole. That is why we are like, no, the process of coming up with such policies and laws takes time. And we are like, we came up with a model contract, for example, and code of conduct that we will be launching officially, should be next month, but that will be communicated in advance and we really need to like find immediate solutions to long term problems because if we stayed here waiting for such laws to be put in place before they even take effect, it will take quite some time and we are like, we need to find working measures to ensure that we do not always stop the violations Working in a better environment. Okay.

Jason:

So are you focused on signing up, the people who are actually doing these jobs or are you like, you're trying to get people to join your association?

Michael:

We already we already have, almost 870 members as of now and we are like this is the time we need to interact with our members now, this is the time we need to like let them know who we are, what we do, and why we have decided to do this. And that is why we have to arrange everything strategically, we have every other thing arranged strategically for that day. And that is why we're not talking about it because we do not also want interference from external factors. Yeah. So

Jason:

do you think that this is a type of job that can be reformed? Like, do you think that if you fight companies like, is this is this a job that if there was better, like, mental health support, if they paid more, like if these sorts of changes were made, like is it a job you think people should be doing? Basically we need technology. There's no doubt about that,

Michael:

we need technology, but it shouldn't come at a human cost. We can have this done the best way. What is so hard with offering mental support to the people working on graphic content? You know clearly they need it. Is it so hard to provide?

Michael:

Question, if this job was done in The US for example, will they still do what they are doing here in Kenya? Will they still give the pay they're giving here? Because we are we have friends and guys we've been working on several projects in The US and The Philippines and so on and so forth. We are like someone is paid during training. In Kenyans we self train and we are never paid.

Michael:

But in Philippines and US for example someone is being paid $30 per hour for training tasks. In Kenya we are paid zero for training tasks. Someone is paid $50 per hour, here we are paid 0.01 per task. It doesn't make sense. Why this discrimination?

Michael:

If they can pay people in The US and in Philippines well, that means they can pay people in Kenya well. So it's not a hard thing. It's not that complicated. It's just a matter of let's do things the right way. And I have a problem because I don't know if doing things the right way is a crime in some countries.

Michael:

Is it a crime, for example, in The US to do the right thing? No. Then why are we doing this here in Kenya? Mhmm. Why in Africa?

Michael:

Why? What went wrong?

Jason:

Right. I mean, it's it's taking advantage of the company. It's very or the country. It's very extractive. Which is not right.

Jason:

Right.

Michael:

It doesn't have to be a win situation. We can have a win win situation. It's not a claim having a win win situation. The business can still thrive, the business can still take care of itself even if they share a win win situation.

Jason:

Are there any companies in Kenya that you know of that are doing things the right way?

Michael:

I don't I haven't interacted with any, to be honest. Mhmm. For the following reasons. One, we have summer in Kenya and summer in San Francisco. Are you being paid the same way?

Michael:

Are people being compensated the same way? Of course not. And we want this nonsense called you know the issue of you know the minimum pay, this nonsense has to end because we can't be doing the same things. They're using the word minimum pay as a you know a tool to try and you know enslave these people and mistreat them it's not right yeah. Secondly we have like we keep asking questions why would someone be paid per hour?

Michael:

$50 per hour and what we are saying here fine we do not want the 50 but can we at least move it between 5 and $8 per hour that is what we're asking for at least on the minimum side right on the lower side we're not asking for 50 but we're like, can we actually start from between 5 and 8 per hour, let's start from there. Mhmm.

Jason:

Did you have bosses, human bosses?

Michael:

Let me not call them bosses, they're puppets if I might use the right word, call them puppets because one, and sometimes I even ask because every time there is an issue if they want to implement some me say draconian changes you see someone telling it's the client, it's the client, it's the client, so who is this client? So you can't talk because it's the client, the client has to kill your people just because it's the client. No we can't work that way. We must have the right approach to issues if there are changes that are supposed to be affected because there are times when today someone comes in and like oh you're supposed to be working for six hours, the next minute you're supposed to work for ten hours, next the minute you're like you'll adding fifteen minutes to your time because so and so is absent. Okay how is that my business?

Michael:

I never send a contact with someone. So if someone is absent it's about the company to look for a way you know to find a way to actually you know compensate for the billable hours, not using me because I'm being used for the to you know fill the billable hours and yet I'm not being compensated for the same. I really had that issue before I left summer, you're told today everyone will be adding fifteen minutes for the same period of time, fifteen minutes on every day shift and I was like we are 200 people, fifteen minutes for 200 people, how many minutes are those? For a whole month? So the company has met you know the biller box, the clutch ones, but why are they not compensating people?

Michael:

And then you see people, you know, changing cars day and day. Today, somebody's on a BMW tomorrow, somebody's on a base and you're like, wow.

Jason:

Well, can we talk about that, about, the client? Because you're for Sama, there's a bunch of companies that you work for. People in The United States have never heard of these companies ever, like, and accept the people who've done, like, a lot of research. But you're not really working for SAMA. You're not really working for these companies.

Jason:

You're working for the big tech companies ultimately.

Michael:

Yeah.

Jason:

How do you think about that? Because it's like, it's almost like this level of abstraction between the two where it's like, you are doing work for some of the richest companies in the world, but you're contracting with these random companies that spin up in Kenya and then they take your labor. They probably make a lot of money from these companies. Like, I think the whole system is is one where it's like big tech is at the top and then you have all of these, like, weird contractors in the middle that are taking advantage of you. And, it's not really a question, but like sort of what do you how do you think about that?

Jason:

First of all, as

Michael:

far as I'm concerned, for the very period I worked in summer over three years, I was working for medical, and I can say this more than a 100 times, I can prove this more than a 100 times, I can even prove the projects I worked on and even whoever was in charge of that project from the Metcrew side, not from this side. I have you know that evidence with me so and I will not be afraid to share that kind of information. So I know I worked for Meta under some sources I worked for Meta and I can prove that one, Meta works on a platform called Workplace. There is any other platform called Workplace anywhere that is not Metas. Mhmm.

Michael:

So I worked on Workplace for the three years. I worked on a project called g p l six. G p l six was Meta purely. There was nothing else. We need to understand and it was so strict that we got to a point where like not everyone would attend client meetings.

Michael:

But at least we had access to certain information that they didn't know we accessed. Sometimes there is a leakage where like someone sent the wrong document to you. You're like, ah, is this what we are doing, this is what we are getting in you know interacting with. So this is the client they're like. So you worked for you worked for Meta personally for three years under Samasu's.

Michael:

So I can prove that anyway. So for anyone who is working under BPO and is working on Workplace, please you're working for Samar, you're working for Meta, sorry. Just understand that. It's as simple as that.

Jason:

Yeah. Yeah. Meta doesn't hire you directly because Meta doesn't want to

Michael:

get its hands dirty. It's because they want to run away from the legal responsibilities, exactly. And, you know, they will never run away.

Jason:

Yeah. So in the Uber ride over here, I was talking to the driver and she's also a data labeler. How many people do you think are doing this right now? Like, is it

Michael:

a pretty common job here? I see thousands because, let's start from where, our group possible trainers, for example, we have guys who had, a database of 30,000 trainees. That is one puzzle. Dealing with 30,000, training 30,000 people and we have like eight of them. So even on a ratio of let me say one to 10,000 that is 80,000 already.

Michael:

So we have a database of close to 200,000 people that we are trained specifically for remote tasks, not even in other places. Know talk about BPOs, specifically for remote tasks. Those are the ones we can access on our end as the 10 people. What of the others, other people who are trainers and yet we have not even reached out to them.

Jason:

What are

Michael:

BPUs? You know, BPUs are physical setup here, physical office setup here, let's say, country like summer. Okay. That is a BPO. Okay.

Jason:

Business processing zone. Did you ever have to look at violin content?

Michael:

Pornograph was one of it because I viewed a 13 year old and I had a conversation and I it has never left my mind. 13 year old and I had a conversation.

Jason:

It's really bad.

Michael:

Yeah. And we're like

Jason:

And so you have to report that and say You can't report that. You're supposed to put that on that. Yeah. Yeah. Yeah.

Jason:

So you don't

Michael:

know who to report to. You don't know them. You know the channel you're supposed to and you have another slave tool called the you know, there's some monster called NDA. The NDA is a, you know, is a slave tool used to enslave people not to speak about what they are going through. And going forward we have a feeling that these NDAs should be drafted with everyone present, not the companies alone.

Michael:

You can't tell us that we're not supposed to speak about the violations, We'll speak about them unless we sit down, like, this one is not right. This is right. I have a sample of an NDA or not wrong.

Jason:

When did you decide that

Michael:

you were taught even though you you were subject to these NDAs? This time, I'm very much ready for any legal battle anyway, but we are like, no. We're not going to keep quiet. Mhmm. Is this is us suffering, and we can't suffer in silence.

Michael:

This is not the colonial period where like now you're supposed to be on no. No. No. No. No.

Michael:

I don't have a right to speak against any violation anywhere. And that is what I'm doing. And I say, I don't care if they, you know, they suspend all my accounts on you know on these platforms, I don't care. After all I survived until I came across these jobs, I had survived for all those years without these jobs. I would rather speak, let them suspend all the accounts.

Michael:

What did

Jason:

it feel like, like you are kind of training your replacement in some way, like that's what the job is. You are training this robot to be more human and you're training this robot to to, like, be a companion for other people in the case of the the, like, AI companion stuff. Must be weird. No. It's kind

Michael:

of funny because I really suspected this when I noticed that some of these messages are being stored or most of these messages are being stored. Why? The reason I said earlier on is like how do you handle a conflict? Because these messages have codes, so they use the codes to access the messages. So if today I shared a screen where I'm like chatting with the user on the other side and they can use a very code just next to the message to know who sent the message because the system generates the codes.

Michael:

So now these messages are being stored because how do they resolve conflicts? How do they pay you? Right. Because if they pay you after two weeks then that means they can't refer to the number of messages you sent. Yeah, yeah.

Michael:

And they can certify that you send these messages. Then if you share your personal information they still can know you shared your personal information, how do they know? Yeah, then there are issues like, let me say where you're like, there are times when if you log in to any platform and you start chatting them, there are those immediate responses that just come from the bots. Question, where did they get that?

Jason:

From you.

Michael:

So that is how it started. AI can never be AI without humans and for me I've already said this and I will say it repeatedly, it's not artificial intelligence, it's Africa intelligence. Most of these dirty jobs and most of these jobs have been done here in Africa. Guys in Africa I have been there trying to like, I would say I've been a software developer without even going through school for the following reasons one when you're doing piloting for example you're given a dummy website that you need to work on but you're like the client brings the job posted on that site and you're like now what do you think should be added on this? What do you think should be removed?

Michael:

What do you think should be like included? And you know so you're like here, you're given the tool, you're like going through it and you're like please add a skip button here, please add this here, please add this here and like once the tool is functional, that is when communication stops. Question, what happened?

Jason:

So I read an article, also written by Kenyan that was titled, I don't write like chat GPT. Chat GPT writes like me. Did you see that article? Yeah. He came

Michael:

across it, but I never went through it fully.

Jason:

Yeah. I mean, basically, point was that now when he writes online, when he, like, writes on LinkedIn and places like that, he's accused Yeah. He's accused of using AI, but that's because ChachiBT is trained on Kenyan the way it is that Kenyans write. And this is Kenyan English, so I think it's

Michael:

a problem most people are facing currently. And you see, that's when you say we train our own death now. There is someone training his own death. So we train Chagibati, it's fine, but now it's killing us slowly because everything you're going to write will be regenerated. We go on, we can never run away from that.

Michael:

And the reason for this we don't know.

Jason:

Mhmm. Is there anything else that we haven't talked about that you think people should know? The most important thing

Michael:

is about Data Libraries Association. It is one of the organizations in the country and we are not just planning to remain here in Kenya, are going global, that is our target. And I believe we are the first one also to come up such an organization that protects the welfare of the data labelers worldwide. We've seen so many people try to copy that and we're like we must go global because we need to address, this is a global problem. We're able to address it locally.

Michael:

We have to address it globally because it's global problem. We don't want our friends and colleagues in Brazil for example to go through the same. If this change has to be affected it has to be across the globe everywhere. And that is why we are here as DLA, and that is like our main objective as a people, yeah, and as an organization. So DLA is global, feel free to register as a member anywhere in the world.

Michael:

We would love to meet our global members any day. We would love to go to Brazil and meet our, you know, members in Brazil, in Ecuador and Venezuela, so

Jason:

to speak. Thank you so much. Thank you for your time.

Michael:

Thank you too for your time.

Jason:

Thank you for listening to the four zero four Media Podcast. If you enjoyed this episode, please like and subscribe, leave a comment, tell your friends about us, etcetera. We'll be back in a few days. This episode was mixed edited by Alyssa Midcalf from Kaleidoscope.