The 404 Media Podcast (Premium Feed)

from 404 Media

The Tea Breach Just Keeps Getting Worse

Episode Notes

/

Transcript

We start this week with Emanuel’s and Joseph’s coverage of Tea, a women’s dating safety app that was breached multiple times. After the break, Sam and Emanuel talk about how a new UK law about age verification is impacting peoples’ ability to see footage about current events. In the subscribers-only section, Jason explains that LeBron James is not in fact pregnant.

YouTube version: https://youtu.be/COKdgdHuBV8
Joseph:

Hello, and welcome to the four zero four media podcast where we bring you unparalleled access to hidden worlds both online and IRL. Four zero four media is a journalist founded company and needs your support. To subscribe, go to 404media.co. As well as bonus content every single week, subscribers also get access to additional episodes where we respond to their best comments. Gain access to that content at 404media.co.

Joseph:

I'm your host, Joseph. And with me are four zero four media cofounders, Sam Cole Yep. Emmanuel Mayberg Hey. And Jason Kepler.

Jason:

Hello. Hello.

Joseph:

Okay. Final reminder.

Jason:

Last chance.

Joseph:

Literally. Because if I mean, if you're a subscriber, you get this early. I think you get this on Tuesdays. If you're a free listener, you get this the day after. That's another benefit of subscribing.

Joseph:

You actually get it early and everybody else. July, 6PM at RipSpace in Los Angeles. We are having a live event. It's going to be me, Sam, Jason, and then we're also gonna be joined, I believe, by Dexter Thomas. Right, Jason?

Joseph:

Do just wanna explain what we're gonna be talking about and sort of what we're doing briefly?

Jason:

Yeah. So we're gonna be talking about our reporting on the technology that powers ICE. We felt that was appropriate to do in Los Angeles because a lot of this technology is being deployed in LA. And so our friend Dexter Thomas, used to work with us at Vice and now has a podcast called Killswitch, but also is an independent journalist like us, will be with us and talking about his reporting on the ground because he went to a lot of the protests and filmed at a lot of the protests in LA. So we're gonna do that for, like, an hour ish, and then the rest is gonna be a party with the DJ and beer and wine and just hanging out, and it should be fun.

Jason:

It should be a good time. So if you're on the fence, please come. It will be fun. You can find tickets at bit.ly/404ripspace or on our website. And if, again, if you're a subscriber, you get free tickets.

Jason:

So on our website, there's a code. Just scroll back to where we posted about the event. You can find the code.

Joseph:

Yes. Alright. That all sounds good. Looking forward to seeing everybody there. Let's get straight into this week's stories.

Joseph:

The first one, Emmanuel's the first byline and and I helped out on it as well. Emmanuel, the headline is women dating service app tea breached users IDs posted to 4chan. I guess, first of all, Emmanuel, how did you get this tip? This was on Friday, I believe, and it was pretty fast moving. Can you walk us through when and how you got that tip and what happened next?

Emanuel:

Yeah. Sorry. I have to correct you. You call it a dating service app, I think. It's a dating safety app, which is an important distinction.

Emanuel:

It's like an app where women are invited to, they thought, safely exchange information about men that they want to date or are dating. And the way this happened, I believe this was Friday morning, and I just get a call on my phone and I can see that it's to my Google Voice number, which is kind of the number that I've shared previously for people who wanna send us tips. And I don't always pick that up, but I picked it up for whatever reason that morning. And it was like a good Samaritan, I would say, somebody who is his day job is, like, IT adjacent, and he sounded pretty frazzled and panicked. And he was like, hey, there's something going on with 04:10.

Emanuel:

You have to see it. I'm sending you some links. And I was like, I couldn't really understand, but then I went to 04:10. And by the time I got there, it was obvious that this app called Tea had a major breach that people could get users' images, which the app asks people to upload selfies or previously photos of their ID in order to prove that they're women because it's an app for women, and people could get their hands on thousands and thousands of those images, some messages, and some other data. We can get into all of that.

Emanuel:

And not only was it clear that it was available, people were already making it available off their app. So it's like when I got there, people like, anybody could go there, could go can get into the cloud computing. It was a Google service that the app used to deploy the app. People could, like, write through that. But people Because it was exposed.

Emanuel:

It was exposed. Fully exposed. Yeah. And but but people were already, like, archiving it and using the images and and and mocking these women. It was like the vibe was like, oh my god.

Emanuel:

This is really bad, and it's already, like, way too late. It's like as you know, Joe, it's like you can discover a hack at several stages. You know? It's like it could be a researcher, and they disclose it responsibly, and then the company closes it. Or you can find it as a journalist because of a tip, and then you tell the company.

Emanuel:

Or it's like it could be like a little known vulnerability or one that only certain hackers are exploiting. This was like open season. Everybody was in there and kind of gleefully taking advantage of it and making fun of the users.

Joseph:

Yeah. I mean, one of the quotes maybe from the original poster, I believe. Yeah. I think it was them. I mean, their direct quote was in all caps saying, drivers driver's licenses and face pics get the f in here before they shut it down.

Joseph:

It was, as you say, open season, and it wasn't just a quiet a quietly posted link. It was people explaining to one another, hey. Here's a script to rifle through the metadata of the files. It's just a series of attachments and that sort of thing. And then people were using scripts to download those images in bulk and then making them available.

Joseph:

And, of course, I ended up downloading some of these or or I think all of them. I downloaded the entire dump once it was made available so I could verify. And, I mean, it's a lot of data. It's a lot of images, and tea later confirmed it was tens of thousands of people's selfies and their identity documents. Just to back just to back up a little bit, Emmanuel explained what the app is, but Sam, you've covered these are we dating the same man Facebook groups, and this essentially is the appified version of this or it came from this.

Joseph:

Is that fair? And you can you tell us sort of what those groups are a little bit more broadly? Because it's kind of the same thing. Right?

Sam:

Yeah. I mean, I didn't get to use tea before it went down. So I don't know exactly what like, the if it's exactly one to one, the same thing, but it's the same idea. It's definitely, like so the are we dating the same guy groups, women would post a picture of a guy usually or maybe a description of a guy, but usually a picture and say, hey. I'm going on a date.

Sam:

For example, I'm going on a date with this guy tonight. Does anybody have any red flags? And it's like red flags are, like, the code for, like, don't go or, like, we have information for you. On Facebook, this was happening with full names. Obviously, people attached to it.

Sam:

So more risky, but the groups were closed. So, ideally, you wouldn't be able to get in if you were a bad actor. But, yeah, it's like a vetting thing where, like, you know, you could say, oh, yeah. This this guy I've been on four dates with. Does anybody else know him?

Sam:

And someone's like, that's my husband. You know? It's like, that's an extreme case, but, like, that does happen plenty. And that's the idea with the are we dating the same guy once. And then and men get super, super pissed about the existence of those those groups themselves and just are enraged that they're on them.

Sam:

And then, obviously, I think a lot of that rage is what we saw happen with the TEAG.

Joseph:

Yeah. I mean, can can you just briefly elaborate on that? And I know I'm not really asking you to put yourself in the mind of a 4chan user necessarily, but you see I mean, it's an obvious question, but I'm gonna ask it anyway. You see a correlation there between some men getting very, very angry when, say, their face is posted into one of these Facebook groups and 4chan users rifling through this database. Is that sort of like one and the same thing almost?

Joseph:

Obviously, one's a lot more extreme than the other ones, but is it sort of the same sort of behavior? What do you think?

Sam:

Yeah. And I don't I mean, it's it's just different. It's not really more extreme. I mean, it's what's extreme is, like, guys suing these groups and, like, the administrators of these groups on Facebook and saying, you know, I'm suing you for libel or whatever, like, defamation of

Joseph:

Which they've done. The

Sam:

new footage. Which they've done, which is obviously a pretty serious reaction to having your picture put on Facebook by someone else. But, yeah, I mean, it's we saw it, like, with tea in general. People were like tea went a little bit viral earlier in the week, last week, and people were talking about it even though it's been around for a while. And then a lot of people were, like, posting, like, these, like, satire or, like, maybe they were real.

Sam:

I don't know. Like, fake apps that were, like, like, the flip side, like, really being, like, really derogatory toward women and, like, saying, like, we're gonna put, you know, your faces in here. And after the tea hack, we saw, like, tea spill, which is, like, a hot or not game based on the hack images. It's just it's all, like, proving the point a little bit. It's like, yeah.

Sam:

Like, you know, women were warning each other about you, and this is your response to be, you know, horrible back at them. It's just like that's it's kind of like, okay. No wonder. But that was I do feel like it's it's all kind of connected, and it's all coming from, like, this online dating world of, like, you know, guys that are dangerous to go on dates with very often. Women are, you know, the victims of violence on online dates quite often.

Sam:

So it's understandable that this exists, but the hack is definitely, I feel, in a verbal response to that.

Joseph:

Yeah. And that website you mentioned, it was somebody had taken those exposed tens of thousands of images, and you're shown two images, I think, and you have to select one based on their perceived attractiveness, like, as you sort of hinted at, almost like the very early Facebook site from Zuckerberg. Right? And that's got tens of thousands of rankings or something like that. So already, the information is being abused in various ways.

Joseph:

So that all happens. There's the media firestorm. We we reveal and first report this data exposure. Then it gets worse. And the headline of this one is a second tea breach reveals users' DMs about abortions and cheating.

Joseph:

Jason, it was actually you that got this tip. How did that come about?

Jason:

Yeah. So a researcher who we had done a story with back at motherboard, like, all the way back in 2016 reached back out to me and was like, it's not just this initial hack or this initial, like, exposed database. The the actual, like, main database of t was exposed as well, which included, you know, like, presumably or seemingly all of the DMs, including things like usernames. You know, it was searchable to some extent, so they were able to show us. There was, like, women talking about abortions that they had saw or that they had had, talking about, you know, cheating situations, like, really intense situations that they had with their partners.

Jason:

Like, it it was very, very, very sensitive information. And so, you know, since you and Emmanuel had been reporting on this for a while, I passed it off to you to to actually do the reporting and confirm the story because you're already very deep in it. But, basically, this person is really good at decompiling information and sort of found that there was an exposed database, like a a further exposed database that made the hack, like, way worse than we initially thought.

Joseph:

Yeah. Whereas the first one was a Firebase instance. Again, while Emanuel was talking about Firebase is this, like, app development platform by Google, and it seems that you didn't need any sort of real authentication to go in and get those selfies and ID photos, that sort of thing. The way it was described to us by this researcher for this second data exposure was that any user's API key you know, you make an account with your username and password or whether and you're given an API key. That's just how apps generally work.

Joseph:

Right? The way it was described was that any user's API key could query the entire database. So, basically, you had almost like admin level access even though you're a random person who just downloaded the app, and that's not good, obviously. That's really, really not ideal when anybody can access all of that sort of information. So the researchers sent us over all these screenshots, and they were very interesting.

Joseph:

But we need the data, you know, to verify what is going on here of their own accord. They had already downloaded this information, sent over to us. And kind of selfishly, I very much enjoy these, like, reporting puzzles of we have this data, we have to figure out and verify, improve it came from a certain service. For the first one with the driver's license photos, I downloaded the APK. I decompiled the Android app, I found the oh, yeah.

Joseph:

That exposed database in there, the same URL is in the app. So that was pretty good verification. For this one, with the more than 1,000,000 messages, we went through, did get some phone numbers, texted some alleged users in there. One eventually got back to me and confirmed, yes, I am a user of tea, but that was actually after we published. How we verified this one was that we took the usernames from the million messages, not all of them, just a random snapshot, and then I tried to make accounts on tea with those usernames.

Joseph:

And in every single case, that was not possible because that username was already in use, indicating that, yes, these a million messages have come actually from the tea app. So whenever we get data like that and whenever we can verify it like that, I'm always supremely confident in the veracity of what we've got. Jason touched on this, but Emmanuel, you also went through some of the messages, and we didn't quote really any directly. Can you just explain a little bit more about why we didn't do that and maybe just a bit more on the sensitivity of these messages that you saw when you were scrolling through?

Emanuel:

So we didn't quote anything because it is possible to word search the data that we got. And I was trying to explain this to my wife because I was talking to her while we were reporting the story, and she was like, who hacked entity? And that's an understandable question, and I guess the answer is, like, nominally, I don't know, some bad people who have 4chan accounts, but that is not the question or, like, that is not the problem. It's not that somebody the story is not that somebody broke into tea. It's that I don't know.

Emanuel:

If tea was a bank, they just, like, left the vault door wide open. And that's the real that's the real offense here. And because of that, we don't know who else got their hands on this information, and we don't want to give specifics because we don't want to make that stuff easy to find. And we don't want to have that stuff easy to find because my poking around the messages it took me like two minutes to identify someone because people are DMing each other. Like, that's the kind of conversation that's in the data.

Emanuel:

And they're being very real, and they're sharing real names and phone numbers and social media handles. So somebody is talking about the person they're dating being someone else's husband and them cheating, and some people were talking about abortions. And it just it was it it incredibly easy to find those people in the real world just given the context of the conversation.

Joseph:

Yeah. And so we do that. I contact Tee for comment about the exposed direct messages on Saturday. I tell them explicitly, hey. This research has found this.

Joseph:

Also found, apparently, the ability to send push notifications to more than a million people, which is kinda crazy. That's like we don't we barely even mentioned that, but that's also wild. And I contact Tee, as I said on Saturday. They don't comment specifically. It's like we're continuing to investigate and, like, we're we're not gonna share more information at this time.

Joseph:

We then publish on Monday, and then very soon after, they make a post on Instagram saying, we've just learned actually, the direct messages were exposed. We're turning off DMs now, and then send that statement to CBS News or various other people as well. To be clear, they did know since at least Saturday. And also to clarify, the researcher said their access to that database was cut off sometime late last week. If the access was still live, at least to our understanding, you know, we may not have published at that time.

Joseph:

We don't just, like, find a vulnerability or get told about vulnerability and then go, okay. Cool. And then just publish an article because that's gonna potentially actually lead to more data exposure. It's the best, to our knowledge. It it was closed, and I think Tee just turned off DMs as an extra, precaution as well.

Joseph:

Very briefly, just last thing, Emmanuel, just before we were recording, I think we just published about a class action lawsuit. Do you just wanna mention what that is briefly? I I feel like that just happens now. Right? That's just normal.

Emanuel:

Yeah. Unsurprising, I think, but a law firm that specializes in data breaches has reached out and told us that they filed a class action against T. And, yeah, I think it's that doesn't guarantee that that will go anywhere or that they'll be successful, but I'm not at all surprised that the complaint has been filed. They're expecting other complaints to be filed, and they hope to kind of take the lead on that and have all those other lawsuits join them in the the class action.

Joseph:

Yeah. Very, very standard. Alright. We'll leave that there. I'm sure we'll continue to cover tea even though none of us have heard of this app until last week.

Joseph:

It's a really significant data breach, so we'll definitely be following that. We'll leave that there, and we'll be right back after this to talk about I mean, it's complicated. You're just gonna see. Okay. We'll be right back after this.

Joseph:

Alright. And we're back. Honest okay. I'm gonna read out the first headline, then I'm gonna go to Sam and ask her about The UK age verification law. And then I think Emmanuel has some sort of weird diagram that he's gonna, like, describe or something, and maybe we'll upload it on the show.

Joseph:

But So the headline is UK users need to post selfie or photo ID to view Reddits r slash Israel crimes r slash Ukraine war footage. This is about The UK's new age verification law and some unintended but maybe foreseen consequences of that. Sam, what is this law that just passed in The UK about age verification?

Sam:

Yeah. So this passed or it went into effect last week, which is why we're talking about it, and everyone's talking about it this week and last week. It's called the Online Safety Act. It's really similar to a lot of the age verification laws that we've talked about on this podcast a thousand times and written about quite a bit in The US. Basically, it's like a protect the children type law.

Sam:

It requires so it does a lot it does a a handful of things, including, like like, adjusting algorithms so that kids can't see, you know, things organically in their feed that would be harmful or, like, considered harmful and a bunch of different, like, regulation requirements for platforms. But the big one is that it's requiring platforms to to get to keep operating in The UK. They have to implement age verification to check whether users are 18 and over. So far, we've seen that mostly look like selfies or IDs, which is very coincidental

Joseph:

Very tiny.

Sam:

I guess, considering what we just talked about, that all these platforms will be holding IDs or you know, I think in most cases, third parties will be handling the verification. So on Reddit, it's, like, something called Persona, I think. And there are there are a bunch of different third party verification platforms that will do this. But, like, at the end of the day, you're gonna have to show and you currently if you're in The UK and not using a VPN, you have to show that you're 18 using, like, a valid driver's license or some kind of, like, government issued ID or, like, biometric data. So it would, like, scan your face and determine whether or not you're 18 or, like, use these things in conjunction with one another.

Sam:

And not complying is, like, an $18,000,000 fine or something. It's huge to to not comply with this. It's not a risk that platforms are gonna take. So, already, we're seeing, like, lots of different subreddits, which we'll get into going behind, like, an age verification wall. Foreign sites that want to comply with the law are doing it this way.

Sam:

Just certain Discord communities are requiring age verification. I think Xbox just announced today that they're gonna start doing this too. It's just like, like, old guy. Yeah. The gamers.

Sam:

Yeah. Yeah. Which actually is, I would assume, probably quite a bit of harmful content. Roadblocks and Minecraft,

Joseph:

not the best places all the time.

Sam:

Yeah. Exactly. So, yeah, that's that's in a nutshell what it is. And, obviously, it has these, like, massive repercussions that I'm sure we'll get into.

Joseph:

Yeah. So it's mostly I mean, in my eyes, it's mostly about porn. It's mostly about online pornography, sites like Pornhub, that sort of thing. But then as Emmanuel's story gets into, it is impacting all these other websites. Emmanuel, how do you wanna do this?

Joseph:

Do you wanna talk about this Reddit one first, then get into the payment processes? Is that what you wanna do?

Emanuel:

Here's what we'll do.

Joseph:

Good. Because I don't

Emanuel:

Yeah. No. I don't either. It's really

Joseph:

It's really complicated.

Emanuel:

I asked I don't usually, like, feel very strongly about, like, what we talk about on the podcast, but I really wanted to talk about this because, I don't know, I wanted everyone to check-in, and I wanna see how how everyone else is thinking about this. I find this to be one of the most complicated, subjects that we cover, and I kind of switch how I feel all the time. And, like, surprise, surprise, censorship and platform governance is, a very complicated subject. We all know that. But it is changing and evolving now in a way that we talked about for years, but is now actually happening.

Emanuel:

And it's just a mess. It's a huge mess. So Sam, I don't know if you saw in the podcast room, I just like posted this word cloud of like all the different entities that are involved in this.

Sam:

There's So like, white wall craziness happening

Emanuel:

right here. So it's like to back up, there's like just to, like, run through a few things that have happened in the past, like, month or so, We talked about Civitai, this AI model sharing platform that was used to create non consensual pornography. They got pressured by credit card companies to change their policy in a way that completely changed the nature of the platform and remove a bunch of those models that we found were really harmful. Then a few weeks later, Steam, which is like the, you know, default way of buying PC games online, they changed their policy. They said explicitly to come in line with what credit companies want and removed a bunch of sex games on Steam.

Emanuel:

Steam, in case you didn't know, for years has allowed sexually themed games, and there's been a lot of spam and low quality games flooding the platform every day since they've done that. And they didn't remove all of it, but they removed a bunch of incest related games and very violent, very graphic games. And that happened. And then I think later that week or the week after, itch.io, which is this huge platform mainly for independent game developers and students to share their work because it's easier to upload your games there and you can also be more flexible about how you charge for the game. So you can charge nothing, you can decide what the split is between your game and what itch.io makes as a platform.

Emanuel:

And they just like took this really radical action, probably because the credit card companies were threatening to shut them down any minute and just like de index all their not safe for work games, all their sex games, made a few of them unavailable in a way that people found really shocking. Like, if you're in this indie dev game community, a lot of your favorite games, award winning games, fall under those not safe for word categories, and they were like disappeared from the platform. And that really rocked people. And now that this law came into effect, Reddit is also forced to use age verification because of this law in The UK. And as you said, Joe, most peep mostly people think of it in terms of pornography and, like, not letting kids access that type of content, and the way they do it is, it's like, because of this law, it's Reddit's responsibility as a platform to verify that every user who accesses that type of content is an adult.

Emanuel:

And in order to do that, they have like an age check, which much like the other story we just talked about, you upload a picture of yourself or a picture of your ID. And this company called Persona verifies that you're of age. But it doesn't only do that for subreddits with sexual content, but like anything that is mature, which can be any subreddit that is about the news, but in a very graphic and, like, immediate way. So as you said, Israel Crimes, which is a community that mostly focuses on, you know, war crimes that Israel is committing in Gaza, that has an age check, and that subreddit is filled with, like, very graphic, horrible content and, like, movie videos and images of real people dying, but it also has like normal discussion about the politics of this and why it's wrong and people organizing and just like sharing their opinions about why they they think this is is wrong. And also like normal news articles, right, it's just a community that has this perspective that is willing to show that type of content that now in order to see it in The UK, you have to jump through all these hoops and potentially jeopardize your privacy in order to participate.

Emanuel:

That's that like that's kind of like a few things that have happened recently, and as you can see, like, I oscillate between, you know, we're in the media, we're we're our company is very focused on getting impact. So when we expose that Civitie is enabling this like really bad behavior and credit card companies respond by saying, hey, you have to change your policies or we're going to not work with you in a way that will completely end your business, I would consider that like a positive impact or like a good result. But when the same exact mechanism is used to nuke, you know, thousands of games that are people's personal art and expression of who they are and, you know, things that I enjoy that don't intend or I think you could even reasonably argue cause harm to anyone, right, those are nuked by the exact same mechanism and sometimes by the same interest groups. I think that is awful. And it's just like all these platforms are forced to make all these decisions right now, and I think some of them are probably positive.

Emanuel:

A lot of them are horrible, and it's just like it's it's just a very complicated landscape, I think.

Jason:

I think it's a very complicated landscape, but I think that the this, like, legislation in The UK and the age verification laws that we've seen in states in The United States about porn are pretty, like, definitively censorship and are not the type of intervention that you want to see from the government. I think, you know, Sam has written a lot about this. We, you know, we've all touched it in some way. I think Mike Masnik at TechDirt has done, like, really good work on this, but it's like it's like using a hammer to, you know, fix something that you would prefer a finer finer tooth comb to to mix my metaphors there. But it's just like it's a it's a super messy thing, and it fundamentally, like, undermines the idea of having a free and open Internet.

Jason:

And then as you said, you have, like, payment processor and credit card companies putting pressure and stress on the entire situation. And you also have a lot of these, like, anti porn nonprofit type vibes that are putting pressure on the credit card companies that are that are lobbying for a lot of these laws. And especially, like, in The US, in some of these states, these states are, like, pretty captured by one political party, and and, therefore, it's, like, it's pretty easy to push through some of these laws. And it's like you have states that are essentially adding a censorship layer to the entire Internet without, like, understanding what they are doing. And then or maybe they do understand what they're doing, but they don't care.

Jason:

And then on top of it, it's like you have tons of, like, VPN companies. It's like VPN downloads in The UK are through the roof. So people, like, do find ways around this, but it's very similar to to what you'd expect from, like, authoritarian regimes. It is very similar to, like, I went to Indonesia last year, and there were many, many, many websites that I could not access without a VPN. And it it would there was almost like no rhyme or reason to which websites were blocked and which were not.

Jason:

And it's like, I believe it was an anti porn law, but Reddit was blocked. Four zero four media was blocked. Like, random things are blocked, and it was very hard to tell what was blocked and for what reasons. And then we haven't talked about this yet, but it's like the sites that are complying with this are adding a layer, like a an ID verification layer that, you know, Sam has written about, I've written about. There's, like, a bunch of different companies that are offering these services where you have to upload your ID to access these different websites.

Jason:

And it's like there's a variety of different ways that this is being implemented. And so, like, many of the services that are offering age verification services say that they are deleting your IDs after, like, a certain amount of time or they say that they're encrypted or they say that they they maybe delete them immediately after verifying who you are and your age and that sort of thing. But it's like, we don't know. Like, there's so many different services that do this, and different websites are picking which services they're going to use. And it's like we just spent a half hour talking about tea and people's IDs being leaked on the Internet, And it's very easy to imagine a future where one of these ID verification companies gets hacked or where their security isn't perfect and and, you know, pretty sensitive data, like, ends up on the Internet.

Jason:

So I I think that the problem that they're trying to solve is a very difficult one, and one of the reasons that it's gone unsolved for so long is because the, like, quote, unquote, solutions to it are often worse than the problem itself or, like, create more complicated situations that undermine the idea of, like, having a free Internet?

Emanuel:

I think, first of all, Persona, which does this for Reddit, says they keep your images for seven days, which I guess is better than keeping them forever, but it's not as if nothing can happen to that data in seven days and it would presumably could be millions of images. So another way in which, like, I think this is very complicated is I wrote this story about Reddit, and I was like, hey. I don't think I don't even vouch for these subreddits, I don't think that they are necessarily, like, the most productive places in the world or anything, or at least I can't say that they are. It just like it did not seem positive or good to me that now in order to see that stuff, whether you're a minor or an adult, you have to show your face to Reddit or show your ID to Reddit. That seems like a hurdle that overall has the effect of making the news more cleaned up than it is in reality.

Emanuel:

And people were responding to me on Blue Sky being like, what do you think? You think kids have to watch other dead kids? And I was like, obviously, no. Like, obviously, I don't think you have to force kids to to watch, like, frontline footage from the war in Ukraine. That's crazy, and I'm not saying that in the article.

Emanuel:

But it's that stuff is gonna be harder for anyone else to see. And sometimes that is the stuff that, like, radicalizes people or makes them change their position or, like, that historically, you know, how Americans felt about Vietnam, how Americans felt about the Iraq war, how Americans felt about the Holocaust. A lot a lot of that had to do with what kind of images were in their heads. And, like, policymakers in The UK just decided for their citizens that, like, that stuff is gonna be harder for them to access. I don't know how you police that.

Emanuel:

I don't know how Reddit should manage that, but it seems clear to me that while the problem is real, this kind of policy is not the solution as as as Jason we're trying to solve a problem with with with the wrong with the wrong tools. Also, I wanted to like, the whole thing reminds me of our journey with Trafficking Hub and Exodus Cry. Sam, I don't know if you want to talk about that, where it's like we were reporting about Pornhub for so long that I think they and organizations like Encoves thought that we were like allies or that we had the same interests, but then our reporting shifted from reporting on Pornhub to reporting on on those organizations. Did you wanna talk about, like

Joseph:

Well, maybe who maybe who those organizations are because I don't think everybody's

Emanuel:

Which are involved in the STEAM, by the way. It's like Exodus Pride is like or or behind some of the activism that led to STEAM and itch.io to change their their policies.

Sam:

Yeah. Yeah. I mean, these are religiously affiliated, current or past, conservative, I would say, groups that hold up anti trafficking as their, like, mission, but and that's how they get nonprofit status is they want to, you know, save trafficking victims. But the problem is they define trafficking as porn, all porn, all sex work. Anyone in the adult industry is, like, a victim of self trafficking, and they're the ones behind a ton of this pressure that gets put on payment processors.

Sam:

They're the ones behind a lot of the pressure that gets put on politicians, which the pressure needed there is, like, a pinky finger push. It's like, all you have to do is slide a bill in front of a Republican politician and that says save the children, and they'll sign it immediately and not read it. That's something that we know for a fact. A lot of these bills don't get read before they're voted on. So, yeah, they're the ones that kind of are a big force for a lot of the changes that you see.

Sam:

And now at this point, it's like the administration is, like, welcoming this type of rhetoric in very actively in The US. I guess in The UK too, don't know a ton about UK politics, but it's a very much one to one comparison there. What happens there, what happens here are kind of in tandem. So if you consider all that as being, like this is, like, you know, extremely well funded lobbying groups whose full time job is to moralize what we do on the Internet. I don't know if, like I don't I don't particularly know if, like, the CEO of Mastercard, like, really gives that much of a fuck, but, like, the lobbying groups do, and they have a lot of money and a lot of pressure and people behind these campaigns.

Sam:

I think, like, the I guess it's funny that everybody's talking about this right now because it's something that sex workers have been talking about for seven years to to thirty if you take the long range. I think the way you know that this isn't necessarily like, we talk about unintended consequences, it's like I don't know. It's like whether they intended these consequences or not, I think, is aside from the point. I think we know that the actual, like, meaning and, like, purpose of a lot of these bills is not necessarily to protect kids because we know for a fact it doesn't work. Like, there's such studies for this.

Sam:

There's research behind this. It doesn't work. We see it happen every time. People just do VPNs. People go to worse and worse, more dangerous sites.

Sam:

What we know works, governments refuse to actually put any power behind. It's like we know that device based age like, parental controls and verification works. We know that kids probably shouldn't be handed an iPhone completely without restrictions, you know, at age three or whatever. That's probably a recipe for disaster, and yet we do it every day. We know that, like, sex education and, like, age appropriate because discussions about consent works as far as media literacy and also just, like, understanding what you're consuming on the Internet, what you're seeing.

Sam:

But there's no push for any of that in any level of government. It's just these conversations about platform regulation, which I've always thought is the wrong way to go about it. Platforms are motivated by profit and engagement. That's not their duty to parent your kids. But now the government's involved, and now they're gonna crack down on content that, like, legal adults should be allowed to access without a problem.

Sam:

But because there are a lot of kids in these countries without any supervision online. This is what we have to deal with, which I think sucks. I think the the stuff going on with steam and and itch and that you have to show your idea to be on blue sky in The UK is probably not good.

Jason:

Yeah. I I think to expand on that a little bit, it's like how these things play out in practice is, as Sam said, it's like if you are in a state that has an age verification law right now, it's like Pornhub is blocked because Pornhub has decided to block itself rather than comply with these laws. And so you can't access Pornhub in a lot of states in The US. And so people in these states either use a VPN or they use other websites that simply don't comply with the age verification law. And, like, a lot of those sites are based in places that have, like, very poor laws around things like copyright.

Jason:

Like, a a lot of it ends up being, like, pirated content, like, quasi legal content. Like, who knows what's going on? And it we know that this is happening because, like, if you look like, I've seen conversations on Reddit, on other places like, hey. Can you share this without a Pornhub link for those of us in Texas? Like, can you can you give us a link to x videos or, like, a a different website that, you know, is not complying with the law.

Jason:

And so that's gonna happen in The UK if it hasn't happened already. And then even even in places that have, like, really authoritarian governments like China, China has a lot or had a lot. I don't know the current state of it and, like, the specifics here might be slightly wrong. But, basically, they were trying to limit how much children were gaming. And so I think they had, like, a one hour a week gaming limit for kids.

Jason:

And what ended up happening was kids were taking their grandparents' IDs, and they were just using them to log in to, you know, the game server or whatever. And so you had these, like, 80, 90 year old people playing, like, dozens of hours a week of of different video games. It's like, that is probably gonna happen. It kind of rocks. It's like people are gonna find ways around it, first of all.

Jason:

Second of all, you know, this is sort of I just wanna stress again. It's like it's not just blocking, like, like, distasteful, violent, whatever, like, news because the world is a bad place and there's bad news, but it's also blocking, like, consensual porn that adults want to access. And, like, there are reasons why an adult who wants to look at porn, like, might not want to upload their ID to tell a random company so that random company can tell Reddit or Pornhub or whatever that this person is an adult. Like, there there are many, many, many people who just, like, don't want to do that, and it's a similar problem to the ones with with, like, Facebook's real name policy and and things like this where it's just like anonymity is important on the Internet. It's been important on the Internet since the beginning of the Internet, and we are, like, kind of just throwing that away for, like like, because people have gotten better at lobbying, essentially.

Emanuel:

I think to crystallize what makes me feel, like, yucky about the whole situation is, like, our journey

Jason:

It's just as a dad, comma, as a father.

Emanuel:

No. Not even close. Okay. Okay. We spent so long reporting on how bad Pornhub is, and at some point around 2020, I guess, a bunch of politicians and interest groups were like, you're right, Pornhub is banned, let's ban porn, right?

Emanuel:

And we're like, no. Like, that that is that wasn't the point at all. And now that we spend a bunch of time talking about, like, non consensual AI content on the Internet, you know, The UK is like, so let's age verify the whole Internet, or we spend a bunch of time talking about people live streaming mass shootings on Twitch. So it's like, oh, age verify that or don't allow violent content on Facebook. It's like, no.

Emanuel:

That's not what we're saying. So it's like it's the way that our reporting is being leveraged to justify these, like, puritanical, censorious politicians and interest groups that really, like, rubs me the wrong way. And I guess all we can do is just, like, keep reporting what is actually happening. I don't know. And, like, we never have, but but but, like, never advocate for, like, these type of of of solutions, like, these overbearing terrible solutions.

Emanuel:

I think a lot of the time people assume you know, we hear this a lot about, like, our AI reporting. People assume that that is what you want. You know? It's like, oh, you wanna, like, censor Twitter or you wanna censor, like, social media. It's like, no.

Emanuel:

Not at all.

Joseph:

Yeah. I think that's a really, really good clarification. You're right. Okay. We will leave that there.

Joseph:

If you're listening to the free version of the podcast on Now Play Us Out, but if you are a paying for a full media subscriber, we are going to talk about LeBron James and how he is not pregnant as far as I know, you can subscribe and gain access to that content at 404media.co. We'll be right back after this. Alright. And we are back in the subscribers only section. I actually wrote in the transition script the wrong story, Jason.

Joseph:

I wrote, we're gonna talk about Meta allowing some job candidates to use AI in their interview. And I initially pitched that to talk on the podcast. You're like, no. I won't talk about LeBron James.

Jason:

Yeah. Yeah. You're gonna have to find that content on I went on the radio. You can listen to to me talk about Meta interview there. So a few months ago, I think maybe back in February or March, I wrote an article about how Brain Rot AI was taking over Instagram, and this was sort of immediately after the launch of some Chinese AI video generators.

Jason:

And then it was pre VO, which is, you know, Google's AI video generator that is, like, pretty pretty good. I found some Instagram accounts that were posting, like, really fucked up AI as we've discussed many times, but it was, like, AI of real people doing, like, really bizarre things. And so one of the videos was of LeBron James, very clearly LeBron James and Steph Curry and Diddy in prison. And Diddy, like I mean, he sexually assaults Steph Curry. There's, like, no way to put it any other way.

Jason:

And, like, LeBron in this video, like, starts laughing about it. And this video had, like, 10,000,000 views, and the rapper fifty Cent liked it, for example. Like, it had hundreds of thousands of likes, maybe millions of likes, and it was just mega, mega viral. And the account that posted it was posting, like, dozens and dozens and dozens of these videos, and then they were also linking to a Discord where you could, like, learn how to make this type of AI video. And on this Discord, which is called Interlink AI, they had realized that AI generated content of basketball players was going viral on Instagram.

Jason:

And so they created a specific tool for people to make AI videos of LeBron James, of Steph Curry, of Michael Jordan, of Diddy, of, you know, any number of other basketball stars, Andrew Tate, like, just different people. And people were doing really messed up things with this. They were teaching each other how to make videos of LeBron pregnant. I saw many, many, many, many videos of Steph Curry getting his head bitten off by polar bears. They were like super interested in like feeding Steph Curry to different wild animals.

Jason:

Yeah. You know, things like normal stuff. Normal stuff like this. And it was going really, really viral. And then I checked back in on this Discord a few weeks later, and they were like, we are not gonna allow you to make videos of any real people anymore because a really popular basketball player got mad at us.

Joseph:

And and, I mean, I'll just read out the headline because I I guess that leads to it, and that is LeBron James's lawyers send cease and desist to AI company making pregnant videos of him. What was that so you're in the Discord, you check-in on it. Have they like posted a photo of this legal demand or like what information do you have at that moment?

Jason:

At that moment, I only had this this like post that the moderators made where they said they said, quote, this change comes after we ran into legal issues involving a highly valued basketball player. And to avoid any further complications, we've chosen to take a proactive approach and fully remove all real list list likenesses from the site. And so, basically, all I had at that moment was the fact that they said that they got some sort of complaint and the fact that they, like, took down this tool, which they built the tool so that it could do this. The tool is now essentially useless, and the Discord existed primarily to do this as well. And the Discord basically is inactive at this point.

Jason:

Like, it pretty much, like, killed what they were doing. But, anyways, I messaged the mods, and I was like, can you tell me more about what's going on here? And they told me that it was LeBron James, and they told me the specific law firm that sent the cease and desist. And then I found his Instagram, and on his Instagram, he was, like, waving around a paper that I took freeze frames of, and I saw the law firm's name on it. I saw that it was, like, a cease and desist.

Jason:

He wouldn't send me the whole thing, which is super annoying. I really wanted to see the whole thing. But then I went to well, then I did some more reporting, and I learned that all of the earlier videos that were really viral on Instagram had not only been deleted, but that their creators had been banned, which is pretty intense. Like, I have not seen Meta do a lot of banning of AI generated content even when it's pretty bad.

Joseph:

The Instagram accounts were banned.

Jason:

The Instagram accounts were banned by Meta. And so, I mean, my thinking is that these same lawyers likely approached Meta as well, and I asked Meta that directly, and they refused to comment. It wasn't like they were like, hey, we have no comment on this. They were like, we refuse to comment. And I was like, okay.

Jason:

Was kind

Joseph:

of That was a weird way of putting

Jason:

Kind of a weird way. Yeah. Yeah. There yeah. And so it was a weird way to to sort of frame that.

Jason:

But then I, you know, I called the law firm, and it's it's this law firm I had never heard of, but that does represent sports stars and Hollywood people. And they have a website, but there's nothing on the website other than a phone number. There's, like, almost no information about it. And I called them up and I had a very odd conversation with the woman who answered reception where I was like, hello, I'm a journalist, like, I'm looking to reach, you know, I named some of the people that I knew had sent the cease and desist, and she was like, I cannot connect you with them. You cannot talk to them.

Jason:

I was like, can I have an email address for them? And she was like, no. You cannot have an email address for them. And I was like, can I leave a message? And she was like, no.

Jason:

You cannot leave a message. And I was like, I'm going to tell you what I'm doing as a journalist and like you and I'm gonna leave my information and like, call me. Call me. So I told them what I was working on, and I was like, I wanna know more about what happened here. And she was like, okay.

Jason:

And I never heard back from them, and I called them one other time, and it was like a similar situation. So then I I ended up finding, like, the email addresses for several people at this law firm. I emailed them. I never heard back. But, you know, I felt confident enough to run this story because I was able to, like, freeze frame different parts of the video and see it.

Jason:

And it's like a pretty wild thing to invent and then create a document that looks like a legal document that has this real law firm that we know represents sports stars and have Meta delete everything and the tool getting shut down. It's just like all the circumstantial evidence suggests that, like, LeBron James' lawyer sent this letter, and for good reason, like, were doing messed up things to him.

Joseph:

Is this is this, like, the first time that we've seen a legal demand like this from a celebrity? Like, I suppose the Scarlet Your Handsome one is similar, but that's also different in some ways. Like, is it is it rare or is it the first time? Like, what is it?

Jason:

Yeah. I mean, that that's part of the reason that I really wanted to do this story is that we have time and time again seen celebrities' likenesses be stolen and used for nonconsensual AI imagery. You know, most often, this is happening to women celebrities, and it's being turned into porn. You know, in this case, it was, like, pretty, like, content. Like, definitely, sometimes it was sexual content of LeBron James.

Jason:

And I've been wondering, I'm like, are any of the celebrities or any of these lawyers, like, doing anything about this? And I think I suspect that this world is quite secretive, and I suspect that this is not the first time that it has happened, but I'm not aware of any other reported instances of it. Scarlett Johansson has spoken out about it. There was discussion about sort of, like, Taylor Swift possibly doing something after, you know, sexual imagery of her went viral on x during the Super Bowl, which Sam and Emanuel wrote about. But, you know, it it shouldn't be on these celebrities to have to, like, to have to do this, especially because they're victims of this.

Jason:

And yet, I feel like if they do stuff like this a little bit more often, it would go a long way, and it would go a long way not just for the celebrities, but it would go a long way for the, like, normal people who are not celebrities who are regularly turned into nonconsensual pornography using AI. And so I thought it was, like, pretty interesting that LeBron James did this. And I've also wondered a lot about whether, like, a company like Disney would do something like this because a lot of the shitposting AI is, like, Mickey Mouse. You know, I've seen Goofy, Donald Duck, seen McDonald's, seen like these massive, massive companies have their trademarks, have their likenesses, their intellectual property utilized by AI to do, like, really messed up things. And, you know, it's like fan art is allowed, but, like, remains unclear to me whether, like, you turn Mickey Mouse into, like, I don't know, like, really fucked up AI, whether that is, like, protected by trademark law.

Jason:

And

Sam:

Disney's suing midjourney, so I guess we'll find out Yeah. Soon enough.

Jason:

Yeah. Yeah. I I've been I need to learn more about that lawsuit because I wonder if it's, like, based on the training or the output or both. Probably both. But, yeah, that that's pretty much the story.

Jason:

It's like LeBron got mad that people were burning him pregnant and did something about it, and now you Well, these say tools.

Emanuel:

Think it was interesting because when you did the Brain Rod story, at some point, we were slacking about, like, what's the limit? You know what I mean? Like, when is LeBron going to get so upset that they actually do something and apparently, like, pregnant LeBron is is is the limit? But I think it is important because, you know, it is possible to create an environment where it's difficult for people to do this. Right?

Emanuel:

Like on YouTube, you can't use, I don't know, like, somebody's copyrighted music without content ID tagging you and removing or demonetizing your video. And I think, yeah, it's like if Disney got mad at Instagram a few times about AI Mickey Mouse and various Marvel characters and, like, more of these cease and desists were floating around, then you would just, like, see less of this trash. It's the same thing with Nintendo. Like, Nintendo has been out there enough times going after fan games just by sending cease and desists and be like, hey. Knock it off.

Emanuel:

Like, you can't use Mario. Don't use Mario. It's not like they sue everyone or they're filing a million lawsuits a day. They just made it clear that if you fuck with their IP, they're going to come after you. And Disney has done this many times to various forms of media.

Emanuel:

It's just that so far, no celebrity or big media company has really gone after, like, the AI slop really hard. So I think that's why it's notable. Like, if LeBron does this, if a few more people do this, it could really change the kind of stuff you see on Instagram.

Jason:

Yeah. And the person who made this tool is a 20 year old, like, startup hustle bro. You know, he he was like, here's how I make a lot of money Instagram doing this. And, you know, I don't I don't have a lot of, like, sympathy for him because he's, like, flooding the Internet with shit and teaching other people how to do it. But at the same time, it's like one cease and desist letter, like, destroyed this community that was getting, like, millions and millions and millions of views on this, like, very, like, messed up stuff.

Jason:

And it's not like it went through an entire legal process where, like, this guy was fighting a lawsuit and blah blah blah. It's like LeBron's expensive lawyer sent one letter and it's over. And it's like, you you think that presumably, like, a few more of those happen and these companies start being like, yeah. You can make AI using our tools, but you can't, like, make it of your neighbor. You can't make it of popular celebrities.

Jason:

You can't make it of Mickey Mouse. And maybe people will, like, find ways around that, but it does seem like that is a way sort of, like, making this a lot harder to do. And I wonder if you if that same thing happened to, like, Nudify apps, for example, if we'd see similar. I mean, a lot of those are based in, like, really random countries, but they are being, like, platformed on, you know, Instagram and TikTok. Like, we've reported a lot on it.

Jason:

It's just like I don't know. Like, all of these accounts that I emailed about like, I emailed Facebook. I was like, are these accounts okay? Like, are you cool with this? And they're like, yeah.

Jason:

Like, no problem. Seemingly, they, like, perhaps heard from LeBron's lawyer and, like, they nuked everything.

Emanuel:

They're okay when you find anything asks.

Jason:

I, like, went and looked, and I I can't find more, and that doesn't mean it's it doesn't exist, but it's like, I've gotten pretty good at finding this, like, really weird slop on Instagram. And I'm like, well, they pretty much, like, destroyed the LeBron in prison pregnant genre of AI generated content. Yeah. Alright. Joseph is like, oh my god.

Jason:

Please end this podcast.

Joseph:

No. No. No. There's some stuff going on outside my window, which I'm not gonna repeat on the podcast. No.

Joseph:

I think that's really good. Okay. I will play us out. As a reminder, four zero four media is journalist founded and supported by subscribers. If you do wish to subscribe to four zero four media and directly support our work, please go to 404media.co.

Joseph:

You'll get unlimited access to our articles and an ad free version of this podcast. You'll also get to listen to the subscribers only section where we talk about a bonus story each week. This podcast is made in partnership with Kaleidoscope. Another way to support us is by leaving a five star rating and review for the podcast. That stuff does really genuinely help us out if you could do that.

Joseph:

This has been four zero four Media. We'll see you again next week.