The 404 Media Podcast (Premium Feed)

from 404 Media

Pokémon Go to The Military Industrial Complex

You last listened November 28, 2024

Episode Notes

/

Transcript

This week we start with Emanuel's couple of stories about Niantic, the company that makes Pokémon Go, and its plan to build an AI model based on data collected by its users. After the break, Jason and Emanuel talk about their big investigation into the rise of "AI pimping." In the subscribers-only section, Joseph explains why he doesn't use a mobile phone and how he uses an iPad Mini instead.

YouTube version: https://youtu.be/fZ1Eu1DAK3c
Joseph:

Hello, and welcome to the 404 Media Podcast, where we bring you unparalleled access hidden worlds, both online and IRL. 404 Media is a journalist founded company and needs your support. To subscribe, go to 404media.c0, as well as bonus content every single week. Subscribers also get access to additional episodes where we respond to their best comments. Gain access to that content at 404media.c0.

Joseph:

I'm your host, Joseph. And with me are 2 of the 404 Media cofounders. The first being, Emmanuel Mayberg Hey. And Jason Kebler.

Jason:

Hello. Hello.

Joseph:

Yes. Sam is not here this week. I presume she'll be here next week. But we have, basically, the Emmanuel the Emmanuel show this week, or at least in the free section. I mean, Jason works, heavily on the second story as well.

Joseph:

But the first section, Emmanuel, you've published a couple of stories about Pokemon Go and data collection and AI. The first one is Pokemon Go players have unwittingly trained AI to navigate the world. So the company behind Pokemon Go is Niantic. Is that correct? So what did Niantic announce exactly?

Joseph:

And then we'll get into the more of the specifics.

Emanuel:

Miantic announced something called a large geospatial model. This is a term that they coined. They explain in their blog post announcing this that that name is in direct reference to a large language model, which is the type of AI model that we talked about endlessly here over the past, more than a year now. And what what is what is the reference they're making there? A large language model.

Emanuel:

How does that work? You scrape tons and tons of text from the Internet usually without permission. And based on that data, you train an AI to kind of statistically, assume what is the most likely, next word in a sentence and with enough data that sounds like coherent English that can answer you, on a whole range of subjects. A large geospatial model is basically trying to do the same thing for the physical world. Joe, sorry to put you on the spot, but there's maybe, like, a short paragraph in the story that you can read where, like, they give an analogy where he talks about, a church.

Joseph:

Yeah. I've just found it. And I agree. I think this would be useful because I'm almost having trouble visualizing even what that would actually do. But I'll read out the section here.

Joseph:

And it says, imagine yourself standing behind a church. Let us assume the closest local model has seen only the front entrance of that church, and thus, it will not be able to tell you where you are. The model has never seen the back of that building. But on a global scale, we have a lot of churches, thousands of them, all captured by their respective local models at other places worldwide. No church is the same, but many share common characteristics.

Joseph:

And LGM is a way to access that distributed knowledge. So, basically, it's like predicting what an area is gonna be. How how would you characterize it?

Emanuel:

Yeah. Exactly that. Like, there is enough data about physical spaces in the model that the model would be able to predict how to navigate the space. Right? It's like the the the thing that it made me think about is, like, when you're at a restaurant that you've never been to before and you need to go to the bathroom.

Emanuel:

Right? It's like, you don't know where the bathroom is, but you've been to enough restaurants to know that the bathroom is, like, not in the middle of the restaurant. It's somewhere off to the corner to the edge of the restaurant. Right? And it's like you're kind of training an AI model to do the same thing.

Emanuel:

So it hasn't seen this specific church, but it has seen enough churches to know that this is where the entrance would be or something like this.

Joseph:

It's funny the restaurant example because I all I always ask the staff. Like I also always ask, but

Emanuel:

you know where it's not going to be. You know what I mean?

Joseph:

Yes. That's fair. That's fair. So for those who have lived under a rock for, what, 5, 6 something years, let's back up a little bit. And can you just describe what Pokemon Go is, specifically?

Emanuel:

It's funny because while researching these stories, I learned something that I thought was apocryphal, but is apparently true, which is, on one of the on on on on an April Fools' Day, Google put out this video where they, kind of announced a fake integration of Google Street View and Google Maps with Pokemon. And it was just a joke about, like, oh, you'll be able to see Pokemon on in in in Google Maps. And it was such a viral video that the Pokemon company and Nintendo got in touch with Google and were like, hey. Can we make something like this? And they did.

Emanuel:

And that game is Pokemon GO. It's a mixed reality or augmented reality game that really blew up in 2016 where you kinda go around the real world with your phone and, like, real locations in the real world have Pokemon or gyms there, that you can interact with and have kind of, like, a real world Pokemon game where you're capturing them, doing battle with other Pokemons, and stuff like this. That's all based on, like, Google Maps data.

Joseph:

Yeah. I I mean, it was obviously a phenomenon. I imagine all of us played it. I didn't play it all that much, just a little bit. But I have a very vivid memory of it was the first weekend it came out.

Joseph:

And it was a Friday night in London, and everybody in the street was playing it. And, you know, like, lads who were out at the pub were screaming in the road because they were trying to capture Pokemon. It was insane.

Jason:

That was, I I, like, played it in Central Park in New York, and just being there actually, I don't think I was playing it, but I saw people like, I could tell the groups of people that were playing it. Like, you could tell by their behavior that they were just, like, groups of people walking into the woods and stuff like that. I feel like there are also some stories soon after it launched about, like, children wandering off into the woods and and things like that. Like, that was a a bit of a panic for a while.

Joseph:

Walking to the road and stuff. I seem to remember that. Yeah. And, like, putting themselves in danger or something. So you're playing this game and the way you do that is you point your phone phone's camera somewhere in the real world.

Joseph:

Like, maybe there's a a Pikachu or whatever outside a landmark and it's on the pavement and you have to film it and point your phone at it like that. I mean, what data is being collected there? And I appreciate we probably didn't really know this at the time we were playing Pokemon Go. It's sort of we kind of learned this a little bit later through reporting and through this announcement as well. But what is happening there?

Joseph:

Is the phone collecting information about surroundings or something?

Emanuel:

So Niantic collects a ton of data from Pokemon Go, from Ingress, from a bunch of other games it has launched since the success of Pokemon Go. And that data is, used and collected in a bunch of ways that I don't wanna get into here, because Niantic did talk to me, and they were very careful about saying what kind of data goes into this LGM product that they're working on. And this data is when one of those games asks you to, take a picture or scan a real world location. So, for example, they recently introduced this product, this feature in Pokemon GO called Pokemon Playgrounds, and that is where you can go to a physical location and pin your Pokemon to that location. You're like, on this bench, in this park, I'm putting my, Pikachu.

Emanuel:

And then somebody else can come there, and that location is saved in the game so they can take out their phone and see it. And, like, that data feeds into this product. They also had some other features that asked you to, like, scan real locations and monuments and stuff like this, it is unclear to me, whether that stuff feeds into the LGM or not. It's a little it's a little vague, but it's like that kind of stuff that is, inform informing the LGM, like, pictures of real places and things in the world that are also attached to location data so Niantic knows where that thing is.

Jason:

Yeah. And so I I edited this piece, and I I was, like, looking into this. And it's it's not super clear which specific things, you know, Niantic is using for this specific product. But I will say that over the years, they have I guess, I just wanna highlight that they, like, incentivize players to do this. It's like you get little rewards if you do it.

Jason:

Right.

Joseph:

It's like gamifying that type of collection.

Jason:

Yeah. And it's I mean, okay. This is not a real example, but in my mind, I'm just like, please go scan the nearest, like, nuclear silo. There's we put a Bulbasaur there so that we can, like, get images of this. And to be clear, that's not a real example, but it it like, that's kind of the vibe where it's like, please go to your nearest monument or church and, like, hold your phone up to it and walk around it, get it from all angles, and, like, we will give you some Pokeballs for for doing so.

Joseph:

Yeah. It reminds me of another story from a couple years ago in The Wall Street Journal. Byron Tow wrote it. And the headline was gig app gathering data for US military others prompts safety concerns. And it was this app called Premise, where users were told, hey, go here and take some photographs, and we will give you money or whatever.

Joseph:

Obviously, oversimplifying the piece. But basically, a gig platform for going and doing OSINT for people. I'm not saying it's one and the same here. They're building an AI model here. But it is still the outsourcing of gathering data about certain landmarks.

Joseph:

I don't know. I find that pretty interesting.

Jason:

Was this in the aftermath of of Russia's invasion into Ukraine? Because I remember there were a couple stories immediately after Russia invaded Ukraine where it was there was, like, a panic over gig workers scanning specific things and then concern that it was being used for targeting in some way. And it was, like, in the immediate days after the invasion, and I I don't remember. Like, there was definitely some over over panic there. Like, I don't know how it actually shook out, but I I do remember that there were definitely, concerns that it was, like, not Uber drivers, but, you know, like, gig workers, as you say, like, taking pictures of plotting, like, different things that people were worried were being used for targeting.

Joseph:

Yeah. Definitely around that time and definitely those concerns as well. Byron found it was being, as well as that, this was being used by the US military, which was it's that's like almost even crazier to me in a in a way. I was gonna ask what does Niantic say is gonna do with the data, but I feel like we touched on that. And actually, that'll be better for the second story.

Joseph:

So before we get to that, in the headline, we put that Pokemon Go players have unwittingly trained AI. We got some pushback against that. I think Jason or Emmanuel, elaborated in behind the blog, you know, sort of our behind the scenes paying subscribers article, that we publish every Friday. But just briefly, I mean, a very simple question. Do we think that the English lads who were playing Pokemon Go on Friday night when it launched and getting all rowdy, do we think they were aware they were contributing to a, you know, a mass data collection model being used to power, you know, the generation of a of a new piece of AI technology?

Emanuel:

Can I give my version? And then, Jason, you can give the spicier version. So I think there are 2 types of people who, played the game in regards to this question. And one category of people, they think it's just a game. They had no idea at all that any of this is happening, that they're collecting data of any kind.

Emanuel:

These are children that are playing this game. It's like it's really ridiculous to assume that they would know that this is, that they're generating data for this company to use. And then there's this other crowd that, maybe are people who read our website and are tech savvy and are concerned about data collection and privacy and security and all this stuff. And they rightfully assumed that the players are generating very valuable data and that Mantic will leverage it in some way. And I think that's a fair assumption, obviously.

Emanuel:

But even in that case, no one could have predicted, I think, that they were doing this to create something called an LGM, a concept that didn't exist back when the game first blew up. So some people may have had suspicions about Niantic using the data, but I don't think we could have assumed that this is what it would be used for. And it surely it they will continue to leverage the data in new ways as as time goes on.

Jason:

Yeah. I mean, the this was my behind the blog, and it was probably one of the spicier ones that I've done. My post about it, like, went relatively viral, so a lot of people complaining about that unwitting part were, like, in my mentions. But Emmanuel pretty much nailed it. I think all I would add is that, there have been studies that show that no one reads the terms of service for things.

Jason:

Like, we know this, and it's not like you can negotiate terms of service. But studies have shown that, like, 99.8% of people do not read terms terms of service. And then of the people that read terms of service, more than 99% of people do not understand what they mean. So there's that. There's the fact that, like, Niantic was part of Google, and then I believe it was sort of spun off.

Jason:

I don't know if there's, like, any relation anymore, but it has always been something of a mapping company. And so there are people who said, like, oh, well, that's, like, what Niantic does. But Ingress, their their first game was released in, like, 2014, I believe, and then Pokemon Go came out in, like, 2016. And to Emanuele's point, like, a lot of the people playing this are children. It's like these these kids and their parents don't know the history of Niantic as a company or even what Niantic is.

Jason:

It's like, I don't think that a lot of people are researching sort of, like, the background and funding of the mobile game that they play. Like, it's it's just I play video games all the time, and I couldn't tell you who developed half the games that I played for, like, 100 of hours even if their name flashes every time I start it up just because it's, like, not something that I pay attention to. And then the other thing is that you could assume well, so I I just went through the the sign up process recently on Pokemon Go, before we publish this, and it asks for all these permissions. It's not like it's saying, hey. We're building a a a model an AI model based on this stuff.

Jason:

It's like, hey. Scan this thing, and we'll give you where there where we put a Bulbasaur, and we'll give you some Pokeballs. And if you don't wanna do that, you don't have to do it. But it incentivizes people to do this, and anyone who's actually playing the game and wants to be successful at it probably is doing these quests to some degree unless you're highly informed.

Emanuel:

I think even even if you're highly informed, like, what is happening with AI now is just new and, I think, unexpected. So, like, when you use Facebook, when you use YouTube, you grant the company all these rights, and we all kind of know that because we know how these companies work. But I don't think that people who post their drone videography to YouTube could have assumed at any point that YouTube would or some other company would then, like, process all those videos to create a AI video generator that would eventually put them out of a job because video is much cheaper to generate this way than pay them to make it. It just Yeah. The it's it's it's the way that the data is being leveraged that is totally new and unexpected, I think.

Jason:

Right. And the very last point I was gonna make is people said, oh, like, what did you think their business model was? And it's like, well, if you're even thinking about that, first of all, it's a free to play game, but it has in app purchases, and they make 1,000,000,000 of dollars from those in app purchases. Like, that's a pretty common one. You also might assume because, as Emanuele said, it's like you sort of know that you're giving them information, and you might think like, hey.

Jason:

There's a business model that I'm familiar with, and one of those business models is targeted advertising based on your location. And Pokemon Go has that also. And so, you you know, that's not something that I like, Joseph. I know that's not something that you like, but that is, like, a type of thing that an informed person might think. Okay.

Jason:

They're using my location to target ads at me. And you can make that calculation, like, I'm okay with that or I'm not okay with that. But then what they've done is recontextualize that same information that they're using to deliver targeted advertising to you to then build, like, this big mapping platform that can be used for all sorts of other things, which probably leads us into the second story.

Joseph:

Yeah. And I would just add that even if you did assume it and, you know, you thought that, oh, my data is gonna be used in some sort of way. Okay. But now they admit it. It's news.

Joseph:

They are coming out and saying they're building a model. So I don't know. That's new information so people can make informed decisions. Right? But you're right.

Joseph:

It does go into the next story, which is Pokemon Go data, quote, adding amplitude to war is obviously an issue, end quote, Niantic exec says. I kinda butchered that headline. But, basically, there was a conference run by Bellingcatz, the OSINT organization. The person on stage was Brian McClendon, Niantic's senior vice president of engineering. And as you alluded to, Jason, formerly the co creator of Google Earth, Street View, and Google Maps.

Joseph:

Emmanuel, what did Brian say on stage that made you think I should pull this out and, you know, do a second article on these comments?

Emanuel:

So I should say that the reason I blogged this in the first place was garbage day had a short item about Niantic's blog post announcing this, and it was referencing a tweet by, an Austenet researcher who was saying, isn't it funny that all our Pokemon Go, gaming is gonna fuel killer robots? And that's a very provocative idea, but I didn't say anything like that in the first story because there wasn't any information indicating that it would be used that way. And I asked Niantic, and they didn't address the question.

Joseph:

Well, what did you ask them?

Emanuel:

Like I asked them, will you have any restrictions on who you sell this data to? Specifically, will you sell it to militaries and governments? Will you, have any limitations about it being used in some kind of, like, lethal force? And they just did not address the question either way, and I didn't think it was, like, fair to include because that would be pure speculation. All the original blog post said is that they wanted to use the LGM as a critical component for AR, which is augmented reality, and, robotics, content creation, and autonomous systems.

Emanuel:

And that could be, like, anything. Right? Jason has covered these food delivery robots in LA. So you can imagine it informing something like that. It would be very useful in that scenario.

Emanuel:

But then I found out about this Bellingcat conference where he was, and it's a fascinating, talk, which I recommend everyone, watch. And, Bellingcat asked the obvious question to which he responded that, basically, if a military or government uses it in any way that a regular consumer would use it, that's okay. But if they use it in a way that adds amplitude to war, that's definitely an issue. Him saying that's definitely an issue is is not a no, by the way. And I asked Niantic as well, does that mean you won't do it?

Emanuel:

And they did not say that. They said, this is very early on. We just announced this. This is months or years away from being deployed, and we're exploring all options, which is definitively not a

Joseph:

no. It's a it's a very long no.

Emanuel:

Yeah. It's a very long no.

Joseph:

Or or it's not. But yeah.

Jason:

It's a very long, like, probably.

Joseph:

Right. Right. And I mean, that's already incredibly interesting in those comments and sort of that dancing around the issue and him saying that, you know, adding sort of force to a war could be an issue for them. We can't really go beyond that at the moment because as you say, it's being developed. But, like, what do we make of this?

Joseph:

Is there a future in which this AI model could be used for military things? It could be used for more innocuous things? Is it, like, basically a we wait and see, you know, and because I I think the one thing we do know is this data is probably quite valuable. So,

Emanuel:

I think during his talk, he shows this video about how he he just he shows this clip where they take all the scans of this courtyard with a fountain in the middle. And with the photos of that location, plus the, like, geolocation. Right? Just, like, knowing where the phone is physically in the world, they were able to create, like, a 3 d model of that space, like a video game. Right?

Emanuel:

So they take a bunch of 2 d images. They take a bunch of location data, and they create what looks like a 3 d level from a video game. Now with a game as popular as Pokemon GO, if you could do that to, like, major cities all over the world and then one of those cities becomes, like, a war zone, right, and you have, like, what they call centimeter level precision in mapping of those locations, then you don't have the problem that Jason's delivery robots deal with. Right? Because it's like you have you're navigating those spaces like it is a 3 d level of a video game.

Emanuel:

Like, you know what the space looks like. So if you have, like, a, you know, like, a spot robot mounted with a machine gun, it's much easier to navigate that space if you have that kind of data as opposed to doing things like we currently do, which is, like, you have a robot with LIDAR cameras, and they just go down the street and they try to figure out what is happening around them as it's happening and parse it out. And that's how you get them, like, falling off the curb and you get, like, self driving accidents and stuff like that. You don't you're not trying to parse out the environment as it's coming at you. It's like you literally know every inch of

Joseph:

the environment. Or you can the the Boston Dynamics, whatever sort of dog robot can predict that in this sort of European style city or whatever or this American style city, there are certain characteristics. And while I turn around this corner, there is a high probability that the curb is gonna be this sort of height. You know, the curb in London is gonna be of a certain texture certain angle or whatever, and it could maybe make, you know, predictions or movements based on on that. So, yeah, you can see the the benefit of it.

Emanuel:

Yeah. And and most I I should also say, and and Yantec says this as well, most of this type of data that exists at the moment does come from, like, dash cameras and self driving cars. And the the the the killer app aspect that Pokemon GO has is that it's a bunch of data where only pedestrians go.

Joseph:

And different levels. Like, you'd if it's a dash cam, it's just one stationary sort of thing. And if you're moving a phone around, it's gonna be all over the place.

Emanuel:

That's right.

Joseph:

Yeah. Jason?

Jason:

I was just gonna say that they're not my delivery robots, but I should I should start a delivery robot corporation, open source it.

Joseph:

Yeah. I mean, why not? 404 well, I shouldn't tell you what. You you can take that branding yourself. May maybe that can be a a side project for you.

Joseph:

Alright. We'll take a break there. When we come back, we're gonna talk about something completely different. We're gonna talk about the rise of the AI pimping industry. That is a word or a series of words I didn't think I would ever say.

Joseph:

We'll be right back after this. Alright. And we are back. This is one that both, Emmanuel and Jason wrote, inside the booming AI pimping industry. So how does this story start?

Joseph:

And and I think it might be good to give people a concrete example of what someone might see when they sort of scroll through social media and they come across one of these posts that you're talking about, in the piece. I don't mean the educational ones and pea how people learn to do AI pimping, but sort of what does the end product look look like?

Jason:

Yeah. So this is actually a follow-up. So so we did this story in partnership with Wired. It was the 2nd story we published on Wired, and Emmanuel and I worked on it for several months, sort of like in the background for a while. And it grew out of reporting that I did and I believe published back in February about the rise of AI influencers on Instagram.

Jason:

And, essentially, if you are scrolling through Instagram and let's say that you, generally like a lot of Instagram influencers or models. I mean, these are, like, usually like, almost always women, although there are sometimes men. Like, we saw some AI generated men as well, but it was it's mostly women, and it's mostly them in bikinis, like, at the beach or by the pool or in front of a mirror, that sort of thing. And it's like modeling images more or less. And, you know, it's just like life lifestyle content, for lack of a better term, like, aspirational travel content.

Jason:

And, you know, you'll either see a grid post where it's just like a photo, or you might see a reel. And that's a critical distinction that we can talk about, but, basically, it's like both we saw both types of things.

Joseph:

So, yes, you have that lifestyle content, the sort of stuff you see on Instagram all of the time. Emmanuel, what does it look like? Is it is it photo? Is it video? And crucially, what is different about the because we're not writing about normal Instagram influencers.

Joseph:

What is different about this content?

Emanuel:

Yeah. So first, I would say, it's very possible that you've seen this content and you don't realize it because it looks real. It looks convincingly, real. You would not be able to tell that there's anything not right about it. But, yeah, it's either still images, models, influencers, beautiful people that you can aspire to be, or reels, which are really being promoted by Instagram right now.

Emanuel:

It's trying to put them in front of you in many ways, and those look like real video.

Joseph:

And then what's different? Is it that the face has been swapped? Like, what differentiates this from a normal piece of Instagram content? It is

Emanuel:

what what all this content that we highlight has in common is that it is originally content from a real human creator who posted it to their social media, and someone else just took that video and used AI to make a deep fake video. But, usually, when we think about deepfakes, someone takes a porn video and puts someone else's face into it. And in this case, they take a PG 13 rated video and puts an AI generated face onto that video in order to make it seem like original content. And the face is consistent across the account. So it seems like you're following a real person, but you're not.

Emanuel:

You're following an AI generated person who is stealing videos from all these other, usually women, in order to create, like, a viral popular account.

Joseph:

Yeah. I mean, really, really crazy stuff, and it's especially wild that people may have seen this and not even realized it because they are super convincing. You scroll through them and the faces match. And it's the the continuity, as you say, which has always been a big problem for AI. Right?

Joseph:

But it's like a consistent fake character across the social media posts. That's the craziest thing to me. So you have all of this, but you didn't just talk about that. You actually went sort of a layer lower than that. Because it turns out there's an entire industry about monetizing these fake creators now.

Joseph:

So who is making these deep fake social media, pieces of content and, like, you know, what what's happening there?

Emanuel:

So when Jason covered this, back in February, He was just covering the fact that it was happening, and it wasn't clear if it's just a few people doing it for fun or to fuck with people, or they just found a way to monetize it. The reason we came back to it is that the entire space has exploded and has become totally professionalized. So I couldn't tell you who most of the people who run the accounts are, but there are many of them. And, also, there's this other class of people now who are kind of like many other Instagram hustlers, they say, I manage all these accounts. I make this amount of money.

Emanuel:

Do you wanna make money like this as well? You should follow me, buy my guides on how to do this, watch my videos. There's, like, an entire industry of people teaching you how to do this, and you paying them for for that privilege and, yeah.

Jason:

Yeah. So we found a couple different communities of people doing this. You know, one is called the, quote, unquote, digital divas, and it's like I don't know who runs it. I mean, that that's kind of one of the frustrating things is that, the the, quote, unquote, people running it are just AI avatars themselves, but it's like 3 AI influencers who are teaching other people how to how to do this. And it's, like, a $50 guide.

Jason:

They have a Discord, and they have, like, coaching on that Discord. So you can, like, join the Discord, and they will teach you how to generate these models, how to get the face similar, so on and so forth. I think one interesting thing about the the digital divas is that they claim that they're an anti deepfake server, and so they're like, don't do the face swaps onto the real bodies of real women, which makes them feel like they are doing something that's, like, a little bit more moral or ethical.

Joseph:

It seems like splitting hairs when it comes to the ethics of her leg. So what are they doing? They're generating a whole person?

Jason:

Or They're generating a person, and then they're they're generating yeah. They're, like, they're basically not doing body swaps. They're not doing face swaps onto real videos of other women, but they're they're, like, using those videos as inspiration and in some cases, like, as training data. Like, they're they're putting them into the different Laura's and, meaning sort of like the instructions for generating the this these models and these images. And then they're also saying, like, rather than just, stealing, like, Ariana Grande, why not take Ariana Grande and Sabrina Carpenter and, like, turn them into a hybrid person?

Jason:

And so, therefore, it's like not stealing a copyrighted image. You're you're, like, remixing it in a way that we find to be more ethical. And so I think it's, like, important to remember that kind of like all of this AI generated art, like, regardless of whether it's a one to one theft of another person's image or body or likeness, it's like they're all trained on images of real people. They're all trained on videos of real people, and that's, like, what underpins this entire industry. But it's interesting that they took this idea and they're like, oh, we're the ethical ones because we're not stealing content.

Joseph:

Yeah. Because using its trained data is still stealing. It is just obfuscating that fact. You're stealing, but it's like in the background, basically. It's it's effectively the same sort of thing, and you're just less likely to get caught.

Joseph:

So how are these AI pimps? And I should say, that's not our term. That is what some of the people say themselves. Right? We have marketing material that we got and and pull it in the article.

Joseph:

How are they monetizing, this material they're making?

Jason:

Yeah. So, the the digital divas just stand in contrast to one of the other people that we found who ran this account called Emily Pellegrini, who is an AI influencer, and it's, like, I'm just gonna say she, but, again, not a real person, was written up by, like, the Daily Mail. Like, all sorts of people covered her as, like, one of the biggest and first AI influencers and had, like, 100 of thousands of followers on Instagram. And now the person behind that account, who again is using, like, an AI persona of a man, is teaching people how to do this. And I would say that that guide is, like, much more nefarious than the digital divas.

Jason:

Like, I just wanna be clear that it's a spectrum. You have, like, the people who are like, oh, don't steal content. And then this other one is like, here's how to steal content. And it's like a real, like, wink, wink, nod, nod, be careful what you do. But it's really runs the gamut between, like, hey.

Jason:

This is, like, we're creating a new thing, a new industry, and, like, here's how to steal money. Steal women's bodies and make money off of it. And the way that they are monetizing it is they're basically linking to, this account this website called Fanview or a few other, websites that are OnlyFans competitors and knock offs more or less, that'll that explicitly allow people to sell AI generated nude images or nude videos. And those accounts also use stolen content, often from adult actresses.

Joseph:

Right. Right. There's the stealing from the Instagram influencers to make the, for lack of a better term, safer work content that's on Instagram. And that acts as the funnel to get people to these OnlyFans like services, which is where they've then got AI generated stuff, which is presumably ripped off either deepfakes or users training sets, adult performer content. You sort of have those 2 sets.

Joseph:

So what, is Instagram's role in this and sort of their response? I'm not sure which one of you pinged Instagram for comment, but sort of what did they say?

Emanuel:

Well, it's funny because before we get to what Instagram said, I talked to Elena St. James who, is an adult content creator who is very aware of this problem. This has happened to her content. This has happened to content to other adult content creators that she knows. And the problem here is not just stealing content from other people.

Emanuel:

It is then also creating a situation on Instagram where she is competing with, like, an infinitely generating number of accounts that can produce content much more easily than her because she has to take photos and make videos. And these people are just, like, stealing or generating content, and it's just drowning her out. And in case you don't know, obviously, Instagram have a lot of, restrictions about posting adult content or unity of any type, but the way that people who make money on OnlyFans and other similar platforms gain an audience and monetize their content is by essentially advertising on Instagram. And she says that ever since this started, she's been having a harder time picking people up and converting them and so on. And when I was talking to her, she was speculating that Instagram is not only aware, and it's not only that it doesn't care, which I think is something we can assume based on our past reporting adjacent past reporting on AI slap on slop on Facebook and stuff like that.

Emanuel:

They might actually be in favor of this because it juices the numbers. Right? It's just like more people posting, more people engaging.

Joseph:

More likes, more More likes and just place

Emanuel:

the numbers for Instagram. And I think Jason, didn't Zuckerberg essentially confirm this in a way on a quarterly earnings call?

Jason:

Yeah. I think we I don't know if we talked about it on the podcast, but, on the last quarterly earnings call, Zuckerberg essentially said that AI generated is driving more engagement on both Facebook and Instagram to the point where they are thinking about creating a whole additional feed just for AI generated content, which in the past when Facebook has made new feeds, they, like, start as new feeds and then they just, like, integrate them into the the main news feeds. And so they think that this is good, like, by and large, the idea of AI generated content. I will say that Instagram did take down a couple of the accounts that we flagged and that they have over time because they have an anti impersonation policy. And, I mean, it's very small number of accounts that they've taken down, and there are many, many, many, many accounts doing this.

Jason:

And then I think Elena St. James, raised a really interesting point, which is that when she has reported her own content as being stolen, that often brings scrutiny to her account in general. Like, her account is no longer able to fly under the radar, which is not to say that her account is even doing anything wrong. It's just that Instagram's enforcement around adult content is all over the place, and so you'd probably rather, you know, a moderator not look at your account than than look at it. Is that is that fair to say, Emmanuel?

Jason:

Yeah. You spoke to her.

Emanuel:

Yeah. Yeah. So just to answer, more directly to Joe's question, we flagged a bunch of accounts to Instagram, and we're like, here's the AI content. Here's the original content that it's stealing from. Is this a problem?

Emanuel:

And they're like, maybe it's a problem. Maybe it's not. We can't say based on you flagging this. The person whose content was stolen has to report this directly, prove it's their original content, and then we maybe do something. So we flagged a bunch of stuff.

Emanuel:

They took some down. Most of it is not they generally, like, don't care. The person who's being stolen from has to report it, and then maybe they will do something. To Jason's point, there's a bunch of stuff in how, sex workers and adult content creators have to operate on Instagram that makes it easier to steal from them, for example. So, Elena has her account, and in the bio of her account, she names another account that people should follow in case her primary account is shut down.

Emanuel:

Right? Because Instagram will randomly, without explanation, shut down, accounts for adult content creators. And rather than them having to build their following from scratch, they're trying to have, like, an emergency lifeboat that people can follow. And this is the same kind of behavior that you're seeing with the I the AI accounts. Right?

Emanuel:

So it's like there's 2 Elena Saint James accounts. They're both posting Elena Saint James content, and she owns both of them. But now there is a 3rd, a 4th, a 5th that are not hers and are not authorized and are just, like, monetizing her videos without her permission. So that makes it harder for Instagram to, like, tell which is which. Right?

Emanuel:

Yeah. And maybe this is a naive, silly point for me because I didn't know this is and

Joseph:

and you both reported it. But the fact that they Instagram might take it down for impersonation is tricky in a way. I mean, you would hope they would take it down because it is lifting hers or others people's content. But it almost brings in too complicated or philosophical a discussion where you can imagine Instagram would, like, start stroking their chin and be like, well, is it really impersonation if they've generated a new thing off this person's body? And it's like, dude, we're not here for the 101 philosophy class because you just, like, stop people getting ripped off.

Joseph:

But, yeah, I guess we'll see. We will leave that there. If you're listening to the free version of the podcast, I will now play us out. But if you are a paying for a 4 meter subscriber, we're gonna talk about something that's been requested by a lot of people, but I just wanted to pull it kind of on the pod. How and why I haven't owned a phone since around 2017, and how I live my nightmare life, due to that.

Joseph:

You can subscribe and gain access to that content at 404media.c0. We'll be right back after this. Alright. And we are back for the subscribers only section. I'll try not to just talk 15 minutes.

Joseph:

So please interrupt me if you have, a question, Emmanuel or Jason. But this So

Jason:

I have plenty to say about the fact that you don't have a phone and the hardships it causes me

Joseph:

as your

Jason:

friend and person who knows you.

Joseph:

We we can have a group therapy session. This is awesome. Okay. So the headline is, I don't own a cell phone. Can privacy focused network change that?

Joseph:

I'll get more into this, but the phone is by a company called Cape. They've been selling it to the US military and doing sort of research projects with them, especially in Guam, and now they're trying to offer it to high risk members of the public. I initially filed this it's quite as we call it a straight story. Right? Where it's just like, here's the facts or something.

Joseph:

But Jason looked at it, and I think the correct criticism was that it it didn't really feel authentic. Is that right, Jason? Or or rather, I come from a very particular pay place because I don't own a mobile phone and that's why I'm covering this and I sort of didn't discuss that. Is that sort

Jason:

of Yeah. I mean, it it was an interesting story. It's just that you didn't really explain that you don't have a phone and why you don't have a phone, which is something you that you've written about before, so I know it's something that you feel comfortable talking about. And I think that this specific phone company, Kape, which we'll talk about, it didn't make you buy a phone, but you did try out having a phone for the purposes of this article, and that's because they have tried to address some of the reasons that you don't have a phone. So so maybe let's start with, like, why why don't you have one?

Joseph:

Yep. And I'll run through these pretty quick. But there whenever I get close to getting a mobile phone, something happens again. And I'm like, oh, no. I'm right.

Joseph:

I'm still not gonna get one. Telecoms get hacked all of the time. T Mobile is breached multiple times. And that data includes names, physical addresses, other personal information as well. That can happen on a grand scale where entire databases are stolen.

Joseph:

Or it can also happen where many of the hackers I cover gain access to an individual account, you know, through a corrupt insider or they gain access to network. They look up information like that. So just in the same way you may give your address out to a company, telecoms seem particularly fucking crap, like looking after that information. There is fraudulent emergency data requests. We covered a while back that somebody posed as a police officer, contacted Verizon and said, hey.

Joseph:

I'm a cop. Please give me the data linked to this phone. It was actually a woman that he was stalking, got the physical address, and then he drove to the associated address armed with a knife. So it's not just hacking. There's sort of almost scamming and domestic violence and then abuse there.

Joseph:

Location data, way back, I think in 2018, we revealed that T Mobile, AT and T, and Sprint at the time were selling precise location data of their users for a very complex supply chain, end of the book bounty hunters. And we were able to tap into that and track a phone, that triggers, you know, 100 of 1,000,000 of dollars of fines and stops that practice. There is SS 7 where surveillance companies and governments will tap into the backbone of the Internet to intercept text messages, calls, and get location data. I think this was used recently to break into the Linus Tech Tips YouTube account. People keep telling me that, and I need to go check it out and and watch it myself.

Joseph:

I've covered how that's being used by financially motivated hackers to break into bank accounts in the UK. I've also been called up by the owner of an SL7 company who, after I covered them, they said, yeah. It's me, the guy who sells s s seven to the highest bidder, blah blah blah. I think I'd already stopped using a phone at that point. It was around that time.

Joseph:

And then, I guess, the last one is SIM swapping, which is where, I'm sure people listening to this will know, hackers and scammers and fraudsters will trick a telecom into swapping a mobile phone number over to theirs. They then control your phone service, basically, and they can get the password reset, tokens, 2 FA tokens, do all of this sort of stuff and break into your accounts. So I mean I mean, when I list it like that, it's crazy. The these networks and these companies all over the world and I'm not just saying it's a UK thing or a US thing. It's a worldwide thing where telecom infrastructure was built in the time when security was just taken less seriously, and here is where we are.

Joseph:

It's like a hodgepodge of all these different technologies.

Emanuel:

Sorry. Do you remember why you decided to drop the phone when you did? You lifted a bunch of stuff, but there was, like was there one thing?

Joseph:

It it was I I actually can't remember the moment. I genuinely can't remember the moment. I do know it was around the time of that call I got from the s s 7 guy, and I can't remember whether it was I got the call and then I was like, oh, I'm getting rid of my phone, or I had already gotten rid of it, but it was around that time. Yeah. I just genuinely can't remember.

Joseph:

You know? Jason, I feel like you were gonna ask something.

Jason:

I was gonna just gonna say it it seems like it's a hodgepodge of, like, tech oversights and then, like, active kind of, like, this works for for a business model sort of things.

Joseph:

Yeah. Yeah. It's like it's it's a mix of all of those. Right? And then regulatory issues as well and, yeah, all of that.

Jason:

Yeah. So so you've been use you were using an iPad Touch, and now you have an iPad mini with a little bag. The bag is very cute.

Joseph:

I've been defaulting to jackets, like like shirt jackets. I have a couple of different colors, which have the correct pocket size for the iPad Mini.

Jason:

Okay. I I like the bag.

Joseph:

Okay.

Jason:

Not making fun of the bag, but it's like a it's it's a cute bag. It's a good bag. Anyway, because you bought it maybe you wrote about the bag. Anyways, it doesn't matter. You use Wi Fi on these devices.

Joseph:

Yeah. So I because you can again, yeah, you're right. I I had the iPod Touch, and then Apple discontinued that. And you I think you could still get some security updates on it, but they they discontinue it. Right?

Joseph:

And I would churn through those. I would get one. I would drop it, and those things stop working after you drop them very, very quickly. And I would get a new one, like, every 6 months.

Jason:

I was gonna ask though because I feel like there's this meme that's been, like, many times disproven at this point, but maybe let's let's address it, where people are like, don't use public Wi Fi or use a VPN on public Wi Fi. And I you do use a VPN sometimes, maybe always. I'm not sure. But it's like you actually use public Wi Fi for your own security. It's, like, kind of the opposite of what Yeah.

Jason:

Most people are maybe assume.

Joseph:

Yeah. That that that's a really good point. Like, as you say, there's always that myth where, oh, don't use public Wi Fi because then a hacker is gonna get your credit card details or something like that. And it's like, sure, maybe that was the case, like, before 2013 when the Edward Snowden revelations came out and it kind of pushed everybody to, like, implement HTTPS, which, you know, encrypts basically your web traffic when you go to a web page. It is pretty unusual for you to go to a web page now and it's just HTTP, like it's not encrypted.

Joseph:

Like, I do not feel exposed really when I'm on a public Wi Fi network. And as you say, I use those all the time. I'm walking around. Maybe there's one on public transport. I'll jump onto that.

Joseph:

Maybe there's a coffee shop or something and, you know, some of those are public. Some of those are private. And then, of course, you know, also just my own Wi Fi or friend or friends or whatever. But that threat from public Wi Fi is massively overblown now. Like, I'm not worried at all.

Joseph:

And, like, if I see anything suspicious, it's just like, oh, okay. That's strange. But as you say, I use a Wi Fi, sorry. I use a VPN 90% of the time anyway. And, you know, that actually brings some other risks as well where, you know, the VPN industry is a fucking cesspool of snake oil, right, with people making all of these claims.

Joseph:

But you have to choose one that works for you. And I'm not gonna say a name because, like, I don't really wanna endorse any particular VPN provider. I don't really wanna say the one I use either. But, you know, there are some out there that you can trust more than others. But, yeah, I don't have the cellular versions of the iPad because for me, that would kind of defeat the point in that I mean, yes, it doesn't have a SIM card.

Joseph:

You can just take that out. But it still has a baseband, and theoretically, there can still be some attacks there. And if I if I'm already going that extreme, I will just go for the iPad, you know, rather than rather than taking the SIM card out.

Jason:

Yeah. So, we'll talk about what it's like to try to interact with you, at the for the end. But tell us about Kape and, like, what how they try to solve these things and your experience like

Joseph:

Yeah. So there there have been a few companies over the years which they offer a data only service or an eSIM service. And it will be something like, hey, it's an app you install, and we are a mobile virtual network operator. That's an MVNO, which is basically a phone company that sits on top of a T Mobile or an Orange or an o two or a AT and T or whatever. And they sort of provide their own service, like Mint Mobile, you know, the Ryan Reynolds one, that's an MVNO.

Joseph:

Google Fi is an MVNO as well. Cape is one of those. But because they write all of the software themselves, they say, they're able to do some more interesting things. One is they're able to rotate the IMSI, which is sort of a unique identifier for the subscriber, I would say, like the SIM card or the eSIM. They can also rotate the IMEI, which is a unique identifier for the phone.

Joseph:

And they're also able to rotate on demand the ad ID, which is that sort of advertising unique identifier which we've touched on a bunch with how the secret service was looking up location data without a warrant. That's reliant on the ad ID. What makes it interesting is that when you combine all of those and cape the Cape phone rotates through them, to certain attackers on a network, potentially even the proper telecommunications company that Cape is piggybacking off at the time, be that an AT and T or a US cellular or whatever, the Cape phone could look like a a different phone every single time. And you can change that by setting up a geofence. So when I walk to my home, I wanna look like this phone.

Joseph:

When I work to go to work, I wanna look like this phone. Or you can do it on a timer. That's how I tested it. Every hour with a little bit of entropy, recycle my identifiers, or you can just do it on demand. I think that's pretty interesting.

Joseph:

I have no idea how useful it is for actual real world attacks, and we're not gonna know for years years years. The navy told me it has been useful at improving information security. It's just a statement, you know, I don't know anything more than that. I'm definitely gonna try and do some more digging into its efficacy. But, you know, now they are trying to get journalists or high profile activists or domestic abuse survivors on board to then pay.

Joseph:

And I didn't actually get into this into the article just just because it didn't really come up. I was talking more about the tech. But, like, I think it's, like, a $100 a month, something like that. But then the phone is, like, potentially $2,000, 1500. So, like, it's it's not cheap.

Joseph:

You know? And I didn't get into that just because I don't fully know yet that they haven't published the price list. But, yeah, that that's basically what they're all about.

Jason:

Yeah. I I think I think it's pretty interesting. It's cool that they're doing this. I'm always quite interested in new phone tech. And by that, I mean, just, like, alternative operating systems and, like, ways that people tweak Android or iOS, like jailbreaking stuff like that.

Jason:

This is this is obviously not that, and I believe it lives on it it's Android. Right?

Joseph:

The custom phone is Android. I think they are gonna launch an iPhone version, but that won't have all of the identity obfuscation because you need a rooted phone to be able to do that.

Jason:

Right. Right. So when I say new phone tech, I don't mean, like, new gadgets. I just mean, like, ways of altering this sort of, duopoly that is Android and iOS. That's why I've always been very interested in the encrypted phones that you've written many things about and writ wrote a book about.

Jason:

But I guess the thing that was interesting to me is that you tried this, for the purposes of this article, and you sort of ultimately came to the conclusion that while interesting, it's just it's not worth the risk to you for you personally. And I think that that is probably the takeaway that I would have had also because as you said, it's like very few people have actually used this, and they haven't used it for that long. And once it starts getting used, like, much more widely in society and for a longer period of time, we're not gonna know how well this tech actually works.

Joseph:

Yeah. It's like I want to see a search warrant affidavit written by the FBI trying to get cape data when a criminal uses it. And, actually, that might not happen for a while because it seems like they're being pretty restrictive on who they sell it to. But it's like, when you get those subpoenas published by Signal and they show the data they provided to the authorities, what they were able to do, from that, I can make an informed decision and be like, oh, I'm I'm gonna use Signal and I'm gonna trust it broadly. I need basically that until I could use this phone all the time.

Joseph:

Because, you know, I'm at home. The phone is on. It says it has this IMEI and this IMEI. Deep down, I'm just like, this is still a phone connecting to a network from my home and my place of residence, and I can't shake that feeling yet. And I know my position is extreme.

Joseph:

It's just I'm very, very used to that friction. But, yeah, I'm I'm just I'm curious to see what happens next. I guess just to round it out and finish this conversation, let me just say a couple of things practically, and then maybe, Jason, you can yell at me for being really difficult to get in touch with. But beyond the VPN, people often ask, well, like, how do you deal with SMS and sign ups and stuff? I use a ton of voice over IP numbers.

Joseph:

I have lots and lots of them. I won't say so what services I use. But, yeah, there is a problem where you'll sign up for one service, and it says, we can't take that phone number because we know it's a voice over IP, and then you can't set up for accounts, or you can't set up 2FA or something like that. And I've had to basically move banks sometimes where they're like, this doesn't work anymore. And I feel locked out of my bank account.

Joseph:

It's like, this is a massive pain in the ass. Then I get over it, and I continue my weird obsession with it. I mean, I have sunken cost fallacy now. Right? Like, if I stop now, I'd look like an idiot.

Joseph:

So I can't. I have to keep going. You know?

Jason:

I'm not gonna yell at you. I think it's admirable that you don't have a phone. I would just say that, there are definitely times where we have to do something for you because you can't do it because you don't have a phone, which often has to do with, like, testing things or, like, receiving an SMS on a specific carrier or whatever. So that's that's kind of interesting. Not not complaining.

Jason:

Happy to do it. Happy to do it.

Emanuel:

Joe was like, I have another story about how bad it is to own a phone. Well, one of you morons give me your phone number so I can show you.

Jason:

He's it's it's often like, I'm gonna give your phone number to a hacker, please.

Joseph:

We literally did that this week.

Emanuel:

It's Yes.

Jason:

It's not safe for me to do, you losers. What are you doing?

Joseph:

Please be my my little guinea pig.

Jason:

Yeah. Basically. But then sort of, like, on a more practical level, I would say that you're very good about it. But it's like, let's say that, we are in the same location at some point. It it can be difficult to meet up with you at a specific time in a specific place because you have to pre organize as though you're in, like, the year 1999 before cell phones existed, like, what time and place you're gonna meet at, which is fine.

Jason:

And I'm we don't need to talk about it, but you must be very good at map, like, sense of direction.

Joseph:

I take screenshots of the routes on the maps app or whatever. And I will have 4 screenshots, and then I'm walking around and I'm literally flicking through these big screenshots on an iPad mini. And if anybody sees me, they're like, what the fuck is that guy doing?

Jason:

So, you're you're pretty good at it, I think. Like, but, like, let's say that you're running late or something. Like, I assume you have to, like, duck into a bar or a restaurant to log on to Wi Fi to tell the person that you're trying to meet up with, like, hey. I'm running late. Yeah.

Jason:

Something like that. Yeah.

Joseph:

And I also leave early. I'm just super prompt. I usually get there before the allotted designated time for us to get drunk. So I'm gonna get there before, you know. I'm just very, very practical, you know.

Jason:

But that's it. You're good at it. You're good about it. It's just, you know, you have to you have to know these things. Well, if you're meeting up with you, you need to know these things in advance.

Joseph:

And I will say that as we come to, at least for those in the US celebrating Thanksgiving, I'm very thankful for you putting up with my phone no phone bullshit. So I appreciate that. Alright. I will leave that there. And with that, I will play us out.

Joseph:

As a reminder, 4 zero four Media is journalist founded and supported by subscribers. If you wish to subscribe to 4 zero four Media and directly support our work, please go to 4 zero four media dot c o. You'll get unlimited access to our articles and an ad free version of this podcast. You'll also get to listen to the subscribers only section where we talk about a bonus story each week. This podcast is made in partnership with Kaleidoscope.

Joseph:

Another way to support us is by leaving a 5 star rating and review for the podcast. That stuff really does help us out. Please do that if you haven't already. This has been 4 of 4 Media. We will see you again next week.