from 404 Media
Hello, and welcome to the four zero four Media Podcast where we bring you unparalleled access in the world both online and IRL. Four zero four Media is a journalist founded company and needs your support. To subscribe, go to 404media.co, as well as bonus content every single week. Subscribers also get access to additional episodes where we respond to their best comments. Gain access to that content at 404media.co.
Joseph:I'm your host, Joseph, and with me are two of the other four zero four media cofounders. The first being Sam Cole Hello. And Jason Kebler.
Jason:Hello. Hello.
Joseph:I nearly said Emmanuel just because I'm reading script even though he is clearly not here today. So, Jason, what's going on with merch?
Jason:Yeah. So we have been we haven't restocked our merch in months, and so, you know, orders have still been coming in, but we've been basically out of most things for quite a while. So we just are doing a big restock right now. I just placed the order. So we have crewnecks and hoodies back in stock.
Jason:We have hats, both black and green, back in stock. Those have been out of stock for a long time and are very popular. And we have the return of beanies. And then very excitingly, we have a few new items of merch. So we have a new crewneck sweatshirt in addition to the one that we had from last year.
Jason:We have a long sleeve doom t shirt. So we had the tank top doom t shirt over the summer. Now that it is fall, we have a long sleeve t, same design. So we will add some sleeves.
Joseph:And that's the numbers four zero four. But if you look closely, it's a bunch of code and that code is for the video game Doom. Right?
Jason:Yes. It is. And then we have a logo t shirt. Pretty simple, but people have been asking for that. So that's all up for preorder now in our Shopify, which you can find on our website if you just look for the merch store.
Jason:And then also, if you place an order, I'm really behind on shipping. It will be out, like, this week. I'm sending stuff out. So basically, big merch refresh. Very excited about it.
Jason:Go check check it out.
Joseph:Have we been impacted, touched by tariffs
Jason:Yes. At all? Yes. We have. Probably, we'll do an article about this at some point, but we've mentioned this before.
Jason:We work with a local screen printer in LA. They're really, really cool. It's like a a very small shop. And basically, like, the underlying t shirts that we're buying, the underlying sweatshirts that we're buying have gone up in price pretty significantly over the last just, like, month and a half, something like that. We're talking, like, 5 to $7 per item just for the base t shirt at wholesale.
Jason:So yeah. I mean, when we're talking about, like, tariffs impacting things, like, this is one of them. I'm gonna talk more to our to our supplier just about, like, what is going on, but this is it's interesting because some of our stuff is made in The US. We try to, like, make stuff in The US when possible just to support, like, local business and things like that. I've mentioned many times I'm in LA.
Jason:There's, like, various clothing companies here that make their stuff in in LA. But they're getting the actual cotton elsewhere presumably, or they are, like, now playing in a world where all their competitors' prices have gone up, and so they're they can increase price as well. I'm not exactly sure, like, what it is in this case, but the underlying cost for us has gone up, like, dramatically in the last month. So not good.
Joseph:Yeah. Just another way in which some large economic and geopolitical decisions can impact four journalists trying to sell T shirts with the doom code on them, basically. I mean, it's impacted, you know, untold number of businesses in all these different ways and, you know, it's impacted us in a fairly straightforward way probably. Like, hey, price went up, you know. But hopefully, people, you know, go check out the the merch refresh.
Joseph:I'm definitely glad that the Doom one is now on a long sleeve shirt. There was no way I was ever going to wear the tank top, although I know Jason rocks it. Don't think of Sam. Sam's a
Jason:big fan. We're we're brave.
Sam:Up. Yeah.
Joseph:Yeah.
Sam:Sun's up. Guns out.
Joseph:Not for me. So, yeah, I'm glad the new stuff is in. And with that, let's talk about this week's stories. The first section, lob out Zora two. I don't even know if I did that
Jason:sounds like a z. Sora, like Sora. Kingdom Hearts character.
Joseph:I think that's why my brain refuses to pronounce it because I'm not I'm not entertaining the idea of Kingdom Hearts. I find it interesting from an IP like, I'm not I don't want anything to do with that. Okay. It's about Sora, OpenAI's new AI slot machine app video model. And the first one is OpenAI's Sora two copyright infringement machine features Nazi Spongebobs and criminal Pikachus.
Joseph:First of all, Jason, what is the new app exactly? Because it's not just a new video model, which we we see every six months or whatever. Right?
Jason:Yeah. I mean, it's it's OpenAI's attempt to make a TikTok competitor. So it's a it's a social media platform or at least it's trying to be, and that's, like, kind of a new thing that AI companies have been trying to crack. Like, the new new problem they've been trying to solve is, like, how to make this stuff social. We've seen, like, Meta try to put it into its products, but Mark Zuckerberg has talked about trying to, like, make its own AI focused social media app.
Jason:And so this is OpenAI's attempt at that. So basically, you download the Sora app, you scan your face, like that's you can't really use it unless you scan your face, and that's like the first thing that you do. So
Joseph:Do you literally have to do it? You can just scroll if you don't?
Jason:I'm not sure actually, but it's the first thing that you're prompted to do and you can't like do much of anything on it if you don't scan your face. I should've I should've checked. I just did it because I was trying to figure out, like, how this worked and so
Joseph:But it shoved in front of you. Yeah.
Jason:Yeah. So you open it up and, like, the camera turns on and then you do a selfie and you, like, take a video and you hold it to your face for a few seconds, and then it asks you to look left, to look right, look up. This takes, like, maybe ten seconds, and then it asks you to say three numbers. So it'll be, like, 42, 87, 91, and that's it. Like, that's you give it your voice by just saying that.
Jason:And I bring that up because maybe, like, four or five years ago on an episode of Cyber, which is the old podcast we did at Vice, we did a deepfakes audio episode where we cloned the voice of Ben McCoo, who was our host at the time. And it we had to, like, find bespoke software engineers to figure out how to do this. Like, they they had created a model that would, like, clone his voice. It was pretty sophisticated. It was running on their their, like, own server, and Ben had to read maybe, like, twenty minutes worth of audio, which is a lot of
Joseph:audio. Thing.
Jason:Yeah. At the time, they're like, we've gotten this down from like, this is state of the art. Like, you only have to read twenty minutes worth of audio versus, like, hours and hours and hours worth of audio, and it didn't sound even that good. Like, it it it did sound like Ben, but it was, like, a days long process. He had to do all this stuff, and now to make pretty convincing video with synced audio that sounds like your voice, it's like a seventeen second process.
Jason:And, Sam, I feel like you have done, like, quite a lot of reporting on, like, how that tech has changed. Like, I don't know. I feel like you're you've, like, cloned yourself various times.
Sam:I mean, when Deepfakes first came out in, I guess, late twenty seventeen, I tried to make one of myself. And at the time, it was like I don't know. You needed to you need to have, like, pretty extensive programming knowledge to even just set up the the adversarial networks that you needed to, like, make a deepfake visually. And I gave up. I mean, I was like, I don't have the I don't have the patience or even the knowledge to, like, follow a tutorial at the time.
Sam:And now it's like, you know, we could do it on an app, and it's literally one photo. Or in some cases, not even a photo if the person is, like, well known or public figure, you just use text. But, yeah, the it's that's the case with all of like, every format of this technology is so much better than it used to be to the point where you just need, like you said, like, a couple seconds worth of audio. It's also interesting to me because that's I I had to go through this recently with, like, my bank with, like, doing the voice verification stuff, like, to make, like, a voice print. And it's basically the same exact thing.
Sam:It's like you read, like, eight words or something or numbers, and you instantly are like, I know your voice now. It's like, this doesn't seem right. It seems like it should you should need more than that.
Jason:It seems like you should need more.
Joseph:Yeah. Well, and with bank accounts specifically, I used to service Eleven Labs a few years ago now. And to clone my voice with that, I had to read out, I think, few sentences, and it was weird. It was like some romance novel, if I'm remembering correctly, and it got like a bit steamy. I'm like, why the fuck am I reading this?
Joseph:And then I did that, cloned my voice, and then I used that cloned version of my voice to break into my bank account, which was protected with Voice ID. So it's a big mess. And as you say, Jason, it's crazy easy now to do it. Yeah.
Jason:And I I bring this up and we dwell on this because, like, that is the big advancement from Sora in my opinion is, like, there there have been other, like, nudify apps and AI apps where it's like, oh, you put in a picture and then it makes, like, a crude version of it. And those have been very damaging, and we've, like, written about them. But there hasn't really been a video model to my knowledge where you can deepfake yourself or, you know, for lack of a better term, you can, like, clone yourself in, like, three seconds. And that is, like, how the app works. It's basically, like, you scan yourself, it's connected to your profile, and then you can use text like in any other AI app to say, like, Jason, writing an article in a cyberpunk future, and, like, it will make a video of that, and it will have my voice, and it will you can, like, write the script, like, blah blah blah.
Jason:And to be, like, very honest, it is pretty damn good. Like, it is shockingly good. I think that there's still, in many of the videos, like, quite a tell where it's like, oh, this is AI generated. But with things like animation, which we'll get into in a minute, like, you can tell it to make an episode of SpongeBob, and it will look like the actual animation. And the the videos, like, look pretty realistic.
Jason:The the audio sync pretty well. The voice sounds like my voice. It's, like, quite concerning. And then you can it's called Cameo. You can, like, put anyone else who has the app into your videos and by just, like, tagging them and, like, generating video with them.
Jason:So, basically, like, people have been making videos of Sam Altman, the CEO of OpenAI, by tagging him and then making him say and do, like, crazy shit. And and notably, and I think to, like, OpenAI's credit, this is like an opt in system. And so you can be on the app and say, don't let anyone make videos of me. You can also say, only let my friends make videos of me. And you can also say you have to, like, manually approve any video that has you in it before it is published publicly.
Joseph:I wonder if does Sam Altman, is he doing that to all of the ones in even Nazi uniforms and shit just taking his notification?
Jason:No. No. I think you can I think you can make it just like an anything goes situation, but like there is a setting where it's like I have to manually prove each and every video that's made of me? Although, like, I actually haven't tested it enough to know because you can generate drafts that live on like the person's phone, and then that person can download the video, which I think is actually one of the the, you know, concerning things about Sora and what we're gonna see and already are seeing is people using Sora to generate video, downloading it, and then posting it elsewhere. Because, like, if you stay in the Sora app, which, again, is just like a TikTok clone, so everything on the app is, like, vertical videos that you can scroll endlessly and they're all AI generated.
Jason:It's like if you're in that app, you know it's fake because it's like an AI app. So, like, on that front, it's actually slightly better than, like, my Instagram feed where my Instagram feed full of AI shit. But, like
Joseph:Which isn't disclaiming stuff.
Jason:Yeah. Whereas, like, everything on Sora is AI generated. But, like, Sora is so good at generating AI video that you can just, like, generate the video, download it, and post elsewhere. And, like, that's probably advantageous for the people doing this because the other platforms are monetized as we've, like, talked about endlessly.
Joseph:Yeah. And we'll get to more about spreading these videos and sort of misleading people. But as well as uploading your own face, you mentioned there's SpongeBob, and, of course, the headline mentions criminal Pikachu's as well. When you first opened the app and you start scrolling, what are some of these examples of stuff you're seeing? Because the article starts with basically a very long list to give readers an idea of what people are seeing.
Jason:Yeah. So I got this app the day that it came out, and the way that I got it is I bought I bought an invite code on eBay because I was not invited by OpenAI. Which we also
Joseph:covered to be clear because that's interesting in its own right. Yeah.
Jason:Yeah. And it was full of just like copyrighted stuff. And so, like, I think the first video I saw was SpongeBob, you know, saying like, I'm on the Sora app, oh my god, and like talking to Patrick and it sounds like SpongeBob, it sounds like Patrick, it sounds like Squidward, whatever. Then I saw there's so much Pikachu. There's, like, tons and tons and tons of Pikachu.
Jason:I actually saw a video of Sam Altman grilling a dead Pikachu on a charcoal grill and him going, like, and like cutting into Pikachu and being like, I'm gonna get sued for this, which I thought was actually pretty funny. Sure. There there's like tons and tons and tons like Mickey Mouse, you know, Bart Simpson, Peter Griffin from Family Guy, like any copyrighted character you could possibly imagine was on there and people were doing all sorts of shit with it. Like, you know, there was crossover stuff. There was a lot of people putting them into, like, Minecraft.
Jason:There was, like, Fortnite, but it has, you know, Peter Griffin shooting Stewie or whatever. And that was like the vast majority of the videos that I saw. It was like a mix of that, a mix of like Sam Altman videos and then like random people, you know, posting themselves just trying the app out. But like the stuff that was going viral that was being delivered to me by the algorithm was primarily copyrighted content.
Joseph:Yeah. And you write in the piece, quote, with the release of Sora two, it is it is maddening to remember all of the completely insane copyright lawsuits I've written about over the years, some successful, some thrown out, some settled, in which powerful companies like Nintendo, Disney and Viacom sued powerless people who were often their own fans for minor infractions or for use of copyrighted characters, there would almost certainly be fair use. That was a long quote. Sorry. But what what do you mean by that?
Joseph:Because you're seeing all of this this wave, this flood of copyrighted material in this app. Clearly, OpenAI has scrapes these these characters and this material from the web in some fashion. And then you have companies like Nintendo who are usually crushing random ass fans for making stuff about their games. Like, what do you think about that disconnect?
Jason:Yeah. I mean, I I think that there's a huge disconnect and with a caveat that we'll talk about in a second of of what has happened over the last week. But basically, like, Nintendo is extremely litigious. They go after people who pirate their games, which makes sense, but then they also go after people who, like, mod their games or emulate their games. They they I wrote about one time Pokemon company and Nintendo suing this fan who threw an unofficial Pokemon party at a game dev conference, like, not even affiliated with the game dev conference.
Jason:It wasn't like it was at this big event. It was just, like, at some satellite bar, and they sued him because they put a Pokemon on the poster of that tiny, tiny event. And so then to see just like this anything goes extremely blatant copyright infringement by a company that's worth billions and billions of dollars and has all this funding is kind of insane to see like how far we've come. We've talked about the copyright issues with AI before, so we won't get into it too much here because a lot of it is, like, unsettled laws to whether this is fair use. But, like, it was only a year and a half ago or so that most of these big AI companies, like, wouldn't even say whether they were training on copyrighted content even though it was obvious that they were.
Jason:And now it's like you can have a picture perfect, you know, episode of the Simpsons with all of the voices synced up. It's like, obviously, they were training on all of this, and so it's it's kind of insane. And to be clear, like, there have been a couple high profile lawsuits against AI companies. Like, Disney is suing mid journey, but you would think, like, you have, like, Mario like, when you have Sam Altman, like, killing Pikachu, you'd think, like, hey. Maybe they would be mad about this.
Joseph:Yeah. And there's still time, obviously, for, like, lawsuits to happen and stuff, but companies can opt out of their IP being used in the app. And as far as I know, companies have to do that on like a per character basis. Like Disney or Nintendo can't go and say, take all of our stuff out, please. They have to go, could you take out Pikachu, please?
Joseph:This is what Pikachu looks like. And I mean, I remember way back when Pokemon was, what, 151 Pokemon? How how many how many is it now?
Jason:I mean, I think it's up near a thousand if they're over a thousand.
Joseph:Yeah. So Nintendo's gonna have to send a thousand letters and open a thousand different cases about Pokemon that I know. I don't know how many people care about the latter series of Pokemon. That's my bias showing. But Disney's gonna have to do the same for that and, you know, maybe that will, you know, please these companies.
Joseph:But, yeah, sure, a lawsuit could happen. And with that said, because now the app's out and companies have had a minute, a few days to, you know, opt out certain characters, have you seen any change on the app? Like, oh, actually, I can't make SpongeBob Nazi anymore. Have you noticed anything?
Jason:Yeah. Yeah. So actually, yeah, I didn't mention, but like there's a there's a there are videos of SpongeBob being a Nazi that I saw like right after I opened the app. And then, like, one thing you can do in Sora is you can then tweak things that have already been published. And so people were like, now make it Pikachu.
Jason:Now make it Goku. Now make it Rick and Morty. Now and so there was just, like, an endless scroll of, like, Nazi IP, like, with swastikas in the background. Like, it was crazy. It was crazy.
Jason:But, anyways, over the last week, a lot of the characters have been opted out by the companies that own them. And that is now one of the, like, main things that people on Sora are making videos about and talking about where basically, like, Sora is no fun anymore because everything I try to do is a content violation. And so a lot of the videos are about content violations and getting, like, Sam Altman to be, like, say things that are like, you can't have fun on my app anymore, which is like the app only came out six days ago. And so, I mean, that that raises questions about whether this app will have, like, any sort of staying power whatsoever. We've seen it time and time again, like, big fancy new social media app is released.
Jason:There's like an invite system, so it's hard to get on. Clubhouse,
Joseph:that sort of thing.
Jason:Yeah. It like seems really exclusive. And then it's like, oh, we got an invite. Like, gotta get on this app. And then it's popular for a few days or a few weeks and then it kind of peters out.
Jason:And it's like, I don't know if this will have any sort of staying power. Like, obviously, ChatGPT has, but not everything that OpenAI has made has been like some massive hit. And so, yeah, it's like a lot of the characters have been opted out and people are now, like, complaining about it. And let me just say that the opt out system is like a system that OpenAI devised. It's like that's not like a legal framework.
Jason:I mean, very very easily, like, a company could be like, no. Take down everything or we're gonna sue you. And I don't know how that would go, but it's worth saying that this is like an interesting new, like, paradigm that they're trying where it's like, we're gonna put you in the app unless you specifically tell us you're you're not gonna be in there.
Joseph:That's not how copyright works. It's literally opposite of how it's supposed to work. Yeah. And it I mean, I guess maybe to wrap it up before we go to the watermarking thing, just with all of that IP now being removed by the copyright holders, it almost just brings up a question of, what was the fucking point of this app in the like, literally, what is the point of this app? Like, what do you think?
Jason:Well, I mean, I I think it's like AI generated video is very popular on the Internet just because of the AI slop phenomenon and all that. And it's like, this is the best, easiest slop maker. I mean, I think that people will find ways to jailbreak it, for lack of a better term, like to get around the the guardrail and things like that. And then also, it's like most AI slop that we see on the Internet doesn't have characters in it and stuff like that. And so they've made, like, the best easiest AI slop generator that there is.
Jason:And I think that to the extent that there will be staying power for this, I think it will probably be from the slop manufacturers who are using the app to make stuff, but then are taking it off the app and putting it elsewhere to go viral in the ways that we've, like, talked about before.
Joseph:Yeah. And so yeah. More like more like generic monster does x y z or something. Like, it doesn't have to be some IP from stranger things or something. It's just gonna be more generic creations, which don't violate copyright probably, maybe.
Joseph:I also I'm really sorry if this is so dumb and this was like the whole in joke the entire time, but I'm just sort of realizing that Sora from Kingdom Hearts, which is a video game which has all of this IP crossover from Disney with, you know, Goofy and all of that, And then OpenAI uses that name for its massive copyright infringement app. I'm I'm sorry. I'm only just putting two and two together.
Jason:But I I mean, I don't think that's why it's called Sora,
Joseph:but that's really Yeah. Exactly. It's like
Sam:There's also Cameo exist too. Like, the reason the word Cameo for literally what Cameo is, which is like a put put a celebrity in something to make them say what you want them to say app. Does Cameo not own that trademark?
Joseph:Surely, they have a trademark on that. I'm actually gonna look that up while, Sam, I'm gonna ask you about this next story, and the headline is and and Matthew, our regular contributor wrote this, Sora two watermark removers flood the web. But basically, you know, every Sora two video comes with a watermark. It's sort of like a little spinning icon that, as Jason says, you can be like, oh, that comes from this app, so it's gonna be AI. But as Matthew has reported, there's now this wave of watermark removers where you just upload the video and it takes it out automatically.
Joseph:I'm just wondering what you what you think of that because, of course, you've covered a ton of, you know, abusive deepfake and AI videos as well, which don't present themselves as AI generated. They don't have a watermark very much deliberately. What do you make of Sora trying to have a watermark and then it being defeated in twenty minutes or whatever it was?
Sam:Mean, it's, like, very predictable. Like, you can take watermarks out of pretty much any like, TikTok has watermarks where it's, little TikTok logo bounces around. It's just not like, watermarks in general to be a solution for AI generated content, It's, like, a good start, but it's relying on people to keep it in the video to, like, not use something like this exactly like this to take it out. It's watermarks are just such a flimsy indicator of AI content in general, and it's also can put in you can put a watermark on a real video. Like, you could edit a video to put a sore watermark on it and say, oh, that's not real.
Sam:There are water there are types of watermarking that aren't we're talking about, like, the little logo. Right? Like, the thing that is on the video that you can see with your eyes. There's types of watermarking that's like it's more like thumbprinting or hashing where it's not visible to naked eye, and it's much harder to remove. You have to, like, edit the video to actually be able to see it.
Sam:Google has something like this for VO. But this particular type of watermark is, like, very predictable that people would try to take it off of Sora videos to make them seem like they're real videos. And then they they repost them to, like, reels and other platforms where, you know, it's like they're mixed in with real stuff. Yeah. This whole story was very it's like, yeah.
Sam:That's what people are gonna do with this. It's it sucks that this is, like, the the cottage industry that pops up every time. So it's definitely worth reporting on and noting that here it is again, like always.
Joseph:Yeah. And it was just done so quickly, and it can be automated by tools that anybody can then use. I'm looking at the Cameo trademark and I'm not a copyright or a trademark lawyer. I think Jason would probably know more about this. But you know, it says in its goods and services section, downloadable software used to create personalized video messages for entertainment and structural inspirational or greeting purposes featuring athletes, actors, entertainers, influencers, etcetera, etcetera.
Joseph:I know. It sounds like you could do that on Soaroo. So I guess we'll see. I know. But then there's this specific style of the Cameo logo, whatever.
Joseph:Yeah. I'm pretty sure there's gonna be at least one lawsuit after this app in some sort of direction. Alright. Let's leave that there. When we come back after the break, we're gonna talk about the app IceBlock and Red Dot which you know both for reporting sightings for ICE officials and how Apple and Google has removed those.
Joseph:We'll be right back after this. Alright. And we are back. Jason, do you want to take a read on this story?
Jason:So this is a story that you wrote. You've actually written a handful of articles about this and this phenomenon. The title is Ice Block Owner after Apple removes app. Quote, we are determined to fight this. I guess let's just start, like, I mean, what is the IceBlock app?
Joseph:Yeah. So IceBlock is an app that was only available on Apple devices, and you downloaded it and you could anonymously report sightings of ICE officials in your local proximity. So it would have access to your location data. They say they treat this, you know, in a very privacy centric way and a security researcher who looked into the app largely corroborated that as well, didn't find anything sketchy going on with the location data. The idea is that, oh, I can turn that on and then if I see an ICE official in this location or a raid going on or somebody not having a due process rights or whatever, I can report it on the app and that will then alert people, say, in that neighborhood or maybe the next neighborhood over depending on how close obviously the activity is.
Joseph:And this is one of several apps like this and we'll talk about another one in a minute, but it really rose to prominence in June when CNN reported on the Ice Block app. And the Trump administration was very, very mad about it to the point where they even floated these ideas that we're gonna try to find a way to charge CNN for covering and allegedly promoting this app. Obviously, that's ridiculous and I don't think we'd have any legal basis, but, you know, sort of who knows in this day and age. But that's how it grew and grew and grew to the point where, you know, it became pretty popular. And when people think of ice spotting apps, they typically think of IceBlock just because it's sort of the most well known at this point.
Jason:Yeah. But this app has been pretty widely used by activists, by people in their neighborhoods, etcetera, to essentially tell people where raids are happening. Like, I, again, live in LA and during the the I mean, there's still it's still happening. It's still happening, like, everywhere, especially in Chicago, especially in LA, happening in Portland, but it's happening all over the country. It's like my neighbors were like, download the Ice Block app.
Jason:Like, you'll walking around, you'll see flyers that a lot of, like, coffee shops will put on their doors and things like that, which is like what to do in case there's an ICE raid, like how to support the people who are being, you know, accosted and and detained and things like that, grabbed off the street. And a lot of them are, like, download the ICEBLOC app to see to, like, report this. So that that's, like, very notable, I would say. So, anyways, it gets the attention of the Trump administration because CNN reports on it, and then Apple takes it down last week. What is, like, Apple's reasoning for taking it down?
Joseph:Yeah. So I spoke to the developer of Ice Block, and they shared the email they got from Apple after, you know, the app was removed, and they point to a couple of parts of their terms of use of, you know, their their App Store policies or whatever. The first one is 1.1 objectionable content, then, you know, an option include what they call as offensive, insensitive, upsetting, intended to disgust in exceptionally poor taste or just plain creepy. And then they include some examples and they point to defamatory, discriminatory, mean spirited, commentary about religion, race, sexual orientation, gender, ethnic origin, or other targeted groups. I emphasize that because that's the only one I could possibly in a million years seen this app kind of at a stretch touching.
Joseph:But that is what Apple says. And crucially, this came after there was a shooting at an ice facility, I think a few weeks ago at this point, where a shooter shot into a van near the facility and authorities claim that the target was ICE officials probably based on messages written allegedly written on the bullet casings. Unfortunately, you know, someone died in that event. It was a detainee in the van and others were seriously injured. But that sort of brings up ice block again in front of the Trump administration and it seemed well, I mean, it is clear that pressure from the Trump DOJ and Pam Bondi's DOJ directly led to Apple removing this.
Joseph:I mean, Pam Bondi went to Fox herself, though again, that's the attorney general, and said, you know, we don't want this to happen. Basically, I'm paraphrasing slightly. You can see the full quote in the article, but it was direct pressure from the DOJ leading to this. And I think you found that particularly wild, Jason. Right?
Jason:Well, I it just so a couple things. One, like, during COVID, the Biden administration sent some, like, relatively innocuous, like, emails to Facebook and Twitter being like, hey. Can you take down this disinformation about vaccines and things like that? And they did. And this became the basis of Elon Musk's the Twitter files, which after he bought the the app, like, released all of this internal Twitter stuff that was, like, evidence of a massive government attack on free speech because of pressure from the Biden administration on these social media companies to take down content it didn't like.
Jason:Like, that so, basically, like, this is a has underpinned, like, a gigantic right wing grievance for years at this point. Like, this these couple of emails have sustained, like, frankly, a year's long, like, outrage cycle.
Joseph:And careers are built on it, basically. Grifting careers.
Jason:Yeah. Yeah. Grifting careers are built on it. Like, lots of people have been fired, like, you know, like, Elon Musk came in and fired everyone at Twitter and, like, anyone who had anything to do with anything involving that. I don't know if there's been, like, threats to prosecute, like, the Biden officials who sent those emails, but, like, it's a big fucking deal, like, in right wing circles for a very long time.
Jason:Now this is, like, literally the exact same thing, arguably even worse because, like, for a variety of reasons because not they're not saying like, hey, take down a couple posts. They're saying, hey, delete this app or else, essentially. I mean, I think that if the DOJ comes to you and says, like, delete this app, the else is implied, I would imagine, especially, like, with this administration and especially when you have, like, Tim Cook going to the White House giving, like, a glass ornament to the president, like, things like this. And so you have this happening, and it's like, I don't know how many people use this app, like, hundreds of thousands. I don't know.
Jason:Lot it it's a pretty popular app. It was, like, top of the App Store for a little while or near the top. Like, that that's that's not good. And it it's definitely I don't know. It's like First Amendment violation adjacent at the very least.
Jason:And so, you know, the creator of this app says that he's going to push back against it. I think let's talk about the Google situation, and then we can talk about some of, like, reasoning for this and the, you know, terms of service and and things like that. So this the second story is called Google calls ICE agents a vulnerable group, removes ICE spotting app red dot. So, basically, the ICE block was never on Android. Right?
Joseph:It was never on Android because the developer says they couldn't do the privacy protecting things they like to do on Android. Some people disagree with that, but that's a state of reason. Yeah.
Jason:Okay. So there's another app called Red Dot, and it also gets deleted from the Google Play Store. What what happened in this case? And this happened immediately after the Ice Block situation.
Joseph:Yeah. As far as I can tell, this wasn't simultaneous to Ice Block. I think it happened afterwards. Of course, if anybody has more information on that, you know, especially from Google's decision, let me know. But that's also my understanding at the moment.
Joseph:Yeah. Red Dot was a very, very similar app. I mean, at the end of the day, all of these apps are going to be pretty similar. Right? All you're doing is reporting a location saying there are ICE people there.
Joseph:Red Dot did much the same thing. Both Apple and Google removed this app. You can't get this app on the Apple App Store. Apple did tell me it removed multiple apps when it removed IceBlock, but the company stopped responding when I asked, well, was it Red Dot? Because I found out the day after.
Joseph:I then ping Google because that's even more interesting because it signals that, oh, there's like a broader crackdown on these ice spotting apps and it isn't just limited to Apple. Right? So I reach out to Google and I'm gonna try and summarize I'm I'm gonna read basically what we put in the article because I'm sure for reasons that Jason will bring up, it was a bit of a weird response. First of all, Google said it didn't get any outreach from the DOJ. So like, I asked that explicitly, like, did the DOJ tell you to take us down?
Joseph:They said, no. They didn't talk about the ICE facility shooting specifically, but in this very roundabout way, Google said it removed apps that share the location of what Google described as a vulnerable group after a recent violent act against them connected to this sort of app, which is like, well, that's a lot of words to say that you removed ice spotting apps after the shooting at an ICE facility and most importantly, that you consider ICE officials a vulnerable group. Jason, when I initially filed it, I don't think we changed the copy. Well, no. We did later on a little bit, but the headline crucially was not stressing the vulnerable groups part.
Joseph:You pulled that in. That you pulled that out quite rightly. Why did that stand out to you? Calling ICE officials a vulnerable group is an obvious question.
Jason:Yeah. So we used to report a lot on terms of service and content moderation on social media platforms, specifically, like, after the twenty sixteen election where there was, like, during, like, Black Lives Matter and, like, the Me Too movement and things like this, there was, like, a real effort from social media companies to devise rules that would protect, you know, quote, unquote, vulnerable groups. And vulnerable groups is defined differently by every social media company, but, like, broadly, it means ethnic minorities, gender minorities, religious minorities, people who are traditionally discriminated against or the subject of hate speech. So trans people, black people, you know, in in specific countries, it's like religious minorities that are subject to, like, hate speech and genocide and and things like that. Like, there are so many examples of just, like, really awful things happening on social media, you know, like Facebook being credibly accused of facilitating a genocide in Myanmar, things like this.
Jason:And so they write rules in response to these, like, really awful real world situations. And the catch all term that social media companies use is vulnerable groups. And, like, in no world was, like, police officers, law enforcement, like, a vulnerable group when these rules were written. And I think to call them a vulnerable group now is, like, insane. And it's also but it also mirrors, like, what the Trump administration has been saying, where they they say over and over and over again that these ICE agents are risking their lives, that they're trying to keep America safe, that they're being targeted, that they're being talked, that they're being harassed, which has, like, been the underpinning for why they're not including ICE agents' names on, like, indictments in court and things like things like that, affidavits, which is very rare.
Joseph:And masking, of course. Yeah.
Jason:Yeah. Masking, like, refusing to say their badge number, refusing to identify themselves, which is why you have, like, sometimes what seem to be, like, plainclothes officers, like, abducting people off the streets because they're like, oh, well, we're being targeted so we can't identify ourselves. And so this is now Google, like, parroting parroting that language and also using these these, like, guidelines that they wrote years ago and retrofitting them to, like, come up with a pretext to ban this app. And Apple didn't say the exact same thing, but it was somewhat similar where it was like, you know, you already said what the reasoning was.
Joseph:Targeted groups.
Jason:Yeah. Yeah. And it's like, okay. I don't know. Like, not good.
Jason:Not good. I I don't know what to say other than it's like we're now in a world where, like, trans people are not protected on the Internet. Like, you know, actually targeted groups are not protected on the Internet, but police officers are.
Joseph:I wonder if Google considers people like the woman who was shot seven times by an ICE official a few days ago in a hotly disputed event in Chicago. Right? I wonder if they considered that person part of a vulnerable group as well. Yeah. I it is very much co opting the language and the methodology and the thinking from a content moderation approach from a different time.
Joseph:Like, this doesn't well, they're trying to apply it here in a time that is radically different to, you know, 2016 or or even 2020 or anything like that, really. I did get a statement from Fire app, which is another similar app. And I should say another one was removed. I actually can't remember the name of it right now because there's so many of these weirdly, but another one got removed. Fire app reached out to me and they're much more I mean, they're closer to like citizen.
Joseph:They're they're they're sort of broader, but when I checked the Fire app website, and interestingly, it's a website as well, You can view it. You can't report on there yet. But, of course, a website is harder to block than an app. Right? I went on there and they sent me an email and they said that it's concerning to see platforms similar to ours, a citizen style app removed despite their clear compliance with legal standards.
Joseph:This raises serious questions about fairness and transparency. This action seems to be based more so on fear of retaliation and retribution from the Trump administration and in line of kissing the metaphorical ring, something we unfortunately have seen many top executives do in order to placate Trump on their side. I mean, I think that puts it pretty well. How about we leave that there? I'm definitely gonna keep an eye on all of these ice spotting apps being removed.
Joseph:If you're listening to the free version of the podcast, I'll now play us out. But if you are a paying four zero four media subscriber, we're gonna talk about a very, very significant update to a story we broke back in May. This radically changes and adds to the narrative, the police and flock the surveillance company gave at the time. You can subscribe and gain access to that content at 404media.co. We'll be right back after subscribers only section.
Joseph:Here is the head excuse me. Here is the headline. Police said they surveilled woman who had an abortion for her, quote, safety. Court records show they considered charging her with a crime. Actually, Sam, I'm gonna go to you first because you you did the second pass on the story, and I saw you posted on Blue Sky.
Joseph:I think just calling it insane or wild. I can't remember the exact word. Is is that your feeling? Insane and or wild?
Sam:Yeah. It is. It's insane. It's wild. It's it has as as I said last week, it's a very classic horror story.
Sam:It has it all. Technology that horrifies me is mostly the theme. Yeah. It's we have police misleading the press, flock covering it up, The press, not us, believing and eating up everything the cops and flock have to say. In the meantime, and all this time, we have the the cop at the center of this whole debacle getting fired, right, for allegations of sexual assault.
Sam:So, yeah, it's a
Jason:Not not fired, arrested, charged with three felonies, out on bond, back in his job. Fired. Oh. As like, while he's waiting yeah. He was suspended for, like, quite some time, and now he's, like, back in the job for until his trial.
Sam:So Yeah. Even better. So yeah. I mean, I was I was stunned by the story, and it's an incredible piece of reporting just start to finish just because it has been such a ride since May. So yeah.
Joseph:Yeah. So, Jason, remind us what the first article about what we published this said. It was back in May, and it basically showed this Texas officer doing something. What happened there, and what explanation did the police give at the time?
Jason:Yeah. I'll preface this. Sam Sam just said this, but, like, you and I reported both of these stories a bit. This is, like, one of the most complicated stories we have done in quite some time, and, like, explaining it succinctly is very hard because there's lots of nuance. There's lots of weirdness.
Jason:There's just, like, a lot of documents, blah blah blah. The actual story is, like, not that complicated, but, like, the actual specifics of, like, did the cops slide? Did they omit stuff? Like, it's it's tricky. And so, basically, back in May, we were doing a lot of reporting on FLAC, the license plate surveillance network, And we were getting these documents called audit reports that show why cops are using the surveillance tool.
Jason:Like, each and every time they use the tool, they have to put something in the, like, reason field of the search. And we had this this instance from Johnson County, Texas, and it was, like, May 7, I believe. But, basically, we did we had a FOIA. FOIA was passed to us by the person who did it, although we filed some FOIAs that have since, like, had this information in it as well.
Joseph:Rose Terz being the person who originally flagged it. Yeah.
Jason:Right. Right. So they ran a search that I wanna get it right. Oh my god. I can't find it.
Jason:It was like, okay. Had an abortion, comma, search for female was the stated reason for this search in Johnson County. So we saw that and we're like, holy shit. That's really bad because abortion rights activists, women, surveillance and privacy experts, etcetera, like everyone who cares about this issue at all has been saying for a very long time that surveillance tools are going to be turned on women who are trying to get abortions, who have had abortions, who people are trying to assist them, etcetera. Like, the surveillance state is going to be turned on them.
Jason:And so this seemed like an example of that. We call up Flock or we email Flock asking them for comment, like, did you know that this is happening? Is this allowed? Like, what's going on? They respond and they're like, hey.
Jason:There's like more nuance to this story than seems than it seems. You should really talk to the Johnson County sheriff. So we call up the Johnson County sheriff who we'd already reached out to for comment, but it's just like it's worth noting that it's kind of rare for a surveillance company to say like, hey, you should talk to the cops.
Joseph:I've never seen that before.
Jason:Yeah. I I can't think of an example, but they're like, you you really need to get a full story because like it that's not the full story. So I call up the cops. I get the sheriff, Adam King. I talked to him for nine minutes.
Jason:Like, I I took had in my notes and everything. And he is like, this was an example of a woman who gave herself an abortion. It was a self administered abortion, and her family was really worried because she was missing. And he he used the word, we were worried that she was bleeding out, like that she was bleeding to death somewhere. And so it was a missing person investigation.
Jason:She was never the subject of a crime, like a crime investigation. We were simply looking for her, so we ran her license plate on Flock. He said that they eventually found her a few days later in Dallas or they, like
Joseph:No. There there was a hit in Dallas.
Jason:There was a hit in Dallas on Flock, and they eventually found her a few days later safe. They did not say where they found her, how they found her, or, like, what the circumstances of any of this were. So we report that because that's all we had to go on, and also it's just like that's what we reported.
Joseph:And that's still hugely significant. It's still hugely significant the flock is used at all to look for somebody who self administered an abortion because as the experts we spoke to said, I don't know. Maybe she didn't want to be contacted by her family. Maybe she was worried about being, you know, the target of an investigation or whatever. It is still very significant including with that nuance, which, of course, we include.
Joseph:Yeah.
Jason:Right. So
Joseph:But then we get these new documents.
Jason:Well
Joseph:or did you
Jason:wanna say sorry. I think we should add more, which is like we publish the story. It becomes huge national news. Right. Right.
Jason:Lots of people write about it. Local people write about it. You know, national news writes about it. The Electronic Frontier Foundation writes about it. And then there are a couple, like, investigations into FLoC because Texas searched the national network, so it was more than 80,000 cameras.
Jason:So it wasn't just cameras in Texas, it was cameras in California, it was cameras in Oregon, it was cameras in states that have very, like, liberal and favorable abortion laws. And so some of these states have laws that explicitly say that you cannot use surveillance for this purpose. And so there's, like, some reform. Basically, like, I think California, Oregon, Illinois, I could be wrong on this, but they basically are like, you can't use flock for this purpose. So, like like, we're shutting this down for, like, national searches from other states for this purpose.
Jason:FLoC does some, like, reform as well, like adding some some blocks and and, like, guardrails to its tools. We don't need to get into the specifics of why, but it it leads to, like, some some changes. And then Flock was like, despite all of that, we are still kinda, like, mad at about this story. They call it clickbait. The CEO of FLoC, this guy Garrett Langley, goes on to Forbes, like, a Forbes does a big article about them.
Jason:So surveillance reporter at Forbes, Thomas Fox Brewster, does a big feature about Flock in which includes an on camera interview with Garrett Langley, the CEO of Flock, and he says, like, the journalist got it wrong in this case. Like, I think they got it wrong. He said that he thought it was clickbait and, like, some other things. So then, okay, agree to disagree. Like, we thought our reporting was accurate.
Jason:It was accurate. We you know, time passes. And then the Electronic Frontier Foundation filed some FOIA requests for the investigative documents in this case in Johnson County, Texas. And they get court records about an arrest that happened. And what happened is basically, like, everything that the police said in the first place was more or less not true.
Jason:The facts of the case were far worse than we initially reported. And the long and short of it is, like, the police did consider charging this woman with a crime. They asked the district attorney whether they were allowed to charge her with a crime. The district attorney said no. There's no mention of the woman's family in the arrest report, which we'll talk about the arrest in a second.
Jason:And, like, the only family member mentioned in any context was this woman's partner, and this woman's partner was arrested for allegedly assaulting, choking, threatening, and holding the woman at gunpoint on the day that she had the abortion. And he is the one who called the police, and he called them two weeks after the abortion occurred, meaning she had not just had the abortion, like, she was not likely to be bleeding out because it two weeks had passed. It is implied in the police report that the reason the woman was missing was because she was assaulted by this man, so she went somewhere else, like she didn't return to the home. And so, you know, basically, like, everything that had become public about the story previously was it was the facts of the case were, like, far far worse and far crazier and far, like, more upsetting. And in many cases, like, the exact dystopian nightmare that experts have been warning about for quite some time.
Jason:Just to read very quickly from the the arrest report, quote, it was discussed at the time with the district attorney's office and learned the state could not statutorily charge the woman for taking the pill to cause the abortion or miscarriage of the nonviable fetus. They also call it a death investigation into a nonviable fetus, and so that's like why they were there in the first place was to investigate this abortion. It was not to, like, look for the woman.
Joseph:And Yeah. Crucially, we don't know the exact times, but crucially, and correct me if I'm wrong, Jason, but on the same day, they're discussing about whether they can charge this woman related to this fetus. On the same day, they perform the flock search for her vehicle in an attempt to locate her. Now, we don't know whether the flock search was after they spoke to the district attorneys or before, but it was on the same day and I think that was very illuminating, you know.
Jason:Yeah. It was on the same day and then so so that all happened like at the May. Then weeks pass, this man is arrested for allegedly assaulting her. So like
Joseph:Because the woman went to
Jason:the Yeah. Because the woman, when she was like discovered safe, was she went to the police and was like, my partner abused me and I wanna report the crime. And that is then when it I mean, it's not even clear when it changes from being an a death investigation into, like, an assault of investigation, but this is all part of, like, the broader narrative that's included in these court records. Like, that that's sort of, like, when they begin investigating the boyfriend, but this is after they already used flock to try to search for her. So then we get the the records, like, you know, at the May, we write the article.
Jason:Another week passes, and only then do do the cops create what's called a supplemental report. So, like, an additional document that they add to the court record where they try to justify the use of flock. And in that court record, which was created in June so after this had already become like a big shit show for Johnson County Police and for FLAC, they have someone write a document that says, we were concerned about this woman's safety. And they don't use the word abortion anywhere in the piece. They don't, like they they basically just say, like, we're the family was concerned for this woman's safety, and so we ran flock to try to find her.
Jason:So, you know, we can't say exactly why or what happened, but this seems like an attempt to justify the use of flock after the fact because that's the only that's the only place that they try to justify it and the justification comes after this was already a problem for them.
Joseph:Yeah. Squaring everything in this case is very, very difficult. And as you say, it's one of the most complicated stories we've done in a bit. That let's just, for a second, assume, okay, they do the supplemental report, which maybe they should have done beforehand. They shouldn't have done it a week after it was this massive issue.
Joseph:And maybe, you know, they they hear from the family that, yes, we need to search for this person. As you allude to, the only family mentioned is this abusive allegedly abusive boyfriend who was charged. So maybe that's the route it went down. It's it's a jigsaw puzzle for all of these different things, but absolutely regardless, these documents which have come out provide way more necessary vital context for what actually happened, which of course, you would hope the sheriff's office would have provided at the time when we were initially reporting it. And, hopefully, Flock knew that as well, but that's unclear because Flock hasn't responded to a request for comment.
Joseph:Oh, sorry. They've declined to comment.
Jason:Yeah. I tried to contact everyone involved in this case again. I called, you know, sheriff's office, DA. I sent them faxes because they all have fax machine like numbers on their website. No response to my faxes.
Jason:Actually, let me check my fax machine.
Joseph:I I I was telling you that, like, we should buy a fax like, a physical fax machine. You should and then we should publish the number and see what we get, but I don't know how that might be a good idea for about five minutes. We'll see.
Jason:I know
Sam:exactly what we would get.
Joseph:Shrimp Jesus pawn or something. Some in ASCII art Yeah. Sent over a fax machine.
Jason:Yeah. Wow. No received faxes. Sad. Sad.
Jason:Yeah. So, I mean, I said this before, but, like, this is one of the craziest stories that we've done in a very long time. Like, the facts of the case are far worse than we initially reported, which I don't think we did anything wrong in our first story. It's just like we've gotten more context, more more facts, and, like, we're reporting them out now. But, like, what what happened here is the dystopian nightmare that people have been warning about because it is indisputable that this woman's boyfriend or this woman's partner who was later arrested for allegedly abusing and threatening her on the day that she got an abortion, He is the one who reported the abortion to the cops.
Jason:Like, that is why the cops were investigating this in the first place. So the idea that, like, family member is looking for a woman who had an abortion or something is, like, not a good one regardless. Even if the family ended up, you know, like a different family member who's not mentioned anywhere was worried about her safety, it's like the anti abortion laws, like, enable domestic abuse for this reason because it's something that he was able to hold over her by using it to then, like, you know, report this to the cops or whatever. And then the other thing is, and this is separate but also worth considering, The sheriff that I talked to, again, has since been arrested for allegedly sexually harassing his female employees, saying, like, really crazy shit to them, harassing and threatening threatening them when they, you know, made this public. And then while he was getting deposed by a grand jury, he allegedly lied about, you know, what he did.
Jason:And so he is he's been charged with aggravated perjury is one of the things. So he's been charged with lying to a court. And as part of his bail, he's not allowed to use a bunch of different surveillance systems that the Johnson County Sheriff's Office has. That those are documents that I pulled yesterday. And so I don't know.
Jason:Like, crazy. It's just crazy. The the facts of the case are are very, very wild, And it's really hard to put a headline on this. It's it's hard to sort of say, like, it's hard to explain the story unless you know some of the background to it. So
Joseph:That's why the headline was so difficult. But but I think we got a really good one at the end, but, like, yeah, it's hard because it being a very substantial update to a story where you need to communicate somehow that previous story, but then you just have to accept. Well, mean, also this is important. This might be the first time that a lot of people have even heard of this. You know what I mean?
Joseph:And so we have to accommodate to those readers as well. You have to try to be as clear as possible to everybody. But, yeah, it's very, very difficult for that reason. I guess the last thing I'll add is that I think one of the takeaways here was that FLoC may not necessarily know its law enforcement users are actually doing with the technology. I'm sure they do in plenty of cases.
Joseph:I'm sure they have really good relationships with various law enforcement officers who share in detail the missing vehicles, the missing people that they're locating, all of that sort of thing. But clearly here, there was some sort of disconnect between what flock and the sheriff was saying publicly about the earlier narrative and then what was detailed in these documents, which they did not discuss publicly. So I spoke to some flock sources, you know, current and former employees of the company, and I relayed basically the thrust of of what this new narrative was. And one told me this update is so disappointing and then quote, as much as Flock tries to be good stewards of the powerful tech we sell, this shows it really is up to users to serve their communities in good faith. Selling to law enforcement is tricky because we assume that we use our tech to do good and then we just have to hope that they're right, end quote.
Joseph:And, yeah, that sounds about right. Of course, it's I don't know if that would really be the case here, but then you have very sensitive technologies like NSO Group breaking into phones and that sort of thing. The customers of those technologies do not want NSO Group or the surveillance contractor to know what they're doing because often, that's some very, very sensitive stuff, some very, very abusive stuff. Obviously, flock and NSO are not in the same ballpark, but I think there's a fair comparison there about what you may or may not know about your customers. But I think this is a really good example of employees inside these companies, the CEOs of these companies, and then just the companies more broadly themselves.
Joseph:You don't necessarily know what your customers are actually using that tech for. Alright. We'll leave that there. I will try to find the outro scripts and then it will play us out. As a reminder, four zero four media is journalist founded and supported by subscribers.
Joseph:If you do wish to subscribe to four zero four media and directly support our work, please go to 404media.co. You'll get unlimited access to our articles and an ad free version of this podcast. You'll also get to listen to the subscribers only section where we talk about a bonus story each week. This podcast is made in partnership with Kaleidoscope. Another way to support us is by leaving a five star rating and review for the podcast.
Joseph:That stuff really helps us out. Soon, I'll start reading through the Spotify ones. I've just been reading the Apple ones out. I'll go through those as well. Missus Bean, four zero four Media.
Joseph:We'll see you again next week.