The 404 Media Podcast (Premium Feed)

from 404 Media

ICE's Backdoor Into a Nationwide AI Surveillance Network

You last listened May 28, 2025

Episode Notes

/

Transcript

This week is a bumper episode all about Flock, the automatic license plate reading (ALPR) cameras across the U.S. First, Jason explains how we found that ICE essentially has backdoor access to the network through local cops. After the break, Joseph tells us all about Nova, the planned product that Flock is making which will make the technology even more invasive by using hacked data. In the subscribers-only section, Emanuel details the massive changes AI platform Civitai has made, and why it's partly in response to our reporting.

YouTube version: https://youtu.be/d029G6SI0dA
Joseph:

Hello, and welcome to the four zero four Media Podcast where we bring you unparalleled access in the world both online and IRL. Four zero four Media is a journalist founded company and needs your support to subscribe. Go to 404media.c0 as well as bonus content every single week. Subscribers also get access to additional episodes where we respond to their best comments. Gain access to that content at 404media.co.

Joseph:

I'm your host, Joseph. And with me are all of the four zero four media cofounders. The first being Sam Cole. Hey. Emmanuel Mayberg.

Joseph:

Hello. And Jason Kebler.

Jason:

Hello. Hello. We have new Doom oh, yeah. This is on the document. I should have waited for you.

Jason:

You're about to say it. You're about to bring it up, and I preempted you. We have a tank top for order up for presale on our Shopify page. This is by huge request. Lots of people actually emailed me asking for tank top.

Jason:

I think it's really cool. It features the actual code from Doom. It's a four zero four ASCII art. Sam, you wanna talk about it? Because I feel like you were the sort of originator of this this design.

Sam:

It was me and Ronan Wood, who's our designer of most of our merch, or he's the one who puts the cool logos onto the shirts and hats and things. It was, like, weirdly hard for us to find just, like, the raw doom code, which is probably fully a skill issue. But he and I both were, like, hunting for days being like, where the fuck do they put just, like, the regular ass code that is in everything, apparently. It's running on, like, refrigerators and toothbrushes and stuff. But, yeah, we he I think he found it or one of us found it without having to run Python or something.

Sam:

It was it was a weird process.

Jason:

It's been open source since, like, 1999 or something like that. But, however, I also went to go fact check this after we had already placed the order with the printer, and I was like, where is the doom code? What what does it say? And it it's on GitHub. There's many files, so I I wasn't actually sure, like, which one to grab.

Sam:

It looks good. It looks good on the tank top. It looks fantastic, so check it out.

Jason:

Yeah. They put Doom on a shirt.

Joseph:

That's good. And it's open source, as he said. So, Iad, please don't sue us. I mean, they're not gonna do that. And and I'm just gonna cover our asses now.

Joseph:

Doom the Dark Ages, really, really good game. See, now they can't be mad at us.

Emanuel:

You know? It's funny. Endorses the Dark Ages. Yeah.

Joseph:

We don't really do endorsements, that's like the one we're probably gonna do. Alright. Emmanuel, do you wanna take this first story?

Emanuel:

Yeah. So our first story is from Jason and Joe. The headline is ICE taps into nationwide AI enabled camera network data shows. You guys worked on this for quite a while, and the camera network in question here is Flock. It's a service called Flock, which we've covered several times over the years, but I'm sure not everyone is familiar with it.

Emanuel:

Maybe let's start with what is Flock? How does it work? How common is it?

Jason:

Yeah. So FLoC is an automated license plate reader camera. It basically, like, sits at different intersections or just different places on a road, and it scans the license plates of cars as they drive by. And Flock, I believe it's the most popular one. It's at least the, like it's it's one that's used by different homeowners associations and neighborhood watch groups.

Jason:

You know, there's also a Motorola one that is extremely widely used, but Flock is sort of like a a Silicon Valley startup whose CEO has said that he wants to, like, use these to get rid of crime everywhere. And so, like, that's the pitch. It's like, put these in, and over time, we will completely eliminate crime, which is obviously, like, a very, very, very lofty claim. And so over the last few years, they've kind of gone state by state, city by city, trying to sell these both to, like, neighborhood watch groups and shopping malls and and places like that, but then also, of course, to the police, to local police. And so over time, there has become a very large network of Flock's cameras.

Jason:

So it's not just, you know, the city of Dallas that has access to FLoC cameras. It's like, once you're in it, you can query the cameras of other states and municipalities. You can do either a statewide search or you can do a nationwide search. And so in some cases, you can like, say you're a local police department, you can say, hey. I'm looking for this license plate of a stolen vehicle, and you're not just searching the cameras that you have personally bought and, like, paid the subscription fee for.

Jason:

You can search all the ones in your state, all the ones in the country if you want to. So it's become this massive network of tens of thousands of devices, and you really, like, can't drive in many major cities and small even some small towns at this point without driving by one of these things.

Emanuel:

The AI of it, is that the ability to find and identify the license plates? Is that the AI portion of it?

Jason:

I think it's that. I think it's the fact that it's all sort of, like, connected in one of these larger networks. And then also in the second half of this show, spoiler alert, Joseph will talk more about features that are being added to Flock that are going to make them smarter and more, I guess, more like AI ified.

Joseph:

Well, it's that. And you can you can be like, I want to search for all red cars that were in this area, and there's some sort of object recognition there. And, like, yeah, some people may be like, oh, that's not really AI. It's like, well, yeah, it's not chat GPT, but it is AI or machine learning, I guess. You know?

Emanuel:

So what is this data that we got?

Jason:

Yeah. So very crucially, there is something that cops who have flock are able to look up themselves, and it's called an audit report. And, basically, this audit report allows the cops of any given city to find out how many times their cameras have been searched. So as I said, you can either opt into a nationwide network or a statewide network. And in this case, the Danville, Illinois Police Department has opted their own cameras into both a statewide network and a nationwide network of flock.

Jason:

Meaning that any I don't know if it's any, but but many, many, many different police departments from around the country, when they are searching for a license plate or searching for, like, whatever they're doing, like, when they're doing a flock query, they are pinging the Danville police department's flock cameras. They're pinging their database of, like, whatever cars have driven by them. And so this is, like, a this is a feature that flock cameras have that allow police to do, like, compliance stuff to basically, like, if they're if someone says, like like, a city council member says, like, show me all the times that our flock cameras are accessed, they can go in and download this data. And so some researchers filed a public records request with the Danville Police Department and got all of the times that their cameras had been queried over the last year, I believe. And it was, like, 6,000,000 times.

Jason:

And so this just means that they were effectively grabbing what we believe to be a pretty comprehensive picture of how police are querying flock cameras all around the country. And in these like, basically, there's a massive CSV spreadsheet file. And, like, for each individual entry, it says the police department that searched Flock's network. It says the it it shows, like, how many devices were searched. In in this case, it was sometimes, like, forty, fifty thousand different devices across the the country, which we believe means, like, 40 to 50,000 different cameras across the country.

Jason:

And then, crucially, they need to put a reason for doing that search. And for many of these, this just says, like, stolen car, property theft, things like that. But for a lot of them, for between four and five thousand of them, it says immigration, or it will say HSI, which is Homeland Security Investigations, which is a division of FLOC or sorry. It's a division of ICE. Sometimes it just says ICE.

Jason:

Sometimes it says immigration. And then sometimes it says ICE plus ERO, which is enforcement removal operations, I believe. I don't have it in front of me right now, but it's basically the division of ICE that does deportations. Crucially, there was, like, some searches from the Dallas Police Department that were doing that. So, basically, this shows that local cops around the country were searching flock cameras to do immigration enforcement.

Jason:

And, crucially, they were doing it on behalf of the Department of Homeland Security, HSI, and also ICE.

Emanuel:

So if I'm the FBI and I'm doing an investigation of Sam, Sam has committed some horrible crime, and I want call records. I can go to AT and T with a warrant and be like, show me Sam's call records. That is not exactly what we're seeing here with FLOC. Like, how how would you describe the process by which ICE or other police departments are getting this license plate data?

Jason:

Yeah. So a few things. One, ICE does not have a contract to use FLAC. Like, neither does the Department of Homeland Security of which ICE is a part. So right now, the feds supposedly do not have the software that allows them to do these queries directly.

Jason:

So they can't go they can't, like, open up a Flock dashboard and say, show me all the license plates that drove by this specific intersection or whatever. Like, they they don't have that ability. They haven't procured that. They haven't procured Flock, which has, you know, a process where they're, like, taking bids and they're purchasing the software and they're they're, like, kind of legally allowed to use it because they bought it. What's happening here is local police are doing these searches for ICE and or the Department of Homeland Security.

Jason:

And we talked to like, I I emailed, I think, like, 25 different police departments. Joseph emailed probably, like, 20 of them. And what they're saying is, in some cases, these are, like, formal requests from the Department of Homeland Security. Like, the feds are saying, hey. Look this up.

Jason:

We're trying to solve this crime or whatever, and they're doing that. In other cases, they said it was, quote, informal, meaning someone just, like, asking for a favor. And what's notable is that police feel like they don't need to get a warrant to search flock because it's something that they're, like it's a service that they're paying for. And we see this with a lot of surveillance technology where, like, if you're buying it from a private company, they feel like they don't need to get a warrant. There's, like, various court cases right now from people saying like, arguing that they should need to get a warrant, but currently, cops are not getting warrants.

Jason:

And so what's happening is basically, like, the feds are getting either backdoor or side door access to Flock's system without paying for it. I say backdoor or side door because, like, experts I talked to both called it both of those things. There's no real, like, distinction here. But, basically, they're getting, like, unofficial access to this incredibly powerful surveillance network through local police, and there's, like, no oversight of this whatsoever. There's, like, no meaningful oversight.

Jason:

And so, like, sure, I I believe that ICE could probably enter a contract with FLAC and get direct access to this. Like, there's nothing that I know of that would stop them from doing that, but they haven't done that yet. And this is a really powerful network, and they have access to it through this, like, pretty unofficial mechanism.

Emanuel:

So what do you think why do you think it matters that ICE has this access? Is there, like, a legal implication for them being able to do this? Like, isn't the fact that they're getting this informal access indicate that they're unable to do it officially at the moment? Like, go in the story, talk quite a bit about, like, what ICE is or not ICE, like, local agencies are able or not able to do in terms of helping ICE with immigration enforcement?

Jason:

Yeah. So a few things. One, I've learned over the last I mean, a lot of people probably have known this for a while, but I've learned reporting on this and the massive blue story that we did a few like, month ago that generally local and state police do not have jurisdiction over immigration matters. Like, they are not allowed to say, hey. You're in the country illegally.

Jason:

We are going to detain you. Like, that is specifically a federal function. But what Trump has done and what Trump has become very, like, obsessed with is the idea of empowering local police to enforce immigration laws. And they've done that through this program called two eight seven g, which is it's just like an ICE program that different police departments have to apply to, and then ICE says, okay. We are empowering you.

Jason:

It's like a delegation is what it's called. We're delegating you the to you the authority to conduct some, like, immigration enforcement type stuff. So the concern here is that, like, this incredibly powerful nationwide network of license plate cameras is going to be used to, you know, pick up people for ice, but also that this surveillance network has been built. This, like, apparatus has been built largely by going community to community and saying, like, we're gonna use this to find stolen cars, or we're gonna use this to, you know, help prevent or solve really violent crime, things like that. And a lot like, FLOC is in a lot of liberal neighborhoods, like, largely according to both what we've reported on, but also the experts that I spoke to.

Jason:

Like, a lot of them were saying it's like rich people trying to protect their homes and stuff like that. And by and large, like, at city council meetings where they discuss, like, should we buy this? Should we not? They're like, oh, we're gonna use this to solve violent crimes. And it seems to me based on talking to different police departments, that a lot of them don't really understand that they have, like, accidentally opted themselves into this large surveillance network that is being used by cops all over the country for all sorts of things.

Jason:

And they don't even know what it's being used for in in many cases. Like, I talked to some of the cops about different searches that they ran, and they were like, oh, yeah. Like, that it someone on our force did that search, but we don't, like, exactly know why.

Emanuel:

So what can you talk a a because you point out the the legal issue with them helping ICE and federal agencies with immigration stuff. You're pointing out that the reason field in the searches very clearly says that these are immigration or ICE issues, and they say what? Like, what are the responses to to that contradiction?

Jason:

Yeah. I mean, sometimes so it's it's ends up being like a quite a complicated story in some ways because this came from Illinois, which is one of the few states that actually explicitly bans their police from working with ICE. They have a they have a law that is basically makes it like a sanctuary state. And so, like, all the cops there are like, oh, no. No.

Jason:

That wasn't for immigration. We're not allowed to do that. We comply with Illinois law. But then I show them, like, a spread a spreadsheet thing that says, well, it it says here that you did this for, quote, immigration violation or something like that. And they'll be like, oh, yeah.

Jason:

But that was like a specific criminal case. And, the person might have been illegal or undocumented or or whatever. They might have had like a not legal immigration status, but that's not what it was for. And in many cases, they they weren't able to tell me what it was specifically for either because they didn't know. Like, some of them said straight up, like, we actually don't know why this search was run, and we don't know why it said immigration there because, like, I believe it's because access to Flock is just, like, so widespread.

Jason:

It's like it was described to me by some of the experts we spoke to as, like, a Google search engine for cars. So it's like, oh, I'm doing this case. Gonna go on Flock and, like, type in what I'm looking for. And because there's, like, very few restrictions on how it's used, cops are just using it for, like, whatever. And it's like, if if I ask you why did you Google that thing three weeks ago, you might not be able to tell me specifically what it was for.

Jason:

Or maybe even more to the point, if someone asks me why Sam or Joseph googled something, I'll be like, maybe it was for this story, but I have no idea what it was actually for. And so it seems like access to this, like, massive surveillance network is now so commonplace that the cops don't even know, like, why they are searching it half of the time. And I just I think that that's, like, very notable actually because we've built a system that can track the movements of anyone with a car in The United States, which many of these communities are unlivable without a car. Like, you need a car to go anywhere. And at first, this was a technology that was built for local police to, you know, again, find stolen cars, to find people who, like, find robbers and, like, murderers and stuff like that.

Jason:

And now it's being used in this way to help the feds under an administration that has become obsessed with undocumented immigrants. And so we've, like, built this, like, surveillance apparatus, and it's just being used in, like, a really commonplace way in a way that I find to be, like, really concerning.

Emanuel:

I think the I

Jason:

don't know if that answers your question. I don't remember what your question was, but, like, that's why I think the story is is important.

Emanuel:

The the comparison to Google, I think, is very good and a good transition into the next story we're gonna talk about. But I guess just before we go to that, back in motherboard, we did a bunch of stories, most of them from Caroline Haskins, who is now at Wired about Ring. And I see a lot of parallels with Ring in the sense that it is a device that is sold to groups or individuals. And as the product is getting popular and widespread, people don't realize that they're building out this network that can be accessed by cops. How would you say this compares to Ring in terms of, like, the surveillance dystopian aspect of it?

Emanuel:

Like, worse, comparable?

Jason:

I think it's very I think it's a it I think it's an apt comparison in terms of, like, the business model and and sort of how it works. I think Ring didn't ever have a like, they haven't sold tons and tons of Ring devices to cops directly. Like, that was something where they were selling it to homeowners, and then homeowners were opting into this system where cops could access it. And

Emanuel:

But cops were like pitching Ring to communities. Yeah.

Jason:

Yeah. Because cops were basically trying to get this, like, ad hoc network of just, like, thousands of Ring cameras in an individual community. And that was, like, sold originally as, like, we will use this to make sure that people don't steal your Amazon packages, and then it became something larger than that. And that there's been all sorts of privacy concerns with Ring, security concerns with Ring. We don't need to get into it to it.

Jason:

But, basically, it started out as something that was, like, for consumers and then was used by cops. What FWAC is, it was initially pitched to, again, like neighborhood watch groups and homeowners associations, and they can share that information with cops. But now I believe, I'm not positive that this is the case, but I believe that the vast majority of Ring cameras are sort of, like, bought and operated by cops. So it's become more of a specific police technology. But the the nationwide network thing is is really very important and very, very similar.

Jason:

I think Flock is actually maybe more concerning than Ring because all of the systems interconnect. So, like, I don't know. You could track a person across the entire country if they were to drive across the country, and you could track their movements and things like that. And it's funny because a lot of the cops say, well, we're not tracking people. We're just tracking cars.

Joseph:

And it's like, well, who

Jason:

do you think is

Emanuel:

in Yeah.

Jason:

These cars?

Emanuel:

Cars that specific people own. Yeah.

Jason:

Yeah. And that'll that'll go directly into Joseph's next story. But it is, like, it is very much, like, you cannot move around this country without being tracked passively. And I should have said this at the top because every time we talk about flock, I feel like most people understand the the prob the issue or, like, what is being built here, but it's like, these are not speed cameras. They have nothing to do with speed cameras.

Jason:

They have nothing to do with red light cameras. They catch every single car that drives by the camera, and they they add it to, like, their database and with a time stamp and all that sort of thing. It has nothing to do with, like, oh, well, just don't speed or just don't run a red light and you'll be fine. It's like there's a database of your movements if you drive by one of these. And we've written about it before, but there's a an open source project called Dflok that is trying to track where these are.

Jason:

And if you look at that, you'll see they're they're in a lot of like, every mid sized city, tons of suburbs, lots of big cities. They're in many, many, many US cities.

Emanuel:

So we'll leave that there, as Joe says, and we'll take a break, and then we'll come back with the other flock story.

Jason:

And we are back with a story that Joseph did that's extremely related. This is a Flock mega episode. We should have said that up top. But license plate reader company Flock is building a massive people lookup tool, Leak Shows. So this this leads on to some of the things we were talking about in the first segment about tracking people versus cars, tracking cars versus people, the AI parts of Flock and things like that.

Jason:

Joseph, this story is about Nova, which is a new tool from Flock. What is it, and how are they marketing it to COPS?

Joseph:

Yeah. So Nova is this new product which I believe they're rolling out or at least advertising and marketing to cops at the moment. And it's almost like an add on to the flock network. So if you thought that automatic license plate reading technology, as you say, following cars and by extension drivers all across The United States was invasive enough or maybe should require a warrant or whatever, this is now going to add additional information to those searches. And I'm looking at the Flock website now with a with a page about Nova, and it says you can quote, see the full story, connect people, vehicles, and locations across agencies.

Joseph:

Nova helps you solve crime and prevent the next one faster, and it's supposed to bring all of your agency's data in one place, you know, so video and nine one one calls and, of course, the flock automatic license plate reader networks as well. But scrolling through this page like I am now, as I said, it's kind of all marketing speech. It doesn't actually really talk about what the Nova tool is or what data it uses, and that is what this story is about, which is based on a leak that we got from Flock.

Jason:

Yeah. So I actually was the Nova page up by the time we did this article?

Joseph:

It was. I'm not sure if we actually quoted it explicitly. Maybe we did an earlier draft.

Jason:

Well, there's almost nothing about it, as you said. Like, what what the what a next gen public safety data platform actually is wasn't known. And so you, you know, had a source who leaked you information about what it is, actually some information from within FLAC and and the problems that people had with it at the company. So, I mean, what sort of data is included in Flok or sorry, in Nova? And what what so what is this product?

Joseph:

Yeah. So as you say, the marketing material is really sparse, but then when you go through the leak, the I mean, there's some, I would say, incredibly interesting and illuminating stuff in there. The first being the Nova plans to at least use hacked data. So that is data from breaches that's being published online, and that could be then married with the automatic license plate reader data as well. And one concrete example they give is ParkMobile, an app that was hacked a while ago.

Joseph:

You you actually use ParkMobile. Right, Jason? What is it?

Jason:

ParkMobile is a smartphone app that you use to park mobily off your mobile device. No. So That's

Emanuel:

the case.

Jason:

Lot of a lot of states a lot of cities have replaced their parking meters with ParkMobile. So, basically, you put in your license plate, and then when you park in a specific spot that would normally have a parking meter or, like, a publicly owned parking lot, you say, hey. I'm parked here. I'm parked in this spot. Give me an hour.

Jason:

Give me two hours, and you pay with Apple Pay or whatever. Right. Some places also, like, people don't have coins anymore. So a lot of, like, the coin operated parking meters will have an option to use ParkMobile instead of that. It's extremely common.

Jason:

It's very, very, very common. I see it all the time at as a driver as a driver.

Joseph:

And when you sign up, you presumably have to tell them about your car. Right? Yeah. They need to know

Jason:

what need to know what car it is because otherwise, you're gonna get a parking ticket. Like, someone is gonna come by and give you a parking ticket. So you have to put in the make, model, and the, usually, the color and the license plate.

Joseph:

Right. Which is all information, which is very, very interested very, very interesting to flock. Right? So I

Jason:

I wanna highlight here. You just said it, but this is hacked data. Hacked data from ParkMobile. It's like they Flock didn't go and buy this from ParkMobile, did they? From what we can tell?

Joseph:

I I'm not exactly sure where the ParkMobile breach was, but given the date, which was 2021, that was the the date of the breach itself. It was probably on breach forums or one of these other pretty low level hacking forums where stuff gets often published publicly. So there's this twenty one twenty twenty one breach of ParkMobile. That data includes people's email addresses, phone numbers, and mailing addresses. Now, Nova, this new add on from FLOCK is planning to marry that data with the automatic license plate reader information.

Joseph:

So let's say you type in a plate number, you look it up, and you're like, okay. That car drove all across Chicago or whatever. Well, maybe the talk of return actually who that car belongs to instantly. And maybe cops can go figure that out themselves, and they do that all the time. Right?

Joseph:

They'll they'll get a plate and they'll go off and they'll go to the DMV and figure out, well, who's that vehicle registered to and that sort of thing. But this is allowing FluxNova users to do it basically instantaneously, at least potentially.

Jason:

Has, like, your email address and things like that as well because you have to make a ParkMobile account. So, I mean, I don't actually know because we haven't seen Nova in action, but I could imagine it correlating your car not just with your identity, but also with, like, your contact information, which could be very valuable for police who are investigating you.

Joseph:

Yeah. And, I mean, again, you you stressed it, but I I I will do that as well. This it's a trend of companies taking hacked data from I mean, I hate to say it, the dark web or, you know, it's just posted online or something. They then maybe clean it up if necessary. They bundle it together with other data, and then they sell that to law enforcement.

Joseph:

We did a story years ago back at Motherboard when we were there. I'm looking at it now from July 2020, and that was some company I think called SpyCloud, and they would have breached passwords and all of that sort of thing, and then they were selling access to that to that tool to law enforcement as well. And that was, I think, more for uncovering people's identities probably online, you know, maybe in cybercrime or in child abuse investigations, that sort of thing. This is way more physical because you are marrying somebody's movements, their vehicle with their identity. And of course, there's lots of other data that I'm I'm sure we'll talk about as well.

Joseph:

It's not just the hack stuff.

Jason:

Right. So what else is in here though? Because it's not just ParkMobile. It's it's other things too.

Joseph:

Yeah. There's there's two of the main types, and one is, quote, commercially available data, end quote. And that comes up a lot, especially when we've written about location data over the past several years and there's all of these data brokers out there where you can just go and buy information from them, essentially. And the leak we got specifically mentioned Equifax and TransUnion. And I'll say straight away, Equifax didn't respond to a request for comment, and TransUnion denied having any business relationship with Flock.

Joseph:

That being said, this tool is still being developed and worked out. Right? The sort of data you would get from an Equifax or a TransUnion like a credit bureau is that in The United States, when somebody opens a credit card or a line of credit, they provide their physical address, their contact information, their name, all of that sort of thing. This is called the credit header. It's like the the PII bit of your credit report that then gets sent to TransUnion, Equifax, etcetera.

Joseph:

And sometimes, they then rebundle that and then sell that or transfer that to other people. One example is trans TransUnion's tool, t l o x p, which they sell to like PIs and sort of stuff like that. So you can see definitely how that sort of information would be very useful to Nova or or a Nova like tool where, okay, I have the license plate, I figured out the identity, Now I can get their mailing address or this other personal information that they handed over to TransUnion. And then the third one, which we didn't really get into much in the piece because it's sort of mentioned in passing in material we got, but it's public records. And that stuff like marriage licenses, property records, even campaign finance records.

Joseph:

So you look at somebody's vehicle and then what? You're saying who they're married to? And I mean, that's explicitly the goal is to get not just the driver, but the people they are associated with as well.

Jason:

Yeah. I think that's an important point because this administration in particular has been really obsessed with, like, figuring out networks of people and who who they've associated with and things like that. Okay. So you also besides, like, figuring out what this is, you also, you know, had some people inside Flock talking about how they felt about it. You know, you you had I believe it was leaked Slack messages.

Jason:

That's right. Isn't it? Yeah. Like, what were people inside Flock saying about this?

Joseph:

Sure. Yeah. It was leaked Slacks, and there was also audio of a meeting. And this meeting, I think, was very, very telling because it really crystallized what Flock imagines this product could be. And because it's all it's all fine to say, oh, we're gonna take data from here and marry it to this data, and that's like, you know, a little bit technical and maybe it's not really on the nose enough for people.

Joseph:

But some of the quotes from this audio is quote, you're going to be able to access data and jump from LPR, license plate reader, to person. So it's incredibly explicit in its goal and that we're not just tracking vehicles anymore. We're tracking people, and that's the entire thinking behind this behind this tool. And then in some of the Slack chats, I don't think I'll go into them in in particular detail, but, you know, there were people worried about what exactly this tool is, what it's gonna be capable of, and especially the hacked data stuff. There are people talking about like, well, if Flock got hacked, would it be okay for then another company to use our data?

Joseph:

You know? Like, you can't really take hacked data from the Internet, and then if it happened to you, complain about it as well. Because as you said, you know, it's just wild that a company is taking hacked information in the first place at all.

Jason:

Right. And I think that brings us to the last question that I have for now, which is basically, like, we have these cops saying this actually just tracks cars. It's not anything more than that. And yet here you have Flock saying publicly it's a next gen public safety data platform, and it really, like, it's really making this network much more powerful, I believe. So so, I mean, is this more invasive than just having ALPR data on its own?

Joseph:

Yeah. I think one of the hard things about covering automatic license plate reader cameras is that people will often default to, well, it's just my car. And like as you said, in the in the previous segment, some people think it's just about speeding or running a red light or something, which is crazy to me. I I hadn't actually heard that complaint. So that do do people actually say that to us, Jason?

Joseph:

Do people write in and say that?

Jason:

I see it I see it a lot on social media.

Joseph:

Okay.

Jason:

Just as in, like, I'll publish a story about flock and then people will, like, quote, tweet it and say, like, oh, well, just don't run a red light then. Like, people don't understand Right. What they are still even though this technology has now been around for, like, quite a long time. And we've written about it a lot, but still have a lot of other outlets as well.

Joseph:

Yeah. And I think that's a challenge of covering it because people might think that or they'll think, well, it's just tracking my vehicle or other people's vehicles. You know, you could you could just take a bus or something. I don't know. But I disagree with The

Jason:

the other argument is that, like, well, don't own a car, bike everywhere, and walk everywhere,

Joseph:

which is like Completely impractical.

Jason:

I mean, it's just like I I'm I'm pretty anti car, like, in concept, but it's just like it's not possible in many, many, many suburbs, city. Like, the The United States is not designed for that at this moment, and, like, that's that's one of the reasons, you know Yeah. Our culture is bad, etcetera. But it's like people have cars because they, like, need them to get to work and and things like that because they live in places where it's impractical to take public transit or bike.

Joseph:

Yeah. So I I already think the license plate reading technology is incredibly invasive. We did a story back in motherboard where a source had access to a different tool used actually, I think, by federal law enforcement and private investigators, and they looked up with consent a specific person. And, like, it maps out their entire movements. So it it really, really was incredibly invasive.

Joseph:

I already believe that. Now you add on something like Nova, which is taking all of this different data from around the web and potentially other sources as well and marrying it with the license plate reader data. I mean, it's absolutely more evasive. And I think it does bring up the question again of, like, should authorities get a warrant for this? Because in the vast majority of cases, they don't.

Joseph:

You know? And as you said, there are lawsuits going on arguing that's unconstitutional. And if that's already unconstitutional with just the license plate reading data, maybe it's more significant with this stuff as well.

Jason:

Yeah. Alright. Should we leave it there? And I'll give it back to you to read the the to say the end parts because I don't remember what they are.

Joseph:

Of course. No. I'm happy to do that. If you are listening to the free version of the podcast, I will now play us out. But if you are paying for a full year subscriber, we're talk about Civitai, the AI platform that manual has covered an absolute ton and some truly radical changes that has made much of it in response to our reporting and the actions of payment processes as well.

Joseph:

You can subscribe and gain access to that content at 4040. We'll be right back after this. Alright. And we are back. Sam, I'm gonna take another break.

Joseph:

Do you wanna take over to ask about Civatai? Yes. Civatai. Right? I I will I will That's

Sam:

how I say it. Yeah. In my head, that's how I say it. Maybe Emmanuel says it different. Civitae.

Sam:

Civitae. Civitae.

Joseph:

Good.

Sam:

Yeah. So this what we're gonna be discussing now is a story by Emmanuel. The headline is Civitae ban of real people content deals major blow to the nonconsensual AI porn ecosystem. There's a lot going on in this headline, especially if you haven't been following the Civitay saga. So, Emmanuel, do you wanna just start by defining what Sibetai is?

Sam:

Kinda give us a little bit of background of, like, what people do with it, what your obsession is with it? You've written so many stories about Sibetai over the last two years, I feel. But, yeah, this has been something that you've been you've been on them since the beginning. So give us a walk through.

Emanuel:

Yeah. So sorry. I'm gonna monologue a bit. But Please. Cibatai is a website for sharing custom, AI models that generate images or now video.

Emanuel:

It is mostly based on stable diffusion, which is an open weights AI model. People have figured out a way to quickly modify them. So you take the generic stable diffusion model. You feed it a couple of images of something you wanted to recreate, and it becomes very good at recreate recreating the the likeness of a of a person or a style or a thing. When we launched for four media, we each kinda went into our corners and thought about what are the stories that we think are most important.

Emanuel:

And when I was looking at the landscape of nonconsensual content, I thought that Civitai was a very important, if not the most important piece of it. There are AI models. There are AI tools. There are communities, Telegram channels, Discord channels, specific developers, GitHub pages. Like, there's this entire ecosystem that enables this content.

Emanuel:

But if I was to pull out one Django piece that I think is the most important and focused on non consensual content, it would be Civitai because it makes it so easy to make and share models that recreate the likeness of real people or specific sex acts. Civitai has always had the policy that you're not allowed to combine those two things on its site. You're not allowed to post images that are nonconsensual, and you're not allowed to share models that do that. But as I've reported since 2023, all people have to do is grab the sex model and the model of a celebrity, put them together, and it makes it really easy. And that's where a lot of this content comes from.

Emanuel:

It comes from the models that are shared on Civitie, and that's kind of why I've been all over it this entire time.

Sam:

Yeah. So what happened in the last couple weeks? Because the user base has been a little shook and as have the admin of Civitai by some of the changes occurring. So what's what's going on there?

Emanuel:

Yeah. So this latest story I wrote is, like, the fourth I've written in the past two weeks because there's been a flurry of activity. The first thing that happened, Jason actually flagged to me, and that was Civitai put out an announcement that, it was introducing new policies about what content was allowed on the site, and it banned a bunch of models and images that we've seen other adult websites ban before. So it has to do with, like, blood and guns and diapers because diapers imply people who are underage. And we've seen Sam, you've reported on this.

Emanuel:

I think it was Patreon maybe that was banning content that showed, like, adult context diaper use. And I reported on that and what Cimatai said at the time that they were doing it for the same reason that other adult websites were doing it, which is payment processing service providers. So whatever company you use in order to take people's credit cards, they made the demand that Cimatai make those changes. Otherwise, they wouldn't they they wouldn't process payments. If they can't take credit cards, that makes it much, much, much harder to run an online business.

Emanuel:

You can use cryptocurrency. You can use other things. But as we've seen with Pornhub, who's been cut off for, like, five years now or something, it just like really hurts your bottom line. So Cibatize says that. I then also, completely by accident, find out that their on-site video generator was allowing people to make, like, the most horrible type of non consensual content that exists online and made it very easy, and it was making money off of that.

Emanuel:

They shut down the ability to do that, but not before this payment service provider, which Civitai hasn't said who it is, cut them off. So at the moment, there's no way to use a credit card to pay Civitai if you want a membership, if you want to buy their currency, which allows you to generate images. The only way to do it at the moment moment is with cryptocurrency. And they say that they have another service provider in the wings that is going to come in, I think, in a week. But, apparently, what this most recent story shows is that that provider demanded that Civitai remove all models that allow people to recreate the likeness of real of real people.

Emanuel:

So before Friday, if you went to Civitai and you typed in Taylor Swift, you would get dozens and dozens of models that are dedicated to making images and videos of Taylor Swift. If you typed in the name of any celebrity, any YouTuber, any Twitch streamer, any politician, just like pages and pages of models dedicated to recreating their likeness, all of those are gone now.

Sam:

Do you you mentioned this in the story, but do you think this is related at all to the Take It Down Act being passed, or is that just a coincidence? Like, why now, I guess, would be my big question about the payment processors deciding to draw a line and stand with this? It's it's impossible

Emanuel:

to say exactly why. Civitai has cited the Take It Down Act. They cited an EU and AI act that passed last year or was enacted last year, but they always mention at the same time that it's because they are talking to these service providers, and that is the real pressure. And we we say I say this because they say so and also SAM, mostly from your reporting, we know that that is when websites really have to make changes because it alters your business entirely to be cut off like that. Why the payment service providers are doing this is also impossible to say for sure.

Emanuel:

I've asked them repeatedly why they're or I've I've reached out to all the service providers to ask if a, they're the service providers and if they are the ones making this demand, and no one has understandably stepped up to to say that it is them and this is the reasoning. I I will say that all the information about the harm that Civatide causes was exposed by our reporting. There's no other website that has made this clear, and Civatide has done a pretty good job of not making this apparent to a casual visitor to the site. The only way you know how critical of a role it plays in the nonconsensual content ecosystem is either you're in that community already as I am in various, like, Telegram and Discord channels where people discuss it, or if you make an account on Cimatai, change your settings to view not safe for work content, and then go poking around. As we've said many times before, if you just go to Civitai and look at the front page, it's images of, like, cute cartoon characters and spaceships and stuff like this.

Sam:

Which people used to do all the time as some kind of, like, stupid gotcha to your reporting. They would be like, I'm looking at the site right now, and I don't see anything that you're talking about. And it's like, well, we didn't make it up, first of all. Yeah. I went to the Monster Crunch restaurant,

Emanuel:

and they were serving meatballs. So

Sam:

Right. Yeah. Exactly. It's like how, I don't know, how simple of a workaround. Just log in and turn on your not safe for work settings.

Sam:

But yeah. I mean, do you think this is gonna cut off because you mentioned this is kind of the Jenga piece of, like, the nonconstantial porn ecosystem, but do you think this is gonna cut everyone who's participating in this off from being able to do this anymore? Like, what's what are they gonna do? Are they gonna migrate? How are they coping, I guess?

Emanuel:

So in the immediate, it definitely has impact. And I know that because, again, I'm watching what the people who make this content say, and they're all like, oh, fuck. They're all like, oh, no. My models, they're already gone. It's too late to grab them.

Emanuel:

I rely on this for this or that service. I make money selling images, like, using these models. I'm screwed in the immediate, like, short term. That doesn't mean that the content will stop, a, because a lot of the models have already been used to create, let's say, a Telegram bot. Right?

Emanuel:

So it's like you took the model from Civitai. You made it possible to create non consensual AI generated videos of people. That bot is already up and active. You downloaded the model, so you don't need it from Civitai anymore. So that's not going to stop.

Emanuel:

Creating new ones, sharing more models, like, there's been a huge interruption in that, but I think it would be very naive to think that this will stop it. There is already, like, a bunch of sites that have positioned themselves as Civitai competitors shortly after Civitai got popular, and they're still hosting a lot of bad models. And I'm sure that, like, the top creators of those models from Civitai are gonna go to the competition and try to monetize their work there. And now we enter some world of whack a mole where, I don't know, I guess I'll do stories about those platforms as well and see if they're getting cut off. So it's one of those cases where, again, I know you're very familiar with this.

Emanuel:

It's like you take down the biggest, worst actor in the space, but it's like other people will step in to fill in the void. But it's, like, slightly not as bad as it used to be.

Sam:

Yeah. And, like, I guess part of the context here that is pretty interesting and maybe answers or at least shed some light on some of these questions or makes it interesting if people are fleeing this site in mass is that it's backed by a 16 z, which is Andreessen Horowitz, which is like this kind of kingmaker startup BC. What's the word? Like, a incubator, I guess. Right?

Sam:

Or they fund they fund startups. So they just throw money at things like Civitae to Venture capital fund. Venture capital. Yeah. They let them be get their get their feet, and then it'll be profitable.

Sam:

Encefoti raised 5,100,000.0 in around in 2023 that was led by a 16 z. I mean, what's like what do you think how do you think that relates? Because I definitely see it being a problem for their investment if they can no longer make back that that funding that they were that they poured into this. And, also, it's just like, you know, a major major venture capital firm is basically endorsing this nonconcentral AI porn engine and has been willingly for years.

Emanuel:

Yeah. I think that's why I thought it was really interesting news that we broke back in the day that they were backed by a 16. I think a 16 doesn't really care about 5,100,000.0 in the round that they led, which means it's not entirely their money, and it's like it's just not a lot of money to them. But I do think the reason I wanted to talk about this again is that all these posts around all these policy changes for Civitay include some sort of statement that is like, this is not the end. We're not shutting down.

Emanuel:

And the reason for that is that despite the CEO of the company, Justin Mayer, repeatedly saying that it's not a porn site and that is, like, not the majority of their business and not why the site exists, people know that that is the attraction. And while I was reporting these stories, I also saw that they had, like, a financial transparency report. And from reading that report, which came out in January, the site is operating at a loss. So Civitai is paying so much money for the compute, for, like, the time on the GPUs that they're renting out in order to generate all these images, in order to train all these models. They're spending more money on that than they're earning on the money people are paying to use that service, so they're burning through cash.

Emanuel:

Now I asked Justin Mayer, the CEO, about this. He says that since they put out that report, they've been, like, breaking even, and some months they're, like, under budget, and some months they're over budget. Sometimes they're in the black, sometimes they're in the red. But clearly, operating at this scale at a loss, they would not be able to do this without the investment from a 16 z, which means that this critical Django piece that is so important to this ecosystem that is causing all this harm that we've been talking about for two years is very much enabled by this investment by the biggest venture capital firm in the Valley, which is is crazy to me. And I think, honestly, if I was to bet on it, I don't think they're gonna re up on this investment.

Emanuel:

I don't think they wanna put more money into what looks like a pretty big mess. I mean, I'll I'll be delighted if they did because it would be such a crazy, crazy story. But I don't know. It's just like they're very culpable in this, like, entire nasty ecosystem that's enabled by Civatide. They they just somebody paid for that compute, and it was Andreessen Horowitz.

Emanuel:

That just, like, simply how it is.

Sam:

Yeah. I would be shocked, but, like, the depravity of VC perhaps knows no bounds.

Emanuel:

Yeah. Maybe they'll do it as, like, I dare you to cancel me kind of kind of Yeah. Yeah. That's down. It's possible.

Emanuel:

You know?

Joseph:

Yeah. Yeah.

Sam:

Alright. Well, Joe, do you wanna read us out? Because I don't know the I don't know the script.

Joseph:

Dude, I mean, none of us know. None

Sam:

of know. I feel like you say it in your sleep. You wake up and you're saying

Joseph:

Oh, no. I I read it every single time.

Jason:

I I I leave my body when you start talking

Sam:

about that.

Jason:

I'm like, oh, we did it. We finished another episode. Thank god.

Joseph:

Dude, this was a chill f. I've been like, me and Jason have been arguing about the title of the podcast and Slack.

Jason:

We're arguing literally over quotation marks. Yeah. Yeah. We're discussing quote marks.

Joseph:

It's important. Alright. I will play us out. As a reminder, four zero four Media is journalist founded and supported by subscribers. If you do wish to subscribe to four zero four Media and directly support our work, please go to 404Media.co.

Joseph:

You'll get unlimited access to our articles and an ad free version of this podcast. You'll also get to listen to the subscribers only section where we talk about a bonus story each week. This podcast is made in partnership with Kaleidoscope. Another way to support us is by leaving a five star rating and review for the podcast. That stuff really helps us out.

Joseph:

Here is one of those from Booger Maroney. I can't say enough good things about this crew and their reporting. Surprisingly, even handed given the content and the nature of their coverage, a take on things I wouldn't have reached for by default, and a lack of reverence that is very refreshing. Thank you. This has been four zero four Media.

Joseph:

We will see you again next week.