The 404 Media Podcast (Premium Feed)

from 404 Media

How Benn Jordan Discovered Flock's Cameras Were Left Streaming to the Internet

You last listened January 12, 2026

Episode Notes

/

Transcript

This week, Jason is talking to YouTuber Benn Jordan, who has done some of our favorite reporting on Flock, the automated license plate reader surveillance company. A couple months ago, he found vulnerabilities in some of Flock’s license plate reader cameras. I have been following Benn’s work for a while, and soon after that video came out, he reached out to me to tell me he had learned that some of Flock’s Condor cameras were left live-streaming to the open internet. In this episode, we discuss how he discovered the issue and what happened next.

YouTube Version: https://youtu.be/-GQ31n_hR9I
Speaker 1:

Hello, and welcome to the four zero four Media Podcast where we bring you unparalleled access to hidden worlds, both online and IRL. Four zero four Media is a journalist owned company and needs your support. To subscribe, go to 404media.co. Subscribers get bonus episodes and early access to our interview series, which you're listening to right now. This week, I'm talking to YouTuber Ben Jordan, who has done some of my favorite reporting on Phlox, the automated license plate reader surveillance company.

Speaker 1:

A couple months ago, he found vulnerabilities in some of Phlox's license plate reader cameras. I've been following Ben's work for a while, and soon after that video came out, he reached out to me to tell me that he had learned that some of Phlox Condor cameras were left live streaming to the open Internet. If you don't know, Phlox Condor cameras are video cameras that are pan tilt zoom cameras so they can, like, go back and forth, zoom in. They're not for license plates. They're actually for tracking people.

Speaker 1:

So we learned that some of those exposed cameras were located in Bakersfield, California, which is about a two hour drive from my house. So I went up there to film myself on them. And before we get to my interview with Ben, here's some footage I took in Bakersfield that demonstrates the problem we're about to talk about. If you're listening on audio only, I think this will still make sense, but you can see the video on our YouTube.

Speaker 2:

Okay. So I've driven up to Bakersfield, California from Los Angeles, to check out this flock camera that Ben, Jordan, Gainsec, and myself found streaming directly to the Internet. It is a flock condor camera, which is a newer type of flock camera that is designed to track people as they walk by. This is an addition to its automated license plate reader systems. And so this one is stationed at Hughes Lane and Ming Avenue in Bakersfield outside of a Big O Tires.

Speaker 2:

A Carl's Jr. Well, here's the Big O Tires. A Carl's Jr. And a Macy's. We have a Macy's mall.

Speaker 2:

And I don't know for sure, but I think that the camera is probably owned by the mall because the mall has automated license plate readers at every entrance. And this is stationed at a traffic light. Hopefully, you can see it up there, on top of this traffic light. We'll zoom in. And basically what we found is that this is streaming unencrypted, totally insecure, no password required directly to the Internet.

Speaker 2:

There's actually a panel that you can go on there and you can see not just the footage, which I believe saves about eighteen minutes at a time, but you can sometimes see logs. You can sometimes see the type of camera that it is. And we found between forty and sixty of these live streaming throughout The United States. We don't know if that's the total number of them, but so far we've seen them stationed at playgrounds, we've seen them stationed at malls, we've seen them stationed on bike paths. I saw one, like, outside of a skate park, a soccer field.

Speaker 2:

And so I'm gonna go out there and walk my dog, who's here with me, and I'm going to record this camera recording me. And then I'm gonna watch myself on the Internet because this is streaming directly to the Internet. Insecure, we can see people walking by. Some of these cameras are so up close that we're able to see people's faces. You can see what they're doing.

Speaker 2:

I saw people walking their dogs. I saw people rollerblading. There was one that was, like, at a Christmas fair. So, yeah, we're gonna go check it out.

Speaker 1:

So after doing that, I published an article called Flock Exposed Its AI Powered Cameras to the Internet. We tracked ourselves. And Ben Jordan published a YouTube video called this Flock camera leak is like Netflix for stalkers. In a blog post after our reporting, Flock said, quote, the issue involved a troubleshooting only debug interface that was temporarily accessible on the Internet, and, quote, the only content visible was live or recorded video comparable to what can be observed from

Speaker 3:

a public

Speaker 1:

roadway. It added that, quote, no sensitive or confidential information was accessed or accessible. While recent third party coverage characterized the issue as more extensive, this was an isolated configuration issue and not indicative of a broader or ongoing concern. We'll let you make your own decision about whether this data is sensitive based on our reporting and what you're about to hear. So here's my interview with Ben about how he found this, how he thinks about Flock, and what comes next.

Speaker 1:

For people who don't know you, can you tell me a little bit about yourself, like, how you got into this? You know, you have a big platform on YouTube, but what what do you what do you usually do? Because you're not usually looking into flock cameras.

Speaker 3:

Yeah. So, I mean, I my YouTube channel started out with, like, kind of deep audio synthesis stuff, and then it kinda went into general science and techy stuff, oftentimes related to acoustics or things like that. And I live near Atlanta, and we have the highest concentration of surveillance cameras in America, and many, if not most of them, are flock cameras. Like, I literally can't go to a grocery store without passing a flock camera and being tracked by it, And it just it it's kinda the story you hear about from anybody who's been proactive or, I guess, active with the the surveillance scene. It's like they they were asking, what are these black cameras?

Speaker 3:

Like, what what are they doing? And then they find out and then they find out more, you just end up down the rabbit hole. And you kinda can't believe that this is something that actually exists, and you can't believe how how much of a Fourth Amendment violation it is. And so I guess mine was was the same story except that I have the I have a lab here and the capability to actually test something. So I started testing breaking the AI models that are in pretty much every single license plate reader.

Speaker 3:

And then, yeah, and then the security vulnerability stuff, I started warning about that when I was making the first video, and it just kinda brought me down. And I'm probably a little bit more attracted to this type of thing or or maybe a little bit more concerned about it because I I would categorize myself as, a left libertarian. Like, I I truly do believe that privacy is a human right, and I think that it it is there at the very fiber of our ability to live out our own destiny. Like, I think it's very important to have privacy when you need it, and this is the opposite of that. So it's it so maybe that's what gives me more energy, you know, or makes me more more likely to keep diving into the same thing.

Speaker 3:

And that's I guess that's how I got here.

Speaker 1:

Did you have the same experience that Joseph and I have had where you do a big story about Flock or learn something about how these cameras work and and what they're doing and you publish it and then, like, 500 other leads pop out. Yeah. Yeah. Because that's that's what's happened to us. And, you know, it's a very important story and an important company, and, you know, it's been a very, like, important part of our reporting.

Speaker 1:

But, also, it's like it feels like it's almost never ending because we've done a couple stories, and then it you know, we'll hear from an activist in Washington. We'll hear from an activist in Texas. Like, there so many people are are sort of pushing back against this on a city level, but they're also, like, creating so much data with their audit logs and all that sort of thing. And people are pulling in. They're finding, like, really fucked up things in it.

Speaker 1:

Yeah. And and so it's like, oh, you should look into this. You should look into that. And it's like we are, but also there's not enough hours in the day to do it because there's so much going on with this company. It's so expansive at this point.

Speaker 3:

Yeah. I mean, employees, flock employees have reached out to me even. Like, a lot of people have reached out to me, and people have reached out to me about, like, similar things too, I think, after making these videos. Some which, you know, I'll probably tell you about. Like, they're actually notable stories that are that are kinda crazy.

Speaker 3:

Yeah. It not only do I have leads, I think another weird side effect of making the content is it gets repurposed onto TikTok and Instagram, where basically you just sort of have it's like somebody watched my video, and then they they put their face on it instead, and they give a two minute version of the video, which I'm totally okay with. Like, I think any sort of advocacy for this is great. Use my content. I don't care.

Speaker 3:

However, they're getting details a little bit wrong just because they're a little bit lazier about it. Like and that that's I I could I I have a bone to pick with that, of course, but the bigger concern is that this company is extremely litigious and, kinda malicious in my opinion. They're not messing around. And and I think, like, in the next content I make about this, I I'm definitely gonna mention that in hopes that some of these people watch it and maybe pay a little bit more attention because if you say the wrong thing, they're gonna take advantage of that and and send you a cease and desist or something much worse.

Speaker 1:

Yeah. We we've noticed the same where, you know, very happy that people are sort of spreading our reporting on Instagram, on YouTube, on TikTok. And a lot of people are doing their own reporting, but there's also a lot of people who are not, which is fine because they're talking about it and they're raising awareness about it. But but then they are getting some of the details wrong, and that feels bad because some of those have gone, like, far more viral than our Sure. Work on it.

Speaker 1:

And then it's like, okay. It's not it's not the worst that that people are learning about this, but they they are learning sort of like a distorted version of it sometimes, and and that that worries me a little bit.

Speaker 3:

Yeah. And it's frustrating when you're sort of asked to respond to someone else's someone else's reframing of your content, which is, you know, then that's another thing where you're just kinda like, well, hold on. I never said that. They said that. But, yeah, I mean, I just hope people are more careful.

Speaker 3:

And and, I mean, a lot of people have been kind of in flocks crosshairs, myself included. It's it's been a very stressful year, and you you would think that the company would just, like, you know, hire a robust security team and fix this stuff rather than just, you know, attacking every single person who points out the problem. But here we are.

Speaker 1:

Yeah. You know, you found at least 40 of these Flock Condor cameras that are streaming unencrypted to the Internet or streaming for at least days, maybe weeks. Can you take me through a little bit how you discovered this?

Speaker 3:

Yeah. I like, I kept telling myself, like, alright. I'm done with Flock. I have so many other things to be doing. I I genuinely have Flock fatigue from the previous round of security vulnerabilities, and then the previous round of, like, you know, the video.

Speaker 3:

I've already made two videos, and I never make more than one video on on a topic. So, yeah, it randomly middle of the night, I'm searching on Showdown, which is kind of a Internet of Things search engine, and I had known about the ports that some of the cameras were using, and they were initially being used by the Falcon cameras, if if I if I remember correctly. And some of them are in saved searches, and I don't even remember the methodology. I think I was just kinda punching in numbers, procrastinating, going to sleep, and then I saw some some come up, and they they kinda matched. The interesting thing is I had definitely seen these models in the past, but you see so many things when when when you search stuff on Shodan that a lot times you you don't really wanna click everything, and then also you're kinda worried.

Speaker 3:

You don't wanna click something, and then somebody have your IP address. You have to start your VPN. You know? So lot a of times you kinda get lazy, and this time, I didn't. And so I I found some, and then I realized that just by searching flock admin, I could find even more and, and different different set setups that were in these cameras.

Speaker 3:

And so I I think I found probably 40, maybe around a I didn't really do a count, and at least 10 were active. I kinda stopped counting at ten. I shared some of the information with John John Gaines Gainesack, and he found even more. He he updated the searches for, you know, a little bit smarter way to search because he's more of an expert at this than I am. Yeah.

Speaker 3:

And I and I mean, immediately, we were just without any username, without any password, we were just seeing everything from playgrounds to parking lots with people Christmas shopping and unloading their stuff into cars. I I mean, it was honestly, we probably saw more stuff that didn't have cars in it than stuff that did.

Speaker 1:

Yeah. I mean, that's the thing that really stood out to me is that, you know, a lot of our reporting, almost all of it, has focused on Phlox ALPR cameras. I know that's what your videos have focused on so far. But this company is building a huge surveillance apparatus that is not just ALPRs. You know, they have a drone program, and now they have these Condor cameras, which are specifically focused on tracking people more or less.

Speaker 1:

You know, I watched the webinar where they Yeah. Showed this to cops, and and, you know, they do talk about tracking cars as well to go along with their LPRs. But, you know, I I was able to see these cameras tracking people walking down a bike path, you know, walking down the corner in front of a mall in Bakersfield. And I don't know. I just I was so struck by how high res it was and how it was clearly tracking people and then also the kind of banality of the places that they were stationed, like, you know, a bike path, random parking lot, a playground.

Speaker 1:

I'm curious sort of, like, what you thought when you saw this stuff because I I know you were pretty affected by it.

Speaker 3:

Yeah. I I think it was, like, the first time that I actually got, like, immediately scared. You know? Like like like, you see the next step right in front of you. You know?

Speaker 3:

Like, I all this time I've been saying, you know, this is a slippery slope. This might happen. You know? And that was the first time I was like, oh, it's it's it's here. This is it.

Speaker 3:

This is this is like the stuff that I've been a conspiracy theorist about for the last couple of years, and now we get to view it in real time. So that bike path that you mentioned, I went there, and I read Phlox statement under a Phlox camera with access to it in real time. And but going down that bike path is one of the most uncomfortable dystopian things I've ever experienced. It's these cameras, they move around and make noise that you could see them moving around and following you. And so it created this this sort of grift with me where where it's like, okay.

Speaker 3:

So I'm I'm really angry that this is here. I kinda hate the city or, you know, for allowing it to happen, but, like, why is anybody dealing with this? Like like, if I live nearby, I wouldn't go on the bike path, period. Like, I'd just be like, this is creepy. I don't wanna be here.

Speaker 3:

Again, I don't care who's watching me. It's just very strange for these, you know, things to be following you around while you're, like, walking around. Yeah. I I think the one that affected me the most was it was a playground, which is powerful. Like, you know, seeing unattended kids being watched from public view is like, you know, I feel like that's one of the most powerful things, and obviously, that was, like, something that I wanted to show other people publicly so they could understand how dangerous this is.

Speaker 3:

And there was one that just going through this footage, like, collecting some of it for reporting. I just saw a guy sort of walk out, old you know, a guy who's looked like he's about 30 years old walked out and just went on a swing set and, like, looked around and then just had a blast on it for fifteen minutes and then, you know, went back to work and left the playground. And and that one weirdly felt I think that one affected me the most because that it it showed what a person does when they have an expectation of privacy, and, like, that person, had he known that anybody was actively watching him, it's doubt it's doubtful that he would have enjoyed himself in that moment. Like, it was like this moment of innocence and privacy that I had access to that I didn't think I should have access to.

Speaker 1:

Mhmm. Yeah. I mean, for me, the the most affecting things I mean, I thought the playground was very harrowing. Honestly, I I felt happy that there were not that many people who went to that playground. Like, you know, when I was looking at it, there were just not there wasn't that much going on there.

Speaker 1:

But for me, it was the fact that it was, you know, these are PTZ cameras. They're pan pan tilt zoom cameras, and they were following people who were walking around. Like, people walking their dogs, a person rollerblading. And the fact that it is tracking you as though you may be a criminal at some point was very just very wild to me because I have seen how Flock talks about this to cops. And what they tell them is that we our AI automatically tracks people so that you can use this for evidence if you need it at some point.

Speaker 1:

And Yeah. It's just like, okay. But you're tracking literally every single person who walks by these things.

Speaker 3:

Right. Yeah. I mean, I suppose that's the big difference is having everybody's activity in a giant, you know, dataset or in a giant in, you know, giant data storage thing that's being organized by AI versus having the police have a suspect and getting a warrant and then following them, which requires man hours. And and if somebody is suspected of something frivolous that's not worth those man hours, then they don't investigate them in that way, and they don't track them. So it it's easy to see how quickly this gets out of control.

Speaker 3:

A couple of years ago, I went to University of Chicago to interview some data scientists. It was their sociology department, and to interview data scientists and the professor of sociology there who had made this AI crime prediction system that had a 90% accuracy rate. And oddly enough, the neighborhood that this AI model was initially trained on happened to be the neighborhood that I grew up in, which is West Englewood in Chicago, which is, Englewood's one of the most violent neighborhoods in in America right now. And so it was really interesting to find out that they had never actually visited the neighborhood that was only fifteen minutes away, yet we're spending years studying all of the the data from it. But more so, and more to my point, is you very quickly realize that if you have, you know, if you have police constantly monitoring one area, you'll find crime there because crime happens everywhere.

Speaker 3:

And that doesn't necessarily make anybody safer. It just means that more crime is being detected where more police are. So that was kind of the that was kind of the vibe of it is you're able to to find all of this crime, but in the neighborhood next to it or maybe even a neighborhood that that didn't have as many socioeconomic issues, you, you know, you weren't finding crime that maybe you should have found. And and I I see this in the exact same flavor as that, where it's like, you you could put these cameras up, and, yes, you're gonna find more crime, and then you're gonna have more people, you know, being affected by by, you know, maybe policies that weren't being enforced by actual police officers.

Speaker 1:

Let's talk about the security aspect of this just because I don't know. It's like you walk around, you drive around, you see CCTV cameras, you see cameras out there, and so we are being recorded all the time. I don't like it. But I also don't generally think, like, this is being livestreamed somewhere. And if it is being livestreamed somewhere, like, in a law enforcement context, I imagine that it's going to the police and to no one else.

Speaker 1:

But what was happening here is that literally anyone on the Internet could go and not only look at this stuff, but also, you know, see the settings, see some of the logs, change some of the logs if you wanted to, which we did not do. You know, what what does this say about Flux security?

Speaker 3:

I mean, so before I even had access to look at a stream, I had access to the administration page of the actual camera itself and thirty one days of data of every single person who had walked in front of that camera. I had the ability to delete that stuff. I had access to the logs. Even more problematic, you can't really prove it, but I was finding when I went up on the logs, I was finding, like, parser errors in that player from forty five minutes before I looked at it. And this is before I shared it with John or anything.

Speaker 3:

Meaning that someone had watched it and had an issue playing the video because, like, it wouldn't just play a video on its own. Now it doesn't seem to me like Flock would sit around and watch their cameras all day or something, especially, you know, at a playground or something like that. But it did make me wonder, like, okay. Who else because these were visible on on SHODAN. So, like, it is it's not that far of a stretch to think that there are a lot of people who have already found this and just didn't report it or, you know, maybe, or were scared to report it or possibly using it for some sort of nefarious use.

Speaker 3:

Yeah. I mean, the security with flock, I I don't know how it could really be any worse, to be honest. Like, I can't imagine it being worse. It was already bad enough being able to poke the camera a few times and connect to it as an access point and then root it and, you know, it be being able to control outside of the operating system, which is Android, by the way. Mean, so it's like I I honestly can't think of any way where it could be worse.

Speaker 3:

Like Mhmm. At this point, if I if I were, you know, working for the company in security or something, I mean, we're at the point where, like, most cameras, in my opinion, need to be replaced. They need to be replaced with something that's not running Android. They need to be replaced with a proprietary operating system or something that's more low level because there's no need to load all of the libraries and things in the Android OS just for something that that restreams and catalogs footage from a security camera. And this is a $7,000,000,000 company.

Speaker 3:

Like, there's no there's no excuse why they they couldn't develop something like this. That's a start, but right now, it's it really feels to me I would never say this in one of my videos because this is very opinionated, but it really feels to me like they are aware that this is a huge mess, and they're just pushing on until they can have an IPO. And then they'll deal with it once the stock market is invested in it, we'll have to, you know, pay the damages.

Speaker 1:

Yeah. I mean, it struck me that, you know, you and John Gainesack found all of these CVEs, like, you know, made a video, got quite a lot of attention on it, because the things that you found were frankly insane. And then here, it's just a few weeks later, and this is, like, an entirely different class of vulnerability and exposure on totally different cameras. And and it it feels to me like it's it gets new and interesting ways of of being, insecure.

Speaker 3:

Yeah. Long before this, when I was making the the security the flock security video, it was almost like a catchphrase where I'd say, like, I don't see how it could get any worse. And and then something would happen where you'd be like, wow. They they pulled it off. They made it worse.

Speaker 3:

And, you know, that happened, like, three or four times. And then this was, the ultimate one because, again, yeah, like you said, this is completely unrelated. And it's it's exactly what has been promised would never I mean, there are so many quotes. I mean, right off the bat, the company says that it doesn't collect data of people, that that it only collects license plates. Like, that's right on their fact.

Speaker 3:

It says that all data is encrypted. You can see now that that is a flat out lie. Like, that is that like, you know, that's not even a misunderstanding of what's happening. They're just lying at this point.

Speaker 1:

Well, I mean, if it's encrypted, but you give people the endpoint, it's functionally not encrypted. Who cares? Like, it it doesn't matter. Yeah.

Speaker 3:

Yeah. And so it's it's just mind boggling at this point. But, yeah. No. I I honestly felt a little I felt a little bit bad.

Speaker 3:

I I hadn't hit up John about anything flock related for a few weeks because two days after the video came out, he lost his job. And and it wasn't it didn't make any sense to me. You know? It wasn't like he hadn't been showing up to work, or it's not like, you know, a company that was downsizing. Like, he he was a he seemed like a valued high up member of a security team that just randomly got got canned two days after the video came out.

Speaker 3:

And he probably won't speak speak about this just for, you know, his own lid, but, you know, I could have my own opinions. But I think it's pretty easy for the average person to figure out, yeah, that these two things are probably connected. So I I actually didn't reach out to him about anything flock related, just kinda kept things friendly because I felt bad. I felt terrible. Like, was like, wow.

Speaker 3:

If I didn't release this video, he'd still have a job. And but when I found this, I was just like, okay. You gotta see this. And he he immediately, you know, fell down the rabbit hole too as did you and, you know, it it is one of the most astonishing security vulnerabilities.

Speaker 1:

Yeah. I mean, it was the the cameras that you went and saw are were closer up. Like, the ones that I I saw were pretty far back, you know, like, very, very wide shots. But there was something pretty like, I'm I had to drive two hours to go see these cameras, and I was surprised by how much it affected me even though I knew exactly, like, what I was doing. But I was like, oh, I took my dog.

Speaker 1:

I was walking my dog, and I'm like, I'm watching I'm recording myself being recorded, and I'm watching myself back in real time as I'm doing this, like, on the open Internet Yeah. In a way that it shouldn't be. And I was kind of surprised by how much it affected me. You knew what you were getting into, but was it weird, seeing yourself on these cameras, like, that you had been watching on the Internet, you know, days before?

Speaker 3:

Yeah. I've always sort of called that closing the loop. It's like when you when you've, you know, you know something exists, you you your your logic lets you know that, know, you have an instinct that something exists. You you logically believe it exists, but then you see it. And and when you see it, you're you're it's like then you actually you you feel the full weight of the whole thing.

Speaker 3:

It was kinda funny because I was recording oh, you know, I was reading flock statements contradicting exactly what you would be seeing with your own eyes watching me read it, and random people on the trail were just kinda like, hey. What's going on? Because, you know, then I realized how crazy I was talking into a security camera into a microphone that had a little recorder on it, and I'm like, okay. I'm nuts. Yeah.

Speaker 3:

And I and I just told them. I'm like, oh, no. I have access to this, and I just showed them my phone. And they were like, oh my god. What?

Speaker 3:

Yeah. And, I mean, they were they were blown away by it. So I guess more to your point, it's one of those things where it's like I always tell people things like this, and I always tell people, like, yeah. This is this is a really bad security vulnerability, but that closes the loop. And once, you know, once they saw their image on the phone in real time, that I think, you know, probably not likely that they're gonna be going on that trail again.

Speaker 1:

Yeah. For me, it just it really shows kind of the, like, holistic nature of the surveillance. It it, like, adds a lot to sort of what I thought of Flock as a company, not not giving them too much power here, but, like, I had largely thought of them as an ALPR company. Yeah. And sort of seeing like, the the cameras that I saw, there was a bunch of Falcon cameras right near it, the ALPR cameras near it.

Speaker 1:

And and it just it I was like, no. This is, a holistic, like, surveillance state situation. And and just also seeing that the exposures were not just in California. Like, we saw some in New York. We saw some in Louisiana, you know, various in Georgia.

Speaker 1:

You know, it it really is. It's like, it's all over the place, and you click one link and be in, you know, Bakersfield. You click another link and be in Suburban Atlanta. Like, they're all over the place.

Speaker 3:

Yeah. I guess, like, seeing the scope of it. I guess I just keep asking myself or I I mean, I I wanna ask my my the the people who watch my platform as well, like, what's it gonna take? Like, what is it gonna take? So now that we could see children unattended on at playgrounds, and we could see women jogging alone on a forest trail, and we could see people buying, you know, a bunch of expensive Christmas gifts and loading it into their van in a in a shopping retailer's parking lot.

Speaker 3:

What's it gonna take for somebody to stand up to this? Well, like, what's it gonna take for Lowe's to say, yeah. You gotta get these out of the parking lot. This is this is endangering our customers and making us liable. Like, the answer I don't want to hear is it's gonna take a child getting abducted or a woman being assaulted.

Speaker 3:

Like, I don't wanna hear that because I don't think that that's necessary at all. I think we could see the risk right here of, like, if I'm finding this stuff I guess an interesting thing to say here is that if if you see a security vulnerability on my video or if you hear about it generally in the press, anything like that, chances are that things are a lot worse because I can only tell you that about things that I can legally access. But if I didn't intend on telling you that, then I would try to opescape my research so I didn't get caught, and I would be using more backdoor things. So, basically, I am limited by the law to only use the search engine and click on things, you know, and then find them and then show them to you. But if I actually go into that log and take that hash information or, you know, do an audit of of the system that's running the software on there, like, you could do all those things and you can find a whole lot more, but I just can't report on it, and most people can't due to the crazy computer crime laws that we have in this country.

Speaker 1:

Yeah. I mean, also, you know, you do really great work, but it's like, SHODAN isn't magic. It's like people know about SHODAN. You know? Hackers know about it.

Speaker 1:

Bad actors know about it. A lot of security researchers obviously know about it. It's like if you were able to find this, like, who knows how many people were able to find it?

Speaker 3:

Yeah. And if it's on Shodan, that doesn't mean it's only on Shodan. You know? Like, somebody could prop could have used DuckDuckGo and possibly found the exact same thing. Like, it just depends if the web crawler had breached it or not.

Speaker 3:

Shodan mostly just prioritizes things. Like, it just filters out other stuff. But with enough dorking I mean, yeah, we we found all that. The last video I did, we found Joshua Michael, he found a bunch of GPS location data of police cars and things like that, just off Google just from doing Google dorking. So yeah.

Speaker 3:

I mean, it's it's it would be unlikely for me to think that nobody else had found this. In fact, you know what? I did get an email from somebody who after I had already told the group about this, I got an email from somebody, notifying me that they had found a similar camera on a from a web search engine. I mean, if we're we're in a world right now where we're, like, banning TikTok and DJI because we're concerned about national security and surveillance of Americans, yet this is allowed, which, you know, the I feel like somebody needs to pay attention to that.

Speaker 1:

Thanks so much for listening. You can watch Ben's videos on YouTube at Ben Jordan. Ben is with two n's. And you can find more of our reporting about FLAC at 404media.co. As a reminder, four zero four Media is a journalist owned company and needs your support.

Speaker 1:

You can subscribe to us at 404media.co. This episode was created in partnership with Kaleidoscope and was produced and edited by Alyssa Mitcalf. We'll be back with a new episode soon.