from 404 Media
Hello, and welcome to the four zero four Media Podcast where we bring you unparalleled access hidden worlds both online and IRL. Four zero four Media is a journalist found company and needs your support. To subscribe, go to 404media.co. As well as bonus content every single week, subscribers also get access to additional episodes where we respond to their best comments. Gain access to that content at 404media.co.
Joseph:I'm your host, Joseph, and with me are two of the four zero four media cofounders. The first being Sam Cole.
Sam:Hello.
Joseph:And the second being Jason Kebler.
Jason:Emmanuel retired after last week. We attained top top podcast status and he's done.
Sam:You won. You beat him.
Joseph:So I haven't listened to the episode yet, but I I get the emails. Actually, I'm not really sure how I saw it, but someone commented on the podcast saying, wow. I love this episode. It's because it was just just Jason Emanuel last week. I love this episode.
Joseph:It was almost like two kids without their teacher being there. Was that the vibe? And am I like
Jason:was the vibe, first of all. And second of all, it wasn't just one person. It was, like, dozens and hundreds of fans being like, oh my god. You you dropped the dead weight. Get him out
Joseph:get him Joe, out of
Sam:should we get out of here?
Jason:I'll do I'll do a solo pod this No. We did miss you. We did miss you. But there was, I believe, two people who enjoyed the podcast.
Joseph:But now now it's Stubborn numbers. Yeah. Am I, like, a tyrannical host? I mean, I keep the pace going. I'm just trying to be, you know
Jason:Well, that was that was the running bit is that we're, like, we're gonna have to let this segment go for, like, two hours. But in the end, the podcast actually was about the same length as always, I feel. So Yeah.
Joseph:Yeah. Well, speaking of moving along quickly and being a tyrannical host, here is the first story. It is written by me, but Jason's gonna help me out with the questions. The headline is this company turns dashcams into virtual CCTV cameras than hackers got in.
Jason:Yeah. So this is, I guess, follow-up story to scoop that you had a few weeks ago, which we'll get into. But I guess to start things off, what is Nexar and what do they sell?
Joseph:Yeah. So I'd never heard of Nexar until that earlier story, which involves FLoC, and we'll get to that in the surveillance company. But what Nexar does is that it sells these dashcams to ordinary drivers. So let's say you're driving around and you just wanna, you know, have some record of what happened if you get into an accident. They they do that sort of thing.
Joseph:They also market the dashcams specifically to rideshare drivers. So Uber, Lyft, whatever. And those cameras will look outwards, obviously, at the road in case there's a crash, but they'll also look inwards presumably for driver and passenger safety as well in case anything bad happens in the vehicle during the ride. So they sell those sorts of things, and I I imagine that some listeners might even have a Nexar dashcam in their vehicle. I got lots of comments and emails from people saying, wow, I had no idea this happened.
Joseph:I'm a Nexar user, blah blah blah. That's the normal business. The other thing that Nexar sells is kind of like a data broker business where they take footage generated or, you know, streams from these dashcams. They upload it to a public map that anyone can access. And what Nexar does then is that it basically uses AI or machine learning, something like that.
Joseph:They don't really specify too much to identify things inside that image. So you'll click on this map and it'll bring up an image taken from somebody's dashcam. It'd be like, oh, there's a road sign here. That means there's lots of traffic here. Oh, there's roadworks here, for example.
Joseph:You know, maybe that's going to be useful information to somebody else as well. And Nexar then sells that data to a bunch of other companies. You can see there they're basically trying to do both sides of the transaction. They're trying to sell the hardware to ordinary drivers. And then all of this data that's being collected, why don't we monetize that as well?
Jason:I really wonder if these dashcams are a loss leader as in I looked up these Nexar dashcams and they seem like they're pretty good cameras. Like, they I'm like, wow. These seeming like have a lot of features and they're quite inexpensive. And so, you know, I'm not an expert in dashcam economics, but it seemed to me I was like, this is this strikes me as, like, data is probably their main business. And I feel like when you first, you know, discover them and and wrote about them and, again, they're, like, a huge company, but I guess it was a company that we hadn't reported on before.
Jason:I was pretty like, it makes sense. I guess what they're doing makes sense to me. It it's almost like a distributed, like, Google Street View vibes in terms of, like, what they're creating. And it's I don't know. It's pretty alarming.
Jason:But, anyways, you tell tell us about that first story. You wrote about them, I think, maybe, like, a month ago or so about a partnership that they were doing with Flock or at least that they're planning with Flocks.
Joseph:Yeah. Something like that a month ago or a few weeks ago. And so regular listeners will know we cover a lot a company called Flock. Now Flock has AI enabled cameras all over the The United States. They continuously scan vehicles that pass by them as part of, you know, surveillance capability called ALPR, automatic license plate recognition.
Joseph:And you drive past a flock camera, it will get the plate obviously, but also the color of vehicle, maybe the model or the brand, potentially the condition of the vehicle as well. So it gets all of this data. And then what FLOC does is it sells access to that to law enforcement. We've done a ton of coverage now where we found the local cops were doing searches inside FLOC on behalf of ICE. Then Jason did something about how Customs and Border Protection actually had direct access to 80,000 of these cameras as well.
Joseph:And then we also found that a Texas official did a nationwide search on the FLOC network for a woman who self administered an abortion. Again, regular listeners will know that. That is what Flock is. What I reported a few weeks ago, as you say, was that Flock is exploring a partnership with Nexar. I got this through sources, confirmed it with the company, wrote up that article somewhat quickly.
Joseph:And that was, of course, interesting because Nexar has all of these dashcams in all of these vehicles, and it's trying to partner with Flock whose business is selling basically intelligence to law enforcement. I want to stress, like, we don't know exactly what Flock wants to do with the data, what it plans to do with it. And again, it's only exploring it at this point, that's why it was described to me by sources and then by the company as well. But that's how Nexar originally got onto my radar is that while they're planning to integrate and partner with this fairly controversial surveillance company, which brings up all of these questions about, well, are what they gonna do with this dashcam footage, you know, which is a totally fair question to ask of him.
Jason:Right. So, anyways, you you write about this company's plans to partner with Flock, and then soon after, as seems to be the case often, you you get another tip because, you know, we've signaled to people that we're interested in writing about this sort of thing. And so you you had a source reach out, and it turns out that this company was hacked. What was the hack and what was compromised?
Joseph:A lot of really, really sensitive stuff. So maybe just step back and walk through how it was done. This hacker who reached out to me, they found that every NexR dashcam, at least as they described it, contained a key to an AWS bucket, Amazon Web Services. This is, you know, very common Internet infrastructure. You'll be running a company.
Joseph:You need to store a bunch of data. You will pull it in an AWS bucket. This hacker found that the key to access that database inside all of these Nexar cameras didn't only allow that individual camera to upload its own footage, which obviously it needs to do. It needs to be able to access Nexar's servers. Right?
Joseph:They found that the key had two high privileges and that it allowed a third party, a hacker or somebody else, to actually access everybody's dashcam footage, which is obviously really, really bad. And they found something like a 130 terabytes of video inside this AWS bucket. They didn't, as far as I know, download all of it. Obviously, that's an absolute ton of data. They didn't send all of it to me either.
Joseph:They, of their own accord, just downloaded, I think, a dozen, maybe a couple dozen videos and then sent them to me for verification purposes. And, crucially, all that public map I was mentioning, which has sort of the Nexar dashcam footage and it shows a road sign and roadworks or or whatever, those are blurred and anonymized somewhat. Nexar blurs the actual dash of the car in cases identifying information on there. It blurs faces and also blurs license plates of other vehicles shot in that vicinity. These videos that the hacker accessed were not blurred at all.
Joseph:This was basically the raw footage taken from these dashcams all over The States. Maybe they're all over the world, but I think the vast majority of the ones I saw were definitely in The United States based on the locations and license plates, that sort of thing. So so I mean, it's really bad.
Jason:Clarify to clarify, the the blur happens at some point after being recorded, obviously, but they they were getting it, like, pre blur.
Joseph:So
Jason:The hacker got this in, like, pre pre blur.
Joseph:So that that was one of the reasons I was really interested in the story was that you go to Nexar's privacy policy and it says, hey, all of the blurring and the anonymization happens on your device. So even we can't see it. I'm like, well, how does the hacker have all this non blurred imagery then as you're reporting a story, more information comes to light. And how Nexar described it to me was that, yes, the blurring does happen on the camera. It's uploaded to one bucket.
Joseph:What the hacker had accessed here was the backups of the individual user. It was basically almost the Dropbox of the individual user. So it does appear that, yes, they do blur and anonymize on device as they claim in the privacy policy, But there's this other database which has it as well because these people wanna back up their footage as well.
Jason:Got it. That is that's very wild, though. I I feel that's very wild. So what types of videos are in there, though? Like, you you know, this this is sensitive data in that you're getting, you know, people's license plates.
Jason:You you might be getting identifying information from their dashcams, but, like, what sorts of things did you see?
Joseph:Yeah. It is very much what you would expect. A lot of cars driving on a lot of roads with a lot of other cars in front of them. That's not really surprising. I would say there was more information in there than you might expect.
Joseph:For what one example was clearly somebody answering a call on FaceTime. You know, there's that very distinctive sound on Apple devices when you pick up FaceTime call. I heard that and then somebody having a conversation. In another example, someone was in their car with a crying baby. I didn't have the full context of what was going on, but obviously, you know, that's a private moment where you don't assume a camera or a hacker may be listening in or or a journalist after the fact.
Joseph:And then there was one where, again, I mentioned the Nexar markets these cameras to rideshare drivers as well. The camera was pointing inwards to the vehicle. You could clearly see the Uber or Lyft or whatever driver and then the passengers get in and they're having a conversation. I can hear everything they're saying very clearly. I can see their faces very clearly.
Joseph:I could probably make out where they were just picked up. So this is all information that, sure, maybe it's okay that your Uber driver knows where you got picked up or whatever. I don't think you anticipate there's a camera in there which is uploading to a server which a hacker has accessed, which has then been sent to a journalist. And, of course, this hacker was acting somewhat in the interests of the company, and I'll I'll reframe that slightly, in that they didn't steal the state and pop and post it on the Internet. They came to me because they were considering about, well, how do I get this fixed?
Joseph:And often, hackers or information security professionals will do that through a journalist because the companies will actually fix the issue. And this wasn't actually in the article. This I learned this maybe a day or two later. But somebody who I already know who's covered who's done research that we've covered before, they said they also found the same vulnerability. They reported it to Nexar before all of this, and they never received a response and it was not fixed.
Joseph:Clearly, because the hacker got all the videos and sent them to me.
Jason:I mean, I feel like people have really people talk shit in cars all the time. Like, if you're able to hear the conversations that are happening in a car, it's like, I don't know, you go to a party and then you get back in the car with your partner or whatever, and you're like, oh my god. Can you believe that person said this at this party? And I don't know. I feel like that's a very, like, universal experience.
Jason:I feel like that's been part of various sitcoms and things like that. Like, it's such a trope. And the fact that if you had one of these cameras set up in this way, you know, it was recording and then uploading, as you said, to a cloud bucket that a hacker was accessing. It's really really sensitive stuff. I'm I'm I'm pretty it's pretty alarming.
Jason:Like, there's not that many hacks these days that I find to be surprising, but this one is is pretty big.
Joseph:Yeah. I feel we've seen we've seen every single sort of hack you could possibly imagine. And, oh, well, they they found a new one. They found a new type of hack.
Jason:Yeah. Yeah. Yeah. We said a lot of top secret stuff when we were in Sam's car
Joseph:a few weeks ago. I just
Sam:recorded all of it. It's all on video and tape. Waiting for the right time to release. But yeah.
Jason:Yeah. Sure. Exactly. So this is bad for the individual users, the people who, you know, may have been saying unpleasant things about their loved ones after a party or whatever, but there's national security concerns here as well. Can you talk more about that?
Joseph:Yeah. So it's like a whole other dimension to the story. Like, any one of these would have already been a story. Like, it already would have been an article. Hey.
Joseph:There's a map, and this company is uploading anonymized footage from it. And I spoke to multiple Nexar users and asked them, Hey, did you know this map exists and maybe your footage is on there? They had no idea. That's already a story. Then you have the hack.
Joseph:Obviously, that introduces another dimension. Then, on top of all of that, some of the dashcams are clearly owned by people who are either visiting or work at very sensitive US military intelligence agency facilities. And I won't go through all of them, but there's some US military ones, air force bases, all of that sort of thing. I'll focus on the CIA one. So the hacker was able to find through that publicly available map, hey, there's a car which is going to the CIA's headquarters.
Joseph:Okay. That's kind of interesting. The hacker was then able to find the unredacted, the unblurred footage from that driver in the hacked AWS bucket. They could do this because I think every video has a unique user ID. Like, it has the user ID then some other sort of identifiers.
Joseph:You can figure out, well, that's from the this user, that's from this user, etcetera. They did that and it showed this person taking the turn off, I think, the parkway, not quite Highway, towards CIA Headquarters. And then they clearly take the dashcam off the dash and they put it somewhere hidden in their vehicle. I presume because the policy is that at CIA, hey. If you have a dashcam, please don't drive with it right into our facility.
Joseph:That said, one of the images on the publicly available map is way closer to the facility than the unredacted footage was. So why does that matter? Well, somebody is driving into or near this very sensitive facility and maybe at some point, they screwed up. Maybe they drove in and their dashcam was still going. Again, I only saw a tiny sample of these videos.
Joseph:But if this person is going to CIA every day, every week, like, they're a contractor maybe, they're a worker there, who knows. Right? But if they've screwed up even once or a few times, hey, that could be beneficial. Now, I'm sure Russian and Chinese intelligence operatives don't necessarily need a hack of a dashcam installed in people's cars to get that sort of information. I don't know.
Joseph:Maybe a lower tier country might need it if like a lower level, like intelligence service. But regardless, it's not good that these cameras are in people's cars who are affiliated with these sensitive facilities and the hacker was able to get into them. You know, it's pretty straightforward.
Jason:And then the companies that buy data from Nexar, you know, also potentially have a reason to be concerned.
Joseph:Yeah. So we didn't know again, right at the top, it was we were mentioning Nexar sells these dashcams, but it also sells this data basically generated from them. And the hacker also got this list of companies which at least have expressed interest or there's they've had a dialogue with Nexar. Some of them, when I approached them for comment, said, you know, we've we've never had we've never had a relationship or some said, we only evaluated it and some did say that they actually used it. But there's a lot of companies in there.
Joseph:Microsoft, Apple, Google, a bunch of AI companies. I mean, the Apples and Googles, would especially the Googles, you would obviously expect because Google Maps is updated, obviously, with roadworks information and traffic data. So you could see how that could be useful. A bunch of AI companies, as I said, and then one of the more interesting ones to me was Niantic, which is the Pokemon Go creator, which as and and and they I'm
Jason:by Saudi Arabia now.
Joseph:I was gonna ask, could you briefly, to close this off, just explain what that and I'm not saying this footage is going to Saudi Arabia, blah blah blah. I just think that's an interesting thing that people may have forgotten about.
Jason:Yeah. I mean, it was a few months ago, and I haven't I did write about it at the time, but, basically, the Saudi a company that is owned by the Saudi Foreign Investment Fund, sovereign wealth fund, that's what it's called, bought Pokemon GO from Niantic. And so, yeah, it's it's now tied to the Saudi government in some way. That's just I don't know. Another another layer to this, as you said, it's like lots of companies touch this.
Jason:So, yeah, I mean, I I find this whole thing to be, like, incredibly fascinating just because it touches so many different companies, so many different, like, attack surfaces. I find the dashcam like, the concept of dashcams to be quite interesting. Like, I understand why someone would want to have one for safety reasons, and also, you know, people drive crazy, and it it it's needed often to, like, establish fault and crashes and things like that. But, you know, if you have one that's connected to the Internet and that's streaming to these server like, being used to create big data, it's stuff like this can happen.
Joseph:Yeah. Absolutely. And I think that if you're a listener and you use an XR dashcam I mean, let me know, and also let me know if you knew that this publicly available map maybe had your dashcam images in it. But, yeah, I I just think maybe drivers wanna keep this in mind when they're putting on a constant ubiquitous surveillance technology into their vehicle. Alright.
Joseph:We'll leave that there. Thank you for running us through that, Jason. When we come back, we're gonna talk about Sam's reporting trip to San Diego. Again, only made possible for due to four zero four media subscribers and the sentencing of somebody that she's followed for a very, very long time, a content warning on that one for sexual assault. We'll be right back after this.
Joseph:Alright. And we are back. Sam, as I said, you wrote this one. The headline is Michael Pratt, Girls Do Porn ringleader sentenced to twenty seven years in prison. How about I'm gonna give the quickest summary of Girls Do Porn I can possibly do so we can get to you immediately talking about the new stuff and being in the Sandia courtroom and all of that?
Joseph:So regular listeners again may know Sam has covered Girls Do Porn for a very, very long time. This was essentially, literally a sex trafficking ring operating in the open on Pornhub. Women would travel for what they believe would be modeling or similar jobs. They would go into the hotel room where they would be forced or coerced into having sex on camera. They would be told this is just gonna be kept to local VHS or DVD collectors, whatever, in New Zealand and Australia, for example.
Joseph:That was a lie. The footage was then posted on Pornhub, obviously, one of the biggest websites on the Internet, and this, of course, ruined many, many of these women's lives. You've written a ton about the hunt for Pratt, and I definitely think people should go check that out, the sort of the private investigator almost side of it. But after all these years, Pratt goes on the run, eventually tracked down, and now you get to, I think, be in the same room as him. Right?
Joseph:I think just first of all, I'll try and do it chronologically. You're flying to San Diego for the sentencing. What was your understanding of how long this would go on for? Like, is it, oh, we're in. He sentenced.
Joseph:We're out. Like, what was your initial understanding of how long it might be?
Sam:I mean, I had never been to I'd never been in a courtroom for, like, pretty much ever, I don't think. Not in my adult life. So and I'd certainly never been into a federal sentencing in person, so I had no idea what to expect. I thought that it was gonna start at nine. The judge would come in and say, this is the information that we have.
Sam:And based on that, here's your sentence. See you never. You know? Like, I thought it was gonna be, like, in and out. Here you go.
Sam:I did know that victims usually give or can give statements in these sorts of things, and they did during the sentencing of Andre Garcia, who is the main actor in a lot of the girls do porn videos. He was the main guy doing the actual videos for years. So during that one, a lot of victims got to come forward and say how he affected their lives in that moment and then going forward. So I knew that victims might say something, but I didn't expect to be sitting in the courtroom for five hours, and that's what it ended up being. We were there.
Sam:We took two breaks, maybe three. I think maybe three breaks. I started losing track. We were there for five hours. I really thought it was gonna be like, we'll get this over with and move on.
Sam:But it took so long because the victims who came were like it was 40 women who showed up to say, this is why this guy should go away for the maximum amount of time. And almost all of them, all but one, said, give him the maximum. Give him life. Like, there's no amount of time that you can give him that would be too much. So I was definitely expecting something shorter and more to the point, and this was very cathartic for them, very cathartic for anyone who's ever been following this story.
Sam:And, yeah, he was there in person. He they walked him in in the beginning, and he had this, like, really bushy head of hair, which I was not expecting and, like, looked like he'd been living in the woods. And he's been in custody for years, but just very different than, like, his bug shots look. And he sat there throughout all of this. He gave a little statement in the beginning and said you know, mumbled through a two sentence apology and said that he never would have come to The United States to do this if he had known this was how it was gonna shake out.
Sam:It was not a real apology.
Joseph:Wait. Wait. Wait. Wait. I would I would not have come to United States if I knew I was gonna be in The US for I'm paraphrasing, obviously, but that was the implication.
Joseph:Okay.
Sam:He said I mean, he gave it was a it was very short, but he said it was never my intention to hurt anybody and said that he would never have come to this country to make a website if if this he knew that this was the way that it was gonna go. And he had given an apology a written apology that was also a pretty short previous and filed that with the court. And it was the same sentiment. It was like, I was just a businessman trying to make a business in America, and he's from New Zealand. He was like, I just wanted to travel here and start a business and saw an opportunity, and it went sideways.
Sam:And I regret that it basically blew up the way it did. I mean yeah.
Joseph:Yeah. That's not real. Apology.
Sam:It was not really it was not convincing remorse, and it was not convincing to anybody that was there, I don't think, either.
Joseph:What was it like? And and we'll talk about the victims and the impact statements in a second. I'm just I'm just curious. What was it like for you after covering this person and this case for so long to finally see this person, hear their voice, be in the courtroom with them? I mean, just journalistically, like, oh, do do things fall into place, or what was it like for you seeing this person you followed for so long in a way?
Sam:I mean, it was definitely surreal. It was kinda strange. I feel like I've been following him through the other people that have been following him. So, like, the lawyers representing the women in the civil case and the investigators who tracked him down and dragged him back. You know?
Sam:Like, they didn't themselves, but then the FBI showed up and dragged him back to The United States, and the the Spanish police locally arrested him. I feel like I've been seeing him through the eyes of so many different people
Joseph:Right.
Sam:Throughout this many, many year case. I think it was '29 2018 or 2019 when it actually went to civil trial, and then 2020 when it was it was 2019 when the FBI charged him and his coconspirators with trafficking. But it's been years long and years coming to actually see him get to this point where he's standing in front of a judge, which was pretty powerful. And it's also it's not something that I encounter very often and that happens very often in general is when someone does a crime like this who is producing nonconsensual material and is distributing abuse material on the Internet, they don't often and this is because the scale of this problem is so big and so many people do this. They don't often meet any kind of justice, and they don't often answer for this particular sort of crime.
Sam:That was pretty interesting just to see someone who had been doing this and thought that he could do it forever because he had the Internet and basically was behind behind the camera and not in front of it to shield him from any kind of responsibility, and I he misjudged that massively. And I think that's only because of the efforts of the women who were there that day. And so in more women who had come forward and said, hey. This is something something very fucked up is happening here, and he needs to answer for it. So I don't know.
Sam:It was very powerful. It was very it was it was a long day, like I said, but it really actually felt like I was there for half an hour. It was like like a snap because everyone in the room was so focused on what the women were saying and just completely wrapped with their stories.
Joseph:And speaking of those stories, as you say, a large amount of this hearing was dedicated to these victim impact statements. I'm just curious, what did you learn from those, especially what was new? Because you were there. You were taking notes. You were sort of updating us in near real time.
Joseph:Of course, you're trying to, you know, get ready to write an article about what's being said in the news. What was new that trickled out for you through those statements?
Sam:So a lot of them were the same thing over and over, which we knew from reporting previously. They all have a very similar story because the business was built on this very specific sequence of events where the women had to
Joseph:go formula, basically.
Sam:It was it was the way that it was the manner of the business. It was not, like, a a fluke that someone had a bad time. It was set up to have the same experience over and over and over where they would be very inexperienced and come to San Diego and basically be sexually abused and assaulted for hours and hours and then let go and then threatened for the rest of their lives about whether they could come forward. So a lot of it was not surprising, but I did hear from several of them that as many as 15 women have died since shooting these videos with Girls Do Porn, which I didn't realize that was the number. I knew that it was maybe one or two, two or three.
Sam:These are all kind of, like, unconfirmed numbers, obviously, but I didn't really realize that that was the scale of it. And it doesn't listening to them, that doesn't actually surprise me, and it probably is that number is probably low because there were as many as it was 400 plus women involved in this and, you know, victims of this scheme. And so many of the ones who came forward in the sentencing said they were they had considered suicide. They had done self harm. They'd gotten very addicted to substances.
Sam:One said she died three times and was brought back to life because she had become addicted to substances. One was in the hospital, and the prosecutor read the statement for her because when she arrived in San Diego at the airport, she had a PTSD related seizure and had to go straight to the hospital and didn't even make it to the sentencing. And she her testimony was that she was brutally raped for nine hours under Pratt's supervision, basically, because it was his company. So that was surprising to me that that number was that high, but also not entirely shocking. One woman came forward and was the mother of a woman who died of an overdose after being a victim of Girls Porn.
Sam:She that was the whole room was, you know, crying as quietly as they could through her testimony, but she didn't even realize that her daughter was a girls' duporon victim until May, and she had died years ago. She didn't know that that was what had turned her life completely upside down originally. It made her withdraw. It made her abuse substances. So that was that was definitely a big theme of this is that these women were saying, we're here, and we're able to speak about this, and we are the ones who can come forward today.
Sam:And there are so many others who can't for so many reasons. And for many of them, it's because they are no longer alive. Several of them turned and spoke directly to him, which I thought was I don't know. It's just it's it was almost shocking that they did, and not because I'm surprised that they're, like, bravery enjoying that, but the vitriol in which they would do that was something that I think took a lot of people back. One woman was targeted when she was very young, and she kinda whipped around and looked right at him and said, hey, pedophile.
Sam:And everyone kind of, like, you know, took a breath. I don't think he looked any of them in the eye. I was in the back. But they would frequently turn around and say they would ask the judge, can I drive permission to speak to the defendant? And then she would say, yes.
Sam:Of course. And they would turn around and speak directly to him and tell him exactly what he did to them and the way that their lives had been affected by this. Several of them said that they had to change their appearances entirely. They had gotten work done, cosmetic surgeries, gained weight, lost weight, changed their hair to basically go undercover as they're in their own lives to avoid being harassed by people on the street.
Joseph:So what about the moment of the sentencing? Do you just briefly wanna walk us through that? Was that over in a flash? Like, what what did that look like?
Sam:That was compared to the four and a half hours that went before. The judge was quiet, obviously, the whole entire time listening, and she has been on this case for a long time and was part of quite a few of the other sentencings that were for his coconspirators. So she was familiar, and several of the women had seen her before and were like, hello again, which is just so crazy to have to do this over and over again for the for the women, for the victims. But the judge was so locked in the whole entire time and just listening completely wrapped, completely eye contact with these women the whole time, like, leaning forward in her seat. The the prosecution representing The United States had recommended twenty two years for him based on a bunch of legal math.
Sam:It's like there's a lot of stuff that goes into, like, sentencing That doesn't someone gets.
Joseph:Right. It's a big document usually, and it's like Yeah. Well, this is a factor that means we should have five years. This is a factor that means we add ten years.
Sam:Or like deduct years for his pleading guilty eventually, even though he plead not guilty or joint.
Joseph:So you're saying the prosecution did their math. They added up, and they came to 22, which Yep. Is interesting because that's way less. A little bit less than what he ultimately got. So And did the defense want?
Joseph:If you remember.
Sam:I think they asked for 17. I may need to check that, but they asked for not that much less. They were like, yeah. Probably a lot. Probably at least 17.
Sam:So I think everyone was kind of expecting 22 just based on the recommendations from prosecution. And she just kinda said I was I don't know what I was expecting. I was expecting some kind of, like, climactic moment where it's like, your sentence is, and, like, do a gavel bang or something. I only know court stuff from, like, judge Judy, obviously.
Joseph:Yeah. From TV. Yeah.
Sam:Yeah. So she just kinda said it in a sentence. She was like, 20 she gave it in months. It was 300 and some months, but it ended up being twenty seven years plus ten years of probation or, yep, ten years of probation. And she said it, and there was a reporter from NBC sitting next to me, and she was we were kinda both like, what?
Sam:And because it was a way higher number than we were expecting. You know, we're quibbling over the matter of, like, a couple years between prosecution and defendant, and she gives a a number that's way higher. So I think no one was really expecting that at all. We were also like I was like, say it again, please, because I don't even think I heard you correctly. You said it in months.
Sam:It's like, you need to say that number again. And I was checking it with you guys on Slack, and then I was checking it with her with this woman next to me. She was like, did I do that math right? And I was like, I've done it, like, a bunch of times. I think that's right.
Sam:And it was twenty seven years plus ten years of probation and then a bunch of other clauses as part of the probation. It was like, he's not allowed to go to a porn store. He's not allowed to I don't even think he's allowed to watch porn. He's not allowed to consume adult content content during that time. So it was a bunch of other things on top of the 27.
Sam:So it's really, like, thirty seven years of, like, being surveilled by, you know, your sentence. So and then after the fact, it was very clear that, like, the energy in the room was, like, relief and also just, like, again, cathartic because all of these things had finally been said to this man who was puppeteering this entire operation and was the reason they were all there, which is something the judge said was, like, it was clear that without you talking to Pratt, we wouldn't be here at all. So it's obvious that you shouldn't get the same or less than the actor in the in the movies, in the films, because you hired him. You know? You knew that he was a rapist, and you kept him employed.
Sam:And that alone deserves more than what he got, and he got 20. So I think everyone after the fact was very relieved. Barry is just like the word isn't, like, happy because Yeah. No one's happy to be there, but, like, it's the best case scenario that anyone could ask for. They were very thankful to prosecution.
Sam:They gave, like, the some of the victims gave statements after the fact and were, like, out recording with local news outlets and stuff like that, so along with, like, the prosecutors. So, yeah, it was it was cool to see. And I keep saying, and I've said this so many times, but, like, I think people hear this story, they think, oh, this is this is so dark and brutal and tragic, and it is. And none of it should have happened at all, but it is a story, start to finish, of these women coming together and saying, we're not going to let this slide. Like, this is something horrible that happens to so many of us, and he needs to pay.
Sam:Someone needs to pay. His whole company does. And they won a civil case back in 2019 or 2020. Sorry. I get those years confused because they were weird, and all this happened at the December.
Sam:It was in that era. And then now I think they a lot of them finally feel like this can be behind them. He's not only in custody, you know, captured after being on the run, after being on the FFA Most Wanted, but also is gonna go away for he's going to to federal prison for a long time. I think that's not it doesn't fix what happened in any way, but it is like, the the best ending that they could have asked for is for him to see some sort of justice. So yeah.
Joseph:Yeah. Well, I think that's a perfect place to leave it with that message of justice. If you are listening to the free version of the podcast, I'll now play us out. But if you are paying for a full media subscriber, we're gonna be talking about the Charlie Kirk assassination, our reporting around that. You know, we've done a couple of pieces, I think.
Joseph:You can subscribe and gain access to that content at 404media.co. We'll be right back after this. Alright. And we are back in the subscribers only section. Sam, this is another one you wrote.
Joseph:And then I think the second one was both you and Jason, if I'm not mistaken. But the first one is Comcast executives warn workers to not say the wrong thing about Charlie Kirk. I feel like I do not need to give the context of what happened with Charlie Kirk, but obviously, for those who don't know, he was assassinated at a college campus, and he was a far right celebrity activist, various different ways to describe. Let's talk about this story though and Comcast and what do they they own NBC, MSNBC, all of that. What did this communication say?
Joseph:What do these executives want people to do or stop doing?
Sam:Yeah. So this email, the subject line was the names of the executives. I was like, well, that's easy enough. We know exactly where this is coming from.
Joseph:But the subject line was the name of the executives?
Sam:The subject line was a message from Brian Roberts, Mike Kavanaugh, and Mark Lazarus. And those are the the chairmans and CEOs and presidents of Comcast as well as Comcast, one of its new spin off cable network portfolios. And it was addressed to NBCUniversal, which is the umbrella company for, like you said, MSNBC, NBC, CNBC, a bunch of stuff.
Joseph:Tons of journalists who are obviously gonna be looking into the story and finding things and saying things, probably.
Sam:Yeah. For sure. So the email said, you know, it started with talking about the tragic loss of Charlie Kirk, called him a a 31 year old father, husband, an advocate for open debate. I mean, you can read the whole email in the story, but it's there's there's talking about how their hearts are heavy. His passing is leaving behind his grieving family, a country goblin' the the division.
Sam:There's no place for violence. They do the whole they do the corporate eulogy eulogization of a problematic person, which is like, we're very sad for his family, and, also, he was cool with debate. Whatever. So the next paragraph is you may have seen that MACBC recently ended its association with a contributor who made an unacceptable and insensitive comment about this horrific event. And we should be able to disagree passionately disagree robustly and passionately, but ultimately with respect, we need to do better.
Sam:And what they're referring to is the political analyst Matthew Dowd, who was fired from MSNBC the same day that Charlie Kirk was shot. They he was on a broadcast with an anchor, an m a c NBC anchor, and he was asked about the environment in which this happens. And he answered I'll summarize it as best I can because it's long, but he said he's been one of the most divisive, especially divisive younger figures and is constantly pushing this sort of hate speech aimed at certain groups.
Joseph:Describing Charlie Kirk.
Sam:Describing Charlie Kirk. Accurately describing Charlie Kirk. And he said, and I always go back to hateful thoughts lead to hateful words, which then leads to hateful actions, and I think that's the environment we are in. And he goes on a little bit, but it's a very thoughtful as thoughtful as you can be on live TV, balanced answer. It's not saying he deserved it or celebrating it anyway.
Sam:He's saying, well, you asked me about the environment that this happens in. He was a very divisive person. You know, it's hateful actions happen when the environment is full of these hateful words and thoughts.
Joseph:That that is that is the very specific political and sociological context in which this happened, and he's just repeating that. But he was then fired for this.
Sam:He was fired. Right. And the president of NBC FSNBC issued a response and apologized for him. He apologized on Blue Sky and said, I apologize for my tone and words. I no way intend for my comments to blame Kirk for this horrendous act.
Sam:So MSMBC in this email is saying, you remember that? That happened two days ago. Just you know? You remember? And then ask them to embody the values, the m the embassy universal values in your working communities.
Sam:We believe the power of creation brings us together. It's such and this is something that so many people have said already, and we'll talk about in a minute. The whole entire email is an encapsulation of what has been going on all of last week and then this week so far of this double speak of we are celebrate celebrating this man who we think was really good at debate and communication and airing out his views in public. If you air out your views in public, you are gonna get shit canned and don't. And, like, it's just it's but they say it in a way that's so yeah.
Sam:It's doublespeak. I don't know how else to say what it is exactly. But yeah. And we wrote
Jason:Jason wrote up the story about this also. Yeah. The the Comcast story is extreme, but we're hearing story after story after story of people who are getting fired because of what they posted on social media about Charlie Kirk or what they said on social media about Charlie Kirk or what they said, you know, in an email behind the scenes about Charlie Kirk. And Sam just said it, but people that I've talked to, and I think just like people who responded to the article that Sam and I wrote that we're about to talk about said, like, thank you so much for writing this because I felt like I was going crazy, kind of like seeing what people are saying about this person. And it is insane the double standard with which we hold people to.
Jason:You know, as we wrote in the the article, like, we're not celebrating what happened. What happened is really scary. What happened is, you know, it's a it's a very awful thing, and it's like I don't know. I saw the video, like, two seconds after it happened, and it's one of the worst things I've ever seen in my life. And I've seen a lot of really, really awful things on the Internet.
Jason:But, like, that doesn't change who Charlie Kirk was. It doesn't change, like, what he stood for. It doesn't change the fact that, you know, he's being held up as this, like, warrior for free speech when his entire career was built on silencing people that he didn't agree with, both by intimidating them and sort of, you know, pushing people into power who, you know, have preyed on trans people, have preyed on LGBTQ people, who have, like, called for the stoning deaths of gay people. Like, this is, like, really crazy shit and really dehumanizing stuff against all sorts of religious and ethnic and gender minorities. And it is maddening to then hear people say he was a free speech warrior.
Jason:He he, you know, stood for open debate. If you say anything bad about him, we're gonna fire you. It's it doesn't make any sense. And I think, also, if you sort of look at what people are saying about Charlie Kirk, you have right wing politicians and right wing agitators calling for civil war and calling for violence and saying that this is civil war. And it's like there are no consequences to these people.
Jason:These people are not being fired for saying these sorts of things, but for accurately describing what Charlie Kirk said and more importantly, what he did, which we'll get into, but it's not just the things that he said. It's like the actions that that Turning Point USA, which is the group that he ran, did. They had real impacts on people's lives, that silent speech. And so, you know, Charlie Kirk's legacy should be that of someone who fought against free speech, not someone who fought for it. And you have these very big media organizations and very powerful people who say they wanna continue his legacy, and they are continuing his legacy by trying to inflict consequences against anyone who disagrees with them.
Joseph:Yeah. Well, you mentioned this other piece that you both wrote, so let's move to that. The headline of that was Charlie Kirk was not practicing politics the right way. That is a response to a New York Times piece, which I think had the headline, Charlie Kirk was practicing politics the right way. Jason, what was this New York Times piece then?
Jason:Yeah. I mean, was an Ezra Klein column, and it's it's interesting because, like, Ezra Klein is, you know, wrote abundance and has become I don't know. Like, this mouthpiece not mouthpiece, but he's he's a very influential person, and democrats listen to him a lot. And I I listen to Ezra Klein's podcast. I think Ezra Klein is very smart.
Jason:Like, I like a lot of what he says, but this is this is a really terrible column that he wrote that was called, you know, Charlie Kirk was Paul was practicing politics in the right way, and he said a lot of the things that I just sort of described where he was like, Charlie Kirk went to these college campuses, and he sat there, and he debated everyone who came, and he we need to continue to foster open debate, so on and so forth. And it's like, I don't know if anyone here has, like, listened to Charlie Kirk or watched his stuff. You know? It's a lot of it is pretty entertaining. Yeah.
Jason:It's like I I get a lot of it through Osmosis, a lot of it is, like, entertaining, and I use that word, like, with air quotes, and it's like it's entertaining in that, like, the open debate that he's fostering is like a college student who is a leftist or a liberal or just, like, believes in equality will, like, say something and he will, like, yell at them. You know? It's like there's not a lot there. It's a lot of, like, like, watch me on the libs, like, quite literally. I I wouldn't, like, say that, but he's it's performative.
Jason:That's that's a very good word. And it's just like yeah. That Ezra Klein article was was very silly. And, mean, I understand the impetus to write something like that in the immediate aftermath of someone's death. Like, you you know, you you don't wanna like dunk on people necessarily, but it it's important to think about like the impacts that these people had and to accurate describe what they did while they were alive.
Joseph:You know Especially when you're writing in the New York Times.
Jason:Especially when you're writing in the New York Times. I mean, the New York Times had like four different articles that were like basically hagiographies of Charlie Kirk. And, like, even in their news coverage, they really, like, glossed over what he said about gay people, what he said about Muslims. It's, like, twelve or fourteen hours before he was shot, he had a tweet that was like, Islam is the sword at the neck of the throat of the American public or something insane. Like, just really, really crazy stuff, really divisive stuff.
Jason:But, anyways, I mentioned, like, we should also think about his actions. And so Turning Point USA, which is his is his organization, you know, the organization is still around, has something called professor watch list, and this is literally just a list of professors who teach either critical race theory or teach that slavery existed and that racism exists. You know, a lot of them teach, like, gender studies, things that have been under attack from the right in The United States. And it's a huge list of professors that they're like, you know, we need to do something about them. And that that something isn't super well defined, but if your name is on this list, all of them have been subject to threats, to harassment, to, you know, spam, and just, like, trolling.
Jason:And, you know, Sam spoke to a couple people who were on the list who got death threats and said, like, this basically, like, ruined my career. And some of them have been fired because, you know, they were indoctrinating the youth according to people who were mad at them and who, you know, had sort of a lot of universities we've seen, like, when they become targeted by either Trump or by conservatives or by donors or by, like, harassment campaigns, they find it easier to, like, not renew a professor's contract or to cancel their classes or to put them on leave or whatever. And and, like, this is a playbook that we've seen time and time again. And so, you know, Charlie Kirk was the person behind that. And so to say that he was for open debate is just it's not true.
Jason:And, Sam, you talked to more people, but I don't know if you have anything more to add there.
Sam:No. I mean, it that's I think we said it all in the story, basically. It's the things that these professors and scholars have had to go through is frequently so insane because of his legacy. Like you said, the one one person I talked to said that he was getting pictures of his own house along with threats. And it's like, that's horrifying.
Sam:That's very scary, and that's real there's a real effect on the way people speak out in public. And not just him, it's like once you hear that that's happened to your colleague, then you start thinking twice about the things that you say and whether or not you want that to happen to you because this mob that Charlie Kirk is encouraging is gonna come after you next. And that was always the that was always the playbook and the legacy that he wanted to leave. So that's what it is.
Joseph:I think maybe we'll leave it there then. I was actually gonna read out a piece from the article itself, but I think Jason put it very well. So with that, I'm just gonna play us out. As a reminder, four zero four media is journalist founded and supported by subscribers. If you do wish to subscribe to four zero four media and directly support our work, please go to 404media.co.
Joseph:You'll get unlimited access to our articles and an ad free version of this podcast. You'll also get to listen to the subscribers only section where we talk about a bonus story each week. This podcast is made in partnership with Kaleidoscope. Another way to support us is by leaving a five star rating and review for the podcast. That stuff really helps us out.
Joseph:I don't have it in front of me, but when I was reading reviews the other day, there was one that simply said 10 out of 10. So I'll take that over the five stars. This has been four zero four Media. We'll see you again next week.