The 404 Media Podcast (Premium Feed)

from 404 Media

Flock Used Cameras at a Children’s Gymnastics Center for a Sales Pitch

Episode Notes

/

Transcript

This week we start with Jason's story about Flock accessing cameras in a children's gymnastics room as a sale pitch demo. After the break, Emanuel tells us why Nature retracted a paper about the alleged benefits of ChatGPT in education. In the subscibers-only section, we talk all about the cancellation of RightsCon after pressure from the Chinese government.

Story 1:
01:58 Flock cameras in gymnastics room story
07:09 Flock’s explanation (sales demos)
13:02 City council backlash + contract renewal
17:51 Aftermath + protests

Story 2:
22:57 AI in education segment begins
23:15 Retracted ChatGPT study (Nature)
25:41 Why it was retracted
33:08 Takeaways on AI research + schools

Story 3:
34:52 AI literacy bill (OpenAI, Google, Microsoft)
36:06 What “AI literacy” means
39:38 Concerns about lobbying + NSF
44:24 Do students need AI training?

Subscriber's Section:
52:01 RightsCon cancellation
57:08 Past RightsCon issues
59:25 China pressure + Taiwan factor
01:04:30 Bigger implications (global + US policy)


YouTube Version: https://youtu.be/aSTCeVH7N5c
Jason:

I think that this will have reverberations across other cities because it shows how intense and vast the Flock network is. Hello, and welcome to the four zero four Media Podcast where we bring you unparalleled access to hidden worlds both online and IRL. Four zero four Media is a journalist founded company and needs your support. To subscribe, go to 404media.co. As well as bonus content every single week, subscribers also get access to additional episodes where we respond to their best comments.

Jason:

Gain access to that content at 404media.co. I am hosting this episode. I'm Jason Kebler. Sam was trying to, I could tell, I could see it in her eyes.

Sam:

He's ready. I was ready.

Jason:

She did it last week. He did it last week. So my turn. So Sam is here obviously, and I also have Emmanuel. What's up, Emmanuel?

Emanuel:

Hey, what's up?

Jason:

Joseph, I don't know. I don't know where he's at. He will be back at some point in the future, possibly as soon as next week. I think only housekeeping I have is, my mom bought me this sweatshirt, if you're watching on YouTube, and it has like a big time copyright infringement on it. She got four zero four media embroidered in like, the most basic font possible and also there's a crab.

Jason:

So I don't know. Look forward to that lawsuit at

Sam:

some point. And Maryland and you want one, let us know.

Jason:

Yeah. My mom will make you one. Okay. Should we get into it?

Emanuel:

Yeah. Let's get into it and start with Jason's story. The headline for this one is City Learns Flock Access Cameras in Children's Gymnastics Room as sales pitch demo. Renews contract anyway. Back to our favorite company, Flock.

Emanuel:

Jason, this all started with a FOIA request by a guy named Jason Hunyar. What did what did he find?

Jason:

Yeah. So kind of proud of this one because I talked to Jason Hunyar, who is a resident of this city called Dunwoody, Georgia, which is a suburb of Atlanta. It's like right outside Atlanta. There are many Flock cameras there. And he told me that he basically learned about Flock from our reporting and also learned about FOIA ing from us more or less, which is cool.

Jason:

But he filed a lot of FOIAs with the city of Dunwoody about Flock, their contracts, their and and then these things called access logs, which basically show who is accessing different Flock cameras and for what purpose. So in addition to what we have written about in the past, which are audit reports, which are basically records of every time that a cop has searched the Flock network. There's also these things called access logs, which show each time that a camera was tapped into. And these sort of show, like, the video cameras that Flock has, so not necessarily automatic license plate readers, but the actual, like, video cameras. And it showed what I thought was like a lot of interesting things.

Jason:

First of all, it showed that Flock's network in Dunwoody is not just Flock cameras owned by the city. There there are Flock cameras that are owned by the city and the police department. But then there are also all these cameras that are owned by private entities. So there were like gas station cameras, there were cameras at gyms, and then there were cameras at this Jewish community center that were all feeding into the police's, what they call a real time data or real time crime fighting database more or less, which we'll talk more about in a minute. And it shows each and every time that someone looked at these cameras and what camera they looked at and what time and for how long.

Jason:

And what Jason Hunyar found was Flock employees were looking at a lot of cameras that were in the children's gymnastics room at the Jewish community center, as well as cameras that were at the pool, cameras that were at a fitness center. And he basically like dumped a lot of these documents and wrote a Substack post that was like, why are Flock employees monitoring our children? And that set off this, I mean, frankly, shit show, like in Dunwoody, where all the residents were like, why are Flock employees looking at these cameras? Flock employees are not really supposed to be looking at these cameras. Why are these cameras connected to this network in the first place?

Jason:

We didn't know that this was happening. We didn't sign up for this. Like, what's the deal?

Emanuel:

I just want to pause and note that, like, we have this situation with Jason Hanyar, who, because of our reporting, did this very fruitful FOIA request, and then Joe was giving a talk in Berkeley and he was just walking around there and saw that there was like an anti flock protest, and obviously we've reported about many cities that are pushed back against flock now, and it's just interesting that there's been a sea change around Flock cameras at this point. It's like it's such a different environment than when we first talk, first started talking about it like a year ago or something. And yeah, proud mostly of Jason and Joe for the reporting that brought a lot of this to light. But

Jason:

I know also, I feel like every time we talk about it, I've like explained what Flock is. I mean, if you don't know, I'm sorry, but I feel like at this point, people actually know what it is. I like have brought it up to people in my life and they're like, oh, yeah, like I've seen plot cameras. I know what you're talking about. And that's pretty rare for a surveillance technology where it's reached this point where it's like, oh, I know what that company is and what they do.

Jason:

And, like it it goes videos and articles about Flock have become somewhat mainstream on like Instagram and YouTube. Like I see a lot of people making videos about it and these different, like, community groups opposing them and that sort of thing. And it seems like it's something where it's becoming this, like, actual mainstream ish issue, which is cool to see.

Emanuel:

Yeah. Yeah. Very similar to the the Ring arc that we saw. So, yeah, back to the specific story. What does Flock say?

Emanuel:

Like, Jason Helliard reveals all this information. People are not happy about it. What is their explanation for why this is happening?

Jason:

Yeah. So I think that this is important because there's like a city council meeting that we'll talk about in a second. But at the city council meeting, the outrage over Flock employees accessing these cameras is pretty intense. And some of the residents are like, I mean, these are their words, but they're like, why do they have like, little Jeffrey Epsteins running around looking at our children?

Speaker 4:

You guys want little Epsteins to have access to cameras all across your city? I mean, that's that's crazy to me.

Jason:

That is like what some of the coverage I've seen on social media has been like. And I think that Flock has been very upset at this framing. I think that to be fair to them to just sort of like explain what they say is happening here and what, I mean, probably is happening just based on sort of how I know Flock works. I've like seen sales demos and things like this before that that have either leaked or that I found unlisted on YouTube is basically Flock does these sales demos where if you are a city and you're interested in buying Flock, you go watch and see how the technology works. And Flock sales employees who work at Flock, basically, like pull up cameras in real time that are from, like, cities that already have contracts with Flock.

Jason:

And Dunwoody, Georgia is near Flock's headquarters. Like Flock is based in Georgia. Flock was founded at Georgia Tech University in Atlanta. And they have like this really deep integration with the city of Dunwoody. And so what Flock says was happening is basically like they were doing these sales demos.

Jason:

There's this real time crime center that the city of, Dunwoody has. And it's basically like a desk with a gigantic television screen that has like 50 cameras on it all at once. And so what they are claiming more or less is that these sales employees were demoing this technology. They were tapping into that real time crime center and they were showing like 30 cameras at once. And some of them happened to be in this children's gymnastics center, at the pool, like at these places that are quite sensitive.

Jason:

And Flock says that, you know, to imply that our employees were pulling up these specific cameras for their own, like, personal creepy uses is wrong and not fair. And that, you know, it's not fair that these employees are being singled out. And they're also claiming that they are one of the most transparent surveillance companies that exists because these access logs exist at all. They're basically saying that like with other surveillance technology, this is not data that you would even have, you would never know. And like they have this they have this system so that people can so that that like access like this can be held accountable in some way.

Jason:

And they were Flock was basically like, this is our bad. We shouldn't have been demoing with these cameras. It's a mistake. Like, we're not gonna do it anymore. And when we do sales demos, we're not gonna use Don Woodie anymore.

Jason:

We're gonna use other cities. And in addition, we're only going to use cameras that are like in parking lots that are away from populated areas. We're not going to use cameras that are, you know, at businesses that that are insensitive locations, that sort of thing. Like, that is what Flock is saying. And I think we'll get more into it, but that is like, I understand that, I think.

Jason:

Like, I I think that's probably what happened here, but that answer is not very reassuring in any case. Like, the the fact remains that basically, like, Flock was using this city of real people and real children and real businesses as a sales pitch to other cops and pulling them up, you know, at random, just whenever they happen to have these, like, sales calls. And I think that if I were someone who lived in that city, I would wonder why we were being used as like a test tube, why we were being used as a sales pitch, why these cameras are even on the network at all, that sort of thing. And so I think that there's like plenty of outrage to be had here. I think I just wanted to explain why Flock says that, well, no, we're not.

Jason:

I mean, frankly, like Flock's blog posts about this are like, our employees are not child abusers. Like that's that's the level that this got to. And that's like the rhetoric that that sort of like ended up happening here. And I think that is probably an understandable place to be if you're a parent and you didn't know that this was happening. But that's sort of like their general explanation for it.

Jason:

And that is like the explanation that the city of Dunwoody, like its mayor and city council has sort of taken back to its citizens at a city council meeting where for, like, three hours, people were yelling at the city for saying, like, how could this possibly happen? Why are you working with this company?

Emanuel:

I think it's totally understandable that Flock would put something out there in public saying, we're not creeping on kids, like standing up for their employees in that way, I guess, is fine. I do find the response to be contradictory in a sense that it's like on one hand, we did nothing wrong. On the other hand, we're very sorry and will never happen again. It's pretty pretty weird. Do you want to talk more about like what did go down at this city council?

Emanuel:

Like what was the vibe?

Jason:

Yeah. So the vibe was basically like at any city council meeting or at many city council meetings, there's like time for public comment. And this city council meeting had Flock, the Flock contract on the agenda. Basically, like, the Flock contract was coming up for renewal. This all happened.

Jason:

And literally for almost three straight hours, like citizen after citizen came up, they had three minutes to talk and they said like, we don't want this technology in our town. Why are they spying on our kids? How could you let this happen? They were also saying like, this happened without, you know, us being consulted. Like, we didn't know this was happening.

Jason:

How can you like take how can we take you seriously as our elected leaders? And and like, how can you basically like work with a company like this? And, I mean, I found the city's response to be extremely underwhelming. Basically, the mayor was like, I was surprised to learn that this was happening. Flock has promised that it's never gonna happen again.

Jason:

They're not using Dunwoody as test demo anymore, but like y'all are overreacting. This was done as like a sales demo. Flock is an important surveillance technology that's keeping us safe. And we're going to like modify the contract slightly and renew it and like move forward. It was clear from what the residents were saying that that's not what they were asking for.

Jason:

Like, did not want these cosmetic changes to the contract. They didn't want these like, sorry, not sorry apologies from Flock. I think that at previous city council meetings, Flock employees were there and they were sort of saying like, you guys are wrong about this. Like, you're mischaracterizing what actually happened. Here's how our technology works, like so on and so forth.

Jason:

And I think what was very clear to me was that the residents actually do know quite a lot about the technology and they know all of the sort of privacy and security screw ups that Flock has had. They know that Flock data has been filtered up to ICE. They know that it's been used to track a woman who had an abortion. They know that there's been like various security lapses. And pretty much like every time something like this happens, Flock will say like, this is being blown out of proportion.

Jason:

This is being taken out of context. You know, these are like theoretical attacks. Like when we did that article about Flock Condor cameras, which are the video cameras that are at issue here, being exposed to the open internet. They said, oh, well, these were like in a testing mode. They were only exposed for a few weeks.

Jason:

This is not like we've fixed it basically. And that's sort of like what they've been saying over and over again is like, yeah, this happened, but like we fixed it. And there and by the way, there's like a concerted like I mean, they basically say like there's a concerted woke effort to like destroy Flock and you guys are overreacting. And I think that the citizens of Dunwoody like really didn't like how it how it seemed like their concerns were being like brushed off in this situation. And then basically, like at the end of the city council meeting, the city council voted to renew the contract with Flock, like, not even a slap on the wrists.

Jason:

Like, I don't even know if they have the ability to do a slap on the wrist, but it's like, yeah, we tweaked the contract slightly. They said it wouldn't happen again. We're going to keep working with them.

Emanuel:

And at the end of this city council meeting, assuming this contract actually changed, like we had all this out, where do you think it lands? Like, what is the reaction to this now? Can you tell on social media how people are taking it? What's the aftermath?

Jason:

Yeah. I mean, I think I think the interesting thing is, and this is not this is maybe not super heartening for the people of Dunwoody at the moment. But I think that, as I mentioned, Dunwoody has like a really intense relationship with Flock. They've had a contract with Flock for a long time. They see Flock as being this like local company that they've worked with for a long time.

Jason:

And it seems like the politicians there are very kind of like in favor of Flock, both like in Dudwonnie, but also in Georgia more broadly, like the Georgia secretary of state or the Georgia attorney general posted on LinkedIn. Like, I'm so proud that Dunwoody renewed its contract. Like, basically, like, today law and order one. I mean, the post was insane. Let's see what it said actually.

Jason:

It said, mayor, thanks to council and you for supporting the use of flock technology. Georgia's constitution says that the government has one paramount duty to protection of person and property. I'm proud to say that Dunwoody's leadership lived up to their duty by continuing to partner with Flock. Flock's CEO also, like, emailed the Jewish community center at issue here and and basically said, sorry, like, won't happen again. And then said, I look forward to protecting, the Jewish community center and the city of Dunwoody for years to come.

Jason:

So they they basically are like, we're gonna continue on with this. But there are a lot of cities and towns around the country that don't have as close of a relationship with Flock that are looking at this, that are looking at other things that are happening and are deciding to cancel their contracts. And I think that this will have like, reverberations across other cities because it shows how how intense and how vast the Flock network is just in this one city where it's not just Flock cameras, but it's all these private cameras as well that can be accessed basically at any time for any reason. And I think that it will have an effect. And I think that in Dunwoody, it has created this movement of people who are against Flock.

Jason:

And some of the people who stood up to speak actually said this, like, we've organized really well to oppose this. Right now, we're opposing something very negative, but we hope in the future we can organize for something more positive. And I do wonder like if people will run for office in Dunwoody on an anti surveillance agenda. It seems like it seems like that is like ripe for, people are very interested in that at the moment. And so I don't know what happens necessarily in Dunwoody, but I think that the tide is kind of turning against these surveillance technologies and that people are going to start people are pushing back.

Jason:

And it's sort of like in Dunwoody, they're starting from an extremely pro surveillance government stance. But in places where there's already skepticism, there have been wins against this technology. And over time, I mean, we might see the same in in places like Dunwoody.

Emanuel:

Yeah. I think if people are literally in the street protesting about Flock, it's easy to imagine it becoming an issue in like future local elections, and if you see that's the case, please get in touch. If you're FOIA ing your local government about fuck and you find something interesting, get in touch. If you're not, please subscribe to four zero four Media and watch our FOIA forums. We teach you how to do this.

Emanuel:

Jason, I believe you guys already did like a one FOIA forum on Flock specifically, right?

Jason:

We did. And and also, like, people are in the streets in Dunwoody also. Like, I got some pictures and video of street protests, it's like people are holding up like, GTFO, get the flock out signs. They're holding up like posters that say flock out, like they're chanting. Like, this is not this is a pretty small city and I mean, there's dozens of people there.

Jason:

Like, this is not something that I think is going away anytime soon and I think like there is now like, organizing against it. And so very curious to see sort of like where it all goes. And yes, please get in touch if you do know more about Flock. My signal is Jason dot four zero four.

Emanuel:

Shall we break there?

Jason:

Yeah. Okay. We're back. We're gonna talk about two stories, one by Emmanuel and one by Sam. Both are about AI in education, which I feel like is a constant topic, very important.

Jason:

First, we're gonna talk about Nature, the massive journal, retracting a paper on the benefits of chat, GPT, and education. The story is by Emmanuel. I believe the paper came out in 2024? '25. 2025.

Jason:

Okay. What what did the paper originally say? The one at issue here.

Emanuel:

Yeah. So the paper was published in a nature journal called Humanities Social Science Communications. Nature has a flagship publication called Nature, but it also has like a family of journals. This is one of them. And the title of the paper was and I guess I should say, if you don't know, Nature is like gold standard for science publishing, very prestigious journal.

Emanuel:

The title of the paper is The Effect of ChatGPT on Students' Learning Performance, Learning Perception, and Higher Insights from a Meta Analysis. So the key term there is at the end, meta analysis is something very common in academic journals, a common type of study. Basically, it is not in the field doing research or, like, collecting its own data. It is taking a bunch of studies on a similar topic and combining all the results from all those studies in order to provide kind of an overview or a meta analysis of what is happening. And in this case, it is two researchers from a university in China.

Emanuel:

They took 51 research studies that were published between November 22 and February 25 on the effectiveness of ChatGPT in education. The bottom line is that they found that ChatGPT had a positive impact, specifically on, again, as the title says, on students' learning performance, learning perception, and higher order thinking.

Jason:

Okay. So what it why was this retracted? Like this this does happen sometimes, but, you know, it's been a a year since this was published and then suddenly, like, out of the blue, it's gone. So what happened?

Emanuel:

So we don't know exactly, but I had I've talked to some experts. I I did some of my own digging, and I read some other papers that talked about this paper and the methodology it uses and the similar papers like it with similar results. And basically, this paper came out last year. It was very viral. It got a lot of attention because it made like a pretty definitive statement that ChatGPT is good for students.

Emanuel:

It's good in an education environment. That goes a little bit against even what we know as non experts on this subject and what we found in our reporting. I did a story a few months ago about Alpha School, which is an AI powered private school that's very expensive, and their results are generally good, but the story goes through some of the problems with AI assisted education, AI makes errors that creates faulty lesson plans that frustrate students, etc. Not surprisingly, people in the education community and specifically people in the community that like is the intersection of education and technology and education and AI were very skeptical about this. They pushed back against the results.

Emanuel:

They made arguments about why the research is bad. We don't know exactly what the thinking was at nature, but at some point a few weeks ago, they put up a retraction note saying that they had doubts about the methodology and the data, and that they approached the researchers about this, and the researchers did not respond, and they decided to retract the study, which means you can still read it, but they kind of, it's kind of like a vote of no confidence in the findings. And if you download the PDF, I noticed, you can get the full article, but it has like a big redacted retracted sign across every page, which is kind of kind of interesting.

Jason:

Yeah. Yeah. So you talked to Ben Williamson, who's a senior lecturer in digital education at the University of Edinburgh. Edinburgh? Edinburgh.

Jason:

What did what did he say about this? And also, I think he's he's very smart here, but once he said it, was like, oh, like, that makes a lot of sense. Like, what the fuck?

Emanuel:

Yeah. So like I said, nature doesn't make it clear, but I think it's pretty clear what the issue is. There are a few issues, but like one that Ben pointed out is that if you recall, I said it's a meta analysis of studies that were published as early as 2022. And if you think back to 2022, I think that's like ChatGPT. Is that ChatGPT two, like, starting to make the rounds?

Emanuel:

Or it's like either two or three.

Jason:

I think I think 2022 is when ChatGPT came out.

Emanuel:

It's when it became popular. It's like, it's very early on in the whole generative AI madness.

Jason:

November 2022 is when ChatGPT was released.

Emanuel:

Right. And it's like, you're you're supposed to believe that someone as early as November 2022 noticed that people were using it in education or did some testing of using Chateappity in an education environment, properly collected the data and properly analyzed the data, and then was able to make some sort of credible claim about how it's impacting Studemann, and not only that, that it's good for them. And in general, that's just like not enough time to do that. And that's like, that's one group, that's one bucket of studies that are feeding into this meta analysis. And then another problem is that the standard that the researchers used essentially seems to be that any study that was peer reviewed was fair game to include in a meta analysis if it was about these subjects.

Emanuel:

And that sounds good, like peer review is bare minimum for academic publishing. I think we all know that, but I think at this point, a lot of us also know that the academic publishing industry is not perfect. It's kind of shady actually, and there's a lot of paper mills and there's a lot of not very credible journals where you can pay to get published or they just like don't have really good, like a very good peer review process, and it just like, it doesn't mean what you think. And then the people who dug into the actual studies found that a lot of these were like not reputable studies from reputable journals. And yeah, those are kind of like basically, it's like you have a meta analysis of peer reviewed papers, and that seems like it's a legitimate pool of data to to look at, but when you dig in, actually, each individual study is not that credible.

Jason:

I think just to put a little bit more context here, I did an article a few months ago about a FOIA. I got back from different schools around the country, different school districts around the country because when ChatGPT was released, I FOIA ed school districts for what they're saying about ChatGPT. And this was like mid twenty twenty three, something like that. And I did an article basically about how like no schools actually knew what this was. Like there was always a teacher or two who was like, oh, ChatGPT, like we should look into this.

Jason:

This is scary. We should like consider doing something about this. And almost none of them actually knew what was going on. So the idea, again, that there was like all this very good science being done about whether ChatGPT could help students and was being published at this time is it defies belief. And so I think that is like important to set up.

Jason:

I guess last, like, what do you take away from all of this? It's kind of a mess. I feel like there's been a few different cases where there's like kind of a highly hyped, study that says AI is good, and then it like makes the rounds. And then at some point in the future, that study is retracted or like wasn't quite right. And then that then it's unclear whether, like, the coverage of it of it being retracted or of that being bunk science, like, actually has the same level of impact that the original study did.

Emanuel:

Yeah. It's funny because I think, like, at least half the story here is about the state of academic publishing. But I think it's it's probably two years ago at this point, but generative AI is also really crushing academic publishing. Like, we we wrote some some stories about how the peer review the peer review seem to be AI generated, you know? So it's like, it's really they're getting it from from all ends.

Emanuel:

But I guess the other thing, and we can get into this a little bit when we talk about Sam's story, but it's like, believe your eyes and ears and believe what students and teachers are telling us about this and they're not loving it. And it's like we didn't do an academic peer reviewed study unless you include our editing to be a peer review process. But like Jason talked to a bunch of teachers about how this is going. It's not going good. Like, don't think you you would have been able to do that story and come to the the conclusions you did if if ChatGPT was incredibly helpful in an academic environment.

Jason:

Speaking of AI being incredibly helpful in an academic environment, let's talk about an article Sam did yesterday. OpenAI, Google, and Microsoft backed bill to fund AI literacy in schools. That's that's exciting. That's exciting. So basically, like, the AI companies are pushing a bill

Sam:

that would do what chaser situation, with these two stories. So this bill, which was introduced by Adam Schiff, who's a senator in California, and also Mike Rounds, who is a South Dakota senator, Republican. It's a big bipartisan bill, which is always, such a delight to see. But the bill would essentially fund the NSF, National Science Foundation, to allow schools to change the the k through 12 curriculum and encourage AI literacy. So building in AI literacy literacy, and this is in major air quotes because we'll get into why that's bizarre and vague in the bill.

Sam:

But the bill would allow the director of the NSF to grant awards to support educational curriculum that would advance AI literacy in school. Yeah. I mean, this is it's just a layer cake of what the fuck generally.

Jason:

Can I ask what AI literacy is? Do we know? Is it defined?

Sam:

It's defined in the bill. The bill as it's written right now is five pages long. One page of that is the title. I guess we should say the title of the bill. The title of the bill is the Literacy in Future Technologies Artificial Intelligence Act, the LIFT Act.

Sam:

I'm always in awe of their ability to make an acronym in these bills, but one page is the title, one whole page is definitions that are very vague, and then there's only a couple pages of actual policy here. But they define AI literacy as having the age appropriate knowledge and ability to use artificial intelligence effectively to critically interpret outputs, to solve problems in an AI enabled world, and to mitigate potential risks. So that's both short and stupid. Literacy by definition is not using a thing. That's onboarding.

Sam:

This is just teaching kids how to use ChatGPT by making them use ChatGPT or Gemini or any of these others that helped fund this bill, which let's see who funded the bill in total. It was Google, which obviously runs Gemini, OpenAI, which is Chatuchu, Microsoft. And then there were a couple others, the American Federation of Teachers. I don't know a ton about them, but they did have this big push for a $2,000,000 partnership with Microsoft OpenAI and Anthropic last year to build this big, like, AI training hub, which I guess is gonna be a physical space, like the Pokemon arena or something. Very scary to think about this.

Sam:

But they're they're sloshing money into AI with the help and the support of all these other AI companies, which is just

Jason:

The the American Federation of Teachers is like the is a massive massive teachers union, like second Yeah. And the president of it is Randy Weingarten who has been this like figure in, like, teachers unions for a long time. And I mean, I have no idea, like, what her relationship is here, but she's been, like, quite controversial in general. I feel like I feel like AI is probably not that popular with teachers based on on what I've said, but there's like a long, long history in general of, like, they do lobbying in addition to just being a union. They do lobbying, and there's a long history of lobbying organizations kind of like selling out their members for whatever purpose.

Jason:

Like farmers lobby has sold out its farmers like over and over and over again on right to repair and things like this. So it's not it's disappointing, but not like terribly surprising that this massive teachers union is like pushing this.

Sam:

Yeah. And has been for for a while. This is a big part of their agenda at this point. Another part of this bill that I think is important to note, and this is something that I didn't really realize until Matthew Gott, one of our contributors, pointed this out while I was writing the story. But the NSF, they last week, Trump fired every single board member that was responsible for guiding the NSF.

Sam:

And the NSF right now is without a director and has been for a while for about a year. So right now, it's kinda like who's gonna lead the NSF next. And the leading name, like the Trump pick at this point is a guy named Jim O'Neill. And he is not a research guy, not a science guy. He just is a former employee of Peter Thiel and is is of that world.

Sam:

And I think there's some speculation among the science community that Jim is nominated and is probably gonna be the NSF director because it'll be kind of like a hat tip or like a consolation to Peter Thiel who Trump is very invested in staying good with. So that's all very cursed because the person who is responsible for applying this funding so even if we could imagine a world where this bill might fund actual literacy and actual learning about what AI is and what it does and what it can't do, the person responsible for handing out that money is this rich guy financier who's just like a teal buddy. And I can't imagine that going anywhere but worse.

Jason:

I mean, this is like the knock on effects also of fully gutting the NSF and just like science research and medical research and all that sort of thing in The United States in general because the NSF has like a very long history of researching and doing like very important things and also funding like, you know, critical research and and doing standards and all that sort of thing. And, now, I mean, it's not without fault, but it's generally been like a very important organization. And now it is like a rubber stamp for industry and, you know, it's pushing it would it would like help these tech companies do what they want to do. On the interview podcast this week, earlier this week with, Brian Merchant, he raised the point that AI companies have been lobbying a lot more. I mean, they they've been lobbying for a long time, especially because AI companies are like Microsoft and Facebook and Google, which have traditionally just been massive lobbyists in general.

Jason:

But OpenAI increasingly doing lobbying, Anthropic increasingly doing lobbying, like different effective altruists increasingly doing lobbying. And one thing that they've been doing is they have been pushing these bills that are extremely friendly to them, of course, like that's what lobbyists do. But they've been pushing basically for legislation that would funnel money and reduce regulations on the AI industry. And then they're also pushing for bills that restrict states from putting restrictions on AI and from, like hampering them in any way. And so I think just the fact that there is like a lobby that is pushing this like very controversial idea that AI is going to be good in schools when as Emmanuel just said, and as our reporting has shown, and there's lots of reporting has shown, it's not clear whether AI is going to help in schools.

Jason:

And in fact, it's like a lot of evidence suggesting that it's not good. And here, you know, you specifically have, you know, companies pushing to to integrate AI more deeply into schools. I guess the last thing that I am curious what y'all think is about this idea of AI literacy in general, as in, I think that's something that a lot of people have been kind of beefing about is whether you actually need to teach people how to do this stuff because, like, at the beginning of the Internet, there were there were, like, computing classes and typing classes and, like, how to use the Internet classes. But if this technology is going to be, like, so deeply integrated into our society and all that, it's like people are, like, figuring out how to use it on their own, just because it's something that you'll encounter and and need to know and and all of that. And it's not like you need advanced skills to use ChatGPT or Claude or or whatever.

Jason:

I feel like we've probably phased out. Like, don't think we have like how to use an iPad class. It's like babies learn how to use an iPad naturally. I'm just curious sort of what you think of this idea of, like, we need classes to show how AI works.

Emanuel:

Did you guys do typing classes?

Jason:

Yeah. I I did. Yes.

Emanuel:

Yeah. Did you think that it helped?

Sam:

No. I can't type for shit. So no. I use these two fingers.

Jason:

It's funny because I also use two fingers and I type super fast and I'm a very good typer. And I feel like I was cheating, you know, at like Mavis Beacon teaches typing. I also had Mario teaches typing. Shout out to that game. Very fun.

Jason:

And I like, the teacher would sit there and make me use homeroom and all my fingers, and I, like, couldn't do it so well. And then I would go back to, like, my two and a half fingers, three fingers typing and I would like crush it and be amazing. And so I feel like often when they're like, you have to do these following rules. Here's how you do it. It's like, often you do it worse because it's like antiquated and kids will figure out how to use technology because they're like smart and they have good brains.

Emanuel:

I had private typing lessons, which is so funny to think about. My parents were like, you have to learn how to type really fast because it's like the future or something. And I went to this lady who like taught me typing, and it didn't work at all. Like, this is like, I don't know, late nineties or something like that, mid nineties, and I just like wasn't using a computer a lot and it didn't take. And then as an adult where I had to use a computer, it's like I learned to type very fast, but it like it wasn't because of the class.

Emanuel:

I'm only saying this to like agree with Jason where it's like, I don't know if you need to teach people how to use a technology that is made to be like a mass consumer product, like the people who make iPads or the people who make AI or the people who make phones make them in a way that like assumes that they have to teach you how to use the thing, like that's like part of the whole effort, right? Like an iPhone is like an intuitive device, that's why they make it the way that they do. So I don't think it's necessary, and then I would also note, just as a footnote, it seems like the worst time ever to push something like this because it is also the time where schools are really warming up to the idea of phone bans and like getting computers out of school and people starting to question the whole Chromebook for every child policy and all of that. So and I think parents are digging it, and it does make sense to me. So I don't think this is going to go down so well like they like they assume.

Emanuel:

I think just when people are realizing that classrooms need less technology rather than more, trying to get ChatGPT in the hands of every child is like not a good pitch.

Sam:

No. I mean, they these companies need kids to use it. That's the kids don't need it at all. The companies need the schools to do this. It's it's very classic market capture from a young age.

Sam:

It's like they'll be they'll get their favorite chatbot, and that's what they'll rely on until, you know, they're adults. I I assume is the idea. Is this I assume is the thinking. It's I mean, it's the same way with, the Chromebooks. Apple immediately got into schools when they as soon as they had a computer that they could put into schools, they did.

Sam:

And now you have, you know, people who are like, oh, yeah. I've used I've been using an Apple computer since I was a child, and it's always what I've used. It's it's something that brands do all the time. I think it's also it's an interesting I think they're too late. I think, like you said, it's like this isn't a weird time to be doing this because they're a little too late.

Sam:

I think maybe if there was less skepticism against these things at this point, maybe they would be well, they would have an easier time of it, but it's I think people are already realizing just on their own that this is not something that kids should have access to or need in a in a learning environment. I think they're probably like a little bit behind the ball, but they're also desperate at this point. I feel like this is such a desperate move to go to go the legislation route is always like, it didn't work in the free market of capitalism. So we're gonna buy a senator and make this happen on our own is just last gasp vibes, but it works sometimes.

Emanuel:

To go back to the previous story, I think, like, the good research is going to come out, and I think it's going to be like abysmal Yeah. For AI companies. Like, I can't think of anything that's more detrimental to, like the development of a of a young brain than just having a machine do all your thinking for you. It's not good. And there's already some research that show that kind of like cognitive atrophy and all this stuff that we've talked about previously.

Jason:

It's funny also just because like I got a push notification from the New York Times yesterday that was like, teachers are making students write essays in class Right. By hand because of AI and that sort of thing. And it's like, I mean, that's something that we heard from teachers a year and a half ago, which is fine. Welcome New York Times. I'm sure it's happening more and more now.

Jason:

And there's been a lot of really good articles about AI in schools from like New York Mag, LA Times, New York Times, etcetera. Like it's become a very mainstream thing to talk about how AI is changing school and how teachers students are not doing well, like all this sort of thing. It's not something that's like flying under the radar in any way, shape, or form at this point. It's like kitchen table conversation in the biggest media outlets, etcetera. So I think that this is not something that like, it's not good.

Jason:

It's not good for the AI companies, I don't think. And I think to Sam's point also, I don't know if this is still the case, but every year for the last few years, there's been a spike in ChatGPT usage and sign ups in like late August as school gets back in session. And then people unsubscribe over the summer. And I wonder if that's like holding true, but it's like the these companies do need to capture these students like early on, I feel, and addict them. Okay.

Jason:

Let's leave that there. If you are listening to the free version of this podcast, that is the end of the podcast. But if you wanna hear us talk about RightsCon and the shit show that has occurred with the world's largest civil rights conference. You can get that in the subscribers only sec section, which you can gain access to by going to 404media.co and subscribing. You'll get a top secret, RSS link that is ad free and also includes, these bonus segments every week.

Jason:

So you can find it there. Okay. We are back, and we're gonna talk about a story that Matt Gault wrote, but that I helped on about rights con. There's two articles. One is called world's largest digital human rights conference suddenly canceled, and another is called China pressure canceled China Pressure Canceled World's Largest Digital Human Rights Conference.

Jason:

Are y'all like familiar with RightsCon? Have you ever been to RightsCon?

Emanuel:

I'm not familiar. I haven't heard of it until the story, but I did learn that it was important and big.

Sam:

Yeah. Define it for us. I mean, I've heard of it. I know of it. I know about it.

Jason:

But It's a it's a conference about rights. It's a it's a rights con, if you will. No. But it's it's thrown by Access Now, which is a civil liberties group that is similar ish to like the Electronic Frontier Foundation, the ACLU Center for Democracy and Technology, etcetera. But it has a more international bent.

Jason:

And they deal with things like free speech online, social media censorship, and I use that as a like, non pejorative, like Turkey is shutting down Twitter, like access to all of Twitter, for example. Like, things like that. They like push against that. They push against, like, dictators shutting down the internet in general. They push against they do a lot of stuff about like journalists being surveilled and arrested.

Jason:

They also do a lot on like, what good content moderation looks like and and that sort of thing. And, Access Now does like really good work, but RightsCon is their big thing. And RightsCon takes place every year. It is a massive, massive conference. It's like thousands and thousands of people go.

Jason:

Largely academics, civil rights experts, journalists, dissidents, like that sort of thing. And one of the interesting things about Writes Con is it doesn't always, but it usually takes place in a country that would be considered like the global South or is like developing or is politically important in some way. So like it's been in Tunisia. Last year, very notably, it was in Taiwan, is in Taipei, which is super notable for this story we're about to talk about. And then this year, it was set to be in Zambia, which was they were quite excited about based on their blog posts and that sort of thing because it was like returning to Africa after a few years.

Jason:

They've done it in like South America, Central America. I don't know if they've done it like in The United States or Europe recently. I feel like they are largely doing it in these countries where there is I mean, Internet freedom is under attack everywhere, but where there's like quite a big threat to like, oh, the Internet as a whole is going to be shut down or something like this. And so it's set to be in Zambia. It was set to be taking place like right now.

Jason:

And I think it was like Wednesday of last week, there was a post by the Zambian government basically saying like, RightsCon is canceled. Don't come. And this was like going out on Facebook and like RightsCon was like, we don't really know what's going on. We're not we haven't canceled it. And so there was this chaos for a few days where you had academics who were like actively flying to Zambia.

Jason:

Some of them were already there. Some of them were on the way. Others were gonna go, you know, in a few days. And there was like, it wasn't no one was sure like what was happening, whether it was actually canceled. The Zambian government actually said it was postponed, which is kind of crazy because you can't really postpone a conference four days before it's about to start when thousands of people are flying in.

Jason:

It's like a very logistically complicated conference. It's not something you can just like move back a week or two weeks or whatever. And so there was chaos for like a few days. And then then on, I guess it was Wednesday, finally, right or Access Now was like, okay, this is canceled. And then a few days later, Access Now did a longer blog post sort of doing a postmortem of of what happened.

Sam:

Cool.

Jason:

Yeah. Cool. No. It's bad.

Sam:

Mean, it's bad. It's really bad. I mean, I I've been following this story kind of tangentially. I know a few people who are either regular rights con goers or or going this year, and it's a total clusterfuck. Do you wanna get into the past the past years?

Sam:

It's always I feel like there's always something, and it is because of the places that they choose to be in, and I'm not saying they should choose to be in different places, but it's the it kind of proves the point of why it's needed every time. So, yeah, I mean, what's what's going on in in the previous years? It's never been fully canceled, has it? This is the first time that it's been.

Jason:

Don't think it's ever been fully canceled before. I I think maybe they did it virtually over COVID or maybe it was canceled because of I would need to go back and check, but it hasn't been like fully canceled to my knowledge. So in 2023, they did in Costa Rica. And they basically told the people who were coming there, that we've worked it out with the Costa Rican government that you can get visa on arrival. And that included from countries where visa on arrival was not normally possible.

Jason:

Like a lot of people are traveling from Zimbabwe, from Pakistan, like from just from places all over the the world that may or may not have like an agreement with Costa Rica as to, you know, you can just like show up or whatever. And RIghtscon told people that they could show up and get visa on arrival for this conference. And that turned out to not be the case. So like several 100 people showed up in Costa Rica, and they were turned away at the border. And this became like a huge problem, because a lot of them were set to be speakers, a lot of them were set to be panelists.

Jason:

And then it became a question of like, well, why are we throwing this conference in this place that's not letting a lot of our attendees in? Like that sort of thing. And it became like the talk of the conference. There was this big postmortem about like what went wrong and why. And then last year, the conference went off without a hitch largely, but it was in Taipei, in Taiwan.

Jason:

And it happened a few weeks after Trump took office. And a lot of the people who I mean, some number of the people who, like, go to RightsCon were doing research on US government funded grants. And so a lot of them were either, like, unceremoniously fired by the US government or had their grants revoked or that sort of thing. And so this was not RightsCon's fault or AccessNow's fault, but, like, a lot of the talk of that conference, as I understood it, was, like, our entire industry is now under attack by the Trump administration. A lot of the people who are getting funded to do human rights research, especially as regards to like the harms of social media.

Jason:

Like a lot of them are researching x and like Elon Musk's threats and and like all that sort of thing. We're like suddenly either unable to attend or like their funding was in limbo, like that sort of thing. And so then that became a big deal last year. And then what came out, on Friday, in Access Now's, postmortem of what happened at RightsCon in Zimbabwe or sorry, in Zambia, is that basically the Chinese government pressured Zambia to cancel the conference. And one of the reasons for that is that there were quite a lot of researchers from Taiwan who were set to speak at RITSCON in Zambia.

Jason:

And the Taiwan issue is like a a major issue for the Chinese government. We don't need to get like deep into the ins and outs of it because I will sound stupid, I'm sure. But basically, it's like China believes that Taiwan is part of China. Taiwan believes that it is an independent country. Taiwan functions largely as an independent country in practice, but there is this policy of strategic ambiguity from The United States at least where it's like, basically, the US government won't answer the question as to whether Taiwan is an independent country or is part of China, and they sort of just like ignore the the issue as best as possible.

Jason:

Like while supporting Taiwan, generally like giving them military support and, you know, buying tons of chips from them and things like that. But it's like they they don't formally recognize Taiwan as an independent country. And a lot of countries around the world don't recognize Taiwan as an independent country because China doesn't like when you do that. And China has the power to exert a lot of influence over those countries, especially in Sub Saharan Africa where China has been like embarking on this campaign of kind of like neo colonialism where they do tons of funding of like public projects and and things like that. And often in exchange, they will like get access to a mine or they will like set up a factory there or they will do like oil drilling there or like that sort of thing.

Jason:

And so China has like a tight relationship with the Zambian government, saw that all these Taiwanese researchers were gonna go talk at RightsCon. I think probably the fact that RightsCon was in Taiwan last year pissed off China, one would expect. And so basically told Zambia, like, you have to cancel this. I think Zambia then went to Access Now and said like, well, we can perhaps continue with this if you're willing to disinvite all of the Taiwanese researchers, not talk about Taiwan, like all that sort of thing. Like, try to exert censorship over the conference.

Jason:

And I think given this like very bad option of canceling the conference or having it go on, but like allowing censorship to occur and disinviting all these researchers who who, you know, talk about internet freedom and talk about human rights for Taiwanese people and and all that sort of thing, like that was a bridge too far. And so they said like, we're just gonna cancel it. And unfortunately, like it's not proceeding even online. It sounds like it's not gonna happen this year at all. And I think that they're gonna try to do it again next year, one would presume, but it's not clear like where they're gonna do it, like how they're gonna do it, like what's gonna happen.

Jason:

And this is all happening at a time where that entire industry is like under immense threat and attack. Like the US government has revoked the visas of social media researchers who like, I mean, the administration's words not behind, like, engage in censorship of social media. So basically, people who research content moderation and, you know, have published papers about, like hate speech and things like that. You know, Elon Musk has sued like a bunch of people, the Southern Poverty Law Center, which has done a lot of this research is like under attack and is getting defunded through this like horrible pressure campaign where like, you can no longer donate to them through your through like Fidelity or Vanguard or these massive institutions. So like this is happening.

Jason:

This attack on this like industry is happening all over the world, not just from like, well, yes, from like dictatorships and and places where they like shut the Internet off entirely and from China, but also from The United States increasingly from, like, corporations and and all that sort of thing. So it's it's pretty scary, I think.

Emanuel:

I think it's also not what happened in this particular case, but I do think you're probably going to see more of this because it's not just that China is very much invested in Africa and, like, not just building projects that bring in a lot of money, but I think own a lot of the debt in this case, right? I think it's like they own a significant portion of the Zambia debt and therefore can give them better or worse terms on that, depending on their relationship. But while this is happening, it's like, he also have, like the cuts to USAID, which is one way to like counter that influence or it used to be like, this is what one thing that people talk about when they talk about soft power. It's like you have China making its investment and pushing its worldview, and you used to have The United States using organizations like USAID to to to promote a different view. And now that has been cut down significantly and it's just like, why wouldn't you go with like China's preferences every single time if if that's the only?

Jason:

Yeah. I'm not up to date at all on US Zambian relations, but one could imagine under a previous administration and with like a functioning USAID, etcetera, where like The US Government steps in and says like, no, like this needs to continue because it's important for freedom. It's important for like the spread of, you know, all that sort of thing. The the liberal values that we care about as a democracy. And that's like definitely not happening anymore.

Jason:

Like there's no either there's no one to do that, but it also like the US government at this point doesn't want this sort of thing to proceed either. And so, yeah, I I think this is perhaps like a a knock on effect of gutting USAID, of canceling all these grants, of this administration siding over and over and over again with social media companies and oligarchs and Silicon Valley, you know, billionaires, etcetera. And this is kind of like where where we've ended up. Okay. On that very cheerful note, we will end the podcast.

Jason:

I believe Joe will be back next week to keep us on the rails. As a reminder, four zero four Media is a journalist founded company and is supported by subscribers. If you wish to subscribe to four zero four Media and directly support our work, please go to four zero four media dot c o. You'll get unlimited access to our articles and an ad free version of this podcast. You'll also get to listen to the subscribers only section where we talk about a bonus story every week.

Jason:

This podcast is produced by Alyssa Midcalf. Our theme music is by Alyssa as well. Another way to support us is by leaving a five star rating and review for the podcast. That really helps us out. I like hearing the five star reviews from Joseph.

Jason:

I don't read the negative ones. They hurt my feelings. Please don't leave them. Okay. We will be back next week.

Jason:

Goodbye.