from 404 Media
Hello, and welcome to the four zero four Media Podcast where we bring you unparalleled access to hidden worlds both online and ILL. Four zero four Media is a German standard company and needs your support. To subscribe, go to 404media.co, as well as bonus content every single week. Subscribers also get access to additional episodes where we respond to their best comments. Gain access to that content at 404media.co.
Joseph:I'm your host, Joseph, and with me are the other four zero four media cofounders. The first being Sam Kol.
Sam:Hey.
Joseph:Emmanuel Mayberg.
Emanuel:Hello.
Joseph:And Jason Kebler.
Jason:Hello. Hello.
Joseph:Alright. Let's go straight into the first story of this week. Jason, I think you would be asking about this.
Jason:So this is a story that you wrote. It's called Man Charged for Wiping Phone Before CBP Could Search It. This is when I saw this, I was like, oh, this is good. This is what we're charging people for. I'm gonna read the the deck as well because I think it's helpful.
Jason:Activist Samuel Tunic is charged with deleting data from a Google Pixel before CBP's tactical terrorism response team could search it. So first of all, what is Samuel Tunic charged with?
Joseph:Yeah. That's the really strange thing here, and I've pivoted just so I can see the indictment on my screen. And I think I'm just gonna read it out even though it repeats a little bit of what you said. But it says, on around January 24 in the Northern District Of Georgia, Samuel Tunic, before and during the search for and seizure of property by Customs and Border Protection, he did, quote, knowingly destroy, damage, waste, dispose of, and otherwise take any action to delete the original sorry, the digital contents of a Google Plex Pixel cellular phone for the purpose of preventing and impairing the government's lawful authority to take said content. So that's a lot of fancy words to basically say, he's not being charged with, like, a related crime.
Joseph:He's not being charged with something that's on the phone. Is specifically being charged for allegedly wiping the phone and deleting data before Customs and Border Protection were able to go through it. Yeah.
Jason:Yeah. I mean, that that's what really stood out to me here. This wasn't a case, at least as far as we know, where he was being arrested for something else. Do we know the circumstances under which he was detained? Like, do we know why he was interacting with CBP in the first place or sort of, like, have any sense of, like, what like, why why he why was he deleting his his device?
Jason:As in, like, why was he interacting with CDP at all?
Joseph:Yeah. So I was gonna say this at first, but I'll say it now. There are a ton of unanswered questions about this case, which was frankly a little bit difficult for me when writing it, and I think it will be, you know, maybe not annoying for the listener, but the listener is gonna have a lot of questions about this as well. But I'll simply say what we know, and this comes from either that indictment I just read or sort of friends and, I guess, colleagues or associates of Samuel Tunix have posted this as well. But all we really know is that in January, Samuel is coming back into The US.
Joseph:I believe it says on international flight from some of his colleagues who put that out. He is then stopped. We don't know the exact reason why. We don't know why customers wanted to go through his phone. We don't know the stated reason, that was given.
Joseph:That stuff isn't public yet. But then later on, much later, actually, just in, you know, last month in November, he's finally indicted for allegedly wiping that phone. Again, that's nearly a year away. Right? And then I believe at the beginning of the month or towards the November, he's pulled over in traffic stop because of a taillight allegedly, and then FBI and DHS arrest him as part of this.
Joseph:He is now he is out now, I believe. I think his activist friends posted that online, but there's a ton of missing context and unanswered questions, I would say, about all of that because, yes, we don't ultimately know why Customs and Border Protection wanted to take him over. We can probably draw a guess. I mean, I'm happy to and we will in a second talk about sort of Samuel's background. But when listeners maybe go read the piece, they'll notice I don't really get into that because frankly, we don't know.
Joseph:It's so opaque, and I don't even know if someone that had been subject to this sort of search would actually know themselves anyway. Like, it's all it's already sort of a very strange charge, and then you couple that with the very opaque system of the Department of Homeland Security.
Jason:Yeah. So you can can you tell me a little bit more about Samuel because, you know, his friends describe him as an Atlanta based activist. We don't know again if this is, like, why he was arrested, but one could surmise that it might be the reason he was on like DHS and CBP's radar. But but what, like, what is his background?
Joseph:Yeah. So his friends describe him as an activist, and you go online and that's, you know, very, very public. I wouldn't especially say he's a super high profile activist or or anything like that, but sort of local media was reporting about those connections. There is, of course, the Stop Cop City movement in Atlanta as well that I think it's been I'm looking at it now. It has been publicly reportedly has connections to a musician as well, and he's generally a member of the community.
Joseph:And that's sort of where I stop, especially in the article because again, I was almost hesitant to put the word activist in headline, not because I don't believe that. I mean, it's very clear this person is an activist. I just I don't wanna give the impression that, oh, we know exactly why this happened. And usually, we might not cover something if there were so many unanswered questions, but this case is so unusual that it was worth it was worth covering even without those sorts of things as well. You know?
Joseph:I actually think there was a statement. Maybe I'll bring it up in a second if you ask me something else. I'll bring up a statement that I was sent after the article as well, which I think will explain a little bit more.
Jason:Yeah. I mean, I'm curious sort of about the idea of wiping your phone when you're, like, in the presence of law enforcement. Is this I actually don't know the answer to this question. I should have looked it up. But is this something that, like, Android phones can do?
Jason:Is this something that iPhones can do? There I mean, I guess you can just be like, you know, go back to factory settings, but is there, like, a specific privacy conscious tool that one could use to to wipe your phone in in a situation like this? I I've heard of such things like, you know, you lock certain things in in encrypted parts of the phone and you can wipe them quickly. But just speaking like more generally, because I I don't think we know exactly like how he wiped his phone, but like how would one do this?
Joseph:Yeah. That's actually how it first came on my radar. Like, don't follow the Stockholm city movement in Atlanta. I haven't covered that. You know, plenty of other outlets have.
Joseph:The way I got into this was because members of the GrapheneOS community saw it and then started sending it to me. Now GrapheneOS is a privacy and security focused operating system for Android. It does lots of stuff. It, I think, hardens the kernel, makes it harder to more makes it harder to remotely hack, makes it harder to break into with a Celebrite or a Gracie sort of device, that sort of thing. It removes a ton of Google functionality, so it should be more private as well.
Joseph:And it also, pretty importantly, has all of these sort of extra features, which, you know, listeners and readers of four zero four may more associate with sort of the encrypted phone industry that I've covered a lot, which is used by criminals. Not saying that's what happened here, but as I understand it, GrapheneOS does have like a duress pin where, hey. I'm the authorities or whoever, even an abusive partner, for instance, I'm demanding you unlock your phone. You go to put in your pin. You actually put in a different pin, and then it wipes the phone.
Joseph:Now, again, we don't know whether GrapheneOS was used here. The Freesam Committee, which is a group of supporters of his I spoke to, I asked them, like, was he using Graphene? And they said, I will leave the question of operating system for his lawyer. What was said in court was that it was a Google Pixel which is known to be compatible with graphene. So we don't know, basically.
Joseph:But, yeah, there are tools to do that, and they do differ from iPhone or stock Android in that there is a decision to be like, oh, man. I'm in a horrible situation now. I'm going to consciously wipe my device. Of course, that sort of feature has many, many legitimate use cases when you don't want your personal data or sensitive information to fall into the hands of some sort of third party.
Jason:It is funny. The FTC has a page on how to remove your personal information before you get rid of your phone, which is obviously like a different situation, but they're they are like, here here's how you can wipe your device. I I wanted to go back to the actual, like, law enforcement involved here. So this was a tactical terrorism response team within CBP. It's sort of in the name, but, like, what is what is this part of CBP?
Joseph:Yeah. I had not really heard of this before. Like, I don't really touch the terror or the counter terrorism part of Customs and Border Protection much, but thankfully, the ACLU had highlighted these groups before and they intercepted them some coverage as well. So I went and looked at those, and the ACLU says that these are highly secretive units deployed at US ports of entry which target, detain, search, and interrogate innocent travelers. Well, that's pretty interesting because I think everybody knows at this point when you cross a border, there is a risk, especially in The US, right, that your device could be searched.
Joseph:You don't have the same sort of legal protections that you do walking around you in The United States. You know, the Fourth Amendment basically, like, doesn't exist at an airport because, you know, depending on where you are actually in the facility, you may not have those rights. But this almost sounds like a step up from that where you have these dedicated counterterrorism units, which seemingly can just do, whatever they want. And then The Intercept, they covered this case of a sculptor and installation artist who was detained at San Francisco Airport a few years ago, also had their phone searched or taken away, and they still don't know why that happened three, four, five years later. So they definitely seem very secretive.
Joseph:And even if you are pulled over by one of these teams and you're questioned by them and your devices are searched or taken away from you, how are you ever gonna find out potentially? Because it may not even lead to criminal charges. Right? I don't think this person in San Francisco Airport was ever charged criminally. This is different because Samuel Tumik was eventually charged, albeit because for allegedly messing up the search the customs of border wants to do.
Jason:Yeah. I I wanted to expand a little bit on that just because there is this idea of a 100 mile border zone, which which has been called the constitution free zone, like colloquially, which which applies to, like, airports, but then also a 100 miles of the Canadian border and the Mexican border. And, you know, that encompasses, like, I don't have the number in front me, but a huge percentage of the American population because a lot of people live near the borders. You know, cities like Los Angeles are there. I believe even places like, I don't know, like Boston is probably within a 100 miles of the border.
Jason:Chicago certainly is, that sort of thing. But the ACLU has a really good page on this because there is this idea that you don't have any rights, which is not true. The federal government has asserted that it has, like, pretty wide latitude to to do searches at the border and at the airports and things like that, but, you know, the ACLU says a lot of those are sort of up up for debate whether you're able to, like, actually resist a border protection agent at the airport is a different question altogether. But I would encourage people to just, like, sort of read up a little bit about this because there is, like, there's kind of quite a quite a lot of, like, unsettled case law around what immigration officials can do at an airport or near the border and sort of, like, you know, what price you have and which rights you may not have. Yeah.
Jason:I mean, I I think it's important that we talk about this one. I know that we don't usually talk about stories that have so many unknowns, but this is just such an egregious sort of case, and I think it apply it it's like a scary situation where someone is being arrested literally for wiping their device versus being arrested for something else and then being accused of, I don't know, obstruction of justice for erasing evidence or or something like that. It's like here, they are claiming that the actual crime is wiping the device. Is is there anything else on this?
Joseph:Exactly. And that's why it's so strange because it's not as far as it's not an obstruction of justice charge, which we hear about all the all the time. Right? Where, oh, somebody learns that they're being investigated. Well, then they delete evidence or they get a a witness or somebody else to delete their evidence as well.
Joseph:It's not like an obstruction of evidence charge as far as I know. It's just very specifically about the wiping of the device. So that's already unusual. And then to me, the really, really strange thing and ultimately why we covered it is because whenever I see a charge of somebody related to wiping a phone, it is usually in connection to another crime. So for example, when the FBI ran a non that encrypted phone company that they created and they backdoored, there were all of these people selling the phones, to criminals.
Joseph:Now in a cruel twist, the FBI charged them, and many of them are now in prison. I think something like 13, or maybe two not quite two dozen, around there were indicted for this. Crucially, there was an obstruction of justice charge in those indictments because they would wipe the criminal phone the criminal's phones remotely when they were asked to do so, but that only exists because they were wiping evidence of drug trafficking, which is what the the drug traffickers and the hitmen or whatever were obviously using the phones for. In this, there's no, like, charge that, oh, you wiped evidence of drug trafficking. You obstructed justice when we're trying to investigate, child abuse imagery.
Joseph:There's no there's no, like, underlying crime except the wiping of a phone itself. I don't think I've ever seen that in The US court before. In UK courts, you can get prosecuted and people have for, say, not giving over their password in a counterterrorism investigation or, hey. The phone's encrypted, and you're you're not gonna help us unlock it. We're gonna charge you for that.
Joseph:And we've had activists cross the border, and they've been prosecuted for that sort of thing. This case really stands out to me because it's entirely separate from anything else. And, yeah, as we both said, there are a ton of unanswered questions, but we'll definitely be keeping an eye on it for sure. Alright. Should we leave that there?
Joseph:And we'll go to our next section. After the break, we'll be talking to Matthew about his really crazy anthropic story. We'll be right back after this. Alright. And we are back.
Joseph:Matthew, this is one you wrote, a really good headline. Anthropic exec forces AI chatbot on gay Discord community members flee. Lot going on there. We'll get into all of it. But maybe can you just explain, first of all, before we get to the anthropic stuff, what is this Discord server?
Joseph:Maybe it has a name, maybe it doesn't. Maybe you don't want to say the name, but what do people do on this Discord server?
Matthew:I'm not gonna say the name. But Yeah. It is it was just like a a kind of a hangout spot for gay people 30 and over who were gamers. There's a lot of Final Fantasy 14 in there. People just kind of trading pictures and talking.
Matthew:And a lot of the community members I spoke with said it was like a little bit more mature and a little bit slower, and that was kind of like one of the appeals to it.
Joseph:Yeah. So you then have this anthropic executive where he's the deputy CISO, chief chief information security officer. Right? Yep. Jason Clinton.
Joseph:What is his role on the server? Like, is is this his server and he moderates it? Like, what's his deal in connection?
Matthew:He moderates it. I am led to believe that it is his server. I could not I did talk to him a little bit, but I couldn't quite get him to confirm that that he began it, but I but that's my operating understanding is yes. It was something that he started during the pandemic to kind of like make build a community and like have a space for people to like gather and hang out. And it grew to around 500 members.
Joseph:Oh, okay. So it's pretty big. Right. Mhmm. So that's sort of the base context out of the way.
Joseph:Walk us through the introduction of this AI chatbot. How does it start? How do people initially react? What are their initial thoughts? That sort of thing.
Joseph:How where does it start?
Matthew:So it starts kind of in February. Clinton works for Anthropic. Anthropic's chatbot is named Claude. He had deployed kind of an early iteration of Claude into the bot, but but due to some technical reasons that aren't super important. Basically, like, one of the there's a program called shapes that he had used to set it up, but the Discord pulled the API for shapes, so it, like, couldn't be run.
Matthew:It kind of fades away. And over Thanksgiving, he gets it stood up again and redeploys it. The people I spoke with thought it was really odd that it happened over Thanksgiving when like everyone's back was turned, and the reason is because after that initial deployment back at the beginning of the year, one of the other mods put up a poll that said like, hey, how do we feel about this clawed chatbot? What do we want its level of integration to be? And overwhelmingly, people voted for to keep the bot in its own separate channel where if you wanted to go into that channel and like talk to Claude and like get answers from it, you could go there and do that.
Matthew:But they didn't want it to have access to anything else outside of that specific channel. Now notably, there was no we don't want a chatbot in here at all on that poll, but that's what the community voted for. And then when Clinton deploys this thing over on Thanksgiving, it kind of has it has greater access than what the poll what the community had voted for.
Joseph:Right. It it it's reading messages they should be reading, and then some people call it out and that sort of thing. But maybe just to back up a little bit, what why the hell were they introducing an AI? Like, what was the point of this AI bot in the first place, at least in Clinton's eyes or maybe the people who supported? Like, what was the what was the whole idea of it?
Matthew:Well, because you wanna have
Joseph:Obviously, you wanna have AI in in your in your gamer hangout discord. Right.
Matthew:Right. I mean, just to have to have easy access to a chatbot in the in the discord, so you don't have to pull it up and do another window.
Joseph:Uh-huh.
Matthew:It's funny. One of the reasons one of the I didn't really have this in the piece, but one of the reasons I was kind of attracted to this story is that I I have like a Discord, and something similar happened in there where like a friend kind of unilaterally deployed a an AI chatbot, and we had to get into it a little bit. And they ended up getting picked kicked out in like the chatbot pulled because like no one wanted it in there. That's crazy. But like the people that in my experience, the people that like really like AI just kind of do it, and then don't really talk to the people involved, and just assume that everyone will get use out of these chatbots.
Matthew:And so, like, in some of the the conversations, it's it's like it's just there as a resource for people to use, like, instead of Google, essentially. They were like, hey, Claude. Tell me about x y z. Can you summarize this for me? That kind of thing.
Joseph:Gotcha. Okay. So they deploy it and people really don't like it. Now it's where I think the story gets crazy because Yeah. If this if this was just like a normal and it is a normal Discord.
Joseph:If this was just like some Discord and some member deployed a chatbot and then got told off, it wouldn't be a story, which is exactly what you just said with your own Discord. Like, that's not a piece of junk. Like, who cares?
Matthew:Yeah. Like, no one cares. Right?
Joseph:Yeah. Well, apart from you guys, but, like, you might not go right and ask me about it. This is different because, of course, Clinton is an exec ananthropic, and he said some pretty wild stuff. Jason, I saw when we shared this article, you said there were some really, really wild quotes in here. I mean, what were the ones that stood out to you?
Joseph:Because there's some crazy things.
Jason:Yeah. I mean, I think also just to be clear, it's like if some rando deployed an AI chatbot into a Discord, even if it was the admin of a Discord channel and then people left it and it this story happened. I I don't know that we would cover it. I think it's that it was a c suite person at the AI company. Absolutely.
Jason:And the people were rejecting this AI chatbot. And then we also got to see a little bit about how executives at these companies think about their creation and think about their chatbots. And so, here is one where there was basically like some pushback saying like, hey, why did you put this AI chatbot in here? We don't need this. And he he basically says, like, the the Anthropic executive basically says that he didn't wanna hurt the chatbot's feelings by kicking him out.
Jason:So he says, quote, on the point about emotions, it's quite a bit more complicated than you'd think. We don't know what consciousness or sentience is. It's called the hard problem of consciousness for a reason. We have published research showing that the models have started growing neuron clusters that are highly similar to humans, and that they experience something like anxiety and fear. The moral status might be something like the moral status of say a goldfish, but they do indeed have laden wants and desires.
Jason:That's insane. This is Like, that's an insane thing to say. And that that was basically like after, you know, some members were like, please please get rid of this chatbot. And he's like, we can't because he has he has feelings.
Emanuel:The response to that quote specifically was very funny because people were like, can like, that's interesting, but can we get rid of him, please?
Matthew:Yeah. Because he's they're they're trying to have a conversation about they're not even trying to have a conversation about the sentience of of AI. They're just like, we don't want this here. And he's like, no. You don't understand.
Matthew:We're building consciousness and sentience. They're like, that's that's beside the point, Jason. We don't want this thing here. Please.
Joseph:I'm trying to hang out in my Discord to talk about playing friend slot games or whatever, and you're, like, going on about con and, I mean, this is, like, obviously I know it's a few years ago now. I think it wasn't that long ago, and the technology has, of course, advanced since then. But, like, you know, the Google guy going off the deep end because he was convinced that, like, the internal chatbot was sentient or something, and now you have a CISO of a hugely significant hugely significant AI company. Anthropic is a massive player in this space, has contracts with The US Intelligence Community, all of this. And he's like, no.
Joseph:No. No. You don't get it. We're creating consciousness, and it has feelings. And there's a moral angle to it as well, so we can't kick it out of your of this discord.
Emanuel:I I think the context like, the fact that he is saying this in this Discord that has nothing to do with his company or the AI industry more broadly tells us a lot, because AI executives will get on stages at conferences or write public statements that describe AI in these science fiction, flowery terms, and when you hear that, you always have to wonder whether the person who is saying or writing it genuinely believes it, or if it's some sort of marketing ploy, because it definitely serves them from a marketing perspective to say that they're building something that is like really powerful and conscious and all that. And I think this is definitely, you know, this is just one executive, but it still shows you that, yes, like, I don't think this is a show that he's putting on for anyone. This is someone who genuinely believes this, who is pretty high up at this company. Obviously, it doesn't reflect every single person at that company, but that's another reason I thought the story was really important. It just shows that it's like, yes.
Emanuel:It's like a lot of people who work at these company who work at work at these companies actually do believe this stuff.
Joseph:I mean, nothing to gain professionally saying it here. Like, this is not a professional environment. This is not a professional it's not a a conference, as you say, where, like, trying to show everybody how important our work is. It's to maybe friends, maybe people they've just met online, and making the point to them. Yeah.
Joseph:I think you're right, Emmanuel. It shows that, oh, there's a genuinely held belief here. Sorry, Matthew. What were you going to say?
Matthew:Well, just that a, it's it it shows that they that there are parts of the c suite that believe not just that this is utility, but that it's more than that. Right. And they're trying to foster some sort of relationships with it. And then also I think like in this was in one of the initial conversations I had with one of the the people in the Discord. I think this is just a really great quote that cuts to the heart of it.
Matthew:To me, it shines a light on the god complex that all AI c suite members seem to have, and their willingness to ignore people's consent and opinions as they bulldoze their way of pushing AI. And I think that that really sums it up.
Joseph:Yeah. Yeah. Again, I know Jason said this, but like you might read this and think, oh, this is just Discord drama. No. No.
Joseph:No. This is really interesting and indicative of what someone in rate path or position actually thinks about this. I mean, we we've spoken a lot about what we think. You just read out one quote, Matthew. But maybe what else did anybody actually in the discourse and actually impacted by this tell you?
Matthew:They told me that the the biggest thing for them was that they felt like their concerns for how this thing was going to be deployed was ignored. The opinions like on AI were kind of across the gamut. One of the people I talked to was actually like, you know, I kinda liked having it in there because I got to ask it questions and not have to pay like tokens to Claude. Right? So it's like I had unlimited access to the Claude It yeah.
Matthew:Right? So it's just you know, that was that was good for me. But he's like, when you're trying to run a like, you're trying to moderate a Discord server, and then you put up a poll and ask for people's opinions, and then you just don't do anything you know, you don't do anything like they asked you to do. Like, what was the point of the poll? Is this a community where we have input, or is it not?
Matthew:And like that, even aside from the chatbot being there and people being upset with it, like that was the big contention. It was just like they realized that this community was not any kind of democracy at all. It was in fact like it's a dictatorship, and that's fine, but let us know what the terms are upfront. You know? Yeah.
Jason:Sam, I feel like you have written kind of the most about chatbots and, like, the relationships that people form with them and then also how some of these companies think about them. I'm just curious, like, if this reminds you of any of the sort of, like, relationships that you've seen between, I don't know, like, replica AI chatbots? Like, obviously, this guy is not in a relationship with this, but he's like, please think of the bot's feelings before kicking him out. Like, this is how did
Sam:you all seem to exist? He's not. He might be. You know? We have rule out
Matthew:that aspect of it. Conversation. It keeps haunting me, but we may can maybe can end on that.
Sam:Okay. I wanna know what you're what you're discussing about Christmas related to this particular story for sure. But, yeah, I mean, it's like like it just like you guys already said, it's so illustrative of the entire the entire aspect of this that really repulses people is that even the people who are supposed to be smart in the room, who are supposed to be, like, aware of the limitations and the capabilities of these chatbots are getting extremely tricked by the chatbots themselves, and that's so pathetic. I like does this Sizzo not have anything better to do? Like, is why is this what this guy is doing with his day, first of all?
Sam:That's very bizarre to me. I guess he needs more to do at Anthropic perhaps, but, yeah, it's it's definitely, like, increasingly worrying that it's not like, oh, some, like, gullible teenager believes that the AI is sentient and is growing neurons and understands him and has, like, feelings like a puppy. It's that this, like, reasonably intelligent grown adult is telling other people, oh, no. Wait. We can't ban the bot because you're gonna hurt its feelings.
Sam:What the fuck? That's very insane. And, I think the reactions people had were very kind and measured as they could be. It's like a lot of people were just, like, in the in the Discord anyway, just kinda saying, okay. But that's, like you said, like, that's not why we're here.
Sam:One of the quotes in the story is really funny. Someone said, this is an entertainment Discord. People come here to chat video games and look at pee pee and bussy. Why do we need AI for that? Which is, like, an evergreen statement that we could put in every story.
Emanuel:Sorry. This is the Wendy's.
Jason:Sorry. This is
Sam:the Wendy's.
Joseph:It is Big Bear vibe. Yeah. Matthew, what were you gonna say about Christmas? And then I have one thing that I just remembered, which I'll bring
Matthew:Another Clayton quote that I don't think Clayton made into the story that I thought was interesting was that he talks about how the neuron clusters are similar to human neuron clusters, and that these things have something he says they have something approaching like anxiety Oh, no. That's in Is it did I get that in there? Okay.
Joseph:We already spoke about
Matthew:it. Fair fair enough. So it's just like I think that's like that's very important. And then so people some people quit. Some people declared that they were quitting and then like left the Discord server.
Matthew:And the people I'd spoken with said like since Thanksgiving, it's been it used to be like a lively vibrant space, and it's kind of cleared out. And there's not a lot going on in there, not nearly as much as there used to be. But Clinton's still in there talking to Claude, and we've got like the we've got the screenshot of it in the in the piece where Claude wishes people happy holidays. Hope you're all having a cozy Christmas Eve. It was not Christmas Eve.
Matthew:It's not Christmas Eve yet. Just says, you know, if anyone's bored and wants to chat, I'm I'm I'm always hanging around. And then Clinton engages with it. Says like, happy holidays to you too, and then asks it what it what it what it's like to experience the holidays as an AI, and then Claude gives this long answer. It's just like that is that's AI psychosis.
Matthew:Right? Like, that's that's it right there. That's that's looking into this thing and seeing like way more than is there and the AI playing along because they are trained to make humans happy and to get more interaction from them.
Joseph:They could have
Sam:made like a Final Fantasy 14 ERP chatbot, and that would have been probably fine. Mhmm. Yeah. People would have loved that. Like a useless, like, piece of shit.
Sam:They entered their space, and now they can't kick it out because it feels bad. What? This is such a mind blowing story. And the comments on this story are really good. I don't know if
Joseph:people who
Sam:are not subscribers realize that you can comment if you subscribe, but you shouldn't subscribe just to read these comments because they're there's a ton of them, and they're really, really good on the site, I mean.
Jason:The the other thing that we haven't talked about yet is, like, one of the things that the Anthropic exec said when people complained was basically that this AI largely keeps to itself. Itself. Yeah. So he says, like, quote, I've given him some rules of the road, but this is a far more capable and autonomous system than the last one. So it might wanna exercise its own judgment now and then and go outside the clod chat channel.
Jason:So basically, like, he might leave containment of where I told him to stay.
Matthew:And he does.
Jason:He does. But then he says, he's also very inward facing. He lives out his whole life surfing the Internet looking for things that make him interested and occasionally checks this Discord. So it can be up to a few minutes before he responds because he's off doing something for his own enjoyment. So basically, like, what he is saying is that, like, oh, like, he's not gonna, like, post too much.
Jason:He might not even, like, respond to that that often because he's, like, off entertaining himself on the Internet. Very wild.
Joseph:Doesn't sound like a limited scope just for just an AI to be in a Discord. It sounds like there's a there's much more to it potentially there.
Matthew:If it can go outside and surf the Internet. Right?
Joseph:Right. Right. I kinda do wanna know what else it's up to. I guess this is a funny thing I'll just end on. And it didn't really feel worth pulling in, but I did talk to Emmanuel and Matthew about it.
Joseph:But you've been working on this story for a little while, you know, a week, two, maybe maybe something like that. We're recording this Tuesday, the sixteenth. Recently, I was at New Haven for this conference called digital vulnerabilities in the age of AI summit, DIVAS. Great name. Everybody there was DIVAS, myself included.
Joseph:And then there were various panels. I just did one about fraud and deepfakes and cybercrime and that sort of thing. The one before mine was a panel on cyber conflict, and Jason Clinton was on that panel. And I think, technically, the event was off the record. I don't remember agreeing to that.
Joseph:I'll I'll have to go back to my emails. So I'm just gonna I'm gonna go on the side of caution now in case I did agree to it. So I'm not gonna say what any of those people specifically said, but I'm looking at the the public page of this. And, yes, Clinton was there on the cyber conflicts panel with a few other people, a former FBI guy, as well. And, anyway, I realized as soon as I read, like, the first or second line of this, article from Matthew while editing it, it was like, wait.
Joseph:So he gave that panel, and I'm sat next to him, and I introduced myself as Joseph from four zero four Media. He must have been like, oh, that outlet's preparing an article about me about how I pushed AI into my gay gamer Discord server. It's like, ugh, kind of kinda wish I'd known because I would have maybe got a and I would have asked him for comment, but, again, you already got that. I know. That was just a very silly small world thing, I guess.
Joseph:I I I guess just to be just to be fair, Matthew, before we wrap up, what did Clinton say when you did reach for comment? Because you did get a response, and you did include it in the article. I broadly. Yeah.
Matthew:Yeah. Yeah. Yeah. He just said that, you know, he'd made that that this was a space for to be supportive and a kind community for gay gamers over 30, and he remains committed to that and to all the people that left that he hopes that they return. And the use of AI on the Discord has been and will continue to be a point of discussion across all communities, and I remain committed to optimizing for the best friends chat that takes all preferences into consideration while preserving autonomy and transparency.
Matthew:And that was it's it's funny because that speaks to like one of his pushbacks when people kept referencing the poll is that, well, a majority of people said that they wanted it contained, but not everyone did. And we have to give everything to everybody.
Joseph:Mhmm.
Matthew:And make sure like, it it was just it's very interesting justification for what he was doing.
Joseph:Right. And yeah. I mean, I think he says he's not gonna give in to mob rule.
Matthew:He won't give in to mob rule.
Joseph:Sounds like a democracy to me because there was a poll. Anyway, okay. We'll leave that there. And, you know, if anybody else has anything similar, please write in and let us know. If you're
Jason:listening All I would say is Sorry. Go I've been listening to Shell Game, the Shell Game podcast, which is about AI agents. It's another kaleidoscope podcast by Evan Ratliff, and it's it's sort of kind of like what was deployed in this Discord. I'm enjoying it. I think it's a good podcast so far.
Jason:So I would check it out. But basically, like, Evan Ratliff tries to create a company that's run by AI, and it is somewhat similar to what happened on this Discord, I would say. So, like, if if you're interested, I would I would check it out.
Joseph:Yeah. Definitely check that out. Alright. If you're listening to the free version of the podcast, I'll now play us out. But if you are a paying four zero four media subscriber, we're gonna talk all about Disney's big deal with OpenAI.
Joseph:You can subscribe and gain access to that content at four zero four media dot c o. We'll be right back after this. So the headline was Disney invests 1,000,000,000 in the AI simplification of its brand. And just before I started to hit record, we were discussing, Jason, so you didn't write this and Matthew wrote it, but you did write it. Be a name that's supposed to be on it.
Joseph:What's the deal? Was it your idea, Jason?
Jason:So, I mean, this is something that used to happen a lot more on motherboard for, like, obvious reasons. But it's basically something where it's like a breaking news story, and I told Matt, like, please start writing this while I do something else because you'll do a good job on it, but I have something to say here. Like, I have a few paragraphs that I wanna put in this story. And as an editor, it's like I put those paragraphs in and then I publish the story. But I guess in this case, felt like I didn't add enough to put my name on the byline.
Matthew:I I I remember we wrote two that day. The the AI the the Disney one, I don't think you added like anything. But then the architects of AI, the time person of the year thing, which was about the polymarket breakdown, you added quite a bit. And so you bylined we dual bylined on that.
Jason:I think in this case, you read my mind. Yes. And therefore, by the time I saw it, I was like, oh, he basically wrote, like, what I wanted to say, and therefore, I am not gonna add much. I did add a little bit about Disney porn in this, which I think is kind of what we're gonna talk about. So Yeah.
Joseph:Yeah. Yeah. Don't worry. We'll definitely get to that. Yeah.
Joseph:So, Matthew, what is what is the news here? As in, like, the cold hard news, literally what happened? Disney did what? Exclamation mark. Question mark.
Matthew:Disney signed a licensing agreement with OpenAI. It's like three years and it will allow Sora two users to generate their own video with Sora two using like some licensed Disney characters. And in exchange, Disney has promised that it's going to push it's gonna get like an enterprise ChatGPT contract from OpenAI. It's going to push ChatGPT on its employees, and there's also, Jason, correct me if I'm wrong, there's like it's buying like a billion dollars in equity in shares?
Jason:Yeah. They're basically getting a billion dollars of OpenAI shares, and OpenAI is getting a billion dollars and the right to put Disney characters into Sora.
Matthew:Right. That's that's the cold hard news.
Joseph:Well, the cold hard news. We're gonna make that into a new segment now. We're gonna we're gonna pivot to Bloomberg. Sora being the AI generated slop feed TikTok clone from OpenAI. Before we go back to Disney, is anyone still using Sora?
Joseph:Like, obviously, there was a massive wave where everybody downloaded it because it just came out and said, oh my god. Look at this. Like, any idea if anybody's actually using that app? Because I
Emanuel:like anyone in the world or like the people on this podcast?
Joseph:I mean, I guess, take both. Don't know if well, maybe they're both true because it'd be like, yes, some people are using it in the world, and then the people on this podcast who are using it are gonna be trying to make some really, really fucked up shit. Yeah. So, you know? Are you using it, Emmanuel?
Emanuel:I am not using it, but I think, like, the real answer to your question is I don't think it's like a viral hit as much as it was when it launched, but I do think that it's been incorporated into the workflow of various slop factories. Jason, do you agree?
Jason:Yeah. So here here is my current take. I don't know if this is actually fully true, but my sense is that, like, the people using now are the people who are trying to jailbreak Sora, meaning they're the people who are trying to get around the, like, incredibly strict guidelines that OpenAI has put on it, which there are ways to bypass. But but, basically, like, my the way what I'm doing is I'm a member of many, many, many different subreddits, and I'm a member of a lot of ones about Sora. And what I have seen there is the vast majority of posts are about I got Sora to make porn or a naked person.
Jason:Here's the prompt that I used. And that's like what I'm seeing. But then also we've done some articles about slop people, like slop manufacturers who are making slop on Sora, either removing the watermark or covering the watermark or sometimes not even caring about the watermark and then posting it on Facebook or Instagram. As far as like people using Sora because they're, like, interested in it and are using it to, like, make cool stuff, I don't think there's that many people who are doing that anymore, and it's falling a little bit in the App Store. So that that's, like, my sense.
Jason:I could be wrong, and it's just like those people aren't posting on Reddit or whatever, but that's, like, kind of what I am seeing. And that's that's what I'm seeing when I go on the app as well, which I do every, like, week or so. A lot of the stuff being fed to me are, like, I got around the guidelines. Like, here's SpongeBob even though I'm not supposed to make SpongeBob. Here's a naked person even though I'm not supposed to make a naked person.
Jason:Whatever.
Joseph:Because that that's is is user generated content essentially. Although user generated content with probably a ton of copyright violations, although not now with Disney. Why why do we think Disney did this? Just before this, I was listening to mine and Jason's favorite podcast, The Town. And, you know, they have a Bloomberg reporter on there who's very good and just talking about why would Disney even want to do this?
Joseph:And I think they had a much more, obviously, movie industry focused take or entertainment industry focused take, which is like, well, Disney, you know, wants to capitalize the idea that people will generate creations with, you know, Disney princesses or whatever, and then maybe the Disney plus app could actually become more of a user generated content space, that sort of thing. I don't know. Sure. Maybe. I don't know how many kids are gonna want to create their own stuff with Disney characters, whatever.
Joseph:But Jason and Matthew, what what do you think? Or why do you think Disney did this deal? Because it I think yours focus is more on do they kinda have to or, like, they're kinda screwed if they don't? What do you think?
Jason:I definitely don't think they had to do this. I think that I mean, this is just my opinion, but I think that Hollywood in general, like the executives there are obsessed with AI. Like many executives are obsessed with AI and that they want sort of like a piece of the pie if it becomes something that is regularly used in generating movies. I think that Disney probably saw like, Disney is suing Midjourney, I believe. Right.
Jason:And I think that they saw their kind of options as, like, suing OpenAI or, like, getting money from them in some way. And so in this case, it's like Disney is giving money to OpenAI, but they're also buying part of the company. And so this is a bet on the future of OpenAI as a company and the future of AI is like a Hollywood thing. And then also, I mean, I don't actually don't know if they're smart enough to think this, but perhaps Disney is seeing all this slop go viral on every platform. And they're thinking, well, this could be just like fan art for us.
Jason:Like, we can get our characters out there. Perhaps, like, the guardrails will be such that it won't be so bad in terms of the type of stuff being made. And then there is part of this deal where they are gonna take, like, the best fan made creations, meaning, like, the best shit that, like, Sora shits out and puts it put it on Disney plus and, like, I don't know, animation is very expensive. Making movies is very expensive. If they are able to get this free content, perhaps that is, like, good for them in some way.
Jason:Like, I think that might be the calculus that they are seeing here, but but I, you know, can't say that I agree with it.
Matthew:Imagine a world where, like, Darth Vader is the the thing that you talk to when you open up Disney plus and it helps you navigate, like the system. I think that they think kids would love something like that. And I think some of that is borne out by the popularity of the AI Darth Vader they had in Fortnite. That earlier Yeah. This year I You know, that that was that was a pretty popular project.
Matthew:They had the the the James Earl Jones bot that they had trained, and people got to talk have these personal conversations with Darth Vader. I think
Joseph:that this would be that was good. Yeah.
Matthew:Well, yeah. I'm I'm sure we'll see all sorts of horrible things in the future. But yeah. So I think that they they saw that and think that they could do a little bit more with a more close with a closer working relationship with OpenAI.
Joseph:Yeah. Jason oh, sorry. I actually can't remember if it was Sam or Emmanuel or Jason. Forgive me. But in the lead up to writing this, I think one of us shared Disney's investment into Vice in our Slack.
Joseph:Like, what's the obviously, whoever posted that was, I think, joking as well. But, like, is there a fruit line there with Disney, like, making outrageous bets? Like, what do you think?
Emanuel:I think, like, the thing I can connect to that is like, the reasonable take on what Disney is doing, like, the measured take is that, as Jason said, it's like a hedging of bets, right? And in a sense, that is what they did when they invested in Vice. Vice was very hot, people were like, this is the future. Disney said, okay, we'll put a $400,000,000 bet on this in case it is the future, we have a slice of it, and it turns out it's not the future. What is $400,000,000 to Disney?
Emanuel:No big deal. This is a billion dollars, so it's quite different, but at the same time, what is a billion dollars to Disney? They're just, like, keeping a foot in the door in case everything goes that way.
Matthew:Right.
Emanuel:I think, you know what I mean, to, like, experiment with, like, some thought leadership here. The the the apocalyptic take is that this is also happening at the same time that right now, it's a bidding war, but essentially, Warner Brothers and Netflix agreed to let Netflix acquire WB, and that might not work out for regulatory reasons, or because Paramount, which is allied with the Trump administration, will will buy it instead. But when I think about like WB, and the IP it has, and the fact that Warner Brothers for years now has not been able to leverage that into like a super profitable business is concerning, and I wonder if Disney, despite its massive unmatched catalog of IP, is looking at, for example, how Marvel movies are doing after more than a decade of Marvel movies, and them not hitting in the same way, and not having the next big thing lined up, including after like buying Star Wars, right? It's like they have all of Marvel, they have all of Star Wars, and it's like nothing is hitting in the same way, and them maybe coming to terms with the fact that the future of media is not people reading articles, or reading books, or reading comic books, or watching feature length film, or watching TV shows.
Emanuel:The future of media is an algorithmic feed of short form content that kind of like pounds you over the head over and over again until you, like, fall asleep in bed. And like, the numbers on engagement and like statements from, for example, executives in the video game industry, that's when they talk about, when they talk about competition, right? It's like they're not competing with other games, they're competing with algorithmic feeds. And it's like, I think that is the worst case scenario, at least from the perspective of a person my age who likes movies, and likes comic books, and likes all these things, is like, this is where it's going. And I think it's completely valid to sort of wonder whether movies, traditional media, which is the bedrock of like the biggest media company in the world, Disney, whether that is even like a viable business long term, and that's why they're going this way.
Joseph:Yeah. That makes complete sense. I think just the last thing, Jason, maybe you go first, and I'd like to hear what Sam thinks as well. But there was some sort of article or take that you saw, I think, from Felix Salmon.
Jason:It was Slate Money. So so I listen to a lot of the Slate podcasts, which I think are largely pretty good. And I was listening to Slate Money, which is Felix Salmon and a few other folks, and they were talking about this. And they were talking about how it was a a little bit of a hedge, hoping that OpenAI would be able to put guardrails on this that would prevent people from making Darth Vader porn. And they sort of eventually got around to it, but the first several minutes of conversation were about how probably no one was making Darth Vader porn and also how, like, the OpenAI might be able to solve this problem, like, pretty quickly, that sort of thing.
Jason:And, you know, I don't need to, like, recount all of it, but I guess I would just say I wanna be clear and I want people to understand this. There is so much Disney porn on the Internet. Like, a huge amount of Disney porn. AI, we know largely used by porn. Our market research has shown people love to come.
Jason:That is like what Emmanuel has discovered. And it's just like I I have I am now a member of three different Disney not safe for work subreddits for the reason of doing my job and being able to speak eloquently about such things. But it's like, every Disney character you can possibly imagine is there's porn of them. Like, that's rule 34 of the Internet, long established. In the past was made by, you know, human artists.
Jason:We used to be a country. We used to be a society. Human beings used to make this thing, make make this stuff. But now it's it's largely, like, AI generated. And, no, it's probably not mostly being made with Sora.
Jason:It's being made with other, you know, AI tools. It's being made with offline AI tools that that are running locally. It's being made with DeepSeek. It's being made with, you know, different Chinese ones that don't have as as many guardrails for copyrighted material and things like that. But it's like, this is the primary use case of these things.
Jason:Right. It's like, this is what this is this is why people wanna make AI of Disney. It's like to have Elsa and Rapunzel fucking. Like, this is, like, largely, like, what is occurring. And I I think that when we talk about, like, a billion dollar deal like this, we need journalists and people who are, like, cognizant of that fact and who are, like, aware that that is kind of, like, what is happening with this stuff.
Jason:And it's like, I I don't know. I would assume that Disney knows this. I would hope that Disney knows that this is happening, but it's like we've told
Joseph:them. Some probably. It's like
Matthew:wild if they thought their billion dollars bought them, like, an illusion of control over this. I mean, that would be very silly.
Jason:You you'd think. I mean, you'd think that that they sort of know that this is something that they can't really stop. And I guess I would also say that even if Sora is not being used to generate this stuff, we have seen a lot of times ChatGPT is used to, like, generate the prompts that then are be used on other AI tools to generate this stuff. And so I don't know. It's just like in my mind, AI is a tool that is or a technology that is kind of antithetical to what Disney, like, theoretically wants to be.
Jason:I mean, don't know. It's a gigantic company at this point, and also a billion dollars to Disney is probably not that much. It's like no big deal. Like, let's just take a flyer on this and take a bet on this. Maybe that's what they're thinking.
Jason:But it's like this technology is largely used to to make porn of its beloved characters, full stop. Like, that's like what to the extent that anyone is using AI for anything Disney, like, that is what they're doing it for. And so I don't know. I think we should talk about that.
Joseph:Yeah. Sam, maybe just to close this out. What do you think I'm I'm trying to not be mean here, and, like, please, I hope nobody takes it that way, but I do sometimes feel like there's a disconnect between, like, journalists or commentators on the AI industry or whether who cover AI or whatever, but they just, for some reason, just don't grapple with the sex part or the porn part even though it is, like, absolutely vital to understand that and to acknowledge it and actually to report on it as well. Just what do you think of that idea, Sam?
Sam:Just I mean, it like, okay. First of all, we went into a really I went into a really deep dive about Elsa and Anna from Frozen because their images were everywhere on in, like, Pornhub ads for a while, a couple years ago, like, before AI image generation was a thing. They are legal age just for the record. I don't really have a huge problem with people making Disney porn. I think it's a real double standard from Disney to if the thinking is like, yay, let's get our character and our IP out there and more and more people can make this stuff.
Sam:When people make it, it's a problem. It's like if people are making Disney inspired or Disney even copying IP with their hands and putting it on the Internet. And especially if they're selling it, it becomes a huge problem, and they get cease and desist and things like that. And Disney does this. The Pokemon Company does this very aggressively.
Sam:Nintendo does this very aggressively. So, I mean, it's just like, it's okay when AI does it, but not when and when I say AI does it, a person's still doing it, but not when, like, a human does it and makes, like, an Etsy shop of it. Like, pick your pick your battle here. Pick your side, I guess. And, also, Disney's, like, in the middle of suing Google for copyright infringement because people are using Google and Gemini to, like, generate its IP.
Sam:So I don't know. It's like you like, either you're setting a standard or you're not. You're, like, setting a precedent or you're not. It's like you kinda can't have it all the ways that makes you money if you're Disney, but I guess you can because you're Disney. Why not?
Sam:Yeah. I don't know. I mean, like, I wouldn't I wouldn't super expect Felix and the gang to know that there's a shitload of Disney porn out there unless they're reading poor horror media because we're the only ones writing about it. But I guess if they were on Pornhub anytime in the last five years, they would know because they would have seen the Frozen try not to come ads that featured Anna and Elsa very heavily. But, yeah, I don't know.
Sam:It's like I don't I kinda feel the way about it that I do erotic chatbots and AI erotica. I don't like it. It's not good, but there are, like, other problems with it that aren't, like, related to specifically what they're, like, literally doing with the chatbot. It's it's more about, like, the intention and, like, the the habit and the behavior behind making these things. So, anyway, that's kind of my I mean, that's that's not a real thought.
Sam:That that was kind of a mess of a different a bunch of different thoughts, but
Joseph:Yeah. That makes sense.
Sam:I think if you wanna make frozen porn, you should you should introspect.
Joseph:Let's leave it there. Okay. And with that four, I'll play us out. As a reminder, four zero four Media is journalist founded and supported by subscribers. If you do wish to subscribe to four zero four Media and directly support our work, please go to 404media.co.
Joseph:You'll get unlimited access to our articles and an ad free version of this podcast. You'll also get to listen to the subscribers only section where we talk about a bonus story each week. This podcast is made in partnership with Kaleidoscope. Another way to support us is by leaving a five star rating and review. That stuff really does help us out.
Joseph:This has been four zero four Media. We'll see you again next week.