The 404 Media Podcast (Premium Feed)

from 404 Media

The AI Exodus Begins

Episode Notes

/

Transcript

We start this week with a series of articles from Emanuel about a crackdown in the AI industry. After the break, Sam tells us about the ‘Save Our Signs’ campaign which hopes to preserve the history of national parks. In the subscribers-only section, Jason rants about how AI will not save the media industry. 

YouTube version: https://www.youtube.com/watch?v=cP2ur8qVmMU

Joseph:

Hello. Welcome to the four zero four media podcast where we bring you unparalleled access to hidden worlds both online and IRL. Four zero four media is a journalist founded company and needs your support. To subscribe, go to 404media.co, as well as bonus content every single week. Subscribers also get access to additional episodes where we respond to their best comments.

Joseph:

Gain access to that content at 404media.co. I'm your host Joseph and with me are the four zero four media co founders Sam Cole. Hey. Emmanuel Mayberg.

Emanuel:

Hello.

Joseph:

And Jason Kebler. Hey. So one small bit of housekeeping. We are having a partygatheringevent in Los Angeles, July at 6PM. I'll put a link in the show notes.

Joseph:

But Sam, anything else you wanna shout out now while I'm just doing it? Or should people go to the event page if they wanna see us, hang out, and and talk about four

Sam:

zero four? Go to the event page and buy a ticket or email one of us for the code if you're a supporter. If you're a subscriber in the supporter tier, which means you're paying supporter of four four media, you can get in free. And we have the password. So we'll have a post up on the site soon with the actual password for subscribers to get in.

Sam:

But, until then, you can just hit us up. But, yeah, we're excited.

Joseph:

Yeah. Sounds good. Again, that's July at 6PM. Check the event page for all the latest information, but it would be great to see some of our subscribers. Jason, were you gonna add

Jason:

anything to that? It's gonna be at a place called RipSpace, which is like a hacker DIY space. We'll probably do some sort of, like, live podcast, live q and a, something like that. We're we're sort of nailing down what it is we're going to do. Probably by the time you hear this, there'll be more information on both the event page and our website.

Jason:

But yeah, we'd love to see you and there'll be a party after.

Joseph:

Yeah, sounds good. All right. We have, as ever, a bunch of stories to go through here. The first section is going to be a series of pieces written by Emmanuel, all about AI, porn, Civitai, Hugging Face, all of these different platforms are hosting it and there's been a ton of developments over the past few weeks and months in this world. So the first piece is the headline of the first piece is a 16 z backed AI site Civitai is mostly porn despite claiming otherwise.

Joseph:

So Emmanuel, Civitai is this popular AI platform, we've spoken about it a bunch, you've done a bunch of investigations into it. We've especially covered how it's used to, you know, host models and generate pornography. What is Civitai's stance on that? Do do they see themselves as this big provider of pornography or not?

Emanuel:

Well, they definitely didn't used to and we can kind of talk through where they landed these days. But historically, Civitai, which is a platform where people share AI image generation models, they've always allowed adult content, they've always allowed AI models that generate the likeness of real people, they just didn't allow people to post models or images that combine both of these things. Right? So no non consensual content, that policy was always there. A lot of my reporting has been about how they failed to properly enforce that initially on their own platform, but then especially off platform.

Emanuel:

Also, while I was talking to them for all of this reporting, whenever I managed to get their PR person on the phone or talk to anyone there, they said that my reporting is unfair because, yes, there is adult content on the website and they support free speech and they think people should be allowed to have porn models and have models of real people. But that's really a minority of the content that is on the site. And I have said in my stories and I have told them directly that that just like as a user experience, as someone who goes to their main page, if you look at the top models, the top images, the latest models, the latest images, it's just a never ending stream of adult content, but I didn't have the data to back that up. So I've always kind of presented what a user sees versus what they claimed. Especially they made this case especially strongly after I published this piece that showed that Cimatai's cloud computing provider called OctoML had a lot of internal discussions about the kind of content they were generating for Civitai, and they were worried that they were generating child pornography

Joseph:

Right.

Emanuel:

For Civitai. And that really freaked out the company and they then went out a kind of, like, a mini PR tour with their CEO, Justin Mayer. And one of the things he did, I think, the day of me publishing the story, right, so like, in order to coordinate with the deadline that I gave them, he gave this exclusive interview to VentureBeat and he said, let me pull up the

Joseph:

While while you're pulling that up, this is back in 2023, right? This is an investigation you wrote pretty early on or 2024 maybe, but quite early on in our history. Right.

Emanuel:

Yeah. So this is end of this is December 2023. Mhmm. And VentureBeat writes that contrary contrary to those figures showing 60% of content on Civitai as not safe for work, a figure derived from 50,000 images, Today, users on Civitai generate 3,000,000 images daily, and the company says, quote, less than 20% of the posted content is what we would consider PG 13 and above. So this is Justin Mayer, the CEO of Civitai, explained to VentureBeat that the kind of audit that OctoML, this cloud computing provider, did on the content they were generating, they saw they thought it was 60% of the content was explicit.

Emanuel:

They say that it's less than 20%. Cut to recently, I wrote a story about Civitai changing a lot of its policies because of pressure from payment processors. So in May, they stopped or they banned, rather, all models that generate the likeness of real people and also specific types of porn that the payment processors thought was too extreme. I wrote about this. I published a story.

Emanuel:

A researcher from the University of Zurich reached out and said that she has been scraping Civitai and she has, like, really comprehensive data on what exists on the platform. And we can dig into some of the numbers if you want, but basically it shows that their claims that it was only less than 20% of the content are absurd and the content is primarily used for adult content.

Joseph:

Yeah. This is really really interesting because you have your reporting over years at this point where it's just like, there's a lot of porn on here, you know. And I think you did a story a long time ago as well which basically showed that one of the driving forces of capital and innovation in the AI industry is to develop porn basically and of course, we all know that from Sam being the original reporters who cover deepfakes as well. Like it's it's always here all the time. So you have the reporting, you have not to undermine it or anything, but your experience of the website is more anecdotal, right?

Joseph:

As you say, you you log onto the website.

Emanuel:

I don't have the data.

Joseph:

Yeah, yeah, yeah, exactly. So I see a lot of I know there's a lot of porn models on here. You finally have this data and we'll talk about what those numbers are in a second but you said this researcher was scraping Civatai. I don't know whether they got this granular with you but sort of how did they do that? Did they look for all images that were marked as not safe for work or did they grab videos and analyze them?

Joseph:

Did they break down how they did the scraping exactly or did they not get the detail?

Emanuel:

They didn't technically explain how they like gathered all the data. I'm not sure what the specifics there are. But what they the data is composed of more than 40,000,000 images that they took from the site. Each of those images has a bunch of metadata that Civitai itself generates. This isn't the researchers annotating the data or analyzing it or anything.

Emanuel:

It's data that Civitai itself tags as using its own rating system of like r p g 13 r oh, sorry, x x x x right? That's kind of the its own method for tagging images and then there's other tags that could include like names of celebrities or the type of sexual act that's in an image and and things like this. They also did something similar to the models themselves and they did that for 230,000 models, I think are in the paper. I've been in touch with the researchers and I've looked at some data that is more comprehensive than that. I think they're up to like 400,000 as what they have, but it's not published yet in like in a journal or in a draft.

Joseph:

Yeah. So it's interesting that it is based on the classifiers where it's Sifytay itself saying, this is not safe for work, which, well, that that meme, that that that GIF if they just admit it, it's it's almost like that. It's tasteful. So the research of scrapes that has all these images, has all these classifiers, very handily attached to them. Well, what's the research showing?

Joseph:

What numbers are we talking about here when it comes to the prevalence of porn pornographic content on Civatai?

Emanuel:

So in December of twenty twenty three is when I published my story about OctoML, that's when Justin Mayer give this quote to VentureBeat. He says, less than 20%. The researchers say that by October of twenty twenty three, 56% of all the images were tagged not safe for work or higher. Right? So that is already, like, more than double than what Mayer claimed.

Emanuel:

The actual number is likely higher because as both the researchers noticed and as I have noticed and reported, not everything is properly tagged. Some people self identify their content as not safe for work, others do not and maybe try to get it under the radar, right? If you have like some non consensual content or a non consensual model up on the site, maybe the people who are doing that are not tagging it properly because they're trying to evade moderation. So the actual numbers are higher. Now, I'm looking at a graph of kind of the distribution of adult content and safe for work content from the first quarter of twenty three to the fourth quarter of twenty four.

Emanuel:

And Mayer was incorrect or misleading in his statement back when he made it and back when I was talking to him for for my reporting. But from that time until now, the number of overall images and models in the site has axe absolutely exploded as has the proportion of adult content. It's like, at this point, it looks like 90% of everything that is on the site is adult content.

Joseph:

I mean, that's staggering. Right?

Emanuel:

Yeah. I mean, it just it it it it it shows the the reason the hearing from this person and reading the paper was very validating is because it it it shows that, like, my user experience of Civitai and what I thought the site is, despite what it was presenting itself as to the public and investors, it's just porn. It is just a porn site. Like, I feel very comfortable saying that it is a porn platform. And actually, when I reached out to Justin Mayer for this story, he said he's he's standing by his, like, venture beat comment, which I think are completely wrong.

Emanuel:

But he does now admit that somewhere around the beginning of 2024, the the the the reason people use the site has changed. And I think that is probably because it's like, you you can't deny it anymore. Just like by looking at the numbers, by looking at the site, it's just undeniable that it is primarily used for adult content, or at least was until these policy changes.

Joseph:

Yeah. I mean I mean, that's really really interesting. And then you just the last question on this article, then we have a couple more to touch on related to this. But you you brought up these policy changes and we'll go into more detail about the exact why of this with the payment processes in a bit. But basically, Civitai banned models related to real people in May, which is obviously a huge seismic change for any sort of AI platform.

Joseph:

What happened when Civitai banned AI models based on real people according to this researcher who's constantly scraping Civitai, like, did it fall off a cliff? Like, what happened?

Emanuel:

I don't think first of all, I should say this is a very this is another very important number and another very interesting finding of the paper because they just recently changed the policy. So the researchers have all these models in a spreadsheet with the metadata and also the links to where the models live. So once Civitai announced that it was removing real people content from the site, it was very easy for them to check how many models were removed overnight. And the actual number is more than 50,000, which is another kind of interesting you kind of when we when when I report about Civitai, it's like, you know, there's Taylor Swift and there's Natalie Portman and, like, you know, all these big celebrities that have, like, multiple models. And then you see there's a bunch of YouTubers and lesser known Twitch streamers, but like, you're like, what is the actual number?

Emanuel:

And it's it's much higher than I thought. It's like, I can't even think like, obviously, there are multiples but

Joseph:

But like tens of who who knows?

Emanuel:

Tens of thousands of people Yeah. Have bespoke AI models to create their likeness. It's just like really kind of a shocking image Yeah. Shocking figure. But to your question, I don't think they still they they've they they have, like, post policy change usage number and post policy change like content distribution.

Emanuel:

They can just see what was removed that was posted before the end of twenty twenty four, I suppose.

Joseph:

Yeah. So then, this just leads on to two other stories and the title of this podcast is something like the AI Exodus Starts and that's where this comes in, in that Hugging Face, another AI model hosting platform. You have a headline here. Hugging Face is hosting 5,000 non consensual AI models of real people. So are these some of the models that were on Civitai and then they got removed and now they've moved over here?

Joseph:

Like, what's the deal here?

Emanuel:

Yeah. So basically, credit card companies come to Civitai, they say, hey, you have to remove all this content, all these real people models or we're not gonna work with you. They are not working with them anyway, still, even after they remove that content. Maybe Sam wants to talk about this, but we saw something similar with Pornhub. It's kind of really hard to undo a payment processor cutting you off after they make that decision and that's where Civitai is at the moment.

Emanuel:

But when they initially announced this, they were like, hey, we're making these models invisible. The people who made them can have access to them for an unspecified short time, and then we're just nuking them off the site entirely. And the moment that they announced this, the Civitai community immediately mobilized and started this Civitai or actually, the Civitai archiving project already existed in the form of a Discord channel because there was other policy changes that kind of made people feel like something was coming and they were starting to starting an effort to, like, archive anything because they were afraid that Civitai would remove it. Once this news came out, it really kicked into high gear and they started grabbing everything that they can in order to make it available elsewhere on the Internet. And because of this research that I talked about in the previous story, I had a spreadsheet with all the links of all these real people models where they used to live on Civitai.

Emanuel:

And the archiving project created a website where you take that link, you enter it into the website, and it directs you to a mirror where that model was re uploaded to Hugging Face. So Hugging Face is like an AI tool and resource sharing platform. It's very popular. It's got like a multi billion dollar valuation, a partnership with Amazon. They've been very outspoken about wanting to be like an ethical AI company and what that means to them.

Emanuel:

I I I talk about that a little bit in the story. They don't have any specific policy against models that generate the likeness of real people, but they do have a lot of kind of vague language about AI being needing consent in order to be ethical. And I told Hugging Face that I have this data, that I I know of at least 5,000 models that are hosted on their site and not just random models, models that I know were used to create non consensual porn, models that were removed from a different, as we just discussed, porn site because they were used for this purpose and I just haven't heard back from them after multiple attempts.

Joseph:

You haven't heard back from the ethical AI platform company about the unethical stuff happening on their platform?

Emanuel:

That's right. Yeah. And I've I've talked to them before and they've replied to my emails and I don't know why they're stonewalling me about this one but they are. I I think I can say we're recording unusually, we're recording this podcast on a Monday. By the time this podcast is out, my story will be out and we'll see if they have anything to say then.

Emanuel:

But I I tried really hard to to get any comment from them and I think if they asked, then I would give them all the links and then they can decide what to do about them but they were just not interested in doing that.

Joseph:

And then we'll just touch on your last story very briefly. But yes, I just wanted to ask Sam for a moment. This last one is about payment processes and sort of the power that they have in the porn industry, right? And for those who don't know, well, sorry. I'll ask this to you, Sam.

Joseph:

For those who don't know, what do we mean by a payment processor exactly, and why are they so important to the adult industry?

Sam:

So as Emmanuel just alluded to, PornHub has had this problem, and a lot of porn sites have to abide by the rules of payment processors. So, I mean, when you're talking about, like, the the different, like, payment processors and gateways and banks, it's, like, kind of it ladders up, through the institutions that we're talking about. But, so, like, for example, Stripe has really strict rules against, porn and sexual content. Visa and Mastercard stopped processing payments for, Pornhub because, there was allegations of, abuse material on Pornhub. You know, it's like Chase has a lot of rules about not safe for work content and adult content.

Sam:

It's because they're considered high risk, quote, unquote, high risk material. So in the same in a similar vein as gambling or guns, they place sexual content in a similar kind of risk category as as a bank, and then that trickles down to payment gateways and the processors and things like that. So, you know, it's like it's what if it's working ideally, what you get is, like, these banks are not having to be accountable for, abusive content like what Emmanuel's talking about. What happens a lot of the time is the banks end up being, like, acting as, like, moral arbiters for porn. So you have a lot of,

Jason:

like,

Sam:

consensual adult content creators caught up in the rules against abusive content. These payment processors don't wanna take the time to sort out the difference in a lot of cases and just ban it all. And we've been talking about this for years happening to, like, consensual and safe and, legal adult content. And it's like these other platforms, like, some of the other platforms that are now having trouble with payment processors, they're fucking around and finding out at this point. It's like, this is you thought you could kinda just, like, get away with doing whatever, and you were above it because you're cool bros or whatever the logic is.

Sam:

I don't even know. It was making a lot of money, so surely it was too big for them to give a shit. But I think now they're finding out that the same rules apply to them. And I don't know. It's like it's we can't really put these things in the same bucket as legal, safe adult content, but they do a lot of the time have to abide by similar rules.

Joseph:

Yeah. And it and it's coming for them now. And I and I guess just to wrap it up, Emmanuel, the what impact from these payment processes deciding they don't want to work with Civitai or another AI platform? What happens after that sort of action has taken place and what's gonna happen now?

Emanuel:

Very quickly, would just say, I think last week I talked a lot about how the reason I was covering Civitai is that, to me, it seemed like a critical piece of Internet infrastructure in the entire practice of producing non consensual porn. And, like, what happens if you remove that? Where do people go? And something I saw people say after the Civitai announcement around real people is that they're just gonna go to tensor.art, which is a site I covered, like, I think the first week we launched, but it's a site that's identical to Civitide basically. It's the same services, same UI, you go there, you download models, you can generate images, yada yada.

Emanuel:

A lot a lot of people move their models there. For a while, Tensor Art had like a tool that even helped people import their Civitai models automatically. And on Friday, Tensor Art announced that they are getting the same pressure from payment processors and they're no longer they're saying temporarily. I don't know how they resolve this, but they're saying temporarily, no more real people content. They've disabled this Civitai importing tool and they're trying to come up with some sort of solution.

Emanuel:

I'm I'm very skeptical that they can find a solution and still have the payment processors work with them because as I've said, sometimes even if you fix every problem, they don't want to come back. So I'm just gonna continue to track like a huge there like, there's a huge vacuum now in how non consensual content is made online because of civitie policy changes and I'm just tracking where all this energy and all these people are going and that's probably gonna be some more reporting in the future.

Joseph:

Yeah, totally makes sense. Alright, we will leave that there. When we come back after the break, we're going to talk about one of Sam's stories about preserving history, very much in the vein of the Trump administration continuing to wipe things off the internet, but this is now sort of IRL archiving. We'll talk about it in a minute. We'll be right back after this.

Joseph:

Alright. And we are back. Sam, you wrote this one. The headline is save our signs wants to save the real history of national parks before Trump erases it. First off, can you just give us a little bit of context of what is the Trump administration doing to national parks and the history, around them?

Joseph:

Like, what's the issue exactly?

Sam:

Yeah. So like many things these days that, are just, like, bad news, dark news, Started with an executive order in March. Trump issued this order called restoring truth and sanity to American history. Already off to a really promising start with that title. The order mostly, like, targets the Smithsonian Museum institution and that network.

Sam:

It makes some really bizarre claims about the museums being part of this, like, revisionist movement, that they're trying to paint historical milestones in the negative light or acknowledging racism in American history, things like that. So the the order is mostly about that. But because it also mentions the Department of the Interior, that catches national parks and monuments in that net also. So, I can just read part of what the the order says because it's written like so many of these things in a very kind of slippery, vague way. Yeah.

Sam:

I'm sure it was lured to hell and back, but, also very wordy. Well, Wilson. Don't I mean, who knows? You know, it's, nothing was in all caps, so I'm sure it got at least one proofread. But, yeah, it says that the the Department of the Interior must take action, whatever that means, as appropriate and consistent with law to ensure public monuments, memorial statues, markers, or similar properties within the Department of Interior jurisdiction do not contain descriptions, depictions, or other content that inappropriately disparage Americans' past or living, including persons living in colonial times.

Sam:

Interesting carve out, and instead focus on the greatness of the achievements and progress of the American people or with respect to natural features, the beauty, abundance, and grandeur of the American landscape. So, I mean, if we can use our contextual thinking skills, take action, I think we can assume means remove, or edit monuments, memorials, and plaques, basically, at national parks or anywhere that's under the Department of Interior jurisdiction that say anything about anything other than how pretty the landscape and the monument is. So And not disparage people, which would, you know, rule out quite a bit of American history if you can't really tell the truth about what was going on.

Joseph:

Yeah. So purely hypothetical situation. You can have a sign that says, wow. Trees look nice. Mountains look good.

Joseph:

But you can't say anything like, well, the indigenous people of this land are x y z or something like that. Is that basically the thrust of it? Okay.

Sam:

Yeah. I mean, it's and again, it's like this the order doesn't really use any very specific examples for the national parks. But given the the attack on, like, so called DEI, the way this administration has been moving up until this point, we can assume that it means, you know, things like talking about indigenous people being slaughtered in this country, things like racism and civil rights, and all of that that actually built a lot of the, helped build a lot of the park system. Black indigenous people played a huge part in building the parks, and, you know, were many of them were there first, things like that. It's just like there's a lot of history going on in the parks that we just, are apparently no longer allowed to acknowledge environmental justice, women in leadership, things like that, that the administration has targeted in the past already online.

Sam:

We can assume probably counts in this case too. So

Joseph:

Yeah. I mean, I haven't really been covered covering any of these sorts of removal of history by the administration, but I don't know. That just drives me crazy, the idea that you were gonna delete history, Jason?

Jason:

I'm curious if this stuff would be FOIA able. Like, theoretically, it would be. I think the crowdsourced aspect of it is very cool and, like, makes it a group project. But I but theoretically, like, the national parks offices should have copies of this somehow. Like, they had to go get them printed, etcetera.

Jason:

And theoretically, all of this could be obtained by FOIA, which is Yeah. Yeah. I don't know if they brought that up at all. I I think that this is maybe easier than FOIA. Like, FOIA is a roll of the dice at at this point.

Jason:

But, you know, a FOIA about this and maybe we'll file some would theoretically capture, like, attempts to take it down or, like, alternate versions of the like, what's being changed, etcetera. Like, if it if it were responded to appropriately.

Joseph:

Yeah. Totally. And I just looked it up and got the email address or the portal for the National Park Service. So we can do that or we'll do it through MuckRock or or whatever. But what these people are doing is sort of this other archiving project.

Joseph:

So with all of those problems in that context, enter save our signs, this campaigns and launched. But what's the goal of this campaign, Sam, and who is involved exactly, and how are they going about it? What are they doing?

Sam:

Yeah. So, like, seeing, like, the writing on the wall, so to speak, not to use a corny pun for this particular story.

Joseph:

That was pretty good.

Sam:

The writing on the placards, data preservationists from the safeguarding and research and culture project and the data rescue project launched this web portal called Zero Assigns. They're asking people if you're at a park, if you're at a national park, to just take a picture of pictures of placards and signs and monuments and that they see as they come across them to preserve them, to archive them, essentially, because they could be under threat. We don't again, we don't really know what's gonna go on. Like, the the take action line is very vague. But, you know, it's it's something where they're the government is asking for visitors to fill out this QR code survey to report signs that they think are negative.

Sam:

So it's kinda like a counter protest to that saying, you know, if you're gonna if, Trump administration is gonna ask people to essentially snitch on signs that address, you know, anything other than beauty and grandeur of the American landscape and, you know, flag these placards that could fall under whatever the administration's definition of negative is, then they're gonna turn around and say, you know what? We're gonna archive this and make sure it's preserved in case it goes anywhere, in case it is removed. And the negative, quote, unquote, negative content must be taken again, using the words of the the order, taken action against by September 17 is what the order says. So they're launching this to, basically say, we have to hurry up and preserve this in case something happens to it, in the future. But, I mean, the Trump administration has totally gutted the park service, at this point.

Sam:

It's they barely have enough rangers to do basic administrative tasks in a lot of the parks. So this is just another task piled onto rangers and park workers that is silly and frivolous and totally wasteful. So we'll see what happens actually in September if anyone is around to take action against this negative content. But that's their goal, so it's, it's worth taking seriously.

Joseph:

Yeah. So I totally understand why the response to the executive order is crowdsourced. You need ordinary members of the public to go out and, hey. Take a photo of this placard or signpost or whatever. Take a photo of that so we can archive it.

Joseph:

It's, I don't know, unusual, weird, funny that the government has to crowdsource that bit as well of the destroying bit because they can't even get the people to go destroy it in the first place or either. That's ridiculous.

Sam:

And so far, the responses to the QR code survey that the administration has put up on signs to ask people to snitch on negative content that has been leaked to different outlets already. And the responses that people are putting into this survey are really funny. It's like there was one that was, respectfully respectfully go fuck yourselves. That's what someone put in there. The parks belong to the people, things like that.

Sam:

People are pissed. People do recognize a total frivolous waste of time when they say it, I think. Rarely do people show up to the parks and say, oh, this isn't, you know, perfectly optimistic about the beauty of nature, so I want something I want my government to drop everything and do something about this. It's, you know, it's you're there for a learning experience and an educational experience as well as the beauty. So, yeah, people are already and that would be something funny to FOIA once all the once it's had some more time to marinate, I guess, the QR codes and the responses to them because I think people are trolling it at this point quite a bit.

Sam:

That's not what the Save Our Science people are trying to do. They're trying to actually preserve what's on the signs.

Joseph:

Yeah. I mean, we might be able to still get some early on foyer them if if the National Parks Service still has a foyer officer or public information officer at the end of the day. So the the dead the deadline to take action to remove all of these signs or or whatever, potentially September. And then it looks like the save our signs program is gonna release the photos that they've crowdsourced in October, if I'm reading that correctly. Is there any indication of how many photos of signs they've got or how many people are are participating yet, or do we kinda just have to wait and see whether people are trying to help help out here when October comes?

Sam:

I think we just have to wait and see. I mean, I it they launched on July 4. We were famously out, which is why we covered it a little late. But, they were hoping to get people on holiday to take signs while they were out with their families and stuff, but, I think it's a little early. It's only been about a week since they launched the project.

Sam:

But it would be cool to see, to see what they get back, to see, what they do with the data. They're gonna make it public, like you said. They just are not totally sure how yet, or they're still working on how that would work. But I think it's I mean, I think it's cool to get public participation in archiving like this. I think archiving feels kind of difficult or complex or unattainable for a lot of people.

Sam:

It's like, oh, these sites are going down, so there's this big archivist effort to preserve them, or we have to kinda pull down the data and fill up a hard drive with it. And this, you know, big effort that you have to be part of a community already to participate in a lot of the time is how it looks from the outside. But this is a nice, like, entry point into data preservation and archiving. You know, it's like you're outside already. Can just take a picture and put it on this website, and you're helping preserve a piece of history when it's under threat, which I think is I think it's cool.

Sam:

I think it's a cool kind of muscle building exercise for people to participate in something like this.

Joseph:

Yeah. As you suggest, absolutely nontechnical as well. Usually, you have to be pretechnical to participate in some sort of archiving solution. You have to download a browser plug in or scrape something, and this is just something you can do to archive IRL material. Well, I'm I'm sure you'll revisit it in September and October, we'll file those for you.

Joseph:

But we'll leave that there for the moment. If you are listening to the free version of the podcast, I'll now play us out. But if you are a paying four zero four media subscriber, I said we're we're gonna talk about something. Jason's gonna rant for about fifteen to twenty minutes. So you can subscribe and gain access to that content.

Jason:

Hey, what? Not true. It's not true. We're gonna have a discussion.

Joseph:

Yeah. Of course, we are. Okay. You can subscribe and gain access to that content at 404media.co. Alright.

Joseph:

We are back in the subscribers only section. Jason, this story is yours. It's been in the works for a while. I think I did the main editing on it. I I loved it.

Joseph:

The headline is the media's pivot to AI is not real and not going to work. There's a lot there's a lot of points in here and and it would be great if you make them. But I think just to start, can you just tell us about this conference you went to and what were people presenting there? So something concrete for people, you know?

Jason:

Yeah. Yeah. So I went to a conference in New Orleans called the Media Post Publishing Insider Conference, which was a conference that we were invited to participate in. It's very fun, actually. It was basically like an annual conference of media business executives.

Jason:

So a lot of, like, chief marketing officers, a lot of chief revenue officers, a lot of people who had jobs that Vice used to have when we worked at Vice. Like, someone at Vice who I I worked with went a couple years ago and we were invited to go. So the types of publications that were there were like the Daily Mail, the New York Post, National Geographic, like what's left of it, The Boston Globe, Hearst was there, The Daily Beast was there, a few others, like, from publications that either you have heard of but don't know the companies behind them or publication you haven't heard of at all, like this one company called actually, don't even know the name of it, but they run five minute crafts on Instagram, which is like an insane Instagram, account where they have, like, weird life hacks, as well as this company called Brighter Side Media, which is just, like, inspirational clickbait, frankly. Also, the company that now owns EBAMS World was there, which was really wild. I I found that to be very weird.

Jason:

They also own, like, Know Your Meme and iCan has Cheeseburger and other sites of this type. So anyways, they bring all these people together and then they also have a bunch of, like, journalism software as a service companies. So there are companies that sell, like, paywalls. There's companies that sell programmatic advertising. There's companies that sell auto playing videos, which is pretty wild.

Jason:

I, like, talked to a few of those. Don't worry, we're not putting autoplaying videos on our website. But basically, it, pairs journalism business people with, like, companies that offer them services. And we were asked to go there. We and Aftermath were both asked to go there to talk to them about the rise of independent media companies.

Jason:

And I don't think there were any other journalists there. Like, single person who was there worked in a business capacity at one of these companies. And it was a really small conference. It was, like, maybe 50 people. And so there was a lot of, like, talking to people about about what they're doing, and I I found it to be, like, gen genuinely interesting because got to see sort of, like, how these much larger media companies are thinking about the rise of AI and how they're thinking about, like, the collapse of traffic and the collapse of the ad market and and all this sort of thing.

Jason:

And it was funny because they've been doing this for a really long time. And there were people talking there like, oh, remember Facebook Live? Remember, like, being able to get to the top of Google rankings with SEO hacks and stuff like that? And so a lot of them were now talking about AI in the same breath. It's like the kind of like rise and fall of all these things that have really destroyed the media industry over the last fifteen years.

Joseph:

Like Facebook's pivot to video, that sort of thing.

Jason:

Yeah. That sort of thing. Yes. Exactly. They're like, oh, remember like seven years ago, we were talking about pivot to video.

Jason:

Now we're talking about pivot to AI because a lot of the same people go every year. Yeah.

Joseph:

So you go there, you I think you did a panel about what it's like running a subscriber driven outlet. Obviously, that's where we're coming from. We're not doing like AI ads or or or anything like that. So you go there, you listen to all these people talk and I I feel like you were writing this piece maybe already and then you going here informed it, or did you write this piece because you went to this conference?

Jason:

I didn't write because I went to this conference. I wrote it because I wrote it because of the business insider layoffs, to be honest with you, which I'll talk about in a minute. But, basically, the there were there was discussion about AI at this conference, and there was one presentation by someone who worked for the Daily Mail who was talking about how they were packaging up the Daily Mail's, like, content so that it could be scraped by AI companies, like how they were gonna sell access to that. And there's been a lot of publishers who have sold access to OpenAI, to to Google, to, you know, different AI companies in exchange for a fee. And so she was talking about how they did that, what the business strategy is there, sort of what they're expecting.

Jason:

And then there were some other talks about AI about how, like, different media outlets are using AI to, like, optimize ads and do do things like this, like change how they code, stuff like that. Again, there were not there wasn't really anyone there who was on the editorial side of things for any of these companies. It was all, like, the advertising side. So it was very ad heavy, very, like, IT heavy. But there there was a lot of talk about AI, and so that did help inform this piece.

Jason:

Again, the the piece like, I decided I definitely wanted to write this article after Business Insider laid off 21% of its workforce in May. And in the layoff letter to their reporters, every time there's, like, mass layoffs, there's always, a CEO letter to the people who are left. That's like, here's what happened and here's what we're gonna do moving forward. And their CEO who is this woman named Barbara Peng wrote, quote, there's a huge opportunity for companies who harness AI first. And then she said that the company is going to be, quote, fully embracing AI.

Jason:

We are going all in on AI. Over 70 of Business Insider employees are currently using enterprise chat GPT regularly. Our goal is a 100%. And then there was, like, just like a lot more about AI in that. And then I also got some leaked audio from a Hearst Newspapers all hands.

Jason:

Hearst Magazines, like, owns a bunch of magazines, but then they also have a a newspaper division that has 78 newspapers around the country. And in that all hands, they were talking about leaning into AI, using AI for quiz generation tools, like games that subscribers could play, using AI for, like, to find stories, to do research for stories, so on and so forth. So I just wanted to write kind of like how we approach this and then also the fact that using AI for journalism is not a business model. Like, it's just it's not a way of making money. It's not a strategy.

Jason:

It's not it's not a way that is, like, going to sustain any sort of operation.

Joseph:

Well, yeah. And I guess that is the main point of the piece. I mean, everyone should go read the whole thing. It reads very, very easily and but there's there's like a lot of context and material in there. So what when you say that using AI is not a business decision, it's not a strategy, it's not going to save all these outlets like Business Insider where it's like, hey, we're gonna lay off all these people, now we're gonna get a 100% of our employees who use ChatGPT.

Joseph:

Why is that not a business decision or what? Sorry. Why is it not a good business decision? It is a business decision, but it's, you know, it might not be the best one.

Jason:

Well, I don't even know if it's a business decision at all. Like, laying people off is a business decision. Cutting costs is a business decision. Selling your content to OpenAI is a business decision. But I don't think that they're business models.

Jason:

And I I think that that is a Right. That's just like a there's a slight difference there. And the reason I say that is because, honestly, I kept thinking, like, it's like going into your newsroom and telling a bunch of journalists, like, you need to use computers now, and we're gonna use computers, and that's how we're gonna make money. Or you need to use the Internet, like, you need to use a cell phone now. It's just like AI is a technology and it's a technology that I mean, I get into this in the article, but it's like we don't really use generative AI.

Jason:

I mean, we we're very, very clear about how we do our work. It's like when we use generative AI, it's because we're reporting on shortcomings of generative AI or we're reporting on a specific tool. And so we're like, hey, we're using this tool to write about that tool. But we're not, like, using ChatGPT to, like, write entire articles for us. We're not using it to do, like, tons of research for us, etcetera.

Jason:

And I think that when these executives think about like, when they say use AI, they mean it, like, use it to make yourself more efficient and therefore, there will be more output and therefore, like, we'll make more money or it will cost less or we can have fewer employees or whatever. And I think that with any tool, and it's like we we've now written about AI enough that, like, I don't think that AI is fake. I think that they're they're it's like impacting the Internet in a huge way, in a in a verifiable way. And then it's also changing society in a big way because a lot of companies are just like laying people off and saying, you know, the these AI versions are are good enough for us or we're we're at least gonna try it. And so, I guess, one of my many main points is that if this technology is going to be used, an executive who has no idea how to do journalism and doesn't care about journalism is not going to be able to say, hey hey, journalists, now you have to use AI.

Jason:

What's going to happen and what has already happened is that journalists who are very good at their jobs are going to find ways of using different AI tools, not just generative AI, but different AI tools to do their job slightly better than they already were. And this is not going to, like, show up in a company's bottom line. And I don't think it's gonna show up in a company's bottom line because it's been happening for a really long time. And that's sort of where I'll I'll get into specifics. It's like, we don't use ChatGPT to write our articles.

Jason:

I think that we're, like, extremely clear about what tools we use and how we use them. But we're recording this right now on Riverside. And I've mentioned this before, but Riverside is a podcast recording software, and it has a bunch of AI built into it. It's like we didn't we don't use it because we want to use AI, but, like, you can edit podcasts using text editing. Like, it will transcribe and then you can cut the video, you can cut the audio by editing off a text transcript.

Jason:

And that's like something that has just started becoming a real thing in the last, like, I don't know, five or six years. There's a tool called script as well. And not every podcast uses this, but many, many, many of them do. And many of them use these tools and it saves time. Like, undeniably, it saves time to do it that way.

Joseph:

Well, they can. But we use human audio producer, but it can do. Yeah. Yeah.

Jason:

Well, yeah. But it it especially saves time if you are doing a narrative podcast. It's like a lot of narrative podcasts are which, you know, we haven't really done much of. Joseph, you did a little bit of it. But, like, for big narrative podcasts, you might dump, like, 15 different segments of tape into something like Riverside.

Jason:

It transcribes them all and then you can, like, basically do what's called a paper edit where you take the transcript and you cut the clips that you want and you do like a rough cut where you get like a a sort of like a rough idea of what you you want it to be. And then a human goes in and like an audio engineer, like professional, goes in and make sure the levels are good. They clean up the audio, like they remove background noise, they do all that. And like, again, there's difference in machine learning and generative AI and, like, I don't know, regular AI and algorithms, whatever. But, like, things like removing background noise is an algorithm.

Jason:

It's based on machine learning and things like this. Like, these are all technologies that are somewhat related and these are technologies that didn't exist not that long ago. And they've made journalists so much more efficient, but it hasn't, like, fundamentally changed the business the underlying business models of media companies. It's like Vice, when we were there, had a zillion people who were writing captions for video. Like, when they were editing documentary, were doing they were transcribing the video and putting captions on, and then they were translating them into different languages and so on and so forth.

Jason:

And that took like zillions of hours. It's like really really time consuming work. And now there's many many tools that either do this automatically. They, you know, any good journalist is going to be double checking these, but it saves a lot of time. And, of course, there are jobs that are lost.

Jason:

Like, there are a lot of translator like, people who do translation and who who do transcription, it's getting harder and harder for these people to find work. And I I don't really wanna minimize that, but also these tools are really, really widely used in the industry. And they're really widely used in the industry not because an executive said, hey, you need to go do this. They're really widely used because reporters who were trying to just, like, get by and do their job, like, started using them and realized that they were useful.

Joseph:

Yeah. Just one I was going to hammer on a couple of things, but maybe just one of them is that you bring up Kevin Bruce in the story. He's a New York Times columnist who I think it's very fair to say is more pro AI or closer to sort of the industry part of AI than we are with, you know, writing about the abuses and how it's being used in the real world, all that sort of thing. You write in your piece, if Kevin is using AI so much in such a new and ground break groundbreaking way, why aren't his articles noticeably different than they ever were before or why aren't there way more of them than there were before? Where are the journalists who are formally middling, who are now pumping out incredible articles thanks to the efficiencies granted by AI.

Joseph:

What what do you mean by that exactly?

Jason:

Yeah. I mean, I'm not I'm really not trying to take shots at anyone in particular, but Kevin's been very public about how he uses AI in his work. And, like, he writes about the same number of articles as he always did. They're sort of, like, of a similar tone and similar, like, level of, like, research. They're you know, he does the Hard Fork podcast as well.

Jason:

So, like, he's he's a pretty prolific person. Like, he publishes a fair bit, especially for a place like the New York Times where people don't publish that often. But he writes a lot about how he uses AI tools. And if that is the case, like and how they make him a lot more efficient. Like, why aren't there just tons and tons and tons of really awesome investigations coming from that?

Jason:

And and in the same way where I was just like, why aren't there reporters who were, like, okay reporters who were doing an okay job who have started using these AI tools that are supposedly so revolutionary? Like, why are they not publishing incredible investigations constantly or just, like, really groundbreaking work all the time? Because that is what that's what would be necessary for these tools to, like, fundamentally change the economics of a media company. You need, like, an entire newsroom of people who are just, like, constantly churning out just, like, amazing work that because they're using AI, and I don't see that. Like, I haven't seen that.

Jason:

What we do see is you can shit out tons and tons and tons of, like, really bad articles that have very basic facts in them, and that is, like, a business model that different media companies have tried over and over and over again. And it worked it worked back when you were able to, like, SEO them to the top and get a lot of clicks and then get ad revenue from that, but we've seen that kind of business model go down the drain. And so you can still make money doing that, but the people who are making money doing that are like individual people who are not journalists who are just doing AI slop and they're collecting a few cents from either Google or from like Facebook's different programs and we've written about them a million times. And so if you own a media company, like, why are you trying to compete with that? Because the Internet is full of just, like, so much sameness.

Jason:

It's just like same same same same bullshit everywhere. Like, that is not something that a skilled journalist or a skilled writer should want to try to compete with, in my opinion.

Joseph:

Yeah. We we

Emanuel:

we do own a media company, like the owners of this media company are on this call. And I think if there was some super efficient, super powerful generative AI tool that would help us do better stories or more stories, we would at least consider it. And as Jason says, some of them are some of the tools are so obvious and, like, woven into how we already work, like, in the form of transcription that there's, like, no really there's no real need to have a discussion about them, it's just like how it is. We use transcription tools and then verify that that it the transcriptions are correct. I think the only thing that we had a brief discussion about is images because we don't have, like, a full time person to do visuals for us.

Emanuel:

We're like, oh, should we, like, do some of the images with AI image generators? And the decision we came to is no because they look bad and they look like everything else and what a the the presentation of a story does matter in terms of, like, making people wanna read it. And if we use generative AI, it was just like, it doesn't it doesn't work well for us. If it did, we we would maybe do it, you know, even putting the ethics of it aside, but but it doesn't. And there's and similarly, like, there's just no there isn't a magic AI bullet to do this job better.

Emanuel:

And if there was, then we would have, at the very least, talk about it. And BI and all these other companies, they they it's not like they know something we don't know.

Joseph:

Right. I guess just the last question, Jason. If if well, what is the answer for media if it's not pivoting to AI and getting a 100% of your staff to use ChatGPT in some form?

Jason:

Yeah. I mean, I think we've talked about that a lot and and I think it's just like you have to be as human as possible. Like, you have to treat your audience like humans. You have to say like, hello. I am a human.

Jason:

I'm writing for you. I'm not I'm not trying to write for an algorithm. I'm not trying to optimize every single thing that I do. Like, I'm gonna try to have some fun in the writing. I think that, like, you can do very serious reporting that has a voice as as we have done.

Jason:

But I think that media the media companies that are, like, working at the moment all have leaned into the humanity and personalities of their journalists or their their people. And it's like that's a kind it's kinda shitty because it means that journalists have to be personalities. And I think that a lot of people when they were going through journalism school or, like, at previous jobs, they were told, like, you're not the story. You're telling someone else's story. Like, you you're not allowed to have any sort of opinion or point of view.

Jason:

You try to remove yourself from the story as much as possible. And I'm not saying there's no room for that. Like, there's room for really good investigations and there and that sort of thing. But it's just like kind of the way of the world now that you have to, like, show people who you are. Because if if you're not, then you're not able to stand out from all of the, like, quote unquote faceless creators out there.

Jason:

Like, we've written so many about them and it's so easy to make so much content that no human being can, like, stand up to it. So you need to tell your audience, like, here's here's how we did this story. Here's, like, why we know it's real. If you're watching this on YouTube, I see I'm going in and out of focus. We'll fix that in the future.

Jason:

It's new camera, a new camera. But, yeah, you just have to, like, lean into to your humanity. And it's not like we are the only people who are doing this. We're definitely somewhat late to the game on this. Like, podcasters have been doing it for a really long time.

Jason:

YouTubers have been doing it for a really long time. TikTokers have been doing it for a really long time. And that's what people sort of are attracted to at at the moment. And I think that even if you're not doesn't mean you need to be like, hey, what's up YouTube? Like, 1000000% energy all the time.

Jason:

But I think it I think you do need to show that there's like a person behind the keyboard to some extent.

Joseph:

Yeah. Totally. That makes sense. I have a lot of thoughts about that, but maybe I'll save them for another time and but you're right in that. It's just something that journalists have to do now.

Joseph:

In part, the one silver lining I'll add, in part because journalism has done a pretty shit show of explaining what it is and what it does. And then you have the rise of YouTube and people are trusting that because there is a face behind it and those sort of two things have intersected and now you have us where we have to go on YouTube. We've been very uncomfortable about that a few years ago or whatever. Anyway, we'll leave that there and I will play us out. As a reminder, four zero four media is journalist founded and supported by subscribers.

Joseph:

If you do wish to subscribe to four zero four media and directly support our work, please go to 404media.co. You'll get unlimited access to our articles and an ad free version of this podcast. You'll also get to listen to the subscribers only section where we talk about a bonus story each week. This podcast was produced by Michael Hermes. Another way to support us is by leaving a five star rating and review for the podcast.

Joseph:

That stuff really helps us out. This has been four zero four Media. We will see you again next week.