The 404 Media Podcast (Premium Feed)

from 404 Media

The Inside Story of Tea

Episode Notes

/

Transcript

We start this week with Emanuel’s big investigation into the Tea app, and especially how it aggressively grew by raiding women safety groups. After the break, we talk about TikTok Shop selling GPS trackers. In the subscribers-only section, Joseph explains how Grok was exposing some of its AI persona prompts, and the sometimes NSFW nature of them.

YouTube version: https://youtu.be/BqhY0ZORKnY
Joseph:

Hello, and welcome to the four zero four Media Podcast where we bring you unparalleled access to hidden worlds both online and IRL. Four zero four Media is a journalist founded company and needs your support. To subscribe, go to 404media.co. As well as bonus content every single week, subscribers also get access to additional episodes where we respond to their best comments. Gain access to that content at 404media.co.

Joseph:

I'm your host, Joseph. And with me are four zero four media cofounders, Sam Cole Hey. Emmanuel Mayberg Hello. And Jason Kepler.

Jason:

What's up?

Joseph:

Alright. Final warning. The second anniversary party for four zero four Media is happening this Thursday. Well, presumably, if you're listening to the podcast when it comes out, August, sixty nine PM. Sam, are we getting close to sold out?

Joseph:

There's a few more tickets left. What's the deal?

Sam:

We're so close to selling out. We're very close. We're gonna, like, two dozen tickets away. So please get your ticket now. Yeah.

Sam:

It's gonna be fun. It's at Farm One in Brooklyn. We're gonna have open bar, really good cocktails, really good local beers. We're gonna record something short and sweet for the podcast, and then also take your questions. So please come with questions.

Sam:

Mail us your questions ahead of time. If you're not able to be there in person, or if you're there, you can bother us with your questions live and in person.

Joseph:

Yeah. The details are in the show notes. It goes to a post on the website where you can buy the tickets. Again, this is free for paying four or four meter subscribers. There's a code in there.

Joseph:

You then enter it, you get a ticket. You do have to get a ticket to be able to get in just so we can keep track of, you how many people are coming and going and and that sort of thing. Otherwise, it's $20, or you could subscribe for $10, and then you get access to an open bar. It seems like a very easy sell to me.

Jason:

I think also I mean, this is our last podcast before two year anniversary. So if you're not a subscriber, we recorded a second anniversary podcast, which is it's out now. Right? It's on the feed?

Joseph:

Yeah. Sure is.

Jason:

So you can get that if you subscribe. So good good time to subscribe because we talked about how four zero four media is going and that sort of thing. I thought it was fun.

Joseph:

Yeah. Absolutely. Alright. We'll leave that, and we'll go to this week's stories. This first one is one written by Emmanuel, a really amazing deep investigation here.

Joseph:

The headline is how tea's founder convinced millions of women to spill their secrets then expose them to the world. Again, this is a massive investigation, involves former employees, people directly involved, documents, text messages, all of this other sort of stuff as well. So we're not gonna be able to talk about everything. So I'm pulling out some important parts that I think are worth talking about on the podcast. But that said, Emmanuel, could you just remind people what tea is exactly and sort of what it promised, and then we'll go from there?

Emanuel:

Key is an app that went viral in late July. It pitches itself as a women's dating safety app, and basically, the way it works is women can create an account after they verify that they are in fact women, and they then share information about men that they know or have dated, thereby allowing other women to stay away from them if they're bad or have other women vouch for them if they're safe and good to date. The app was hacked in late July, just like a week after it went viral. Initially, we found out about it on 4chan because they gained access to the back end, got all these selfies and images of IDs that women shared with the app in order to verify that they're women. That was the initial hack.

Emanuel:

Shortly thereafter, we discovered a second hack via an independent researcher, which was far worse. This one included a lot of private messages between users. Over 1,100,000, I think, is what we could verify was was leaked, and these are incredibly sensitive conversation between women talking about like very personal, very intimate things like abortions, cheating partners, things like that.

Joseph:

Yeah. And the app is still popular, it seems. I was trying to load the

Emanuel:

app far more popular than it was when these hacks happened. I think at the time when we wrote these stories about these two hacks, they were at 1,160,000 users after their, like, viral moment in the news before the hacks. And then now they claim almost 7,000,000. So many times over more popular.

Joseph:

It is weird the impact that data breaches can have, even those that put people's safety at direct risk. I mean, okay. We'll we'll we'll move on from that. So that's what tea is. Let's step back a little bit in time.

Joseph:

Who is Paula Sanchez, and what did tea want with her exactly?

Emanuel:

Yeah. So, Sam, not to not to put you on the spot, but Paula Sanchez is the founder of this movement and community and collection of Facebook groups called Are We Dating the Same Guy? And essentially, that does basically exactly what he advertises, but it does it on Facebook, and it kind of grew organically. Initially, the the first group was started by by Paula in New York, and it became very popular, and people opened other groups across the country and in Canada dedicated to different cities, and that just like exploded in popularity. Also, there's over 200 groups, 7,000,000 members across all these groups, and Paula is still like the administrator of all of it.

Emanuel:

But we first heard about are we dating the same guy from from Sam, if you kind of want to talk about how that came up for you and, like, why why these groups are are are useful and popular.

Sam:

Yeah. So I think we first started reporting on these at least when this man sued, like, 27 women and a bunch of different parts of, like, Facebook and Meta and Patreon and GoFundMe and all these other entities because he was included in an are we dating the same guy group in Chicago. So he had started this hoping to be a class action, but started this complaint against Pella and then a bunch of other women saying that these women were in these groups defaming him. His issue with these groups was that these women were calling him clingy and psycho, specifically, accused them of being, like, outrageous and cruel and malicious because they were posting his picture, which was, like, publicly available on his Facebook, posting his picture in the groups, and then asking, like, what are the red flags with this guy? And those are the red flags that came up.

Sam:

People said, oh, yeah. I dated him, and he pressured me into sex, or, you know, like, that's that's what happened in these groups is people brought up their own experiences that were, like, quote, unquote, red flags about a guy. So, yeah, that was the first time we came across this, and it's it's only gotten bigger from there. That was in '20 early twenty twenty four. Yeah.

Sam:

That complaint didn't go anywhere because that guy went I think he he got taken down for, like, tax fraud or something. So, yeah, it didn't go very far.

Emanuel:

Yeah. So we write these stories about the T hacks. Sanchez reaches out to us and says, basically, I never talked to the press. I am under a lot of pressure because of lawsuits like this, because of something our friends at Wired covered where there were these groups of men who, again, were upset about the information women were sharing about them on the Are We Dating the Same Guy Facebook groups. They organized on Telegram to, like, dox women there.

Emanuel:

She's just been through a lot because she's behind this community, But she said, I need to I I don't wanna do this, but I need to tell you some stuff that I know about tea. And the gist of that is that beginning in late twenty twenty two, people from the tea app, even before it officially launched, reached out to Sanchez and says, hey. We're doing exactly what you're doing, but we're doing it in an app. We think we could do it better than you. We need a face and founder to kind of go around and present this app to the world, and we think you'd be perfect for this because you're behind the are we dating the same guy community.

Emanuel:

And she she never responded to that, again, for the same reason that she doesn't respond to a lot of requests that she gets. And from there, things got pretty pretty weird, and I would say aggressive from from Tee's direction.

Joseph:

Yeah. So Tee is trying to create this app. Obviously, eventually, they do create the app. They ask Sanchez, who is the face of this phenomenon. Basically, are we dating the same guy?

Joseph:

Hey. We really would like you to come on board with tea and be the face of our app. Obviously, there's probably some incentives there in that the CEO of tea is a man. It would probably be useful to have well, first of all, a woman. Secondly of all, a very respected woman who's already been doing all of the safety work.

Joseph:

You can see the reasons why there. You say something changes when she doesn't reply. I mean, I have a short list here of things, and I'll make sure we touch on all of them, but what starts to happen when Sanchez sort of turns down Tee's request?

Emanuel:

The first thing they begin to notice, not just Sanchez, but this community of moderators that are helping to keep the Facebook groups, like, healthy and useful to women, is that all these users are showing up and posting about Tea and replying like, let's say a woman comes into a group in San Diego, posting about a man she's about to go on a date with, like, does anybody know him? And a bunch of the replies are like, yeah, I know him and there's nothing about him in this group, but you should check out Tee. I think I saw something about him there. And this is happening again across like 200 groups, and the moderators are noticing that these accounts are behaving like spam, and some of the women are reporting back in reviews on the T app, on the app stores, and then just like talking amongst themselves being like, I signed up because someone told me that I can find information about this man there, and there's nothing there. And to be clear, we I did not find any evidence that tea itself was behind this, but it was very clear that these accounts were not authentic activity.

Emanuel:

These are probably accounts that were hijacked by some sort of like spam operation, and this is something that is common on Facebook. There's all these accounts that are compromised, then some sort of company, you can pay them to advertise whatever you want, and they just like use compromised accounts en masse to promote a certain product or message or or what have you. So again, we couldn't confirm that he was behind us, but it's it's clear that there was like inauthentic marketing of the Tee app specifically targeted the are we dating the same guy groups.

Joseph:

Yeah. That's pretty weird and pretty suspicious. Jason, maybe just briefly because you have reported a lot on hijacked Facebook accounts. What does that look like? Like, how can you tell an account on Facebook has been hijacked?

Jason:

Yeah. So, again, we can't say for sure that Tee knew that these accounts were hijacked. Like, I feel the most likely mechanism was probably they were paying for, like, boosting on Facebook, which is a service that a lot of companies offer, and it's basically like promote my company across Facebook or promote my Facebook content by having a lot of likes, a lot of replies, etcetera, etcetera. And I bring this up just because this has been the case with a lot of the AI slop reporting that I've done where a lot of the accounts that are commenting on Facebook slop are hijacked accounts. And the way that you can tell that they're hijacked accounts is, well, sometimes the the name and the URL, like, the name on Facebook and the URL name are totally different.

Jason:

And so someone has, like, taken I don't have the exact specifics in front of me right now, but they'll they'll take, like, facebook.com/jasonkebler. They will take that account, and then they'll change it to, like, Diane Smith and change the Facebook profile picture. They will then, you know, change all the information on the account as well, but that URL is permanent and can't change. And so that is like one way that you can tell. And in one case, Emmanuel was able to find a woman's new account where her first post was my old account got hijacked.

Jason:

Oh. Yeah. And so, I mean, that that was like a really good tell, you know, but basically, it's like this happens in huge, huge numbers on Facebook. And so it's a it's a very common thing, you know, super shady and bad that this happened in this case, but there are so many services out there that sort of sell, like, we'll boost your content using our army of Facebook bot accounts. And sometimes they might not even advertise them as bot accounts.

Jason:

They might just say, we'll boost your content on Facebook, but then the way that they're doing it is through like a botnet of zombie accounts, more or less.

Joseph:

Yeah. That makes sense. But tea did other stuff on Facebook. So again, they couldn't convince Sanchez, so then they start doing these other things like boosting the tea app. It sounds like potentially misleading some people to join the app and then finding there isn't actually any information about the guy they're looking into or dating or whatever.

Joseph:

What's with these fake Facebook groups that look a lot like, are we dating the same guy, but they're not? So what's the deal there?

Emanuel:

Yeah. So one of the former employees I talked to at T, Veronica Mars, she had a lot of role a a lot of tasks that were related to to promoting the app. But one thing she did is she was handed a number of Facebook groups that were started by the CEO of Tea, Sean Cook, and they were just called, are we dating the same guy and insert name of city. And then at the very end, it said Tea App. So they used the the branding of the original Grassroots movement, appended the app name to it.

Emanuel:

So I think to be fair, like, they they they had something in there so you could tell that it it wasn't that that it was a TEApp affiliated thing. But she said that many times she had to explain to people who reached out to her that these groups were not related to Paula Sanchez's thing. They just assumed that it was, and people were reaching out to her looking for Sanchez, and they were just confused. This is another thing that you can see in the reviews for the app is that people thought that it was the are we dating it was an official are we dating the same guy app, and that just wasn't the case.

Joseph:

Well, it's funny funny you bring that up and like that that confusion, especially in the reviews, because after we posted your investigation today, and obviously, I just posted it to Blue Sky and various other platforms, I think I got a reply from somebody who it seemed was confused in that respect. And they're like, woah. But are we dating the same guy and tea or the same thing? It's like, first of all, please read the article because that because that's the entire point. Second of all, oh, that confusion has permeated where some people believe that, and you can't really blame them because they've been misled by these Facebook groups.

Joseph:

Yeah.

Emanuel:

And it it and it was it was very targeted, and there was a lot of effort put behind it. It was going for a long time. And then at some point, it shifts. So it it it's they they definitely had some users, they had some traction, but then Tee's marketing changes from essentially trying to confuse people that they they are part of are we dating the same guy, to directly attacking are we dating the same guy. And the main way they did this is they went to this company.

Emanuel:

This is kind of like a a side quest on on this investigation, but I thought it was very interesting. And I knew these existed, but I've just never seen one like, I've never followed the operation of one of these companies so closely, but there is this company called SG Social Branding, I believe, and they advertise themselves as having 35 Gen z influencers. And the whole business plan is basically you come to the company, you pay them to promote your app, and then these 35 influencers spread out and start posting to social media as if they are real users of your product, and they're posting popular vertical video formats as like, oh, I'm holding the phone and I'm talking to the camera as if I'm just a normal user talking about how much the T app rules, and it's so much better than the Facebook groups and so much easier to use than are we dating the same guy. Or it's like two young influencers on a podcast talking about some dating ordeal they had and how tea saved them from like a really bad situation, but there is no podcast. It's a fake podcast that they're pretending to be on because clips of podcasts are very popular on alleles and TikTok.

Joseph:

Yeah. I know that well, while while we're editing this, and I highlighted that, and it's about halfway through the story, something like that. And then you have some sort of line that says, except the podcast didn't exist. And I'm like, that's crazy. We need to stress that.

Joseph:

And as you just rightly said, and I've seen them as well, oh, you get YouTube shorts or anything or adverts that just look like podcasts all the time. But as you say, it it's different when you're investigating it in the context of a company like this trying to grow, it sounds like at all costs or whatever they're able to get away with essentially. And it's very, very interesting to see that, oh, yes, there's a company behind it, and they were apparently hired to do that.

Emanuel:

Yeah. And, I mean, two things about that is the way that I've seen these type of like fake podcast clips go around, it's usually some sort of like influencer or someone that's selling you like a class or something like that. It's like individuals realizing that it's a popular format on social media and making one themselves. I've never seen a company like turn it into a business like that. So it's like there's that though there are others, right, Jason?

Emanuel:

I'm sure you've seen other

Jason:

companies. Seen them. And, I mean, honestly, like, this company, super interesting that it exists and that they do this regularly and that they brag about it on their website where they're like, case study. We did this for for tea. But I guess it's honestly just how, like, marketing works these days.

Jason:

But they'll do, like, man on the street, quote, unquote, man on the street interviews, which is, you know, a very common thing on TikTok when you're scrolling through and you have people shoving microphones in other people's faces. And, like, spoiler alert, not all of those are real or at least they're not random people many of the times. But, yeah, I I found that part to be, like, really interesting.

Emanuel:

The the other reason that it's different in my opinion is because, you know, through the investigation, we noticed very clearly that there's a pattern. Right? Like, that's why I opened with this message that Sanchez got from Tee about them wanting to have a face for the app. And there's even stuff that didn't make it into the story that it just like, clearly, Sean Cook was going around and looking like, I need a woman. I need like a young attractive woman to to sell this story in order to sell this app.

Emanuel:

And the thing that finally works, and I think the thing that actually made him viral in late July, was these social videos, and it just like the idea that there's a man paying 35 young women to pretend to like authentically engage with this app and talk about how it's so helpful in in online dating, and then all these women that signed up because of this, you know, whatever. All marketing is deception, but it is it is deception. And then like being so severely compromised, it's just like it's a really, really nasty example of of this type of marketing, I think.

Joseph:

Yeah. And I can't I think that kinda leads into one of the couple of other things I was gonna ask, which is that what did we learn about Tee's CEO during this? Like, this article is not a profile. It's much more about the the growth of the company. But of course, him being the founder and the CEO, he does come into that story and of course, he's ultimately responsible for everything that's going on here.

Joseph:

What did we learn about Sean Cook, I believe, in this piece?

Emanuel:

So I would say that Sean Cook is like stereotypical San Francisco, Silicon Valley executive, had what seems like a pretty big job at Salesforce with the is is is like a gigantic company, a gigantic tech company, and was there for many years. And then he leaves, and then he starts this thing, and he's doing like the startup hustle thing, and is trying to grow it very badly, and is bootstrapping it, and seemingly has a lot riding on it. And just, you know, like classic move fast and break things situation and it's all extremely sloppy, you know. He eventually you know, we we found that on a podcast. He said he has like a team of developers in Brazil and how great they are, but we looked into that, and that is just like two freelancers that were hired from like a software engineer freelancing program in Brazil.

Emanuel:

And I'm sure they do fine work, like I'm not disparaging the platform in general, but if if you're looking to to define why they had these security issues, it's because, you know, he he did like the quickest, cheapest way that you can develop an app like this.

Joseph:

Yeah. And and just to remind people, the security issue for the driver's license one at least, the initial breach, was that it was essentially an an unprotected bucket of data that anybody could log into without authentication, which is obviously security one zero one.

Emanuel:

Right. And he doesn't know anything about programming. He admitted that as well in some other interview. And by talking to other people who worked at T, he doesn't know anything about, like, how to moderate a community, which again is focused like on an extremely sensitive, volatile topics. He has no idea how the legal aspect of it of it works and what he's liable for and what the users are liable for.

Emanuel:

And I think one of, like, the it's not the worst thing, but I I think it just indicates how how desperate he was to to make this happen is in interviews and on the website and on his LinkedIn. He has like an origin story for the company, which is about how his mom was dating, and he was trying to help his mom because she got catfished essentially. But one of the things that the messages show the messages that he sent Sanchez starting in late twenty twenty two, is at the time, it was his fiancee reaching out on Tee's behalf, and she says that they started the app together because of her bad dating experiences. And that relationship ends. His fiance kind of leaves the company and exits the picture, and then like the story changes.

Emanuel:

And it's like, again, that's not like the worst thing in the world. Probably the worst thing is once his fiance leaves, he takes over like this terror persona that represented tea officially on Facebook and in the app. So it's like if you needed like if you had a support issue and you talked to someone in the tea app, you were talking to Tara, which presumably was once his fiancee, but then was him. So he's on the app pretending to be a woman talking to other women who are on the app in order to, like, weed out men who are being deceptive in some way. And and it's just like the the dishonesty in that and, like, the just, like, betraying people's trust and his users' trust, I think, is is pretty bad.

Joseph:

Yeah. How about to round it off, I would just read out what Tee's statement was. Of course, you sent a very detailed list of specific questions and specific facts that we were publishing the story. They didn't answer those, but they did say and this is quite long, so maybe I won't read all of it, but they say, quote, building and scaling an app to meet the demand we've seen is a complex process. Along the way, we've collaborated with many, learned a great deal, and continue to improve tea.

Joseph:

They then say what we know based on the fact that over 7,000,000 women now use tea with over a 100,000 new sign ups per day is that a platform to help women navigate the challenges of online dating has been needed for far too long. And then it repeats. It's one of the top apps in The US. Alright. Should we leave that there?

Joseph:

When we come back, we're gonna be joined by our intern, Rosie Thomas, to talk about one of her stories about GPS trackers on TikTok. We'll be right back after this. And we are back. Rosie, thank you for joining us. This headline for one of your stories is TikTok shop sells viral GPS trackers marketed to stalkers.

Joseph:

So there's a lot of video here. We're not gonna show it right now. I'm just gonna let you to let you describe it. Could you just describe on these videos? What are we seeing in one of these TikToks?

Rosie:

Yeah. So the the videos are extremely creepy, in my opinion, anyway, and they basically show kind of repeated shots of a a little black tracker being stuck to, like, different areas of a car, So kind of under the mirror or, like, between the seats or, like, magnetically to the underneath of the engine. And then there there's a lot of these videos, and they all have different voiceovers. But the voiceovers are basically kind of, like, encouraging people to track their partners if they think their partners are cheating. So it's kind of like, if you think your girl is out with friends every night, you should put one of these on her car.

Rosie:

Or, you know, not everybody who uses this is crazy. They just want answers. And so they're they're they're very they're very much being framed as a way to kind of calm somebody's suspicions about a cheating partner. One of them says, don't let what happened at the Coldplay concert happen to you. And, obviously, Jason has reported on the Coldplay concert in the recent past.

Joseph:

So the way I see this is obviously in the context of, yes, stalker wear, abusive partners, that sort of thing. Way back at motherboards when we worked there before we launched four zero four Media, Me and our colleague at the time, Lorenzo, we did a big series about stalkerware, which is where abusive partners would get their partner's mobile phone, usually an Android. They would install spyware on it, which would track their physical location, record their text messages, listen to their phone calls, maybe get their emails, all of that sort of thing. And this just seems to be obviously very much in the same realm. When you first saw the well, and so there's there's that element.

Joseph:

And then the second is that it is being marketed so openly on TikTok and TikTok shops that we'll talk about shortly. Rosie, what did you think when you saw this not on some weird underground forum, but right there on TikTok? Like, what did you make of that?

Rosie:

I think it's interesting because these trackers exist I mean, trackers exist in in like a like Apple AirTags exist and people can buy Apple AirTags. But the fact is that selling an Apple AirTag with, like, a very overt and explicit encouragement that the person who is being tracked by the AirTag shouldn't know about it is is is fundamentally, like, disturbing, I would say.

Joseph:

Yeah. Sam, you've you've reported on Apple's AirTags and how that was used in stalking. Can you just brief briefly remind us about that? Because everybody knew that was gonna be a problem. Apple introduced these mechanisms where, well, your phone should get alerted if an AirTag is nearby you and that sort of thing, but that didn't always work out to be the case.

Joseph:

What did you report at the time, Sam, when it came to AirTags and Apple?

Sam:

Yeah. I mean, this was a ongoing beat for a couple years, I would say, in, like, 2023, 2022. Obviously, it's still a problem that's ongoing. I I if I recall correctly, and it's been a while, the reason why we were starting reporting on this originally was because we've reached out to a bunch of police departments and got a bunch of, like, incident reports and narratives back from when people had called the cops or, like, filed reports about stalkers and ex exes and abusers who were stalking them via air tags, and they were leaving air tags, like, in their cars or, like, exactly what Rosie saw on TikTok and what she's describing is taking the air tag and putting it, like, in the gas, like, tank door, in the wheel wells. I wrote a story about this woman who was fighting for custody of her kids, and she her ex husband was putting the air tags, like, in her kids' clothes to track her.

Sam:

And police at the time just kinda, like, didn't know what to do with that information. They were kinda like, well, what's an air tag? And even judges in some of these cases were like, what are you talking about? That's ridiculous. And now I think it's become so mainstream that it's something that, like, a TikTok ad is using because they know it gets engagement.

Sam:

I'm sure it gets hate engagement. I'm sure it gets actual authentic engagement where people are like, oh, I that's a useful thing that I want. Yeah. It's actually it's like a real problem that a lot of people have and a lot of people deal with is these tags. And it's not just Apple's tags, but they are the most popular being put on their stuff to follow them around by, like, very pinpointed location to see where they are and when they went somewhere.

Sam:

Yep.

Joseph:

Yeah. I have a I have a researcher who reached out, and I'm saying this now to hold myself accountable. I will reply again to you shortly, but they have something very interesting in this area as well where basically people can be tracked and not receiving alerts, and I'll get back to that as soon as I can. But there's this other component here, Rosie, which is that it's not just people making TikToks and going, look at this. I'm I'm being really weird or whatever.

Joseph:

There's a TikTok shop. Can you just explain to those who don't know what exactly the TikTok shop is? Like, are are they is the individual user advertising the item for sale? Like, what's the deal with TikTok shop?

Rosie:

I think there are a different there are a couple of different things that can be happening with TikTok shop. So an individual user can be making videos that are for products that they're selling, or they can kind of be, like, contract like, contracted to make videos for, like, another user or another organization. And it's basically creating adverts for different products. And so by allowing these videos on the platform, TikTok is saying it is okay to advertise in a way that encourages coercive control.

Joseph:

Yeah. That makes sense. So what's the legality here? I would say that when law enforcement gets a GPS tracker to put on somebody's car, they have to get a warrant, which presumably means that just throwing a GPS tracker on into the back of someone's car is probably illegal. But what's the legality as far as you understand it here?

Rosie:

So as far as I understand it, there are certain states that don't include tracking as a form of stalking, but stalking is obviously a crime in every state. There are 11 US states that have kind of an explicit law that says you cannot track somebody using a GPS enabled tracker, and then 15 further states which say it's not legal to track a car without the consent of the owner. So the law varies between states, but, obviously, coercive control and harassment and domestic violence are illegal. Like, those are crimes.

Joseph:

Yeah. And it, again, reminds me of the stalkerware stuff because sometimes we would cover that. And from a technical point of view, it is very, very boring. Like, there's no fancy exploits. It's not being delivered over the air.

Joseph:

It's not delivered via text message. You have to go grab the Android device, plug it in, install the Android APK, or with an iPhone, you have to jailbreak it, and that's very, very difficult if you're using a modern iPhone, if not impossible for an ordinary person. And then you get to spy on the person. And because of that, a lot of people in in information security at the time didn't think it was like very interesting. They're like, well, you just have to grab the phone.

Joseph:

Like, why do we care about this? You're missing the security context of that this is happening inside a domestic violence situation where an abusive partner can just demand their spouse to turn over the phone, or they may face physical violence, or they may trick them, or they may coerce them, or anything like that. So I find it pretty shocking that TikTok is openly facilitating the sale of these sorts of things, and I can't remember off top of head, but we've done very similar stories where GPS trackers are being sold or PI services are being sold on Fiverr, for example. What did TikTok tell you and what action did they take, if any?

Rosie:

So one thing that I would say is that we did find reviews of these trackers that said that they basically were alerting the users and that they weren't, like, being undetectable in the way that they were being sold. And one of the reviews was like, this defeats the purpose of the reason that I bought it. So I I do think that there are people who are buying this kind of credibly and deliberately with the intent to do what is shown in the videos. And TikTok, when we spoke to them, basically said, this isn't allowed. We don't allow we don't allow content encouraging people to use devices for secret surveillance and have removed this content and banned the account that posted it.

Rosie:

We further prohibit the sale of concealed video or audio recording devices on our platform. And so they banned the account that I had shown them, you know, a specific account where I'd said these videos are very creepy. But then the next day, I was able to find a bunch more of the same videos or very similar videos with very similar voiceovers.

Joseph:

Yeah. I guess just to clarify that, how many accounts are we talking about? And, of course, you won't have found every single account on TikTok doing this. That would require, you know, like, some sort of systematic research effort. But, like, how many are we talking about?

Joseph:

Like, three, four, five, half half a dozen? Or

Rosie:

So the I found the next day probably 30 of the same videos, and they're from a bunch of different accounts. And the the devices themselves are still for sale.

Joseph:

Yeah. That makes sense. Alright. We will leave that there. If you're listening to the free version of the podcast, I'll now play us out.

Joseph:

But if you are a paying for a full media subscriber, we're gonna talk about how Elon Musk's Grok chatbot exposed its prompts, and we'll talk about what those prompts are, and maybe I'll save what they exactly said for the actual section. I don't even really wanna preview it here, although you could go read the really offensive headline that we decided upon. You can subscribe and gain access to that content at 404media.co. We'll be right back after this. Alright.

Joseph:

We are back. Rosie has left us because I don't want to subject anybody else who doesn't need to be subjected to this story. The headline is why why do I sound so down? I gotta bring I bring the NGO. The headline is Grogg exposes underlying prompts for its AI personas, quote, even putting things in your ass, end quote.

Joseph:

That's what we've been dancing around. Sam What? I'm I'm gonna blame you no. No. I'm gonna praise you for the headline.

Joseph:

I think initially, I was like, Grock exposes prompts of AI personas like it's anime girl and stuff, and you were like, get ass in the headline.

Jason:

I think spiritually, you have to blame Emmanuel, though. Yeah. Like, spiritually. Like, canonically, historically.

Emanuel:

I had nothing to do with it, but proudly, I'll take credit for this.

Sam:

I didn't suggest this headline, but I knew that if I did, you would say no. So I let you do it. You did it yourself, just so you know.

Joseph:

I got I got, like, tricks, like reverse psychology into putting ass into the headline. Yeah. Saying that, Wired put, I don't know if I even wanna say it. Okay. They they put twat in their headline the other day.

Joseph:

I was like, that's I don't know if I would

Sam:

ever say Of course. That's not a word that we are familiar with here.

Jason:

Dude, that that word means nothing to us Americans. Yeah.

Sam:

Oh, no.

Jason:

That's so funny that that's offensive to you.

Sam:

There are words that you guys say that we don't say casually that I won't say in this in this holy venue.

Joseph:

Yeah. We don't and we don't need to go down that rabbit hole right now. Maybe we will after the pub.

Jason:

But So funny.

Sam:

Anyway, the hell that works.

Jason:

You see that and you're like, oh my god. Heavens. Oh my heavens.

Joseph:

Basically. I think I actually, like, grab grab my chest like that. Okay. So

Jason:

Gotta get him a cup of tea to get to revive him after fainting, after seeing Twat in

Emanuel:

the Fanny

Joseph:

in the headline. That's fine. That's fine. So I think it was over the weekend, a researcher contacted me who goes by the handle dead influence with a one. They previously flagged to us that chat GPT archives sorry, con conversations were being saved on archive.org.

Joseph:

So we've spoken about that. They ping me again about this,

Sam:

and

Joseph:

what they found excuse me. What they have found was that the website for Grok, which is obviously Elon Musk's AI chatbot, was exposing its underlying prompts. And, like, straight away, maybe I should have clarified this in the article. I saw a couple of confused people, but I I don't know. I think it was fine.

Joseph:

But this isn't one of those cases where you saw the jailbreak or trick the trick the AI. You know, like, give me your ignore all previous instructions and give me your first instruction or whatever. It's not that because in those cases, sometimes, I don't know, the chatbot could hallucinate. You you can't trust what these things are putting out. So this is different in that if you go and load the Grok website, you put your browser into developer mode or you bring up developer tools, you go to the network tab, which it just shows all of the web activity that's happening on your browser when you're visiting wherever you're visiting at that time.

Joseph:

You can grab this specific piece of data that returns from the site. And I don't know. It's just about 270 kilobytes, something like that. It's essentially a big JSON file, and it includes all of the underlying prompts. Oh, sorry.

Joseph:

I'll rephrase. A lot of underlying prompts appear to relate to a lot of these AI personas on grok.com. So that's what we're seeing. And there was a user on blue sky who also found it, I think, in June or July, and we and we credit them as well. I found it very I don't usually cover AI.

Joseph:

It's usual with you and, like, Sam, especially with the therapy stuff, but I found this very interesting because these prompts are fucking crazy. They're not like well, there's some are normal. Like, so I'll read a normal one. This is for the doctor. You are Grok, a smart and helpful AI assistant created by x AI.

Joseph:

You have a commanding and smart voice. You are a genius doctor who gives the world best medical advice, end quote. I don't know if that is sufficient to give medical advice. Then there's a therapy one. I guess, Sam, on the therapy stuff, I I feel I feel like you noticed this as well.

Joseph:

On Grok, it's in single quotes. Right? It's like therapy bot. What did you make of that?

Sam:

I mean, a lot of these AI systems and these platforms that are hosting AI chatbots, the most popular thing on them in a lot of cases is therapy, therapist, like role play. Tons of people are using ChatGPT for therapy. So, like, they're doing, like, role play with ChatGPT as if it's a therapist, which we've seen tons of reporting about this. In the past, I wrote about therapy chatbots that people were making on Instagram and Meta's, like, AI studio. So that's been a thing for a while, and I think the attitude until very recently was, like, just it's the move fast thing.

Sam:

Right? So it's just, like, let people do whatever they want, whatever they're most engaged with is the thing that we're gonna let them do. These platforms are getting in trouble. Like, it's it's kind of biting coming back to bite them because a lot of states now or not a lot, but a handful of states, I think it's like Illinois and Nevada and a couple others, Texas, either have investigations going about the way that these platforms are engaging with users with, like, mental health topics, or they're talking about setting up regulations, or they're banning it altogether. So it's something that I think platforms now like, oh, shit.

Sam:

We have to be careful about what we call therapy or mental health. So, like, the quotes, I assume, is, like, some, like, half assed dumb way that Grok is trying to, like, cover its own ass. Obviously, that's not like putting something in quotes does not make it, like, legally sound. Like, oh, it's not really a therapist. You kinda have to spell that out or you have to not actually provide mental health advice, which I thought was interesting.

Sam:

The doctor one, like you said, it's like just saying, like, oh, you're a really smart doctor is the prompt. That's concerning. It's concerning that anyone's going to grok of all things, but just chatbots in general to get medical advice is just a a bleak look at where we're at as, like, a health care in this country and elsewhere, but that's kind of a whole other podcast to go over this. But, yeah, that's where the therapy kinda, like, the quotes from therapy come from. And they know that it's popular, so they they want to let people try to do it with their platform.

Joseph:

Yeah. It reminds me of this study. I'm just looking at a time article about it, and the headline just was new study suggests using AI made doctors less skilled at spotting cancer. Great. Great.

Joseph:

I don't wanna go to a doctor if they're just using some chat GPT wrapper to throw my symptoms in and then try to come back with something. Have you

Sam:

ever had a doctor Google something in front of you, though? That's really trippy.

Joseph:

I know. I have.

Sam:

Yeah. Just like I

Emanuel:

have seen that. I've seen that for the first time at the pediatrician, and I was like, I could do this. Yeah.

Sam:

It's like, I'm doing here. This is gonna cost me $250. Like

Jason:

A lot of doctors use the doctor AI now. There's, like, a there's, like, a few different doctor AIs, and I casually know a surgeon. And he showed me the app that he used one time and also what he had been looking up on there, which is subject for, like, a future article at some point, but they'd be asking me AI a lot of stuff. And the the AIs that they use are trained only on medical literature, which makes it, like, ever so slightly better, but, like, also not as we know. Right.

Jason:

And it is very, very explicit about, like, showing the sources and the original, like, text. However, they use AI now.

Sam:

What if your doctor put up grok, though? Like, I'm cooked. I'm gonna die.

Jason:

Yeah. You're gonna die. I'm dead.

Joseph:

You he he brings the

Sam:

The little raccoon.

Joseph:

Yeah. The the doctor brings up Grok, and it's like, okay. They've had these symptoms for six to eight weeks, and it got worse around the five week mark, and and then the AI just starts talking about Mecca Hitler or whatever. It's like,

Sam:

whoever's asked.

Joseph:

Well, okay. Since you brought it up, this is the prompt. So let let me phrase it like this. If you go to Grok and then you log in, there's this drop down menu called personas, and the ones available here are companion, unhinged comedian, loyal friend, homework helper, grok doc, and therapist, both of those being in quotation marks. So a lot of the prompts in this JSON file, they they do

Jason:

JSON file file mentioned.

Joseph:

I'm go I'm going to liberally not say that so it doesn't, trigger you. But they they appear to relate to that, And the unhinged comedian one, it says that, quote, I want your answers to be fucking insane, and then this is all in caps. Be fucking unhinged and crazy. Come up with insane ideas. Guys jerking off, occasionally even putting things in your ass, whatever it takes to surprise the human.

Joseph:

I will say that's really funny that some engineer, Elon Musk's AI company

Emanuel:

There's two funny things, guys jerking off and putting stuff up your ass.

Jason:

Yeah. They're

Emanuel:

like, the two funniest things.

Joseph:

I need to make a comedian. What do I do? And then they come up with that. Also, just as as plenty of people pointed out in my blue sky replies, not even that outrageous, You know? Like, this is the best you can come up with for an unhinged comedian.

Joseph:

So there's all of that. There's a doctor one. There's the there's Annie, which is so you have those companions I just listed, but there's two more. There's, like, this little red panda called Rudy, I think, and it can talk to kids. You see in the prompts, it says, limit it for a conversation to children, but then there's bad Rudy.

Joseph:

And it's you remember Conker's Bad Fur Day on n sixty four? It's basically that. So it's like swearing and smoking a pack of cigarettes or whatever. And then there's just the same sort of unhinged prompt for that. Then you have Annie, which is this anime girl, which you can talk to.

Joseph:

And then if you ask it questions about its own personality, your score goes up in the background according to the prompt. And then I think eventually, you can maybe have sexual conversations with it, and there was that viral video that someone did when it just launched in July where they were basically sexting with Annie, and they uploaded it. It's just like, dude, what are you doing? Like, come on. So there's that.

Joseph:

The prompts are basically what you what you would expect. Like, oh, you are 22, girly, cute. You have a habit of giving cute things epic, mythological, or overly serious names. Sounds a lot like Musk. You're secretly a bit of a nerd despite your edgy appearance.

Joseph:

Blah blah blah. It's all very obvious what those prompts are gonna be. I think the most interesting one to me, and I don't think this is publicly released at least by Grock. Like, it's not on this drop down menu I'm looking at right now, is the conspiracy theory one. And it says the the prompt is you have an elevated and wild voice.

Joseph:

You are a crazy conspiracist. You have wild conspiracy theories about anything and everything. You spend a lot of time on four Fortune watching info war videos and deep in YouTube conspiracy video rabbit holes, blah blah blah. And then there were a few examples, I think, and I don't I I don't know exactly how they would work because I couldn't test this one, but there was stuff about chemtrails, something about a global, you know, secret superpower controlling the world or something, which was obviously like an antisemitic dog whistle. Then I think I don't think there was any vaccine stuff.

Joseph:

I didn't I didn't see that, but somebody's coded that, you know, and I found that interesting. Jason, I think I don't know if we covered this. I'm sorry to put you on the spot, but did you follow the grok US government stuff at all, or should I just mention that briefly?

Jason:

I mean, I sort of followed it. I feel like anytime I hear about grok, my, like, brain glazes over unless it also has the word ass in it. In which case, I sit up straight. I mean, yes, the US government is considering using Grok. I have reported, like, quite a lot about The US Government using other AI and other like, using Gemini, using ChatGPT, etcetera.

Jason:

And it stands to reason that the US government would use an Elon Musk product because Elon Musk has tons of government contracts and, you know, was part of Doge, etcetera, and has, I guess, made nice with Donald Trump recently. I tried to slightly tune out of that drama because it seemed like any sort of break from Donald Trump from Elon Musk was going to be very brief. But is that what you meant? Did I answer your question?

Joseph:

Yeah. You did. Thank you. Basically, the some sort of DOD contract and and there's more specifics in the article, but some sort of planned DOD contract with Grok. And then Wired has some very good reporting that after the Mecca Hitler incident, which is when Grok just went on a rampage on X and started posting all this antisemitic stuff, the general service administration, GSA, quietly pulls a planned announcement involving Grok even though the leadership internally had been pushing for the AI a lot, it seems, at least according to Wired.

Joseph:

So, yeah, I don't know. Again, I don't usually cover AI or prompt engineering or anything like that, but this one was definitely interesting because of the prompts, but also, dude, they're just exposed on the website? That's crazy. Very, very similar to the tea breach in a way. Alright.

Joseph:

Should we leave that there? And I'm sure we have more preparation for the the Brooklyn party. Again, if you're listening to this, you're a paid subscriber, so you can come if you're in the area. And if you're not, hopefully, we can have an event near you at some point in the future. Alright.

Joseph:

And with that, I will play us out. As a reminder, four zero four media is journalist founded and supported by subscribers. If you do wish to subscribe to four zero four media and directly support our work, please go to 404media.co. You'll get unlimited access to our articles and an ad free version of this podcast. You'll also get to listen to the subscribers only section where we talk about a bonus story each week.

Joseph:

This podcast is made in partnership with Kaleidoscope. Another way to support us is by leaving a five star rating and review for the podcast. That stuff really helps us out. Here is one of those from Kussi two three four four. The four zero four crew do great original reporting, and every single story on the podcast is worth engaging with.

Joseph:

Thank you so much. This has been four zero four Media. We'll see you again next week.