The 404 Media Podcast (Premium Feed)

from 404 Media

Signal's Meredith Whittaker on Backdoors and AI

You last listened August 6, 2024

Episode Notes

/

Transcript

As a paid subscriber, we're giving you access to this episode early! This is a special interview episode with Meredith Whittaker, the president of the Signal Foundation. I'm sure you all know, and maybe even use, the Signal messaging app. Here we sat down with Whittaker to talk all about the state of Signal today, the threat of AI to end-to-end encryption, what backdoors actually look like, and much more. This is a wide-ranging discussion where one of the few journalists who has revealed new details about backdoors (Joseph) gets to speak to one of the most important people in the world of encryption (Whittaker). Definitely take a listen.
Joseph:

Hello, and welcome to a special interview episode of the 404 Media Podcast where we bring you unparalleled access to hidden worlds both online and IRL. 404 Media is a journalist founded company and needs your support. To subscribe, go to 404media.c0 as well as bonus content every single week. Subscribers also get access to additional episodes where we respond to their best comments. Gain access to that content at 404media.c0.

Joseph:

I'm your host, Joseph, and with me today is Meredith Whittaker, the president of the Signal Foundation. Signal, as many of you will know, is generally considered the gold standard of an easy to use consumer friendly end to end encrypted messaging app. Signal has a a long history. I remember way back in, I guess, around 2010, although I joined it a little bit later, there was TechSecure and Redphone, and then they morphed into the signal app that we all know today. And then, of course, beyond the app itself, there's the signal protocol, which is, you know, I'm I'm butchering it, but, basically, the encryption itself, I'll say that.

Joseph:

I'm sure I'll get some angry emails, but that is also implemented in WhatsApp and Facebook Messenger, basically making that protocol a linchpin of privacy and end to end encryption for billions of people. Right? So here, we're gonna talk about the current state of privacy, the realities of backdoors, and the threats against end to end encrypted messaging services worldwide. Meredith, thank you so much, and welcome to the show.

Meredith:

Such a treat to be here, Joseph. I am such a huge fan of your work and for for media, so I'm just gonna put a plug in right at the jump that folks really should support. This is you you all are doing so much heavy lifting in an ecosystem that, has sadly been hollowed out a bit too much in the in the last couple years. So so thanks for that work, and, again, really happy to be here.

Joseph:

Of course. Absolutely. So I think just to start, everybody knows signal, I would say, but they may not know some of the the numbers behind it. Can you just give us a sense of, today, how many users are we sort of talking about, and is that number growing?

Meredith:

That number is growing, and I will just, off the bat, be annoying and cagey here and say we don't give we don't give exact numbers of our user base. One reason for that is that it it's pretty volatile. We will see massive growth in a region in, you know, response to a political issue or, you know, what have you, and then perhaps some drop off and, you know, it really, you know, expands and and contracts a bit, but it is overall growing. We are seeing increased sensitivity to privacy. You know, that's evidenced everywhere from our user growth charts to the fact that Apple and Meta are spending 1,000,000,000 of dollars on advertising, really indexing on that one value to, you know, simply the the tenor of the public conversation as things like the AT and T data breach, which you all reported on.

Meredith:

You know, it continues to make people aware of the very real material dangers of of the mass surveillance that most of our core digital infrastructures is conducting on us every day. So, you know, signal if if you want a little shortcut here, signal has been downloaded 200,000,000 times about. I don't have the exact number in front of me on just from just the Play Store alone. And, of course, that's one of 3 clients, and we are, you know, we're looking at trends that really point up into the right and have, you know, some pretty big plans in the future in terms of how we can ensure that signal as core infrastructure, both the messenger and the protocol and the work that we're doing across the stack to enable privacy in an ecosystem that defaults to surveillance is able to grow and spread and become increasingly foundational to a a digital the the, you know, digital ecosystem, the Internet, whatever metaphor we wanna use for the fact that, you know, this now is the nervous system of our social and political and economic life.

Joseph:

Yeah. For sure. And, I mean, you mentioned plans and, of course, growth. That makes me think of how signal over I mean, not even the years, but almost the months it feels because you will release new features somewhat rapidly or you have you have recently. I mean, stickers are a massive part of signal now.

Joseph:

We are constantly using them in our group chats. I make my own stickers, you know, for in jokes and all of that sort of thing. Is the idea there that you're trying to introduce, you know, at the end of the day, fun features, are you introducing those to generate growth? Like, is that the thinking behind it? Or

Meredith:

Well, we are introducing features that we hear from people across the world, and these are different features depending on the region, the context, even the the age group, are core ways that people communicate. So stickers are not hugely popular popular, for instance, in many contexts in the US, but they're wildly popular. They are core communications tools in much of East Asia. And so a messenger app without stickers is not very useful to people who are, you know, who have facility with that paradigm, who, you know, think in stickers, who send stickers as as part of a, you know, conversational rhythm that is, you know, very familiar. So, you know, part of what we're doing there is trying to put the core function of communication at the center of what we do.

Meredith:

People don't pick up signal to flex the fact that they care about privacy. They pick up signal to talk to someone they love or like or want to speak to in some other way. It is fundamentally about human communication, which itself is fundamental to being human. So we wanna respect the fact that that takes many forms and many targets and it's stickers in one case is is very important. So it's, you know, it's to enable growth.

Meredith:

I think, you know, often growth comes when we're talking about communication networks, the network effect really matters. So, you know, communication networks to be, you know, 101 about it. They they increase in value with the quantity and quality of the nodes added to the network. So the more, the more valuable.

Joseph:

So, you

Meredith:

know, put another way, no one buys the first telephone. Right? Put another way, if you switch to signal but your friends don't, you haven't really switched to signal. You're not really able to use it. Right?

Meredith:

So so we see we see growth perpetuated by collective events or collectives. You'll see, you know, a very easy one is is political volatility when, you know, the the distance between physical safety and digital privacy collapses. Ukraine, one of our hugest markets, and, you know, we saw that rise from a very small market to a very large market, you know, obviously, in relation to the Russian invasion. We see growth in in groups that happens in in relation to just big tech messing up, the incumbents making a policy change. In, early 2021, WhatsApp announced a change of terms of service where they announced to their users that they were going to be, you know, giving data to Facebook, and they had promised for a long time not to do this.

Meredith:

This was a broken promise, and people notice. People do actually care about privacy, and they switched in droves to signal. We, you know, basically DDoSed our servers. We had to recover from that. Did a very good job.

Meredith:

You know, bless the team who stayed up nights doing doing that hard work. And then for over 3 months, we were number 1 in the app and play stores across 70 countries. So, you know, you see what we want here is to be ready for those precipitating events, to have the features that make signals seem easy and intuitive, you know, across the globe. And those aren't gonna be the same features in every jurisdiction. You know, stories, for instance, which are not hugely popular in much of the west, hugely popular in South Asia, hugely popular in Brazil and other parts of of South America.

Meredith:

So so, again, we're, you know, we're trying to cover a lot of ground, but, ultimately, it's about it's about making signal as easy to use to communicate in the way that you're familiar with and that your friends are familiar with as possible so that when you need it, it's truly there for

Joseph:

you. Yeah. It's funny. I totally forgot stories were even a thing in signal. I don't think I've ever used that single feature.

Meredith:

You are missing out. You are just really missing out.

Joseph:

I I see something that looks like Instagram in any way. I'm like, no. No. No. No.

Joseph:

No. No. It's it's not even a security thing. It's just like, I I don't know. I've never engaged with that sort of content.

Joseph:

You see what

Meredith:

I mean?

Joseph:

But yes.

Meredith:

Well, I'm gonna I'm just gonna say, I really love stories. I think, you know, it's for me, I have actually love you know, like, I love seeing what my friends are doing. I love cute people. I love a cat. Right?

Meredith:

You know, no shame in that. And, and I feel really uncomfortable with Instagram. I feel really uncomfortable with sort of, you know, that stuff being scraped for AI or, you know, have ads run against it or sort of show up in some stranger's feed. The whole thing, you know, went from a photo hosting app that was slightly better than Flickr to, you know, a a, you know, biometric surveillance hellscape. So having a place where I can share, like, cat photos or, you know, cool photos that I took that I know they're not going anywhere.

Meredith:

Like, you know, they are literally deleted off of, you know, my device and everyone else's. They are encrypted. They're not no AI is run against those. I'm not exposing a stranger's face to facial recognition that could imperil them because they overstayed the visa, and that's run against some ClearView, you know, system and the meta back end that goes to law enforcement. You know?

Meredith:

I I don't think that is happening now, but it's incredibly technically possible, and it's it's the exact kind of thing that could happen. So, anyway, I I love stories. I think folks should, you know, enjoy stories, and it it doesn't come with all of the stress that Instagram and the other surveillance acts come with. But we also give you the option to turn them off permanently and never think about them. So we're we're we're different than big tech in that way as well.

Joseph:

I think that must have been what I did because, again, I literally have not thought about that, in forever. I guess just the last thing on features is that you introduce more features, you at least theoretically can introduce more attack surface. Right? That is just the nature of product development. How do you introduce these new features while keeping the app secure?

Meredith:

I mean, that's that's a good question. I think the, yeah, the paradigm is the more complex the code, the more room there are for bugs to hide. We, you know, we are audited regularly. We engage in, you know, security best practices. We have code review practices, and I think it's you know, we're we're also developing unlike almost any other alternative large scale messenger in the open.

Meredith:

So people can watch even before the feature is pushed. We are developing in the open on our repos. There are people who will go in and see, you know, bits of code behind a feature flag that's sort of, you know, being iterated on and begin to hypothesize in our forums or on Reddit or wherever they do it about, you know, what we might be building. So there's a lot of scrutiny and a lot of eyes on our code, you know, which is honestly, it's a gift. Right?

Meredith:

We have people looking at it, finding issues, raising concerns. You know, sometimes it's bad faith. Sometimes it's annoying. Whatever, whatever. But, nonetheless, that that's a powerful immune system.

Meredith:

So, you know, that's not a one weird trick. There is no one weird trick. The the trick is you never stop asking that question. You never let down the vigilance. You never think you're safe and out of the woods.

Meredith:

Right? But I do think we have a a lot of practices and a development paradigm in place that puts us in a much better spot than any other competitor. And, I mean, let's be real. Like, if you look at the competitors' apps, there are every single quarter, some product development team has got to ship some crappy feature because let's say someone named Mark is obsessed with the newest tech hype. Right?

Meredith:

And so, you know, in terms of bloated features, I think I think signal is actually incredibly lean and elegant, and it really is that balance between how do we not fall behind the norms that mean that we're genuinely not useful for certain forms of digital communication that your most messages would be useful for, you know, while maintaining our integrity as best in class for security and privacy.

Joseph:

Sure. Alright. Last sort of setup question. Years years ago, I remember The New York Times published an article based on a subpoena sing signal received from the US government, and it showed well, I think a lot of listeners already know, but it showed that signal cooperated the legal demand and returns the user data it had, which was, I'm just looking at the quote now, the time the user's account had been created and the last time it connected to the service. Obviously, not much data.

Joseph:

You know?

Meredith:

No.

Joseph:

Is is that still the case with what data signal will provide to the authorities if compelled to do so? Is it still basically the same as it was? Or

Meredith:

And we go out of our way to get that as close to nothing as possible. So, yeah, you can look at signal.org/bigbrother, and you can look through other subpoenas that we have received and unsealed and posted there. We fight every subpoena that we get. If we are forced to comply, if we lose that fight, we then have to comply. That's how it works, and that's why to be a truly private communications provider, you cannot have the data because, ultimately, you can be forced to comply or shut down or, you know, what have you.

Meredith:

So we do comply, but what we comply with is vanishingly small. And, if we haven't if you go to that site and it hasn't been updated recently, I I believe it is updated. I'm not actually looking at it right now, but we're in the process of of posting some newly unsealed subpoenas there now. So you'll get you'll get a number of examples of of how those look and just how little data is available there.

Joseph:

Sure. Alright. That that was me spending 15 minutes to setting it all up because I just think that context is important. But let's now zoom out a little bit. And today, in 2024, what do you see as the most pressing threat against end to end encrypted messaging?

Joseph:

It's a very fast moving world. We have the EU chat monitoring proposals. We now maybe have another debate around the crypto comms around the Trump shooter as well. But is it those what what is the most pressing threat that you see against end to end at the moment?

Meredith:

I don't I don't think in stack ranks

Joseph:

It's not it's not the, it's not the the Fret Olympics. Yeah. I guess. Yeah.

Meredith:

Yeah. I mean, we're thinking like, we're talking about a kind of nexus of threats, and and primary one among them is, you know, centralized power tends to constitute itself via information asymmetry. And so since you know, I don't know. You can look back in 1976 with the US government trying to prevent the publication of Diffie and Hellman's paper on public key cryptography because they didn't even want the paper out there. Right?

Meredith:

There has been anxiety post World War 2, you know, around the idea that any network for communications that people can access could be off limits to government scrutiny. And that's that that is not going away. I think, you know, we need to be on the lookout to threats to the viability of encryption itself, threats to implement a backdoor or a front door or bolt on a surveillance service as a mandatory component of any end to end encrypted communication service, thus completely nullifying the entire point and tiring us out with having to argue against rhetorical tricks over and over again as the the European, the European Commission has just done and then withdrawn. I think there are there are so those are legislative threats that, we've seen many of in the UK, in Europe. You know, we saw Australia propose some of these, but seems to be walking it back with some fairly sane language in, you know, some of their latest, latest regulatory and policy proposals.

Meredith:

So I think we'll see more of those, but I do get a sense that the arguments are getting through to some extent that, you know, I To

Joseph:

the policy makers, you mean.

Meredith:

To the policy makers and and the people who who may not have understood the type of threat that their well meaning legislation was posing. Because I think I think in some sense, we're dealing with a scenario where the pretext is incredibly inflammatory. It's very emotionally charged. Child abuse, child sexual abuse. We need to prevent this, and it it, you know, it, no one disagrees with that.

Meredith:

People yes. We need to prevent it. We'll do whatever we can. Well, here's what to do. And a lot of people, I think, just, you know, took the instructions and said, okay.

Meredith:

We're gonna diligently do this, not looking at the fine print or not understanding that the, you know, there there were some, some Trojan horses in there in terms of fundamental rights and liberties.

Joseph:

Yeah. Well, just on that, I mean, because may maybe not everybody listening is actually familiar with what happened in the in the EU recently. Could you just very briefly give us a rundown of what was being proposed and sort of what your reaction was to it?

Meredith:

Yeah. I mean, there's the long version, which I don't think is terribly interesting for the the the lay listener. But in effect, the the conceit was that there is a, you know, there are issues with child sexual abuse material online and that law enforcement, governments, well meaning NGOs to deal with this, they need to be able to effectively break end to end encryption and scan private communications and num you know, to to find, you know, those that may contain such imagery, and we assume, you know, persecute the prosecute both, I guess, the culprits. And, and, you know, so that's that's the high level rationale. And so we've gone back and forth for, I think, a little over a year.

Meredith:

You know, there's been a proposal to do certain kinds of scanning and then the, you know, scanning that would break into an encryption, constituting a backdoor. And then the technical community and ourselves and and others would sort of come out and say, no. That's that's actually a terrifying cybersecurity risk, and there's no evidence that it actually helps to protect children. And then a new proposal would come and say, like, oh, well, we're not gonna create a backdoor. We're gonna put the scanning in front of the encryption, so it doesn't actually hurt encryption.

Meredith:

But, of course, if you bolt surveillance as a mandatory component of an encrypted system onto, you know, whatever the communications platform is, you have undermined encryption. You've moved the target for hackers from unbreakable math to some crappy vendor provided software that the government is mandating, which is gonna be easy to break. Right? So so we've had to kind of go through a cycle of what I would call, like, rebranding the same old thing in an attempt to get it through and get it past the technical community's vocal opposition, the long standing consensus that, you know, this is math, not magic. You have to recognize the limitations of your desires here.

Meredith:

And so that's where we stand now. The lat latest round was maybe a month ago. There was a a drawdown effectively, and now the European Commission is reconstituting itself after elections. We have Hungary coming into the presidency, who is rumored to be largely supportive of this. We don't know if it'll be picked up or not, but I you know, again, coming back to the first point, power constitutes itself by information asymmetry, and this has been a long standing wish of law enforcement to undermine strong encryption.

Meredith:

I I don't have much confidence this will ever be put to bed entirely. This is not the this is not a disagreement. This is not a misunderstanding that we can educate away. This is a battle for power that we're gonna have to contend with on those terms.

Joseph:

Yeah. It's been going on for decades at this point. Right? And it start goes all the way back to the the Clinton administration where we've well, the Clipper chip and all that Yeah. And then the the San Bernardino stuff, and then, yes, the pivot

Meredith:

And before that, there was you know, in the eighties, Reagan was also discussing some of this. So it's there's never been a the prospect of networks that law enforcement can break that hasn't freaked out government in my in my historical reading. I think there are other threats we could discuss that aren't legislative that might be interesting as well. And one of those that has occupied my time is the move of AI systems into the operating system

Joseph:

Mhmm.

Meredith:

And things like Microsoft's recall in particular. The, you know, I would say pretty alarming shift in a long standing paradigm in which the operating system provides a, you know, a trusted basis on which various applications from, you know, that if not from a single vendor, right, are able to, let's say, form contracts with a given user. You know, signals contract is we collect none of your data. If you're using signal on a device that isn't compromised, it is secure, It is not going to be leaked, and, you know, we are going to, and, you know, thus when when we say that, you know, on thus when we fulfill a subpoena for for instance, you know, it only has a vanishingly few bits of information in contrast to any other application. And that's, you know, that's a kind of, an incredibly important paradigm.

Meredith:

And and what we're seeing with things like Microsoft recall, which was meant to ship with Windows 11, the new Windows operating system, is a a violation of that paradigm where in the name of, you know, feeding an AI system that will do some convenient or an inconvenient or completely useless thing for you. In recall's case, it's AM, Like, anyone wants to know that you were doom scrolling at 3 o m, 3 AM. Right? Like, you know, this

Joseph:

is given to be fair. Like, they don't need an AI system to figure that out. But, yeah, it it would catalog basically everything you're doing on that Windows machine to then provide some sort of efficiency service with its AI tool. But to me, it's just that I don't want that on my OS No. Basically.

Meredith:

No. You don't I mean, just a you know, plain text honeypot on your OS, that includes screenshots of your signal desktop messages if you're using signal desktop. That fundamentally violates that contract between signal and the person using it, which is then being subverted by the operating system manufacturer or the OEM. So I think, you know, I think for me, we need to be a lot cannier about simply accepting that if AI systems are running on the device and not phoning home, they somehow constitute a private system. Because what we're actually seeing is the need for data that these systems have and the functions that these systems are being put to, whether it's recall and this sort of silly eidetic memory that's like a desperate search for a market that's not materializing, if you want my read on that, or Google's Gemini, which is, you know, gonna scan your phone calls for scams, but could very be easily be purposed to scan your phone calls for discussion of drugs or scan your phone calls for people seeking abortion care or anything else that we really need to pause before we allow these companies to make those kind of determinations to violate our privacy, to make those decisions about us.

Meredith:

I think we need to we we need to reevaluate that paradigm, and and some of some of my thinking and my work right now on the, maybe, the the scholarly side or the, you know, analytic side is really focused on that.

Joseph:

To to shift gears slightly, so you mentioned sort of front doors and back doors, and and we've seen authorities launch larger and larger operations against encrypted chat platforms. Listeners will know I don't shut up about this. It's like my obsession. We had EncroChat where where the French police hacked into more than 30,000 devices that were using EncroChat software. Sky, 70,000 devices.

Joseph:

They got half a 1000000000 messages. And then Anom, which is the platform that was secretly run by the FBI. To be clear, these were the majority of their users were alleged serious criminals. That is, that's apparent in the messages that I've seen. That and then you can then argue, well, should the authority still have done that?

Joseph:

I think that's very much an open question and and a debate that we simply have not had, in public. But what do you think, just broadly, about this trend from authorities to compromising entire encrypted chat platforms, even if they are, in some cases, used, heavily by criminals. Because that's just such a different way of approaching, the encryption problem law enforcement sees. You know, it's not hacking an end device. It's not doing maybe a side channel.

Joseph:

Like, well, let's get a wiretap of their ordinary phone calls or whatever. It's just like, we're gonna compromise the entire platform. I mean, what what do you make of that approach?

Meredith:

I mean, I don't I don't have anything clever to say about it. It's, you know, it's it's a threat we need to keep in mind, particularly, you know, given that, you know, private money can mean that these, you know, different start ups or or for profit platforms change hands without people knowing. Right? And I think, you know, I often think about this in terms of Telegram. Like, would we know would we know if the CIA put together an LLC and quietly, like, you know, gain a majority stake in Telegram?

Meredith:

You know, I don't think they'd need to because Telegram is not actually encrypted or secure in any meaningful way, so why not just, you know, look in the window instead of buy the house, maybe? But, yeah, I don't, you know, I I don't have a a sort of analysis on that trend other than I think this is again, we have, you know, closed source platforms that are making promises that aren't validated, aren't backed up by scrutiny or an analysis of the open code or something like the signal protocol, which has been in the open. The implementation has been in the open for a decade. It's been hammered. It's been, examined from all sides, and it, you know, continues to stand the test of time.

Meredith:

And that part of this is perhaps a symptom of the the way we have guilelessly approached tech in general, allowing tech companies and tech narratives to shape what we understand about tech and not demanding democratic scrutiny or expert validation or any real systematic checks, that ensure that the marketing and the reality are the same. Now would that actually make a difference in the context of law enforcement, you know, collaborating with a I you know, or or standing up a fake platform or taking over a platform and kind of quietly subverting its functionality? You know, likely not, but I think, you know, again, maybe maybe we you know, there would be other bulwarks in place that would make it more difficult. And, you know, in general, it's not just law enforcement that could do this as well. Right?

Meredith:

You have you know, we know there's a lot of scam platforms. You know, we can think about the, like, the, you know, match your face to a celebrities that are gathering biometric data to train some shady AI to sell to law enforcement. Right? Like, there's there's a lot of, like, pretext and and lying and marketing that is considered normal in the tech industry that, you know, if we were if we were producing a food product, would not be considered normal, would be, in fact, criminal in a lot of cases. So I think, you know, quick reflections there, but it's it's something we should be wary of.

Meredith:

And I guess, you know, it's a symptom of a larger problem of of hype and opacity being the currency of the industry.

Joseph:

Yeah. Maybe just one other question before I get to sort of the the main one I want to, finish on. But you did bring up Telegram. And recently, the CEO of Telegram said in an interview that his engineers had allegedly been approached by the FBI, you know, encouraging them to include certain code bases in the app, which could potentially act as backdoors. Not many specifics were given in the interview, but, hey, that's still an interesting detail in and of itself.

Joseph:

Have any of your engineers been approached by the FBI or other authorities in that sort of way?

Meredith:

I mean, you're asking me to prove a negative, but, like, let's move back. That is a it's a sort of a fantastical story. Right? I don't know. Like, sure.

Meredith:

That may have that may have happened. Like, hi. I'm the FBI. Here's some code. Could you put the code in your code base?

Meredith:

Like, I don't like, nah, man. Get someone hired at Telegram. Like, there you know, we know there are ways that this is done. We know I I don't I I have not yeah. No.

Meredith:

I I the the story itself just seems a little bit like like a children's book version of a real threat. Right? And, again, you know, Telegram is not open source. We don't know you know, we we we know it's not secure. We know that it is you know, the the their encryption protocol has had, you know, some bad bugs that some people thought looked like a backdoor that, you know, I believe were remedied.

Meredith:

But, you know, nonetheless, you know, I don't I can't say that story is true. I can't say that story isn't true, but I would say it's it's kind of a it feels like a kind of mythologized version of a real concern that doesn't kinda hold water. I I would be, you know, I I would be if that's the FBI's tradecraft, you know, I think we're all pretty safe.

Joseph:

Yeah. And and to and to your knowledge, that has to happen to you. Yeah.

Meredith:

No. I mean, we, you know, we develop in the open. Right? The FBI can look at our code. We we don't you know, this is one of the reasons we don't just willy nilly accept pull requests that are made by external contributors on our repo.

Meredith:

Right? We take a lot of care with what we accept or don't accept and don't accept much because we wanna make sure that we're scrutinizing things there. We have people scrutinizing, you know, our open source code. We have people scrutinizing, our our encryption system, the the the signal protocol and some of the other, metadata encryption that that we use. So it's no.

Meredith:

We we have not had to my knowledge, right, and I can't, it's a it's a bit of a it's like

Joseph:

a key

Meredith:

question because I'm like, I don't you know, like, did a guy meet a guy at a party? I wasn't there. But, no, I've never heard of that happening. That's not how I've ever heard of you know, like, I've never heard of a scenario like that in Silicon Valley, and I've worked in tech for almost 20 years. You know, you do find about out about moles and agents, but it's usually, you know, a more sophisticated operation.

Meredith:

And so, yeah, that's what I can say. And, you know, I think we have structural safeguards in place that don't rely simply on, you know, one engineer saying yes or no. Right? We, you know, we have we we have things in place that that check our code whether the the bug is malicious or accidental.

Joseph:

Yeah. And I think this just leads to sort of the the final main thing I really wanted to ask you about is because while I've been covering law enforcement's approach to end to end encrypted communications, especially a nom, in all of my investigations there, I've sort of seen that 3 options have emerged. Or maybe not options because that sort of implies you can only do one and not the other. But there's sort of these 3 paths for encrypted comms. The first is one you mentioned earlier, which is sort of the front door, right, which is where apps will give data to authorities under a legal order.

Joseph:

You know, and Discord does this all the time. Telegram does not really do it because they just generally don't cooperate. Signal does cooperate, but it gives a very small amount of data because it doesn't retain much data.

Meredith:

We fight, but which is a form of cooperation. We don't deny that the process has validity because that's a a good way to get your service shut down, but we do fight every request. And and then, yes, we do cooperate with those that, we we are forced to cooperate.

Joseph:

Yes. So you have that front door. And, you know, maybe one way to appease sort of the debate and stop these cycles is, oh, well, more apps could then give more data. Of course, that's gonna mean undermining your own security. Then you put that to one side.

Joseph:

Then you have things like backdoors, which is a norm, which is the FBI running its own encrypted chat platform and getting all of these messages. These crazy, audacious worldwide operations, and I think there's a lot of wide room for collateral there. And then the third one is targeted hacking, where authorities may think, well, we wanna get the encrypted messages of this particular person. Let's hack that particular device, use some sort of modular malware to only get the messages, do it under a legal order. Of course, also room for abuse there.

Joseph:

My question is, which one of those is well, I suppose not the most attractive, but the least bad to you out of those options.

Meredith:

No. No. I reject the framing of this question. I do not have to eat anything at this buffet. This is not my dinner time.

Joseph:

I absolutely knew you were gonna reject the premise of the entire question. But I think that's interesting in in of itself. Right? Because we we have got to this point where the FBI is feels it can secretly run its own tech company for organized crime and intercept the messages because they see that as a valid option. Right?

Joseph:

And I think what I'm trying to get at is that we have this debate and there's the dialogue in the EU and and then in the UK and Australia as you say. In the shadows, the FBI isn't really asking. It's like, well, we're just gonna go run our own encrypted platform and do all that. So, if you reject the premise, why is that exactly?

Meredith:

Well, the premise has stayed the same for 30, 40 years. Right? We sort of skim through that history, but the premise that law enforcement is Paris perilously on the verge of being shut out from the visibility it needs to do its job. That has sustained for as long as this debate has sustained. And, of course, 34 years ago, there was no such thing as the Internet.

Meredith:

Our letters were not surveilled, and in indeed, postal mail has some, you know, incredibly strong laws protecting it from such surveillance. If to do a phone wiretap, which was itself controversial, It you had to go through a huge number of hoops. You had to get a warrant. You had to prove that you had a reason to do that. Right?

Meredith:

So they had much, much, much less information than than they do today. And in the interim, particularly, you know, looking at the 19 nineties, we effectively greenlit a business model that endorsed mass surveillance as the economic engine of tech. So, you know, it was there's a number of reasons for this, but, basically, there was a a Clinton administration did 2 things. They put no privacy guardrails on the commercial Internet. So mass surveillance for corporations, everything goes totally fine.

Meredith:

They can do way more than the government could even dream of in the name of profit and commerce. And 2, they endorse advertising as the business model of the Internet. And, of course, advertising is know your customer, know your customers, know more and more and more about your customers so you can, you know, differentiate yourself in the market, and that is an incentive to surveil. So in this sort of last 20, 30 years, we have seen mass surveillance over every aspect of our lives become the norm, become accessible to law enforcement. The argument that they are somehow missing data and that this is a perilous threat to their ability to do their jobs simply does not check out when you begin to look at the details here.

Meredith:

And so, you know, you can say like, in order to get the data they're missing, here are 3 options, but you have already taken as a given a premise that I don't think holds up.

Joseph:

Yep. All all I would add to that is that just to give one specific concrete example, during a NOM, there was an amphetamine lab producing tons and tons of speed, by a very senior drug trafficker, a guy called, Rifkin. His name doesn't matter. What's important is that

Meredith:

Was it a Sackler?

Joseph:

No. Not quite. You know? Yeah. They were they were on the norm as well.

Meredith:

Yeah. Sorry. I don't I don't know if their hiding was really the problem.

Joseph:

Right. Right. Right. But so there's this there's this drug lab, and it's producing drugs. And the key thing is that the local police or rather the national police of Sweden, they had no idea this drug lab existed.

Joseph:

They had no idea this drug trafficking organization even existed until they got messages from the FBI's backdoored app or not. So my question is is that they were missing crimes. Is that, an acceptable trade off? Like, how do you how do you square that with sort of the rejections of free options, if you see what I mean? They were missing crimes is what I'm saying.

Meredith:

Well yes. Okay. That's a it's a very compelling story because it's very difficult to argue against it without, you know, somehow being counted at a camp that is going to be, I don't know, sympathetic to drug lab gangsters, whatever, you know, whatever's going on there. You know, I guess if we had cameras in all our bathrooms, in all of our homes, in, you know, every inch of our lives, we could argue that that would ensure that the law, you know, law enforcement doesn't, you know, miss crimes. Like, there is a threshold here where I think we need to recognize that fundamental liberties, the fundamental ability to communicate privately are pretty imperative to having a functional democratic society and governance structure.

Meredith:

Now I don't I don't know the details of that particular story, but, you know, I I would also I I don't know where's the sourcing on it in terms of, you know, had they received calls from neighbors nearby the lab who may not have been rich people, whose reported bad smells? Did they have other indicators that they weren't paying attention to? Were there other issues of data deluge that was perhaps preventing them from finding the needle in the haystack of data, which is, you know, the largest problem I see when I talk to law enforcement people, when I talk to people familiar with the actual, you know, on the ground labor practices of doing law enforcement in this day and age, is that there's too much data. There aren't enough people to triage it. There aren't systems that can tell the signal from the noise.

Meredith:

There are, you know, repetitive data. There is, you know, there's simply not a good way to sort through it and make sense of it so that, you know, you know, within it, here is an indicator of a a real crime versus a fake crime. So, like, I think what you presented is a it's the kind of narrative that if I were the FBI's press office, I would be really happy to find a way to tell that story because it it's erasing a lot of, you know, complexity. Like, you did we talk to the Swedish police? Is that their story, or is that the FBI story sort of touting this

Joseph:

I mean, I spoke to the Swedish police. Yeah. I spoke to the Swedish police, and I I read the messages. But yeah. Yeah.

Joseph:

But I see I see where you're getting at in that, there could be and it's almost to make you more theoretical, and and speculative now and broaden it out. But, like, there could be these other ways that authorities could investigate or discover these crimes, basically. As you say, the neighbors or whatever and that also applies just much more broadly to when we have many more conversations in private now. It's like, yes, well, maybe they have their phone location data or whatever. It's not like there is a crime wave because everybody is using end to end encryption.

Meredith:

Yeah. And let's can we draw back a little on this term crime, right, like criminality, criminalization? And I don't wanna get too abstract, But I do wanna recognize that, you know, what is and is not a crime changes over time. In authoritarian societies, journalism is a crime. In the US, accessing health care in a number of states is a crime.

Meredith:

Trans health care is a crime. There is a woman living in jail right now in the US. Her name is Jessica Burgess, and she is there largely because Facebook turned over messages between her and her daughter to law enforcement that were used to convict her of the crime of obtaining abortion care and dealing with its aftermath in the state of Nebraska after the Dobbs decision had made that illegal. So, yeah, they they found a crime. But, like, what are we actually talking here when we talk about, you know, the confluence of those types of authoritarian tendencies, the fact that crime is always, you know, defined and redefined according to social norms and mores.

Meredith:

And, you know, obviously, we're this is not to, like, relativize things like murder. Right? But, like like, to be really rooted in our present moment and then to kind of argue that in order to stop crime, we need full visibility on everything that is, we know, you know, chilling the speech, chilling the dissent, you know, deadly for journalism, perilous for, you know, any any meaningful social transformation of the type that will be necessary to overcome climate, of the type that will be necessary to meet, you know, many of the inflection points that we are are living through now.

Joseph:

Yeah. And that's, of course, gonna be even more relevant at least in the US with the the upcoming US presidential election, and maybe we'll just see I haven't I forgot about that. See yeah. We're we're gonna see, what the US is gonna look like, in, I guess, a few short months. But Meredith, this conversation was incredibly insightful.

Joseph:

Thank you so much, for coming on the show, and and I really, really do appreciate it.

Meredith:

Thank you so much, Joseph. Big fan of your work in 4 zero four Media, and, use Signal.

Joseph:

As a reminder, 40 4 Media is journalist founded and supported by subscribers. If you wish to subscribe to 404 Media and directly support our work, please go to 404media.c o. You'll get unlimited access to our articles and an ad free version of this podcast. You'll also get to listen to the subscribers only section when we talk about a bonus story each week. Another way to support us is by leaving a 5 star unfortunately, algorithms Unfortunately, algorithms and shit.

Joseph:

This has been 404 Media. We will see you again next week.