Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Also media.
Speaker 2 (00:08):
Welcome back to it could happen here. I am Robert Evans,
and this is a podcast about things falling apart. Sometimes
it's about how to make things not fall apart, and
other times it's more about enduring it. Today is more
on the endurance side of things. And we're talking about
a subject that we get a lot of requests about here.
We've discussed this a year or so ago with one
(00:30):
of our guests, a great Carl Pasarta. We're talking about
like security culture, and particularly the aspect of security culture
that involves digital devices and how to communicate with your friends'
affinity groups, whomever via your phone essentially or your computer.
This is a thing where there's a huge amount of
disinformation as to which apps are safe. What does it
(00:52):
actually mean to say that an app is encrypted? How
far does encryption get you? What sort of like cultural
things come alongside the actual, like physical reality of the
security of the device in order to kind of make
a comprehensive security profile. We're gonna be talking about all
that today and hopefully giving you some good advice on
what you can trust. Because I am the furthest thing
(01:14):
in the world from a technical expert. We have two
actual experts with us today. Carolyn Senders and Cooper Quinton
have both recently published a paper alongside several other authors
Leila Wagner, Tim Bernard, Ami Meta, and Justin Hendricks called
what Is Secure? An Analysis of Popular Messaging Apps, and
(01:34):
it's it's basically going over what is the actual level
of security with a number of things like Telegram, you know,
Telegram's private messaging system, Facebook Messenger, Apple Message, or I Message.
I guess it's called and obviously signal and kind of
as a spoiler, signal is your best bet, but that
also isn't where you should end, right I think we
(01:56):
want to also talk about kind of like why and
to what extent that's the case. But anyway, I'm going
to turn things over to Carolyn and Cooper now because
I have talked enough about this. Hey guys, welcome to
the show.
Speaker 3 (02:09):
Hey, Robert, thanks so much for having us on.
Speaker 4 (02:12):
Yeah ah, yeah, thank you so much, A big fan
of the podcast, so always lovely, really lovely to be here.
Speaker 5 (02:19):
Yeah, thank you so much.
Speaker 2 (02:20):
Yeah, it's really lovely to have you both again. Listeners,
if you want to take a look at this their paper,
if you just google what is secure and analysis of
popular messaging apps. You'll find the Tech Policy Press has
a summary of it that's pretty quick. The full paper
is eighty six pages or so. I also recommend reading that,
but if you wanted to give this the summary a
skin before you continue, that might help. But I kind
(02:43):
of wanted to start by asking you, guys, what is
it that makes Signal a good option for people? Right?
Because I think most folks you describe it as sort
of security folklore, right, the stuff that you hear about
security from your friends, and if you're not a technical person,
you kind of just like trust what the folks around
you were saying. And that was sort of how I
got into Signal. Right, I'm not a technical person, but
(03:05):
people I knew and trusted who were were like, this
is your best option?
Speaker 5 (03:08):
Yeah, thank you so much.
Speaker 4 (03:09):
That's such a good question, and I think Cooper and
I probably have similar but also like very different answers
to it.
Speaker 5 (03:16):
Cooper, I can go first if you want.
Speaker 4 (03:18):
One of the things I love about Signal is it's
just really easy to use. It's in and encrypted, it's
a messaging app. There's not a lot of stuff on it,
but you can do a lot with it, so you
can do video calls, you can send actually pretty large
files like PDFs.
Speaker 5 (03:34):
You can have drag and drop stuff.
Speaker 4 (03:36):
It's like such a low threshold for use for users
because it is a messaging app, but it does so
many different kinds of things. But then related to that,
it's also actually quite minimal. So the paper which everyone
should read and we'll probably get into this later. Different
apps like Telegram or Facebook's Messenger app, for example, have have.
Speaker 5 (03:58):
This thing we've been calling feature bloat.
Speaker 4 (04:01):
They are messaging services that actually feel a bit more
like social networks if you look at the amount of
stuff that's on there, and by stuff, I don't just
mean like stickers, I mean if you look at there's
all these sort of specific and strange settings you can
use to have all different kinds of messages and all
different kinds of privacy settings, and all privacy settings are
really really great. Because Telegram and Facebook Messenger are not
(04:25):
encrypted by default, actually some of those settings can make
you feel more secure when you're not so. Kind of
the beauty of Signal is that out of the box,
it's incredibly secure. It's an inn encrypted They're not holding
any data about you. I believe the only only day
they hold is like when you've like when a phone
number or a profile has downloaded signal, like when you've
(04:46):
when you've signed up. But again it's it's incredibly easy
to use. And another thing is, you know, if this
was a few years ago, we've been looking at wire
for example.
Speaker 5 (04:56):
One of the nice things about Signal.
Speaker 4 (04:58):
And this might be controversial to some, is that it
does follow modern design patterns and standards. So if you're
using an iOS or Android version, like there are buttons
in places where you expect them to be. Signal is
not perfectly designed, but it is quite usable. Yeah, So
for me, that's kind of what I think makes it
makes it really wonderful.
Speaker 2 (05:18):
Yeah, it's definitely as much as I love it, and
it's my like standard messaging app I do every now
and then run into the thing where like my friends
will call me through Signal, which is great if you
need a call to be secure, but it's not nearly
as good, Like it drops a lot more often than
a regular phone call, and I'm like, we're just trying
to meet at the movie theater. It's okay if the
nsay noes right, Like I've.
Speaker 4 (05:40):
Definitely had that with friends where I'm like, I'm like yeah,
I'm like, we're just calling to talk about like your dog.
Speaker 5 (05:47):
It's probably fine.
Speaker 2 (05:48):
Yeah, the FBI can have this stuff.
Speaker 4 (05:51):
Yeah, please send, please send, please send dog picks through
all all messaging apps.
Speaker 3 (05:57):
You know. But on that note, it's uh, right, writing
usable software that is also secure is really hard, right,
And like as a like as cryptographer, I'm not a cryptographer,
but like as somebody cryptographer adjacent, we got that wrong
for a long time, right, Like before Signal the problem,
you know, there were the the the sort of most
(06:20):
used encryption methods were probably uh PGP email, which is
a method for encrypting email, and off the record chats,
and both of those none of those ever got to
the sort of level of user base that Signal and
and certainly not WhatsApp have, right, And and that's largely
because they were pretty much unusable, like PGP, almost entirely
(06:44):
unusable even by cryptography professionals, right, even by computer security
professionals like ourselves. OTR chat total pain in the butt, right,
like just just a real nightmare to use. So, like Signal,
there are still some rough edges, and we talked about
some of those in our paper. But overall, I think
that the big the big innovation they've had is just
(07:07):
remembering that what people want to do on a chat
app is not encrypt things. What people want to do
on a chat app is they want to they want
to chat right. And and the second that that that
the security sort of gets in the way of that,
people will stop using it and go find something that's
more usable. And it seems like that's been Signals sort
of guiding star and it's and they've you know, doing
(07:30):
the doing the most secure thing that you can will
still being fun and usable to actually just chat on right.
And I think that that has served them quite well.
Speaker 2 (07:41):
Yeah, I think there's it's it's so important. One of
I think one of the things that that contributes to
good overall security is setting yourself up for success, which
means setting yourself up for a system that can function
well if you're lazy, which is one of the nice
things that you know, with Signal, you don't have to
worry about like opting in and out and like selecting
a bunch of stuff. It's pretty safe, especially for a
(08:03):
normal person's uses right out of the box, which is
huge and kind of in the same line, as that
is the fact that because Signal doesn't store metadata, you're
not relying upon them being like committed you know, anti
state actors or whatever like, because they don't have access
to the thing that for example, Facebook will hand over
to the cops if the cops just like breathe in
(08:25):
their direction.
Speaker 3 (08:28):
Yeah, that's that's exactly right, And that's that is that
is the other really cool thing about Signal. You know, we,
as Carolyn said, the only data that Signal gives over
in response to uh A subpoena is the time that
the phone number signed up for Signal account and the
last time it connected to the Signal server. And the
reason we know that is because Signal publishes transparency reports
(08:50):
with the full text and full response of any subpoena
that they get, so like we can actually just see
in the responses that all they've given over is these
two pieces of information, because that's all they have, and
they've done some pretty clever things to make that be the.
Speaker 4 (09:05):
Case, right, And that's actually so different than how other
companies are I think reporting on either subpoenas or any
kind of.
Speaker 5 (09:15):
Weight that law enforcement puts on them.
Speaker 4 (09:17):
So for our report, I don't remember how much it's
it's mentioned in the report actually, but we did go
through and look at Apple Meta and I think Google
like in their own transparency reports to try to get
a sense of how how that would stack up in
comparison to Signals. I think in some cases it's saying
(09:38):
like they received some kind of like notification, but like no,
nothing really clear or specific on like what they received
from law enforcement or government, but rather just that they
received one. And so that's also the really great thing
about Signal is you are getting all of this information
that you're not getting from other companies or PLATF forms.
Speaker 2 (10:01):
Yeah, you know, I wanted to kind of in the
same subject and going back to we kind of opened
this introducing the concept that y'all introduced me to. I
guess I was aware of this, but not the terminology
security folklore, and I wanted to chat a little bit
about kind of the most recent example of this something
a lot of folks have probably been wondering about since
we started talking about Signal, which is that roughly a
(10:23):
week before y'all and I sat down to talk about this,
a kind of viral info meme started coming through that
was like Signal has a zero day exploit, which is
basically a hole that a hacker found in an Apple program.
That is that can't expose you. You have to turn off
(10:45):
link previews, right, which is that when you when someone
sends you like a link to an article in Signal,
you get a little preview not dissimilar to how it is.
And I think to be fair, just based on my
very limited knowledge, that is, when I think about, like,
what are potential holes in Signal, I don't think it's
unreasonable to be concerned about that specific feature. But that
(11:05):
warning was not what it kind of seemed to be basically,
or not as accurate as I think a lot of
people took it as being.
Speaker 3 (11:11):
I don't know.
Speaker 2 (11:12):
I'll let I'll tell I'll turn it over to you, guys.
I think that's the next thing I want to talk about.
Speaker 4 (11:15):
I'll turn it over to Cooper, who had you had
a Uh, you have a lot of feels about that.
Speaker 3 (11:21):
I have so many feelings about this. I was working
on this all weekend.
Speaker 6 (11:26):
So this yeah, so this copy pasta I'm calling this
like this signal copy pasta yeah, which is a term
from you know, four Chan and other horrible internet places,
but some.
Speaker 5 (11:40):
Media audience is probably Internet enough.
Speaker 2 (11:43):
Yeah, I'm gonna guess a good half of the people
listening at least got that message.
Speaker 3 (11:47):
Yeah yeah. And it's it's like, first of all, this
is not if you if you had a zero day
in Signal, which is it, which is an exploit for
Signal that has been unpatched but has not been patched
by the vendor so you can actively exploit it. There
are no people in the world who would choose to
(12:08):
quietly leak this over you know, over vague signal texts.
There are two types of people. One uh, you know,
people like us that would bring this to Signal immediately
and get them to patch it to protect the you know,
millions of high risk users that you signal, or to
the type of people that would go sell this exploit
to some horrible company that would use it, you know,
(12:30):
sell it to to Saudi Arabia or something and use
it to kill activists. Right, Like there is and there's
no in between. There's nobody that is going to quietly
leak this for you know, just for fun with vague details. Right. So,
so this this message set up red flags immediately, and
like it's because I really do not like lying previews,
(12:51):
And in our paper we discussed some of the issues
that we have with link previews. You know, we think
that they can they can leak some information about your
chats to the owner of the website. Right. We think
it's a kind of a large attack service. It's not
super necessary.
Speaker 4 (13:08):
Would you mind explaining to actually the audience to like
a little bit about what what we found when looking
at link previews.
Speaker 3 (13:17):
Yeah. So, the way that link previews work is when
you the way that they work on Signal and on
WhatsApp is that when you send a link to somebody,
the Signal app or WhatsApp goes and like fetches the
web page that that you know for that link, Right,
It goes and downloads, you know, downloads the content of
that link and gets a There are some there's some
(13:41):
special HTML tags that describe, you know, sort of what
the page is about, what the title of the page is,
and like an image for the page. And it gets
those tags and it puts them all together in this
little package and then sends that all as part of
the signal message. So when you put a link in Signal,
your phone actually goes out and gets that web page,
and it gets that web page with a what's called
(14:04):
the user agent, which is like a piece of text
that's attached to the request that uniquely that that identifies
it as being a request from Signal and from like
from signal and from your IP address. Right, So when
you put a link in, the owner of that website,
whoever has the logs for that website can know that
somebody at your IP address is using signal and sending
(14:27):
this link over signal. What we're what our concern is
is that if that.
Speaker 7 (14:32):
Link is unique, then anybody else who visits that link
can be inferred to be somebody that you are talking
with over.
Speaker 3 (14:43):
Signal, right, And so like this can be this can
be a good an interesting a source of intelligence for
website owners, especially for big websites that can easily generate
unique links with like tracking parameters at the end of
the right, Like when you share a Instagram post and
(15:04):
like at the end it's like question mark I G
S H I D equals you know, a long string
of numbers and letters, right, or a Twitter post where
you know T equals a long string of letters and numbers. Right.
That makes a unique link, and then anybody who visits
that same link can be determined to be somebody that
you're speaking with over Signal, so and also WhatsApp and so.
Speaker 8 (15:27):
So for that reason, we we we think that Signal
and WhatsApp should turn link previews off by by default
because we think that that's an unncessary information. Link Signal
and WhatsApps pushed back on that is that link previews
are a core feature that people demand and if they
(15:51):
if they were to turn off link previews by default,
they're worried that people would leave the platform for less
secure platforms like Telegram.
Speaker 3 (16:00):
Yeah.
Speaker 2 (16:00):
I mean, I don't want to tell them their business,
because I'm sure they have data on this, but I've
never thought about link previews as being a thing that
I needed.
Speaker 4 (16:12):
It's like, yeah, I think it's I think it's one
of those things. And you know, we haven't necessarily done
like extensive general design research in this right, Like we
haven't surveyed like three thousand people in the US. We
haven't had like a Pew Research survey across countries and
be like, what are your thoughts on link previews?
Speaker 5 (16:33):
But I would.
Speaker 4 (16:34):
Probably argue because it is it is included in so
much of modern messaging.
Speaker 5 (16:39):
Apps that we now assume it's like a core feature.
Speaker 4 (16:43):
One thing I will give Signal that I think is
amazing that other apps don't do, and this is true
of WhatsApp is pretty much every feature except for encryption,
you can there's something you can toggle or turn off. Right,
So like link preview already was available for people to
turn off on Signal, WhatsApp does not allow that, and
(17:05):
it seems like they're making no moves to allow that
future to be optional to turn on or off.
Speaker 5 (17:13):
But that is I will say.
Speaker 4 (17:13):
One of the things that's really lovely about Signal that
is so different from modern design and modern like big
tech platforms and just platforms in general, is that those
a lot of features are optional, whereas you know, WhatsApp
in metas sort of stance on design is that a
lot of things are not optional, that those are things
users would want. Why would we make foundational elements like
(17:35):
link previews optional? And you're just like starting like gesturing wildly,
but like you know, it's like, well, you don't know
what people want, and I mean, what's the harm in
turning off some of some of these things?
Speaker 3 (17:46):
Right?
Speaker 4 (17:47):
You know, like maybe maybe people don't want to receive gifts.
I don't know, maybe they don't want to receive stickers.
Why don't you like let them have that option. What's
the harm that could happen?
Speaker 3 (17:56):
Yeah?
Speaker 2 (17:56):
Yeah, yeah, I couldn't agree more.
Speaker 3 (17:58):
Yeah. Two things I want to say that one is
one is that and first we should acknowledge that this
it turns out that there was no zero day, there
was no vulnerability. Yeah, this was absolutely just something that
that spread virally out of nowhere. I'd be really interested
to find out what the origin of this copy of
past I was, but I haven't. I haven't been able to.
(18:19):
But it's I'm.
Speaker 5 (18:20):
Curious about that as as well.
Speaker 4 (18:21):
Because I was in another group threw that was like,
we really need outside auditors to look at these.
Speaker 5 (18:25):
And I was like, we have a whole report that
we wrote that didn't look at this.
Speaker 2 (18:30):
Speaking of outside auditors, I gotta pause you guys just
a second, because it is time for an ad break,
So please spend your money and then come back to
learn more. Ah and we're back. Okay, sorry about that, Cooper, Carolyn,
(18:59):
you make continue as you were.
Speaker 3 (19:02):
The other thing I was, I was going to say
that the idea that anybody would leave WhatsApp because they
stopped having link previews is completely preposterous to me. Like
Clownish has over two billion users. They are the you know,
in a position to set the standard for what people
(19:26):
expect from a messaging app, and so like they could
do things like turn on disappearing messages by default and
change that culture. They could do things like turn off
link previews by default and change that culture. Like, they
could do these things, and you know they would you know,
they would not lose enough users to even notice or
(19:50):
care about.
Speaker 2 (19:51):
Right.
Speaker 3 (19:51):
Yeah, they are the only people in the position in
the world, in the position to decide what the culture
should be, and this is what they've decided the culture
should be.
Speaker 5 (20:00):
Totally.
Speaker 4 (20:01):
I hate to break it to you, but if WhatsApp
just got rid of link previews, I'm just throwing my
whole phone into the garbage garbage can, getting rid of it.
Speaker 2 (20:09):
Just tossing it back to a landline.
Speaker 5 (20:11):
Yeah, I'm just.
Speaker 4 (20:12):
Gonna eat it into a river. I feel like I
don't need this anymore. Actually, I'm going back to carryer pigeons.
Speaker 5 (20:17):
That's how far back I'm going to go.
Speaker 2 (20:19):
I mean that that does kind of lead into the
next thing I wanted to talk about, which is sort
of the other wing from security folklore, which is security nihilism.
And yeah, this is kind of you introduce this when
talking about sort of if you do try to engage
somewhat with the technology, or if you wind up just
kind of in the position I think most lay people are,
(20:39):
where you know, maybe you have some friends who know more,
or maybe you have some friends who think they know more,
and you get all these conflicting things about like this
is safe, No, it's not. You can't trust signal. The
FEDS could be running signal all this kind of stuff,
And to be fair, the FEDS have run security based
services before. It's not like I don't believe that's happening
with signal, but it's not like I understand where parent
(21:00):
like that can can enter into people's calculus, especially if
you're not technically knowledgeable, and that can lead to this
sort of state of security nihilism where you're just like,
you can't communicate it all online. There's no way to
do it securely, and obviously there's no perfect right you
never have it, but you don't have one hundred percent
with like talking in person to somebody. Right there are
(21:24):
individuals in prison right now who you know somebody they
loved and trusted rat it on them. There's no one
hundred percents in this world. But that doesn't mean nihilism
is the right response to like trying to figure out
how to set up your communications standards with people right.
Speaker 4 (21:42):
Totally, I mean, I think the approach we take in
because throughout this report we were also teaching workshops to
reproductive justice activists across the US and states where abortion
is banned. I'm from Louisiana, I live half the year there,
the abortion is banned there, and we were also working
with journalists in India. So a big big thing for
(22:04):
us was also teaching threat modeling and different kinds of
what Matt Mitchell, a security trainer and expert, calls digital hygiene,
and so a lot of this was recognizing that there
was certain practices we were picking up on, particularly with
folks we were working with. So like a lot of
reproductive justice activists we were working with are new to security,
they're new to technology, they don't have a background in tech,
(22:25):
and generally, you know, the American South, the American Deep
South is super overlooked in terms of tech policy, in
terms of just I think a general focus when people
are talking about tech or tech literacy or tech activism,
and that is like leaving really massive gaps and knowledge
for people. And so you know, when we were working
(22:47):
on this security folkal or and security nihilism, we're both
actually very.
Speaker 5 (22:51):
Almost like I won't say, like a pendulum, but they
were very connected. And so some of that was.
Speaker 4 (22:56):
People hearing things like oh, I should put my phone
in a microwave when I'm having a very sensitive conversation, right,
And so that's where some of that security folklore is
coming in. It is something that is technically safe, but
it's like not the thing you necessarily, like totally need
to do in that moment. And with security nihilism, what
it kind of came down to, and this is stuff
we've seen with other groups and other circumstances. A great
(23:20):
example are are you know Palestinian activists and journalists. Let's say,
who are you know facing the threat of all different
kinds of governmental censorship and surveillance of sort of saying like,
when there's this large threat sort of hanging on us,
and there's also physical surveillance. And this is true for
a lot of journalists in other countries like India as well.
For example, you know, like should everything go through signal
(23:44):
or does it really matter?
Speaker 5 (23:45):
Like does it really matter?
Speaker 4 (23:46):
And this is also something again we saw with some
some reproductive justice activists as well, where it's like if
everything is being monitored, what's safe?
Speaker 5 (23:53):
Like can I send stuff? Like can I even use Google?
Speaker 4 (23:57):
And part of this was, you know, by teaching privacy
and security workshops, by teaching things like threat modeling, which
is a framework for just assessing what.
Speaker 5 (24:06):
Are what are threats?
Speaker 4 (24:08):
Like are what are all the potential threats you could
face and kind of mapping them from like the most
minor to like the most major, and what you can
do about that. That's a way to try to combat
security nihilism. But I think an approach Cooper and I
are also really fond of is thinking of this like
safer sex. There's all different kinds of things you can
do that our mitigations are actually incredibly helpful, and we
(24:29):
can't look at it as a binary of safe or
not safe. It's actually like much more of a gradient.
But you know, the focal are and the nihilism, I
think come from a very similar place, which is we're
asking people like society is kind of asking or demanding
that people be experts and something that's really hard. I
(24:49):
am like a fairly technical person, and even there are
some things that I find hard to serve wrap my
head around. And I've been working in privacy and security
for like quite a while, and I think think, you know,
it's also really hard when you think about these apps
as like a brand new person. So, like, one of
the things that popped up a lot in our research
is like why should we trust signal? And that's actually
(25:10):
a great question, Like what about Signal in its interface
and its design.
Speaker 5 (25:14):
Would cause you to trust it? Like some people were
like it's a nonprofit. That's great, but I don't know
what that means. I'm like, that's actually a fantastic question,
Like what does that mean? Right?
Speaker 4 (25:23):
Like why should you trust this? You've heard through the
grapevine that you should. And I think these are kind
of all the things that people are dealing with because
if you sort of take a step back and just
look at software or any different kinds of software generally,
why should you trust that it's safe and secure when
there have been so many different kinds of leaks or
breaches or things breaking, right, Yeah, Like, so these are
(25:46):
I think really really closely tied. But I think a
big thing for us is trying to combat that security nihilism.
Like when whenever we can like, there is things you
can do. I don't want to say like no matter
how great the threat, but I believe like, no matter
how great threat, there is stuff, there is stuff you
can do.
Speaker 3 (26:02):
No matter how great the threat is, there's stuff that
you can do to make it more difficult and more
expensive for that person to attack you. Right, Like we
all lock the doors to our house, or you know,
for the most part, or you know, we all we
all do things to to protect ourselves like that that
aren't fool proof, right. Somebody can always break a window
to get into your house, right. Somebody can find other
(26:24):
ways to get into your house. But locking the door
makes it so that somebody has to do the noisy
thing of breaking a window. Right. It makes it so that,
you know, somebody has to spend more time and effort
and more risk of getting caught in getting into your house. Right.
And that's and that's like we layer. When you layer
these protections, right, the idea, you know is that you're
(26:47):
you're you're making it harder, You're making there be more
friction right to piercing your security.
Speaker 2 (26:54):
Yeah, I think that's that's a really good point, and
that the concept of friction, you know, this is something
I've talked about. Not that these are exactly the same things,
but in the although there's not not wildly different when
it comes to like how insurgents win insurgencies, right, it's
not by carrying out these sort of like great battlefield
victories that sweep the enemy from the field. It's by friction, right,
(27:18):
which wears down both the culture and the kind of
readiness of the opponent until they simply bounce, which is
a pretty durable and effective strategy. You can keep it up.
It's this matter of like there's no like sweeping sudden
like ninety minute three act win here. It's more a
(27:39):
matter of the more difficult, the more expensive you make it,
the more you hold on to and the more all
of us hold on to. Right. That's the other benefit
is like, even if you're not even if you are
the most law abiding person in the world like myself,
having these security measures in place means that you're kind
of contributing to the overall immune system of a of
(28:02):
a kind of community of people who don't want the
NSA listening to their ship.
Speaker 3 (28:08):
Yeah, exactly, exactly. And the friction thing is is also
exactly what Signal does, right, Like by the threat model
for Signal is stopping the NSA or other global adversaries
from listening to all communications as they travel over the internet, right,
And that's when you can when you can do that,
(28:28):
like when you can when you can listen to everybody's
conversations as they travel over the Internet, it's really cheap
to spy on anybody. Right when you're encrypting that communication,
then the NSA or whatever other global adversary has to
go actually hack your phone, right, they have to. They
have to target you specifically, they have to burn resources
(28:49):
and you know, burn weapons, right, zero days to get
access to your phone. And that's a lot more costly,
it's a lot more noisy, it's a much higher risk
of them getting caught. So it's introduced to huge friction,
uh in that in that area.
Speaker 5 (29:08):
Go ahead, okay, go ahead, go.
Speaker 3 (29:10):
Ahead, I say, and I think you're asymmetic. The sort
of comparison to asymmetric warfare is exactly spot on, because
none of us are ever going to have the money
that that the NSA or Masade has. None of us
are ever going to have the the total technical acumen
that the NSA or MASAD has, right, but like those
that you know, so we have to kind of fight
a you know, in terms of caryption, in terms of encryption,
(29:32):
a guerrilla war, right, and we have to make things
so expensive and so annoying for them that it's not.
Speaker 5 (29:38):
Worth it totally. And just to sort of building that.
Speaker 4 (29:41):
One of the things I love about Signal is while
they're creating friction for our adversaries, it's actually so frictionless
to use as a user. And I think that's one
of the things I find just continually impressive about that.
I don't want this to turn into like the like.
Speaker 5 (29:58):
We're all himbos for signal. Looks we probably are.
Speaker 4 (30:01):
But because like that's one of the things as a
researcher like Kuber and IOMs have to be like, we're
not paid by a signal at all, Like, but this
is in fact, like one of the best things you
can use. But again, one of the things I think
is amazing is that it is so easy to use
and it really is designed for and I'm using the
term usability as as a design term, meaning that it
(30:26):
is they're thinking about a common user, including those with
like lower digital literacy or those that are have never
used any kind of any kind of security tool, and
so they're hitting a specific threshold of usability for things
to be understandable. And again, that's incredibly hard to do well,
and they are they are doing it quite well. Like
(30:46):
it's very I would argue, it's very easy and sort
of seamless for people to make a jump from WhatsApp
or if you're on like Google or Android using like
Google Messages, sorry Google, if you're on Android or an iPhone,
from Like Messages to Google Messages to signal like it
doesn't It might look slightly different. I might feel a
lot more blue, I might feel a lot more black,
(31:07):
depending on how yours is constructed. But for the most part,
a lot of the features are kind of where you
expect them to be, and it's not at all difficult
to get it up and running, which is not something
against Cooper said earlier.
Speaker 5 (31:19):
We could say about things like PGP.
Speaker 2 (31:22):
Yeah, I wanted to kind of move on to talking
about other apps and their security or lack of it,
and I think we should start probably by talking about Telegram,
because that's probably close to top of the list of
things people use for secure communications that is not nearly
as secure as they think. So, yeah, I wanted to
(31:43):
kind of chat with you about like why that is,
and I specifically I wanted to talk one of the
things that is frustrating about Telegram is they kind of
have they have like a secret chat or private chat,
like they have a couple of different options that don't
necessarily mean what they sound like they mean to most people.
Speaker 4 (32:01):
Yeah, so that's actually one thing our report found. So
private chat and secret chat are in fact.
Speaker 5 (32:07):
The same thing.
Speaker 4 (32:09):
They're just called slightly different things in the app, which
for for again, for those listening.
Speaker 5 (32:14):
That are don't have the background in design, that's bad design.
Speaker 4 (32:17):
That's actually not that's not professional, that's a that is
a mistake. There's no reason for a feature to have
like two different names inside of inside of your software.
And so I don't know if that's an oversight on
their part. I'm assuming so, but like those two things
(32:38):
correlate to the same feature, and so they should actually
be called the same thing. But then even further that
being said, what does private mean to a user?
Speaker 5 (32:46):
What does secret mean?
Speaker 4 (32:49):
You know, Facebook Messenger they call their encrypted message secure
or no, they also call it secret.
Speaker 5 (32:55):
Sorry, they also call it secret. But does that mean security?
Does that mean encrypted?
Speaker 4 (32:58):
And so that's like one of the one of the
weird things where it's like, you know, I think by
using a very sort of like normalize or culturally almost
like emotional name like private, it makes something seem like
it's actually quite safe, when in fact it's not. And
there's a variety of reasons as why, like Telegram is
(33:20):
not not a very secure app that I will let Cooper.
Speaker 5 (33:24):
Talk about more.
Speaker 3 (33:25):
Yeah, I would never advise anybody to have a chat
over Telegram if they are concerned about the privacy of that. Yeah,
So we were talking about friction and the fact that
and encrypted chats are not the default in Telegram creates
a friction for users to have an actually secure chat.
(33:47):
Right you have to go remember to turn it.
Speaker 4 (33:49):
On, and you can only turn it on turn it
on individually per message. It's not like an overall feature
on Telegram or Facebook Messenger. You have to go select
a specific like the specific conversation per conversation, which is
and another thing ourper gets into is how also those
chats don't look very different. They look almost identical to
(34:13):
a normal chat. So for for low vision users or
anyone with any kind of like disability, especially a vision
related disability, it's almost impossible to it's like nearly impossible
to recognize which chat you're using if you're looking at.
Speaker 5 (34:29):
The chat logs.
Speaker 2 (34:31):
Yeah, outside of that, like if people, you know, in
terms of like things, that may not be options right now,
I think basically everyone listening signal is a perfectly viable option.
But it's not impossible that, for example, you might wind
up in a country where, even if there's not a
specific law against it, there is a precedent established that
if you have signal on your phone, you know, it
(34:52):
can be at least used as a justification for charges
that you are planning to use. Like you know, with Atlanta,
people are getting charges because they had a lawyer's name
written on their arm right, and and so the state saying, well,
that's evidence that we're planning to commit a crime. You know,
that doesn't mean that convictions will go through in that
kind of thing, but it may be a reason why
signal might not be an option, or say, you know,
(35:15):
something comes out about it that makes it seem less secure.
What are other good or or acceptable options? And I
know when we're talking about this, these are often options
that require more input and work from the user in
order to maximize their potential security. But I do think
it's good to like let people kind of know what
else is out there.
Speaker 3 (35:32):
Yeah, so when signal isn't an option, WhatsApp is actually
not a bad option. So WhatsApp it is owned by Meta,
which is a you know which is which is you
know not which is not ideal? But WhatsApp actually uses
the same encryption protocol as signal. Uh so, like under
(35:53):
the hood, the way that the you know, the way
that the math works to hide your messages from the
NSA is exactly the same, right, and they've implemented it well.
You know, there are a few more steps that, you know,
a few more precautions that you need to take with WhatsApp,
like making sure that your chats aren't backed up being
the main one. But WhatsApp is certainly good enough, right
(36:16):
if you're if you're you know, chat networks aren't using signal,
if you're in a country where you can't use signal, right,
Like WhatsApp has two billion users, I'm you know, it's
it's you can use WhatsApp almost anywhere in the world.
It's and it's ubiquitous enough that it's not going to
mark you as you know, somebody with something to hide, right,
And like, and I don't want to I don't want
(36:37):
to discount what's app. Right, getting two billion people to
have end to end encrypted messaging by default overnight basically
was a major cupe. Like that that was world changing, right,
and like they they really do deserve applause for that obviously,
you know, I think partly because of their scale, partly
(36:59):
because they're own b meta, right, they haven't taken all
of these same steps, Like they do have more metadata
on their servers than Signal does. Right. But if that's
your option, that is a fine option.
Speaker 2 (37:13):
Yeah, I think that's really good to know, particularly since
options are always more secure than not having any kind
of a backup plan totally.
Speaker 4 (37:24):
And if people are like even slightly nervous about WhatsApp,
of great things, they do have disappearing messages. The downside
is like the fastest disappearing message is only twenty four hours.
But that's something that again you still have, and that's
like that is that is an amazing feature.
Speaker 2 (37:42):
Yeah, and that kind of gets into also what kind
of stuff you can do in order to maximize the
value of features like that. Like, for example, if you're
coming back into the country or a country and your
phone gets confiscated by customs or whatever because security so
uses have some sort of eye on you for whatever reason.
(38:03):
If you've got you know, thumbprint log in or face
log in, they're going to get into that phone right
in your twenty four hour delete thing may not have
gotten taken care of everything. If you've got like a
complicated eight digit password and no biometrics enabled, maybe depending
on where you are and whatnot, that'll keep your phone
(38:23):
locked long enough for those messages to get deleted, right, Like,
it's all about kind of maximizing the chances that something.
Speaker 3 (38:28):
Like that helps. Yeah, exactly. We definitely recommend that people
turn on disappearing messages. I think that that's just a
good sensible default to have. Also definitely recommend that if
you're going to be in a situation where you think
you're going to be, you know, there's a higher likelihood
if you interacting with law enforcement, if you're crossing a border,
if you're going to a protest turn off the biometric
(38:50):
unlock on your phone. Certainly, especially in the US, there's
the case law isn't settled, but there's a lot of
state courts that have decided that police can force you
to unlock your phone with your biometrics and that that's
totally fine. So this, you know, in the in the
US context, is a good idea in any context. I
think it's a good idea if you're at heightened risk
(39:11):
to turn off totally.
Speaker 4 (39:14):
I mean, one thing we're also a big fan of
is figuring out too like and this is again where
threat modeling is so key, is like, is this a
circumstance where you need your phone or another thing that
you know you can always do if you are nervous
about traveling across the border, is you can delete signal
and reinstall it and everything is gone. You can delete
WhatsApp temporarily while you're crossing a border so it's not
(39:37):
on your phone. You know, there are things like that
you can do if you feel comfortable wiping your phone,
that's something also you can do. You know, these are
all again these are these are these are different things,
and I think this is one of the things our
report I don't remember how too much we get into
a bit something that at least we've been thinking about.
Cooper and I run a little lab called Complication, and
(40:01):
one of the things we've been thinking about there is
just also how do we instill sort of like better,
better holistic practices where we understand that a phone is
just one component of our safety and so like secure messaging,
encrypted messaging is one component of that safe safety. So
like what are other things we can do?
Speaker 5 (40:20):
And some of that can.
Speaker 4 (40:21):
Be you know, wiping your phone if traveling, if that
makes sense for you, or if that's a thing that
makes you feel safer, or removing certain apps and then
you know, reinstalling them, reinstalling them later.
Speaker 3 (40:34):
Yeah, yeah, and it and it really is holistic. Right,
Like a thing that a thing that people need to
keep in mind is that, you know, disappearing messages can't
stop an untrustworthy uh conversation partner, right Like if if
my conversation partner is untrustworthy, they can take screenshots of
the messages, right, they can you know, go they can
(40:57):
go snitch to law enforcement about what I've hold them. Right,
Encrypted messaging, discipree messages, These are not panaspeas. Right, you
still have to you still have to keep all of
your other aspects of security as well, right, So don't
(41:17):
entirely rely on these technologies to save you, right, you
have to also trust the people you're working with and
build these layers of security.
Speaker 4 (41:23):
Yet it's true, I mean, Cooper, you could leak all
of my secrets right now on this podcast and them.
Speaker 5 (41:28):
That too, what a gentleman.
Speaker 2 (41:30):
And that is that is the other thing, right where
when it comes to like what is secure, one thing
to remember is that signal for all the good things
about it, Nothing, nothing at all about that app stops
the recipient of a message from you from taking a
screen grab or just handing their phone over to their
friendly local federal agent, right, which is always you know,
(41:54):
we don't want to be I'm not trying to be
a security nihilist here. I think you know, there's no
replacing communication over phones in many situations. But if you are,
for example, going to be transferring a bunch of Plan
B pills in an area where that is prosecutable, that
(42:15):
probably shouldn't go on your phone in that language. Right,
Perhaps you know you could come up with a clever
codeword or whatever. But don't you know it is security
is like you said, holistic. You know you should not
be looking at it as just like, well the app
is secure.
Speaker 3 (42:32):
So that's enough.
Speaker 4 (42:33):
I mean, one thing I also want people sort of
think about too, because that's a really great point Robert,
is like, we do all different kinds of things every
day in our lives that could, you know, in dangerous.
Like I think a lot of the work I do
is I work a lot with people facing all different
kinds of online harassment. So like falling in love, for example,
is a dangerous thing to do. You could have your
heart broken or that person could hurt you. Learning how
(42:56):
to trust people, you know, crossing the street, deciding to
jaywalk right, all different things we do sort of.
Speaker 5 (43:04):
Every day actually can expose us to harm.
Speaker 4 (43:06):
And so one thing I think for people listening to
keep in mind is that's the same one we have conversations.
And I think a way to avoid nihilism is just
to remember that that every day we are sort of
going out there and actually being incredibly brave just by
living our everyday lives, by deciding to be in community
and have friendships and have relationships, and in my case,
(43:28):
I love jaywalking, and no one around me does, and
that's why that's my choice. And I have not yet
gotten hit by a car jaywalking.
Speaker 2 (43:38):
I think it's good to look at this the same way.
There's a concept that the military has sort of developed
when talking about how not to die when you're in
a gunfight or something. It's called the survivability onion, right,
And I think it's extremely useful both if you're talking
about like, well, I'm going to a protest and there
will be violence there, you know, should I wear armor,
(43:58):
et cetera. But it's also just really it's really useful
with any kind of security, and and the onion, it's
it's envision, doesn't on you because like the largest outside
chunk of it is don't be seen, don't be acquired,
which means somebody actually getting you in their head sights.
Don't be a hit, which means being behind cover or something.
And then the very internal part of it is like,
(44:20):
have some sort of armor in case you are shot.
Speaker 3 (44:23):
But if you if the.
Speaker 2 (44:24):
Armor is useful, the majority of the onion has already failed.
Speaker 3 (44:27):
Right.
Speaker 2 (44:28):
If encryption is useful, that is not a dissimilar sort
of situation. Right, So there's a there's a degree of
canniness is super helpful and thinking about like what is
what is visible about me? If I'm doing something, I
know that I have to be extra concerned about the
state seeing what is visible about me from the outside,
(44:52):
you know, tobly, I mean.
Speaker 4 (44:53):
I think that's an amazing thing to think about. Like
where where are you sending a text message? Are you
in a place in which like someone can lean over,
Like I'm the nosiest motherfucker, and all the time I'm
constantly like like looking around being like what's that person
watching on an airplane? Or like if someone is sitting
next to me scrolling, So like you wouldn't want to
like send a sensitive text message like next to me,
(45:15):
because I'd be like, that's that's interesting fodder.
Speaker 5 (45:18):
That's kind of a show Texas to Cooper later, you know.
Speaker 4 (45:23):
And so I think it's important to think about that,
Like who's around you? Is this is like how are
you describing something? Do you know the person you're messaging?
If you're in a group message, you know everybody there?
Like do you trust all of them?
Speaker 5 (45:36):
You know?
Speaker 4 (45:37):
And if you're ever nervous there are this is I
guess the upside also to in person conversations. You can have,
you know, a phone call or an in person conversation
with someone. Right if you're really not sure or you
don't feel comfortable even sending something over signal, that might
be the time to be like, hey, do you want
to meet up and get a coffee and then you know,
try to find a discreet place to have have a conversation.
Speaker 2 (46:02):
Yeah, yeah, I do want to roll to ads real quick.
One second, and I think Cooper had something to say,
and we'll we'll continue, but first products ah, we're back Cooper.
(46:26):
You look like you had something to add.
Speaker 3 (46:28):
On that, nothing particularly serious, just that. I think that
that's I think that that's really good advice for the
military and absolutely justifies the nine hundred billion dollars.
Speaker 2 (46:39):
Yeah, I'm glad they put together a fucking graphic. I
wonder how many billions of dollars that did best I could.
Speaker 4 (46:45):
I could make a graphic for hundreds of millions of dollars.
Speaker 3 (46:49):
Yeah, if anybody, if anybody wants to fund us for
hundreds of millions, we will will do it less now
a year, hundreds of millions.
Speaker 4 (46:56):
We have so many good T shirt ideas and sticker
ideas y'all like so many good ones, so many unhinged
ones that the world needs to see.
Speaker 2 (47:05):
Yeah, I mean I do. I guess just because of
the amount of time I've spent thinking about this stuff
from my old job. There are a couple of concepts
from military planning I think about in this context, and
one of them that I also think is relevant to
what we're talking about with friction is the concept of
an ode loop, right, which is how do you win
(47:27):
and combat against an opponent, And it's by disrupting this
thing called the ode loop. And the ODO loop is
how an adversary carries out actions in a conflict like this, right,
and the steps you have to go for observe, orient, decide,
and act. And if you can disrupt any stage of that,
you can stop them from taking actions, right, which stops
(47:48):
them from being able to harm you. And the good
security is going to impact all three of those things, right,
It's going to stop them from being able to see
you sometimes if they can see you stuff like you know,
you were just talking, we were just talking earlier about
link previews, right, and how that can kind of expose
maybe who you're in communication with potentially, well, that could
(48:11):
allow the state to orient themselves to you and to
your friends, right, And obviously stuff like locking down your
devices not having unnecessarily info online can stop them bring
being able to decide, you know, what you're doing and
how they should respond to that. And I think that's
also good if you're thinking, if you're not just somebody
who is concerned about your security like most people are,
(48:32):
because it's good to have some security. If you're actually
dealing with the state or a corporation as an adversary
in some way, it can be useful to think about
your security culture in those terms.
Speaker 3 (48:46):
Yeah. Absolutely, I think that's absolutely right. It's it's and
I think that it's you know, it points to you
like we should, we should understand what the you know,
mode of thinking of our adversaries is, right, like we
you know, we should if your adversary is the NSA, right,
which is like probably actually not most people in the US,
(49:08):
Like for most US activists, the NSA is not actually
your biggest adversary, right, Like your biggest adversary is going
to be local police, right, your biggest adversary is going
to be you know, the the you know, somebody like
your abusive partner, right, and you need to. And this
is why threat modeling is important, because you need to
to really to really think about, you know, think through
(49:30):
like you know, well, okay, wait, am I actually worried
about protecting myself from the NSA? Or am I more
worried about uh uh, you know the the racist police
officer that drives down my street every day? Right? And yeah,
probably it's the latter. And so you can you can
take a lot more useful actions, right uh. And and
you know you can you can you know, break that
(49:51):
oda loop for him once you know actually what it is. Right, Yeah,
if you're defending yourself against the NSA, you're gonna leave
yourself wide open to the actual threat. Yeah.
Speaker 2 (50:01):
It's totally I think a great example. And I don't
mean to be like quote unquote sub tweeting somebody here,
but I've known a couple of folks like this. It's like,
if you have if you're super paranoid, you're not putting
anything online, You're only talking with your close friends, you
use a dumb phone, you have burners, but you also
drive around with a shitload of weed in your car
in a state where that's illegal. It's like, well, like
(50:23):
your threat modeling is not great in that situation, right,
or like I do all that, but I carry in
a legal handgun with me wherever I go. It's like, well,
it may be more of a threat than your phone.
Speaker 4 (50:35):
My partner the other day was like, what if I
got a dumb phone? I was like, what if I
divorced you? Like like what if?
Speaker 5 (50:43):
They were like what do you mean?
Speaker 4 (50:44):
And I was like, well, I'm going to be the
one using all the maps for both of us, yeah,
and having to google all the dumb shit you want
at Google. That doesn't make I'm now your weakest link,
like go fuck yourself. But also I was like, I'm
absolutely not going to be your your your Google maps bitch,
Like I'm not not doing that. But but I mean
(51:06):
I think also, you know, to both of y'all's points
to get serious again for a second. I mean, you know,
like my threat model, for example, might be similar or
slightly different, maybe slightly less serious than than Cooper's. But
you know, like some of the like the the the
journalists in India we were working with, have quite a
high threat model, right, Like, yeah, the Indian police force
(51:27):
are very much like the NSA. They're very talented they
have a lot of money and tech at their disposal,
and that might be different for some of the activists
we're working with, let's say in like Louisiana or Texas, right,
but the differences is, like we're still talking about I
would argue two brutal police forces that just have different
(51:49):
means of disposal at their hands. So like the Louisiana
police are our groups you should totally be worried about.
They might not be able to hack your phone, but
maybe eventually they could.
Speaker 5 (52:00):
But there are other there are obviously other things that
were about them.
Speaker 4 (52:03):
But you know, in the context of like some of
the folks who are working with in the South, like
reproductive justice activists, some of the things are probably much
more serious in terms of your threat model would be
like a nurse for someone who let's say is miscarring
or has sought an abortion. And this is something Kate
Bertosh from the Digital Defense Fund, a friend of of
(52:26):
you know ours, has talked about where like the people
that are supposed to take care of you might be
the ones that are actually your your biggest threat, right,
the ones that have heard you say something or you've
can fight it in for example, and that is kind
of a horrifying thing to think about.
Speaker 5 (52:42):
But that is, that is a thing you.
Speaker 4 (52:44):
Have to threat model, right, is is can I trust
this person? How am I describing?
Speaker 5 (52:49):
You know? What's happening?
Speaker 3 (52:51):
Yeah?
Speaker 2 (52:52):
Yeah, absolutely, Well, did y'all have anything else you wanted
to make sure to get into in this conversation? There's
so much more in your in the great paper you
helped co author, What is Secure and Analysis of Popular
Messaging Apps on the Tech Policy Press. But yeah, is
there anything else y'all wanted to really make sure you
hit before we roll out?
Speaker 4 (53:12):
Yeah? Please don't use telegram for a variety of reasons,
but also like it's very unclear how they respond to
any law enforcement or government. They don't say anything, and
it's kind of impossible to reach anyone that works there.
Please don't use Facebook Messenger other than maybe sending memes.
There's a lot of really gross surveillance capitalism inside a
Facebook messenger that the paper gets into. But effectively, Meta
(53:35):
is building this weird, sprawling infrastructure inside a Facebook Messenger
and try to link Facebook and Instagram.
Speaker 5 (53:41):
And one of the things we noticed is.
Speaker 4 (53:43):
That, like if you've blocked someone on Instagram or mute
to them, but you haven't blocked remuted them on Facebook,
that your stories, like all those stories are still coming
across in messengers, so you can still see content from
someone because it's linking both of those both of those profiles.
So you know, you could see how partaking like an
online harassment lens like why that's why that's really bad,
(54:06):
that's really harmful and could be potentially you know, upsetting
and triggering for folks.
Speaker 3 (54:14):
Yeah, I'll add that. I think my the major thing
I want people to to think about is that encryption
really does work, and it works really well. And we
can see that because a lot of countries right now
are trying to pass laws that either weaken or byan
encryption right and in fact, the UK uh did passed,
did just pass such a law, the online Safety built
(54:37):
in the UK. And so it's really important that we
that we you know, push back against these laws and
fight back against these laws and and whatever we can, right.
And I'm not I'm not coming at.
Speaker 9 (54:49):
This as somebody who's a big believer in the you know,
in in incrementalism and in working with governments, but I
think that I still think that it's really important to
you know.
Speaker 3 (55:01):
Educate folks and push back against these laws and try
to not let these pasts because these will be really
bad for all of us totally.
Speaker 4 (55:10):
And not to defend the Online Safet Bill, because I
would never do that. I'll go to my grave not
speaking highly of it, only speaking critically at least, like
the pushback from encryption experts and encryption supporters like Merrit Whitaker,
president of Signal, did lead to lawmakers in the UK,
for example, admitting that there's no sort of feasible safe
(55:33):
way to build a back door, right, And that is
I think also a win because of so much pushback,
because of so much research, because of so much criticism
that security and privacy folks gave people that are pro
encryption like that, we you know, we were able to
walk back that part. And I do think that's a
(55:53):
big deal, even if there are other issues with that bill,
because I think it also sends a sick pun intended
to other governments as well, and I think that that's
incredibly important. But yeah, I would also say just just
use Signal whenever you can.
Speaker 2 (56:13):
But yeah, yeah, well all right, folks, that is going
to be it for us here at it could happen here. Yeah,
thank you all for listening, and thank you Cooper and
Carolyn for coming on.
Speaker 3 (56:27):
Thank you for having us, yeah.
Speaker 5 (56:29):
And thank you for having us.
Speaker 4 (56:30):
You can find us on social media for now, I
guess until it all.
Speaker 5 (56:35):
Lights on fire.
Speaker 2 (56:36):
Yeah, whichever one you want to trust.
Speaker 3 (56:40):
I'm Cooper Cue on most social media's Blue Sky Mastered
on Shitter.
Speaker 4 (56:46):
Yeah, I'm Caroline Cinders. My first name, last name. Our
lab is Convocation Research and Design Record Labs on Twitter
at the moment.
Speaker 5 (56:55):
Hopefully we'll get be getting on Blue Sky very soon.
Speaker 2 (56:58):
Yeah. Yeah, probably get back on there more.
Speaker 3 (57:01):
Now.
Speaker 2 (57:01):
Twitter has gotten remarkably worse, which you know, we had
a back in the day on the old something Awful forums.
There was a thread in one of the debate forums
about this very right wing site called Free Republic, which
is like one of the earliest reservoirs of what became
trump Ism, and the tagline for the thread just kind
(57:24):
of watching these people, was there is always more and
it is always worse, And boy, goddamn, if that hasn't
been a continually accurate statement about the whole of social media.
Speaker 4 (57:34):
Right now, isn't a time amazing to watch someone just
light forty billion dollars on fire.
Speaker 2 (57:39):
Yeah, just like yeah, totally there to it. Yeah, it's
like the nihilist and me being like, wow, Comrade Musk
really really taking some hits to capitalism here.
Speaker 1 (57:54):
It could Happen here as a production of cool Zone Media.
For more podcasts from cool Zone Media, visit our website
fo zonemedia dot com or check us out on the
iHeartRadio app, Apple Podcasts, or wherever you listen to podcasts.
You can find sources for It Could Happen Here, updated
monthly at coolzonemedia dot com slash sources. Thanks for listening,