All Episodes

February 23, 2021 48 mins

Over the last several years, we’ve learned the hard way that disinformation, when combined with the power and reach of social media, can radicalize, divide, and destabilize communities -- and even entire countries. In this episode, Hillary talks with social media and technology expert Tristan Harris about how we got here, and what we need to do to mitigate the influence of Big Tech on our democracy. She also speaks with award-winning Filipino-American journalist Maria Ressa about why the Philippines’ shift away from democracy and toward authoritarianism should serve as a warning to us all.


Tristan Harris spent three years as a Google Design Ethicist developing a framework for how technology should “ethically” steer the thoughts and actions of billions of people from screens. A featured subject in the Netflix documentary, The Social Dilemma, Tristan is now co-founder & president of the Center for Humane Technology, whose mission is to reverse “human downgrading” and re-align technology with humanity. He co-hosts the Center for Humane Technology's Your Undivided Attention podcast with co-founder Aza Raskin.


For her courage and work on disinformation and “fake news,” Maria Ressa was named TIME Magazine’s 2018 Person of the Year, and has also been named one of TIME’s Most Influential Women of the Century. A journalist in Asia for nearly 35 years, Maria co-founded Rappler, the top digital-only news site in the Philippines. Maria has endured constant political harassment and arrests by the Duterte government, forced to post bail eight times to stay free. In June of 2020, Maria was found guilty of Cyber Libel charges which includes a sentence of up to six years in prison. Maria is profiled in Frontline’s A Thousand Cuts, directed by Ramona Diaz, and now streaming online on pbs.org/frontline and YouTube. The film is also available to stream in the PBS Video App and on PBS Documentaries Prime Video Channel.


Full transcript here.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You and Me Both is a production of I Heart Radio.
I'm Hillary Clinton, and this is You and Me Both.
This week, we're talking about a topic that not only
has kept me up late at night, but I think
is one of the biggest threats we face to our society,
to our democracy, to who we are as a people.

(00:21):
And that's the spread of disinformation. Now, the first time
I became aware of this was during the election of
I really did not understand it, and it was only
after I saw how outright lies and conspiracy theories and crazy,
unbelievable stories could take hold on the Internet, on social

(00:44):
media platforms and change how people thought and behave. I'm
talking today to two people who have seen the dangers
of disinformation up close, and they are doing everything they
can to stop it. I'll speak with Maria Ressa, a
journalist from the Philippines and a personal hero of mine.

(01:07):
She has risked her own safety and freedom by reporting
on President du Terte's corruption and propaganda campaigns, and she's
an important warning for the rest of us. The rapid
fire spread of online disinformation that has happened is happening
in the Philippines. You know what can happen anywhere, especially

(01:28):
here in the US. But first I'm speaking with Tristan Harris.
You may have seen Tristan in the Netflix documentary The
Social Dilemma. He's been called the closest thing Silicon Valley
has to a conscience, and he's the co founder of
the Center for Humane Technology, which is working to help

(01:49):
us realign technology so that it actually helps people. I
wanted to talk to Tristan because I met him shortly
after the sixteen election, and he was able to describe
to me what had happened during that election in a
way that I didn't understand before. And he has been
sounding the alarm in every way that he knows about

(02:11):
what is happening to all of us because of the
way social media platforms run. I have to start by
just asking you to give a little bit of background
for our listeners, because although you're now one of the
leading global critics of the way major tech platforms contribute

(02:32):
to a toxic online ecosystem, you worked in the industry
for years. You were really on the forefront of a
lot of the technological developments that were now coping with.
Could you give us a little background. Yeah, So, you know,
I went to Stanford and studied computer science at the

(02:52):
time when many of my colleagues later became the new
heirs of this new tech ecosystem that we're now building.
I started a technology company myself called Apture, which was
later Talent acquired by Google. And I say that just
to have maybe the listeners understand that I don't come
to this wanting to tear down the entire project of
technology or be a sort of dystopian leadite. There's very

(03:15):
specific critiques of technology that we have to fix and
that have totally warped the collective psyche of humanity. But
hopefully for people to understand that, you know, I came
to these conclusions that we'll get into through first saying,
you know, how do we build technology that helps the world,
and seeing my friends start there and then end up
in this kind of perverse race for human attention. Because

(03:36):
that's really what changed in two thousand eleven two twelve.
Many of the people who were my friends in technology
had gotten into it because we thought we could make
the world better and build things that would improve the
state of the world, and increasingly everyone was just getting
trapped in this game for who what can I dangle
in front of people's nervous systems to get them to
come back more and more often. How can I turn
things into more of a slot machine? How can I

(03:56):
add more social validation? What if I could get people
addicted to getting attention from other people. I remember when
Twitter and Instagram added the feature that allows you to
follow people so you could have followers. That was an innovation, right,
the idea that suddenly I have, you know, an increasing
number of people who are listening to me every day,
and I want to log back in every day to
see did my follower account go up? They got all

(04:18):
of us addicted to getting attention from other people, which
is really the root of many of the troubles that
we see today. When you now look at where we
are in technology, how do you think we got there?
I think it's very much a product of decision after
decision that we didn't know what road we were walking down,

(04:40):
and discovering much later the harms and I think being
wrong philosophically about the narratives we were telling ourselves. For
so long, people thought technology is this this neutral tool.
I mean, after all, the Facebook, it's like you choose
your friends, you chose the things you clicked on, You
chose the things you liked, commented and shared. So why
would we call Facebook responsible for radicalizing the world or
something like that? And what that, mrs. Is the asymmetry

(05:03):
of power where they are really the ones orchestrating and
pulling the puppet strings of why would certain things show
up to you in the first place, why would certain
groups be recommended to you? I think this came because
we let these algorithms run out of control. It really
was this race for attention figuring out you know, if
Facebook's looking at TikTok or Instagram competing with it. If
they don't do say live video or add Facebook groups

(05:26):
or add auto play and the other guys are, they're
gonna lose. And that's why we call it this race
to the bottom of the brainstem, a race to the
retilian brain, to the kind of automated decision making. And
that's really I think what got us here, and I
think what radicalized an entire population leading to things like
the January six event. Right? Was it this work that

(05:47):
led you to begin your understanding of how these applications
and these uh deliberate designs to capture attention truly making
it an addiction began to make you question what was happening. Yeah,
I think there's really two ingredients to my own kind
of waking up process. One was this earlier work I

(06:11):
did as part of this class at Stanford called the
Stanford Persuasive Technology Class, where the co founders of Instagram
and again many of the other alumni in the tech
industry learned from a certain body of work about how
do you make technology more persuasive, how do you pull
on these invisible lovers in the human mind, much like
a magician. You know, in the film The Social Alenement
we talk about my magic background. I was a magician

(06:32):
as a kid, and so I got really interested in
how do people's minds work. What are the predictable ways
that our attention can be fooled, that we can be
led to make meaning of the coin move from that
hand to that hand. But maybe that's not true. We
make false conclusions all the time, and I saw that
was one of the ingredients and what I saw technology
was doing. Then getting to Google, I first started off
again as this kind of optimist, but I felt increasingly

(06:55):
concerned about where the whole industry was heading. This kind
of social media craze. Everyone was building more and more
addictive app with more and more notifications buzzing your phone.
And I made this presentation at Google after kind of
having my own questioning process, and I was kind of
just broadcasting my own questions and concerns. I wasn't trying
to tear down the company or do a whistleblower type thing.
I just made this presentation basically saying, never before in

(07:18):
history have fifty designers in California, mostly white, young engineering.
You know, male minds from twenty to thirty years old,
shaped a billion people's daily psychology. We are holding the
collective psyche of humanity in our hands and we don't
even know what we're doing. And obviously, with leading to
more distraction, it's weakening relationships. This is back in two thousand,
two twelve. That presentation went viral when I was a

(07:41):
product manager. Unexpectedly, I thought I might get fired for
the presentation and was prepared to leave if I had to.
But instead, and again to Google's credit, actually they ended
up giving me a little position, a carve out position
to research. But I didn't know what else to call
it except design ethics. How do you ethically design the
psychological experience of other people, and I really did research
for several years. I tried to change things from the inside,

(08:03):
but ultimately I had to leave because, as I'm sure
we'll get into, the business model of capturing attention is
the fundamental thing that all of these companies are trapped in.
Where you know, Facebook stock price, Google stock price, YouTube
stock price depends on growing the amount of attention that
they have every day, every year from users. In a
funny way, even though people have been on these platforms

(08:25):
now for more than a decade, I'm not sure a
lot of them really understand the underlying business model, how
it's financed, and how in effect they are commodities. Every
single person who goes on one of those platforms is
being used, and our data, our privacy, the security of
our personal information, all of it is being commodified. Yeah.

(08:49):
You know, in the film The Social Dilemma, we quoted
the line that if you're not paying for the product,
then you are the product. So you know, how much
have any of us paid for our Facebook account recently? Well? Nothing?
And how are they worth almost a trillion dollars of
market cap because they're selling the predictability of our attention?
They're selling the manipulatability of our mind their business models advertising.

(09:09):
Now again, if people are enjoying this, it's very hard
to identify where the concern is. But if we don't
look way out and sort of see how is the
whole system behaving it's actually polarizing us, then we're really
in trouble. Because one other aspect of a business model
that relies on attention is which news feed would work
better at keeping your attention? One that personalizes the information

(09:31):
to you, so every time you scroll your finger, it
gives you a next post that says this is why
you're right and the other side is wrong, versus a
news feed that does not personalize information to you. Well,
the personalized news feed is gonna work better at keeping
your attention. So then you take the shared truth and
it shreds it into three billion Truman shows where we're
each more and more confident and certain of our view

(09:51):
of reality. And it's done that for everyone, which is
really emphasized tribalism, polarization, and broken down our capacity to
talk to each other, which I think is the ultimate
concern about Where this leads to is if we don't
have a shared reality or the ability to have a
conversation with shared understanding. We can't solve any problem with
our climate change or racial injustice or COVID. Also, what

(10:12):
we know from analyzing the top postings on Facebook and
other kinds of metrics for social media is that the
most sensational, the most controversial, you know, the most conspiracy
minded forms of disinformation are going to capture the most attention. Yeah,
which is really an extension of problems that we've been

(10:33):
familiar with around yellow journalism and sort of salacious you know,
putting car crashes and killings on the front pages of newspapers.
But this extends it because it's now automated by machines.
You know. One study we have on their Ledger of
Harms website, on the Humane Technology website is that there's
a study about Twitter that for each word of moral
outrage that you added to a tweet, it increased the

(10:56):
retweet rate by In other words, if you use the
more a negative emotional language, the more indignant resentment, the
more look at these hypocritical you know that kind of thing,
the more retweets you get. If you have a system
that's profiting indiscriminately from whatever against the most attention, you know,
using that logic, imagine you're driving down a freeway and
Facebook is looking at what people look at while they're

(11:17):
having on the freeway, and everybody looks at the car crash. Well,
according to that logic of Facebook and Google, the whole
world must want car crashes. So we made a philosophical
mistake that what we look at is what we want
or what's good for us. And really it's almost the
worst of us. It's almost the the inmates are running
the asylum. But the inmates are are the human amygdala,
the fear and outrage center of the brain, and that

(11:39):
has become the choice making basis, the basis of authority
in our society, in our information system. Well, when you
look at where we are right now and you think
about what is it we could be doing to try
to reign back in the operations of the platforms, and

(12:00):
you know, without sounding to Pollyannish, try to return it
more to the original vision of what it was supposed
to be doing and connecting people, creating community and all
of that. Are there things that platforms can do on
their own to change their business model that would make
them less toxic or is this going to have to

(12:20):
be imposed on them. This is an incredibly difficult conversation because,
you know, I actually believe there's a lot of really
good hearted people who are in the tech industry who
would like to No one actually wants this to happen.
I want to make really clear. I think there's not
a mustache twirling, evil genius who wants to do all this.
But here's an example of why Facebook, for example, we

(12:42):
need to put pressure on them from the outside. You know,
as people are following all these Facebook groups with you know,
more extremists, uh, you know, militia groups. Facebook is increasingly
on the hook with public pressure to try to do
something about these kinds of groups, of this kind of content,
and that's getting recently expensive for them. They have to
hire more content moderators, they have to take actions. When

(13:04):
they take those actions is more political because they get
blowback from different sides saying you're not taking down this,
you're not taking down this. So the cost to them
reputationally are going up. The cost to them in terms
of hiring content moderators and building integrity teams and monitoring
for hate speech, all that's going up, and that's getting
really expensive for them. So one of the things I
know is happening now that they're in the process of

(13:26):
moving to an encrypted model where groups and communication are encrypted.
So they're moving from we didn't know that January six
was going to happen. Two we can't know. In other words,
they're making it impossible for them to actually do something
about these problems because it's too expensive. And the reason
for that is that their business model makes it impossibly

(13:47):
expensive for them to moderate all this content. They have
something like fifteen billion post or messages that are running
through their system every single day, and they can't hire,
you know, fifteen billion New York Times journalists to say
is this credible? Is this true? Is this real? And
so we're left with the authority of virility. Whatever goes
viral wins, and that is just fundamentally toxic. So when

(14:10):
I say what do we have to do about this,
I think we have to go much deeper. You know,
is a virility based information environment compatible with democracy? Are
rules where whatever goes viral wins. It cannot be the
basis of a society that that defeats climate change, cannot
be the basis of a society that defeats COVID Now

(14:30):
when I say that, I don't mean that there's these
magical authorities that we just have to suddenly switch back to.
There's good reasons why people, I think are skeptical of
some of the sources of authority, whether it's you know,
major news media or the CDC flip flopping on masks,
or you know, there's things like these reasons that that's
been debased. But I think we have to ask almost like,
what would it take to restore warranted trust? And I

(14:54):
really mean, I wonder if there's even a kind of
humility movement for the media to sort of self examined
and say, yeah, hey, you're right, there's reasons why you
don't trust us on all sides. But what would it
take to regain and restore trust so that we can
actually have an information system that is credible? And this
is an incredibly complex problem, and I think the first
step is we have to realize that this happened to us,
and that hopefully is a place we can proceed from.

(15:17):
But Tristan, is there no way that smart engineers who
created these algorithms could tweak them or in some way
moderate them so that there could be a technological fix
to the virility. The problem is there's no algorithm for
determining what is true. But there's no keywords or phrases

(15:40):
that would trigger some kind of reaction. No, and in fact,
when they try to do it that way, one of
the things that happened actually after COVID. People don't know this,
but Facebook actually had to shut down. They lost all
their content moderators. Because the content moderators used to have
to go to a physical space, log into you know,
secure computers, and then do the content moderation posts after
post coming through the saying is as good are we

(16:01):
going to remove it whatever? Because of COVID and because
of safety concerns, they didn't go into the physical offices
and there was no way to do that work remotely.
So for the early stages of COVID, Facebook was just
this big open floodgate where anything went right. So if
you have this kind of unregulated information environment, you know,
all the sort of COVID disinformation and misinformation about it

(16:22):
being related to five G and all these kinds of
things sort of spread like wildfire. They don't have an
algorithm for knowing what's true, and increasingly since that they've
been trying to rely on machines to classify what is
hate speech, what is a conspiracy, And what it's doing
is deep platforming hundreds of thousands of people who then
get further polarized. I have friends who are in no

(16:43):
way conspiracy theorists or you know, crazy people who are
actually getting deep platforms because they might be using words
or phrases that are near people who are maybe you know,
getting tagged by the system, and that just produces so
much anger. Now, what could you do? I mean, you
could decrease virility just not making it so sort of
hyper fast morality. You can increase friction. For example, Twitter

(17:05):
did this during the election. When you hit retweet. Instead
of just letting you instantly retweet, they forced you to
say something about what it is you are sharing. That
tiny little step decrease the spread of information. Well, and
what you just said really concerned me. Also is that
if Facebook is moving toward encryption so that you wall
off these groups so that the argument is well, they're

(17:28):
not spreading it except among people who are already in
the group, then you're likely to have a different and
maybe more dangerous breeding ground not only of disinformation but
extremist behavior. That's right, and it's important to realize also
that with the global if you think about the users
who haven't come on to Facebook yet, they're all in
the developing world. They're all in these other countries that

(17:50):
have more fragile societies, more fragile democracies, and that Facebook
doesn't have the content moderators in those languages or have
the cultural context to do it well. So if you
think about to Facebook having a system based on content
moderation and integrity where they have to do all that monitoring,
that's only going to get more expensive for the remaining
users they have to bring online, which means that they

(18:10):
want to stop this problem now and not have that
responsibility in the first place. So again, this is actually
going to get more dangerous, not just here in the West,
but for the developing world, in the global South, which
often gets the worst of these things. If you look
at countries like Myanmar or Ethiopia right now that actually
have massive disinformation problems that are driving up civil wars
and conflict, and that's, you know, an enormous problem. We're

(18:33):
taking a quick break. Stay with us. You have a
great quote that I loved in a couple of things
you've written. I first saw it when you wrote a
piece a little over a year ago in the New
York Times where you quote Edward oh Wilson, who is

(18:57):
famous for his study of ants and was the eminence
Greece of sociobiology, and when he was asked whether humans
would be able to solve the crises that would confront
them over the next one hundred years, he replied, yes,
if we are honest and smart. The real problem of

(19:18):
humanity is the following. We have paleolithic emotions, medieval institutions,
and god like technology. And this really does sum up
what I hear you saying that we've advanced so far
in technology, but basically our brains are the evolution of

(19:38):
our capacity has by no means kept up with this
godlike technology. And it's hard to figure out how we're
going to be able to rein it in when the
technology keeps racing. And assuming that we have an administration
now with the Biden Harris administration, we have interest in Congress.

(20:02):
If you could wave a magic wand and it's an
unfair question, but I'll ask it anyway, what would you
have them do? What three things would you ask them
to do right now that could at least slow it
down until we could figure out how we're going to
better control this. Gosh, there's there's so much in what
you just shared, and I really appreciate you bringing up EO.

(20:23):
Wilson's quote because I really think that is the problem
statement that at least guides are work at the Center
for Humane Technology, because the answer to the paleolithic emotions,
medieval institutions, and godlike technology is we have to embrace
and honestly admit and understand our paleolithic emotions that we
respond to social validation. We have negativity bias, Our minds

(20:45):
do get pulled and distracted, and it does degrade the
quality of our attention spans. We have to know that
about our own minds. We're really one species who has
the capacity to do this. So I just want to
say that first, because I think that's the real project
of being humane means actually a deep embrace an understanding
of how we really work, even when it's uncomfortable to
look in that mirror. So that's embracing the first. The

(21:07):
second is we have to upgrade our medieval institutions from
the clock rates of being too slow and not appraising
of the harms of technology as they accelerate and grow
to have a more frequent understanding and also projecting forward
the risks that different and new technologies are going to create,
because as you said, it's not just we have to
solve today's issues with social media. We have to have
the first derivative that kind of where is this all going,

(21:28):
projected into the future, and make sure that in government
and in our democratic institutions, having strong assessments of where
these different areas are going, and having some kind of
collaboration public private partnership maybe to try to make sure
we're assessing those risks, dealing with the issues, prioritizing them,
and having adequate democratic process to deal with them. So
that's the medieval institutions, and then the last is on

(21:50):
the accelerating godlike technology is you cannot have godlike powers
without godlike wisdom. Phrase from Barbara Marks Hubbard is you
cannot have the power of God's without the wisdom, love,
and prudence of God's god like powers. Be they biotech weapons,
be they psychological influence, be they split testing. You know,
trillions of variations of psychological influence and micro targeting cannot

(22:14):
be wielded by those who do not have the wisdom, love,
and prudence and care that a God would have. Think
of if your Zeus and you accidentally bump your elbow
and don't know your Zeus, you just scorched half the
earth of the lightning bolt without even realizing it. We
have these powers of God's and we have to be
able to have the responsibility and self awareness and frankly
reign in that power. Now, what does that look like?

(22:35):
That's a more difficult question. I think we need a
culture of responsibility. In the tech industry. There should be
almost a license to practice these kinds of exponential powers,
and you could lose that license. We have that in law,
we have that in medicine, we have that in mental health.
If you think about how much compromising information a technology
company has on you compared to say a psychotherapist, Like,

(22:55):
there you are in the psychotherapist room. You're sharing all
your anxieties, You're sharing all these sort of secret parts
of your mind and your weaknesses. Imagine that the business
model of a psychotherapist was to turn to the advertiser
and say, hey, do you want to manipulate Hillary here
for you know which you just said. That would be ridiculous. Given,
and they would lose their license if they did that.
You compare how much a psychotherapist could know about your

(23:17):
weaknesses to a supercomputer that's been pointed at your brain
the last ten years, learning every pattern of what you've clicked,
what you've watched. These things know us so much better
than we know ourselves. And when you recognize the asymmetry
of that relationship, there's a compromising amount of information that
technology companies have on each of us. They cannot have
a business model that's based on exploiting and extracting from

(23:38):
that information. So I just say this because this sets
up the kind of dramatic action that I think a
Biden Harrison administration we need to take. I mean, if
you really want to go all the way, you ban
surveillance capitalism, you ban the micro targeting based business model.
It has become toxic. It has warped our society. This
is not a partisan conversation. This is so critical that
this is beyond partisan. This is a transpartisan, beyond partisan

(24:01):
topic because it's really narrowed each of our views to
make it very hard to have a shared understanding of reality,
and that harms everyone. Whatever it is that you care
about I want to really zero in on the individual
because listeners could hear us speaking and just throw their
hands up and say, well, yeah, okay, it's a mess,
and yeah they know everything about me. But I'm not

(24:21):
going to get off of Facebook. I'm not getting off
of Twitter. I love watching TikTok, you know, so I'm
kind of you know, in for a diamond for a dollar.
What could individuals do to take their own attention spans
back and not only try to have more control over
their own minds, but also to try to avoid consuming disinformation.

(24:42):
I think what people needed to protect themselves is to
understand how this works. And I don't mean to self
promote here, but I think that the film The Social
Dilemma is a partial antibody or semi inoculation to understanding
some of the effects. You at least you know what
you're getting into. My colleague Grenade Arresta actually is the
one who showed a conspiracy correlation matrix that you know.

(25:03):
If you're a new mom and you join do it
Yourself baby Food group on Facebook, which is a great
tool you can make your own baby food organic baby
You're great when Facebook when aggressive on recommending groups for
people to join. It was trying to figure out, well,
which group could I recommend to these new do it
yourself baby food moms that would be most engaging for them,
that would keep them here the longest. What do you
think was the most recommended group to them? It was

(25:25):
the anti vaccine conspiracy theory groups for moms. If you
join that group, then Facebook does the same thing again, Hey,
what's the most engaging group? I could show people who
look like that who have joined the anti vaccine moms
And the next recommendation was q and on gosh, So
Facebook basically directed these women. Yeah, this was an automated

(25:45):
system that was finding the kind of conspiracy minded people
and actually recommending more things just like that to them
because that's quote unquote what they like or what they want.
But it's actually not what they want, it's what they
will be persuaded to click on and to fall into.
And those echo chambers are so powerful because, as you said,
you know, when people are inside of these these closed

(26:07):
echo chambers, like minded people getting social validation and affirmation
from these crazier beliefs, you can really lock yourself into
a worldview. And there's a great Reddit channel called q
and on Casualties where actually a lot of human onbelievers,
especially post January, are talking about how hard it's been
for them to sort of snap out of it and
what that process was. And thank God for this channel
because it is a story of alumni who are kind

(26:30):
of all sharing stories of how why do they believe
us in the first place, and what was it like
and what did it cost them? And I think that's
really important. But when it comes to you know, how
can we live in this system again? The first thing
is we have to understand what it's doing. We have
to understand that we are the product, not the customer.
So if you're using it, just understand the relationship. Every
time you use it, you have a supercomputer pointed at
your brain. It does not distinguish between what is true

(26:52):
versus what just went viral. It is not trying to
protect you. It doesn't know what's good for you. It's
just trying to figure out what keeps your attention. And
so we should use it. If you use it with
the consciousness that is adequate to understand the perversity of
that relationship. And I still have a couple of social
media count even use them very minimally I think using
it less, you know, spending more time face to face

(27:13):
with people and actually having conversations, trying to make sense
of the world with people together. The final question we
can ask is what is worth our attention? Which is
a very spiritual and almost very basic question, which is
what is really worth my attention? What is worth our
society's attention? What is the cost of not putting our
attention on the things that we actually care about? And
I think if you just spend one day away away

(27:35):
from social media for one weekend, the difference in our
whole psyche and you talk about our attention spans, we
come back feeling so different. Just to follow up on that, Tristan,
because I think that's really helpful for people. So how
do you get your news? You know, there really should
be more out there. I think on this specause, I
think that we should see social media is unsafe forgetting

(27:58):
our political news because it's based sickly just a drama
snowball machine. It allows you know, the most indignant, you know,
dramatic hypocrisy and people's anger about hypocrisy to just show
up constantly, never ending. So I think we have to
just recognize, you know what we're what we're in for,
you know, I try to watch and understand views that
are different from mine. I try to follow lots of

(28:19):
media from people's different perspectives. There's from wonderful groups out there,
like Braver Angels or Courageous Conversations that are working on
these deep polarization work. I think there needs to be
much more funding of that kind of thing. That could
be something that also we could be government funded, and uh,
you know, really just un following outrage media, whether it's
partisan television, you know, I would I frankly, I wouldn't
watch either MSNBC or Fox News. I just think that

(28:41):
outrage media is just not healthy for our society. People
can argue about differences about how they conduct themselves, but
I just think asking what are really long form, you know,
understanding based kind of media that actually really do cultivate
long term understanding. Because also, so much of this is
just breathing by our brains so fast that you end
up reading all this new then you say, do I
even know what I just did for the last hour? Right?

(29:03):
Your mind sort of feels empty afterwards? Yeah? Yeah, how
how can I be d program when I don't even
know I'm programmed exactly. We just we said that in
the film, which is how do you deprogram yourself from
the matrix when you don't know you're in the matrix? Now,
of course each side is convinced that the other side
is in the matrix, but not at least in the matrix.
We had a shared matrix and not two billion different ones, right.

(29:23):
I just can't thank you enough for the work you're doing,
for your incredibly thoughtful approach to these really difficult issues.
And I am such a grateful admirer. So I just
know there are a lot of us in your corner.
Thank you so much that that means so much to me.

(29:46):
Tristan Harris is the co founder of the Center for
Humane Technology. He also co hosts a podcast of his
own called Your Undivided Attention. As Triston pointed out, the
combination of social media platforms and disinformation has been and
continues to be hugely destructive in the United States. But

(30:08):
this combination can cause even greater harm in fledgling democracies
or outright dictatorships. And no one knows that better than
our next guest, Maria Ressa. I first crossed paths with
Maria when she was leading CNN's Southeast Asia bureau. I
followed her career, and then in twelve, Maria left traditional

(30:32):
media to create a new outlet of her own, an
online news site called Rappler. Four years later, she found
herself reporting on President du Terte, and there was a
lot to report. He was democratically elected in twenty sixteen,
and right after that he took aim at the country's institutions.

(30:53):
He praised martial law and the extra judicial killings of
people he claimed were connected with drug trafficking. He targeted
freedom of the press and journalists like Maria, and every
step of the way, Rappler has been there shining a
light on it all. In eighteen, Time magazine included her

(31:16):
in a group of journalists they named Person of the
Year for bravely holding the powerful accountable. You'd be hard
pressed to find someone who has experienced disinformation on a
personal level like Maria has anywhere in the world, so
I was really looking forward to catching up with her

(31:38):
for an important and timely conversation. Maria, Hello, you know,
my lord, I have thought about you so often. I
can't tell you how much my heart is with you
and everything you're going through Maria. Let me start by
introducing you to those who may not know you. You know,

(32:00):
long before Maria started winning international awards and acclaim, she
was a high school student in New Jersey who then
went on to Princeton and then went back to the
country of her birth, the Philippines. And I'd love for
you to connect the dots for us, Maria, how did

(32:21):
you go on this life journey that brings you to
where you are today? Oh? My gosh, A search for home?
I mean, weirdly, I was looking for home. I was
trying to figure out who I was. I grew up
in America, in New Jersey, you know. But while I
was growing up in the States, I never felt completely American.

(32:42):
I wasn't as articulate. I was small. I walked to
the third grade in public school, and I was afraid
to speak because I spoke the dialog right. So anyway,
so I find my way home. Six I was here
on a full break going the other way. That was
the only way I could have afforded to come home,

(33:02):
to come to Manila. It wasn't home yet. And then
when I got here, that's when I realized I'm not Filipino.
I am very American, and I learned about the Philippines
through the Noose. I set up the Manila Bureau in
for seen it. Yes, so so I guess that was

(33:23):
it right, But it's like the search for identity again,
like for everyone. When I think about you, I think,
in a way, you're returning to Manila sort of followed
the arc of opening up and democracy and hopefulness about
the Philippines. How would you describe that? So you come

(33:44):
back and what did you find when you first got there?
And then how did that evolve? The people power revolt
in Night six was my introduction to the Philippines as
an adult. And when I came in as a reporter here,
I covered the blossoming of democracy in the Philippines in
all of Southeast Asia. That was the privilege that I

(34:07):
had being a reporter at that point in time, and
you felt like, my gosh, democracy is just going to
steam roll through. So I became a reporter. I watched
the pendulum swing from one man authoritarian rule in Southeast Asia.
And then now I would say, at the tail end
of my career, I begin to see the pendulum swinging

(34:28):
back and it is horrifying because it feels like the
search for justice, strengthening institutions, everything I've worked for professionally
has been thrown out. And the irony of speaking to
Hillary Clinton is that as I became a target of disinformation,
no one has felt that more than you, because in

(34:50):
the end, the disinformation what technology had enabled this propaganda
at an exponential scale. The Philippines was one of the
countries where it was like a testing ground for the
real target, which is America the elections. We are the

(35:10):
Petrie dish for Cambridge Analytica. They tested these tactics of
mass manipulation here and if it weren't they poured it
over to you because you were the target. Um why
the Philippines hundred and ten million people? We are now
six years running. Filipino spend the most time on social
media globally, really the whole world, the whole world, six

(35:33):
years running. So even Katie Harbath, who was with Facebook,
we were doing something together in Berlin, and you know,
she admitted we were patient zero, and I guess I
worried that this isn't over, and it isn't over for America.
It's far from over. I want to circle back to
that because really That's what I want to focus on today,

(35:54):
is this whole ecosystem of disinformation. But to just again
sort of enter you and where you are right now.
What is Rappler? I mean when you found it in
a decade ago, it was scrappy. A lot of people
didn't give you much optimism about its success. But boy,
have you made it a real journalistic center. So I

(36:18):
did six years heading the largest news group. When I left,
I resigned because I thought technology was taking over and
that legacy media was not going to be able to
recover fast enough to take advantage of it. And that's
what I wanted to do. In in I was writing

(36:38):
my second book, it's called From Bin Laden to Facebook.
So I was looking at, you know, if virulent ideology
that was hijacked by al Qaeda, if that can spread
through social media, why couldn't the forces for good use
this new technology? And that was the basis of Rappler.
We wanted to take yourself phone and be with you,

(37:02):
and we succeeded at it. But Facebook and the social
media platforms, frankly, they just got too greedy. They find
too the design of the platform, to the point that
this thing that was empowering people now became a behavior
modification system that has been open to geopolitical power places.

(37:26):
In a thousand cuts the documentary about you and Rappler,
there are some video of early interviews that you had
with the then newly elected President to Tarte, who you
are very respectful toward. You are asking the questions of
a seasoned, experienced journalist and he seems to be responsive.

(37:47):
And then as it goes on, if you ask hard questions,
if you publish embarrassing, inconvenient information that counters what he's saying,
he becomes more and more hostile and the enablers around
him are beginning to take it out on you and Rappler.
So that's a shorthand description of what you've gone through

(38:10):
with this latest leadership in the Philippines. Yeah, I mean,
let me put it over the last four years right
and connected to how this was used the influence operations.
I think in Sten we kept doing our job. We
we tried to hold Terta accountable for the drug war
and then the propaganda war. So in the narrative, journalist

(38:35):
equals criminal was seated. It came up a million times
bottom up seen President to Artis said the same thing
top down in his State of the Nation address about US.
A week later, I got my first subpoena in the
government filed eleven cases against me and Rappler. I was

(38:57):
arrested twice. There was one point I got off a
plane and you know, they bring me into this van
of the police, but they're they're all wearing swat gear
like I'm a terrorist. All of the trials begin, I
get convicted. So I'm convicted for cyber libel for a
crime that didn't exist when the story we published, a

(39:20):
story I didn't write, edit or supervise. That's this cough
gyass moment. So I have ten arrest warrants in less
than two years. That's how the influence operations, information and power.
It acts like fertilizer to allow democratically elected leaders like Luterte,
who uses the levers of power of democracy to cave

(39:43):
it in from within. It's like termites eating the wood.
It still looks solid, but the minute you step on
it it'll break. Our democracy look strong, but they are
extremely weak. We'll be right back. What do you hope

(40:09):
that this new Biden Harris administration and other leaders can
try to achieve to shore up democracy to go after
the termites, you have to think like an Adam bomb
has gone off in the information ecosystem. It's carnage, right.
It's going to be impossible to bring this together using

(40:31):
the old tools that we had, and we must come
together like the world came together post World War two,
to create new structures of how to deal with this.
And and the first is to stop the insidious manipulation
that we are all being subjected to. It is not
possible to think that the old media ecosystem or the

(40:53):
old checks and balances are going to work the same way.
And I'm hoping America takes a need role in this
because frankly, you allowed American companies to do this globally.
That's true, So you please do something well, you know,
it does seem that a few platforms have recognized the

(41:14):
role that they played in the Stop the Steel movement,
the insurrection, the attack on the Capitol. They barred Trump
and Steve Bannon from posting. What do you think about that?
I mean, too little, too late? You know, there's all
of this, like everyone everyone is talking about it, debating
of whether it's a free speech issue. It isn't. This

(41:36):
is a distribution issue. It is, you know it insidious manipulation.
You wouldn't have had to ban him, the former president,
if you hadn't allowed it to get to this point.
It's like how many times can someone lie or insight
to violence or attack women with impunity? Right? And I

(41:59):
think that's been the problem globally, is that the social
media platforms, when it's convenient to them, claim it's free speech.
It also makes them a lot of money, and then
when it's not, they take a long time to take action.
I know this firsthand because I have asked for this
after my conviction. On June, I was attacked again by

(42:24):
influence operations and they were really you know I've gone through.
When you get messages for hour, you get used to it,
Like I'm sure you get used to how to deal
with criticism. Right. There's valid criticism, and then there is
criticism that is meant to pound you to silence. It's
beyond hate speech. It's dehumanizing. It's turning you into a

(42:48):
caricature that breaks down inhibition from otherwise, let's say, ordinary
people to go after you. And it happens in such
a short period of time, and it cannot happen without
a concerted campaign to make it happen. And that's what
a lot of people don't understand, which you have not

(43:09):
only reported on, but lived through, Maria. This doesn't happen
by accident. You know there are bots, these are not
even real human beings that are unleashed to perpetuate the
lies and the attacks and the undermining and the dehumanization.
And you know now you're facing what any reasonable observer

(43:31):
would conclude, our bogus made up charges, all kinds of
charges against you that could literally lead to your being
imprisoned for the rest of your life. And they're not
satisfied with one charge. They have to keep bringing charges.
They have to keep hauling you into court. And as
you said, the arrest that you faced in the airport

(43:53):
when you landed, I think from the United States after
that very long flight is in a thousand cuts. And
it's all meant to destroy empathy for you, sympathy for you,
to try to create this sense that well, you know,
I always liked her. I thought she was a good reporter.
But you know there must be something to it. They

(44:16):
don't make everything up, do they. And as somebody who
you know has had a million things made up about me. Oh,
yes they do. But what really touches me and and
impresses me but also worries me, Maria, is that you
keep coming back to the Philippines. You get out on bail,

(44:37):
and you go off for an event for getting an award,
for making a speech, and then you come back and
I just am so, you know, admiring but also worried
about you continuing to come back. But it seems that
that's your commitment, that you are going to stand your ground,

(45:00):
so to speak, in the Philippines and not just stand
for yourself, but stand for democracy, stand for the freedom
of the press. How do you feel though when that happens? Gosh, Um,
I don't think I have a choice, you know, because
if I am who I am, and if I believe

(45:21):
in the standards and ethics and the mission of journalism,
which I do, then I don't have any other choice.
And I feel like, you know, you don't really know
who you are until you're tested, and it's it's up
to what you compromise that defines who you are. And
I don't have I've lived my life with no regrets
and I don't intend to have any regrets. Right in

(45:43):
this one. I have a company that has a hundred
and some odd people who who are idealistic, who will
do the job, and we keep doing our jobs. We
just released another exclusive story of corruption in the government,
and of course there will be retaliation of some sort.
But it's okay, Maria. I will do everything I can,

(46:05):
through you know, my platforms, to support you, to support Rappler,
because it's not only about you, as much as I
admire and respect you, and it's not only about the Philippines.
It is, as you rightly say, it's about the entire world.
Whether we're going to maintain freedom and truth and an

(46:25):
ability to talk with one another, to have some sense
of community to solve common problems like what this pandemic
has demonstrated is so needed and we need your voice
and your leadership and your example. And Maria, thank you
so much for talking with me today. Thank you. You

(46:52):
can learn more about maria Rests fight for press freedom
and against disinformation in the Frontline documentary A Thousand Cuts.
It's streaming online at PBS dot org, slash Frontline and
on YouTube. And if you want to help Maria and Rappler,
you can do so by making a contribution to their

(47:13):
legal fees. Go to Press Freedom Defense Fund dot org
slash donate. We all have to do everything we can
to protect ourselves and others from disinformation. I don't think
this is a partisan issue. This is a human issue.

(47:35):
It's a democracy issue, So just stay aware, check your sources.
Don't believe everything you see online or hear from people
in power. You and Me Both is brought to you
by I Heart Radio. We're produced by Julie Subran, Kathleen
Russo and Lauren Peterson, with help from Buma Aberdeen, Nikki

(48:00):
E Tour, Oscar Flores, Lindsay Hoffman, Brianna Johnson, Nick Merrill,
Rob Russo, Opal of A, Don and Lona Vlmro. Our
engineer is Zach McNeice and the original music is by
Forrest Gray. If you like You and Me Both, please
share it with your friends. Let them know they can

(48:20):
subscribe to You and Me Both on the I Heart
Radio app, Apple Podcasts, or wherever you get your podcasts.
Thanks for listening and see you next week.
Advertise With Us

Host

Hillary Clinton

Hillary Clinton

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Decisions, Decisions

Decisions, Decisions

Welcome to "Decisions, Decisions," the podcast where boundaries are pushed, and conversations get candid! Join your favorite hosts, Mandii B and WeezyWTF, as they dive deep into the world of non-traditional relationships and explore the often-taboo topics surrounding dating, sex, and love. Every Monday, Mandii and Weezy invite you to unlearn the outdated narratives dictated by traditional patriarchal norms. With a blend of humor, vulnerability, and authenticity, they share their personal journeys navigating their 30s, tackling the complexities of modern relationships, and engaging in thought-provoking discussions that challenge societal expectations. From groundbreaking interviews with diverse guests to relatable stories that resonate with your experiences, "Decisions, Decisions" is your go-to source for open dialogue about what it truly means to love and connect in today's world. Get ready to reshape your understanding of relationships and embrace the freedom of authentic connections—tune in and join the conversation!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.