Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
You're listening to Comedy Central.
Speaker 2 (00:06):
Think back twenty years. Maybe you were in school or college.
A friend comes up to you with a twenty dollars bill.
They say, check this out, and they start folding it
in a kind of weird way, kind of in half.
Then it comes to a point. Then you realize it's
the shape of the Pentagon, and the image on the
bill is now the twin towers who smoke coming out
of them? What did the government know about nine to
eleven before it happened? If you ever experienced that, or
(00:27):
if you ever had that thought, then congrats Alex Jones.
You're a nine to eleven conspiracy theorist. This is Jordan
Klepper Fingers the conspiracy September eleventh really was the ground
zero of conspiracy theories. Chances are you can name one
jet fuel can't melt steel beams? George Bush did it?
What about Building seven? Osama? Bin Laden is a CIA
(00:49):
operative named Tim. What's that you don't know about that one? Well,
someone told it to me just a few months ago
into Trump Broway, people were talking, is bin Laden still alive?
Speaker 1 (00:58):
Jim?
Speaker 2 (01:00):
Are you doing math right now?
Speaker 1 (01:01):
Now? I'm trying to remember his real name.
Speaker 2 (01:04):
Tim Osama Osama bin Laden. Yeah, Tim, Tim, someone forgot
his last names. Tim is not the most saudy name.
Speaker 1 (01:12):
And he wasn't Saudi. He's from the CIA.
Speaker 2 (01:15):
Needless to say, when we heard about Tim bin Laden,
we were like, let's get to the bottom of this huckleberry.
And even though our unverified non tipster couldn't remember Tim's
real last name, we found it. His name is Tim Osman.
Totally fake guy, but his name is Tim Osman. So
I want to go through this conspiracy theory with a
person who is a specialist in media manipulation and the
(01:37):
effects of disinformation, Doctor Joan Donovan, the Research director of
the Shorenstein Center on Media, Politics and Public Policy at Harvard. Joan,
ready to hear this story of a man named Tim.
Speaker 3 (01:49):
Yeah, I know a few Tim's so interested to find
out if I knew him.
Speaker 2 (01:53):
You may know that this guy lives just down the
street from you. Again, disclaimer, his name is not Tim.
Speaker 1 (02:00):
There we go.
Speaker 2 (02:00):
Let me walk through this for you guys. So, this
nutter butter of a story starts in nineteen eighty six
in Sherman Oaks, California. Classic classic Bin Lauden. He's twenty
eight at the time, wearing Dockers, and he's representing the
interests of the mujah Hadeen in Afghanistan. He's at a
Hilton hotel in Sherman Oaks to meet a couple of
FEDS and the name he's been assigned by the CIA
is Tim Osman. Now at the Hilton Osama Bin, Tim
(02:24):
Osman Lauden is told by the guys from the US
government that the CIA doesn't consider their group truly representative
of Afghans, and Tim gets pissed. He wants to lobby
the DC movers and shakers for support. Now, the theory
claims there is evidence that Tim tours US military bases
other parts of the United States, including possibly the White House.
(02:45):
He's even given special demonstrations of the latest equipment, pretty
high end stuff. How do we know all this? And
by no, I mean how do we make it all up?
Because one of the Americans there to meet Tim is
a guy named Michael Reekan Shoodo, a man linked to
the Chinese industrial and military group Norinco, whose name is
misspelled a dozen different times on the most official looking
(03:07):
website explaining this conspiracy theory. He was apparently a loose end,
and he had to be taken care of. So he
gets arrested, accused by the US government of being delusional,
accusing him of modifying something called Promise software in the desert,
which obviously doesn't make sense because and I'm quoting from
the website here, sand isn't good for computers. I mean,
(03:32):
that's a fact. So Ricken Shudo, which sounds like a
delicious appetizer, is put in prison and accused of making
all this stuff up. But if he were really making
it up, then why is there evidence that the modifications
to the computer software was made in an office in
nearby Indio, California. That's the story of tim Osman rest
(03:53):
in Power of Fake King. It's a strange way into
what is probably the original Internet conspiracy theory nine to eleven.
And that is why Joan is here. First of all, Joan,
any reactions to the tale of tim Osman.
Speaker 3 (04:08):
I mean it sounds legit y, like, you know, clearly
we've got a reputable news organization digging up facts, and
we've got you know, layers of editors and others that
have been activated. You know, hundreds of thousands of dollars
must have been spent on this investigation, So I'm on board.
Speaker 2 (04:30):
You buy it, and you're a pro hered.
Speaker 3 (04:32):
Well, of course, of course, you know what's interesting about
things like this is essentially when you're being told something
that is illicit information that you feel like you're getting
information that nobody else has, it does make you listen closer.
It makes you want to dig deeper. And when it
comes to the early internet, we you know, we think
(04:57):
a lot about well, what are you know? What is
government telling us? Right? And you have all of this
new information that you have access to, and so the
moment when the attacks on nine to eleven happened, we
all were concerned, but none of us really knew what
the internet was at that point. You didn't even have
(05:18):
major news organizations taking, you know, their websites very seriously
at that stage. And so if you were going online
to find information about what happened during nine to eleven,
and you were digging in, you would be drawn in
by the novelty and the outrageousness of stories like this,
and you may then find yourself moving between a network
(05:40):
of websites and message boards discussing these theories and others.
And so it's it's unsurprising, but also, we've had, you know,
twenty years of this now and it still looks a
lot like that.
Speaker 2 (05:55):
Well, when we look at nine to eleven conspiracy theories,
where do we still? Where do you begin to hone in?
Speaker 3 (06:02):
I recently published a book with my co authors called Memoirs,
and in the book, we wanted to explain how basically
the Internet affects how people understand politics and communication, and
so we decided to go back into looking at Occupy
and what we were interested in Occupy was understanding the
(06:22):
rise of Alex Jones. And as we were digging in,
we couldn't ignore the fact that Alex Jones was also
one of the major contributors to nine to eleven conspiracy theories.
But it wasn't the same then. It wasn't like he
was online pushing this so much. He had a lot
(06:44):
of television stations that were airing his show and a
few months, I think it was July twenty fifth, before
September eleventh, he had a show where he was showing
people the White House number and suggesting people call Congress
and say, we know a terrorist attack is about to happen,
(07:07):
we know that Ben Laden is going to be involved.
They're going to blame it on him, and you, as
a listener, have a role to play.
Speaker 4 (07:16):
And I won't want you to believe Alex Jones. I
want you to go get these news stories off my website.
I want you to call these major newspapers. I want
you to find out these statements were true by the
White House about preparing for martial law. And I want
you to let them know that there is any terrorism,
we know who to blame, and.
Speaker 3 (07:34):
That participatory conspiracy being part of the action is something
that Alex Jones has been able to really hone in
on and bring people into these worlds as part of
his media making.
Speaker 2 (07:51):
You're saying some and you're saying this is July of
two thousand and one too, So there's people who are
paying attention, they're hearing this before it happens, and see
this happening in draw connection that gives validity to a
lot of his theories.
Speaker 3 (08:03):
Does that build it builds his base? And but what
it does is it actually he actually loses his television networks.
People are you know, this is kind of crazy, this
is really out there. Uh, you know, it's very obviously
xenophobic in some ways. Although cancel culture wasn't really a thing.
(08:27):
Then you could be openly xenophobic or islamophobic.
Speaker 2 (08:33):
And the good old days, the good old days.
Speaker 3 (08:35):
Before yeah, before you when you could get away with it, right.
But by and large, when we were trying to study
at the rise of other kinds of political communication online,
we did keep coming back to nine eleven conspiracies and
especially means like jet fuel can't melt steel beams. Why
(08:56):
do we even remember that turn of phrase nine to eleven?
As an inside job? You know, these turns of phrase
can become very potent and popular, and they're really sticky,
and so they those kinds of key phrases also became
really important explainers or shorthand for groups of people that
(09:20):
had started to come together on message boards and in
email lists that eventually be called became to be called truthers.
Speaker 2 (09:28):
Now, I think what's interesting about this, you know, and
on this podcast we're looking at a bunch of different
conspiracy theories, and we often talk about how these things
spread on social media and the internet. Looking at this
as one of the uh the birth of these types
of conspiracy theories, it's also the birth of the internet
at the time. Can you give us a little bit
of background of how the Internet is being used at
(09:50):
this point, and how how people are using it to
pass information, how people are getting information, understanding these theories.
Speaker 3 (09:57):
So this is before social media, so we're not in
the era of social networks in the same way that
we think about early Facebook or early Twitter. But we
are finally starting to have high speed Internet in our homes,
which allows for the transmission of video. And this is
a really important aspect of how we understand the world
(10:22):
around us, because it's no longer that you're getting your
video from cable stations. It's no longer that strictly, and
this opens up a whole new world of broadcast, creativity, innovation.
And at that moment, there were a lot of people
(10:45):
who were going online making videos, making content that were
anti mainstream media. And I would say that in that time,
even when I was using the Internet, then I was
some one who would consume these kinds of videos. I
wanted to know more about what was going on in
the world. I didn't always trust mainstream outlets. I certainly
(11:08):
didn't trust the government. I mean, I'm a child of
the rage against the machine generation, right, so we always
want to question and ask more and so, but online
everything is done through hyperlinks at this point. So you're
on a website, there's a page on the website with
a bunch of links, and so you're really traveling through
(11:30):
this very labyrinth like information ecosystem where people are linking
you to things or you're following sets of links, and
you never really know where you're going to end up,
but you always take it with a grain of salt.
You think about it. There's no institutional power behind this message.
(11:54):
You don't always know where you're getting the information from,
so you approach it with a kind of radical skepticism
at that stage. Back then, the internet was really a
place for weirdos and geeks and people who wanted to
understand more about the world. And we're sharing things for
(12:14):
the love of one another, and I thought that was
really you know, it is actually kind of a nice
time in a weird way, because you could find your people.
Speaker 2 (12:25):
I remember entering in with skepticism around that time as well,
and partially because of my lack of familiarity with this
new tool. Right, it feels as if everybody was skeptical
in certain ways because we weren't experts on it. We
didn't really know how this was working or what we
were getting information on, but it was sort of like
the wild West in a very curious way. And perhaps
I'm speaking more to myself of somebody who's always afraid
(12:48):
of taking big steps into the unknown. So I was
always cautious about those things. I guess I'm curious about
at that time what kind of conversations or were there
conversations about the Internet and how it should be regulated
and used.
Speaker 3 (13:02):
So in nineteen ninety six, there's this landmark legislation that
is essentially a legislation of decontrol. It says section two
thirty essentially says that websites or computer services are able
to moderate contents as they wish, but they're not going
to be held responsible for the content on their services.
(13:23):
So that means that if you're a server, you're an
email host or your domain registrar, if some crazy person
puts up stuff that's illegal, it's not your fault, right,
You're just providing this basic infrastructure. And so that law
gets passed and you start to see different web services blossom,
(13:47):
and you see groups of people still feel like they
have mastery over the means of communication. They are able
to build their own servers, they're able to register their domains,
and so essentially, at that time, online regulators and many
people using the Internet were very optimistic that there weren't
(14:11):
going to be these major crimes committed. Most legislation or
people were concerned with child pornography as we know, or
maybe people don't know, but the Internet's backbone and the
innovation around the Internet actually came about as the pornography
(14:32):
industry came online. And so the way in which we
remember Internet history. As a professor, I'm always telling my students,
you know, like there was really you know, it matured
around pornography. And so it's not like the we endeavored
(14:53):
to build an Internet that was going to be the
place for you know, this free and open library of
information where everybody's getting access to the world's knowledge. I like,
you remember.
Speaker 2 (15:07):
AOL right, follow the porn and that's I mean, and
that's always been the history, right, isn't that? Also the
innovation towards home movies, uh like, allowing people to watch
it at home primarily came because people wanted to watch
pornography at home, and so the technology follows the porn
if only if we could only aim pornography at a
working democracy, That's what I mean. Then we can technologically
(15:30):
get to a good place. And yes, it'd be like, oh,
thank god, we have a lovely democracy that responds to
the needs of its people. How do we get here, Well,
people wanted to watch democratic porn. Fine, Okay, it's weird,
it's a little strange, but.
Speaker 3 (15:44):
No kink shaming here.
Speaker 2 (15:46):
No kink came in as long as my vote counts.
Speaker 3 (15:49):
But if you think about it, then as as we
describe the history of the Internet, and we're not talking
about then like you know, we want people to have acts,
us to legal you know, law libraries, and we want
people to have you know, access to the greatest science.
A lot of that stuff is still behind paywalls. And
(16:12):
so at that time, the early Internet, you know, maybe
the Wild West doesn't really even describe it, but it
was a bit of a free for all, and the
major innovations weren't you know, necessarily tied to any particular
like public interest or social good, and so conspiracy theories
(16:34):
and conspiracy communities were not just a place where you could,
you know, jump in and say things and contribute, but
these were also communities where people thought that they were
building some kind of knowledge, some kind of resistance to
(16:55):
the establishment, right. And so the Internet had in its
infancy this relationship to liberation, this relationship to if we
had the facts and we were able to communicate freely,
we wouldn't need governments, right, And so there is a
kind of techno libertarian ethic that undergirds the rise of
(17:22):
these kinds of communities online.
Speaker 2 (17:25):
I love it. I want to take a quick break,
and when we come back, we'll be joined by Corey Rowe,
a filmmaker who created one of the first viral conspiracy
films about nine to eleven.
Speaker 1 (17:33):
We'll be right back.
Speaker 2 (17:36):
Welcome back to Jordan Klepperfinger's The Conspiracy. This week, we're
talking about Osama bin Laden and is apparently rich history
as a guy named Tim from California who turned into
a CIA operative. And we're also going to look at
a few theories about what happened in the wake of
nine to eleven. I'm here with Joan Donovan, who you've
been hearing from, but we also have Corey row with
us today. Corey's a filmmaker and a veteran. A few
(17:57):
years after nine to eleven he made a film that
when crazy viral called Loose Change. It was one of
the first conspiracy theory films on nine to eleven and
since then a lot has happened, both for the aftermath
of the film and for Corey himself. So we're going
to talk about some of that. Corey, thanks for being.
Speaker 1 (18:14):
Here, Thanks for having me on.
Speaker 2 (18:16):
Let's talk a little bit about Loose Change. How did
you get involved in making this film.
Speaker 1 (18:22):
I was a soldier in Afghanistan and Iraq and my
best friend Dylan Avery, Tim and I were communicating from
you know, him in the United States and myself overseas,
and you know, just talking back and forth and largely
kind of came from a place of Dylan didn't really
know what was going on with me and different things
that nature and you know, started to just kind of
dig into things.
Speaker 2 (18:43):
Now, is it correct. It started out as a fictional
narrative story and then morphed into becoming more of a
documentary style film.
Speaker 1 (18:51):
Yes, that is correct. Dylan Avery, who is the director
of the film, you know, he was always aspirational and
he always wanted to make a movie, and he started
to write a script in the post nine to eleven era,
and then in doing so and you know, writing that script,
he was doing a lot of research about September eleventh,
and you know, on the Internet, researching different things and
coming across different information that the film started to split
(19:13):
kind of from like a narrative and then there was
sections of documentary. And then he did his first screening
and the immediate response was like, this documentary is very interesting.
You should drop all that narrative stuff. Because we had
no ability to act or do anything of that nature,
and our cameras were terrible, and you know, it was
basically still like pre DSLR days, and we had no
money or equipment to actually make a movie. But he
(19:36):
did have the ability to kind of edit together, you know,
small chunks of information on a laptop, which was really
new technology at that time. The fact that we were
even able to get a camera at all and a
laptop and be able to shoot content and edit that,
you know, on a PC was a revolutionary at that time,
and it was really intriguing for us as young men,
(19:57):
and as myself coming out of the military. It was
technology that I was interested in and it was something
that I enjoyed doing, you know, shooting footage, and I
started to do it while I was in the military
making videos for my battalion and things of that nature.
And then once I got out after my second tour,
I joined Dylan in DC and he was already in
the process of releasing Loose Change, and I just kind
(20:18):
of came on board to help him produce that film
and really get it out there as much as possible,
and it just caught onto things that were really early
on at that time. Google Video, which is kind of
the predecessor to YouTube, was just coming online and it
was a way that we were able to share information
and we didn't even really do it a lot. We
uploaded like a version of the movie in English, and
then other people all around the world would download it
(20:39):
and they would change it into their language, German, Korean,
different things of that nature, and then re upload it
to Google Video. And during I think it was two
thousand and five and two thousand and six, Loose Change held,
you know, the first top video positions from one to
eighteen and all these different languages, and it was just
again was taking off in a way that nobody expected,
(21:00):
but he really could have foreseen. It was just kind
of the culmination of perfect circumstances between technology that was
available to filmmakers early on the growth of the Internet,
as you guys have been talking about as well as
you know, and this is really I think the big
thing is at that time, there was a huge response
to the Bush administration. You know, you guys just talked
a lot about why you know that these groups kind
(21:20):
of came together and that nine to eleven was the
beginning of the digital conspiracy theory, which I agree with.
It just kind of it was all a response because
the Bush administration wasn't investigating nine to eleven. At a
certain point, the Jersey girls who were victims of the
nine to eleven or family members of nine to eleven
victims were demanding investigation into nine to eleven, and the
Bush administration, who was already entrenched in war in Afghanistan,
(21:43):
was like, no, we're not going to investigate this. We're
focused on the war right now. And that's when there
started to be this like huge uprising of people are like,
why won't you investigate it? You know, what are you
trying to hide? And then you know, for people like
myself who were overseas and fighting these wars, it was
you know, also you know, disheartening. And then you have
movies like Michael Morris Fahrenheit nine eleven that were coming out,
and so there was a lot of anti war, anti
(22:03):
Bush administration feelings within the nation that really caused these
things to kind of culminate in different areas. And once
they did investigate nine to eleven and they came out
with a nine to eleven Commission report, of course, there
was a large upward to that as well, because it
really wasn't efficient, sufficient investigation and didn't answer most of
the questions that the family members were asking for in
the first place, which is I believe why society and
(22:24):
members of that society like myself reacted in the way
that we did to create media that was to educate
people about things that could potentially be going on. So
they got more invested with the Bush administration and what
they were doing.
Speaker 2 (22:37):
Walk me through your headspace a little bit there, quit
because so we're talking, you're getting involved around two thousand
and four, two thousand and five, is that correct?
Speaker 1 (22:44):
Yeah?
Speaker 2 (22:44):
How old are you at the time.
Speaker 1 (22:46):
I was twenty two coming out of the military.
Speaker 2 (22:49):
Twenty two, and you're in Iraq?
Speaker 1 (22:51):
Yeah, Actually I turned nineteen in Iraq. I turned I
was I turned nineteen Afghanistan, and then I turned twenty Iraq. Sorry,
that was the exact years.
Speaker 2 (22:59):
You're in Afghanistan and then you're a how are you
feeling told your in Iraq? How are you feeling?
Speaker 1 (23:03):
You know, early on, like everybody, I drank the kool aid.
There's even news articles out there of my hometown paper saying,
you know, terrorism's got to be dealt with. But it
was in Afghanistan that we were told that we were
going to Iraq, well before the general public was. And
then I got to live that firsthand, you know, knowing
that knowledge, coming back to the United States, seeing them
drum up the war effort for Iraq with the false
(23:23):
intelligence that we all know is false intelligence. Now that
we directly lied to the American people and murdered innocent
people in Iraq, Let's say what it was.
Speaker 2 (23:30):
Are you feeling this and doubting that as you are
in Iraq?
Speaker 1 (23:33):
You know, I remember a very specific conversation in the
emergency room of medical City in Baghdad with a father
whose daughter's head was blown off, and he was like,
this is what's going to happen. He goes, you guys
came in here and we have let you do what
you're doing, and he's like, it's going to get worse,
and it's going to keep getting worse and until you
guys leave, because we will never stop. And this is
(23:54):
what's happening, is you're killing innocent people like my daughter.
And guess what. Exactly what he described to me on
the first wave of that invade is what I saw
not only on my invasion, but every subsequent one after that,
as it just continuously got worse as one administration handed
it to the next, and things in that region of
the world just turned into absolute garbage. So personally, for
somebody who me, you know who you know stepped forward
(24:15):
and was fighting for the American government and then to
learn that they're just basically lying to the American people
so that them and their buddies have a blank check
to rip off American taxpayers, and then it's like, all right, well,
we should probably have a conversation about this as citizens
of our country, right because this is fucked up. I'm sorry,
I'm just going to say for what it is, like,
this was a terrible time in American history where the
government was just running amok and citizens were genuinely upset
(24:39):
and concerned you know, and that's that's where we you know,
what I like to really focus on is the fact
of where these kind of things came from.
Speaker 2 (24:45):
It's fascinating to hear this is the story we don't
get to know, like what you're walking into, where you're
coming from, as as you start to put together loose change.
I guess, so you have your experience in Iraq clearly
affects your point of view and your opinion towards the
American government, and clearly a lot of distrust and the
information you're getting. And did you see the internet the
way that Joan has kind of described it as a
(25:07):
place to find community, as to find porn, to find
port Yeah, I guess first of all, do you first
go and find poor and then like, oh, I could
also use this as a place to find community and
or to put out information, seek out information? Is your
take on the Internet at that time?
Speaker 1 (25:23):
Similar what my take is on the Internet is kind
of a cause and reaction that we always see throughout
human society as we continue to evolve. Right, information was
growing and things were happening, and so these things started
to go in one direction or the other. And it's
really the largest question here is can human nature? Can
human survive? Mass communication? Which is what we're really at
the beginning of here and at the beginning of the
(25:45):
internet was And so for me to just kind of
see all this different stuff was crazy, But for us
it was definitely a way to what I would call
weaponized information. We were able to use these new platforms
to get stuff out there in a way that was
never done before.
Speaker 2 (25:58):
So Dylan's a filmmaker, and even at the idea is
let's create something narrative and successful in that sense, did
things shift and you saw yourselves as as activists as
opposed to filmmakers at some point?
Speaker 1 (26:11):
Yeah, definitely. I mean, you know, we were given a
pretty big hat to wear. It wasn't something that we
asked for. We were young kids. Where we the best
messengers for that? Of course not. Dylan just made a
great video that was you know, very that was caught
by people. As you know, people could receive it or
they liked it, or you know what about whatever about
it was something was new, and like she said, you know,
they felt like they were on the inside of information.
(26:32):
And so it grew exponentially and you know, there was
you know, memes later on about you know, college kids
pickup lines. Was have you seen loose Change and that
kind of thing, But it definitely morphed, like we were
talking about two very different errors of time. Here we're
talking about the creation of loose Change in the base
of the Internet, and then where we are today, which
is wildly different.
Speaker 2 (26:49):
Right, we'll get into some of the content of loose
Change and also where we are today, Joan, I want
to bring you in here. Loose Change becomes some say,
one of the first viral hits, something like one hundred
million people watched it were affected by it. What was
it that made it go viral? From your perspective, Joan,
did we have even a concept of virality at the
(27:11):
time when this was launched in two thousand and six.
Speaker 3 (27:14):
No, Well, the things that used to go viral online
at that time were you know, still what goes viral
these days, which is pictures of animals, cats, you know,
funny memes, and you have to remember that, like video
is new at that stage, right, and so, but what
(27:35):
really was this groundswell of interest was small groups sharing
this link, getting involved in discussions about this a film
and this documentary and the community around it that we're
(27:57):
also digging out different pieces of information and putting them,
putting this really big puzzle together on message boards where
people were communicating with one another and trying to add
to the story, right, And in that way, the early
Internet is highly participatory. And I think that one of
(28:20):
the things you don't get with the kind of conspiracy
that we would think of with JFK is the narratives
come down, but there's not a lot of ways in
which you can interact with the narrative. You can believe
it or not. But with nine to eleven conspiracism, you
had this ongoing daily dialogue that you could participate in
(28:43):
and that you could add to. And so that community
building and even this idea that you were a truth
seeker rather than someone that was merely just you know,
consuming what the mainstream media was telling you. And you
were like this drone that was just living your life. Right.
(29:04):
You weren't going to look away. You were going to
look further and further and deeper and deeper into this.
And people were meeting each other, they were having you know, conventions,
they were making memes together and sharing them, and so
it was a highly participatory moment for the culture. And
because you thought that you were finding things that government
(29:26):
and other groups were keeping from you that really made
you want to dig in more and understand more. And
the military component I think is really important here, because
when people feel like they're being lied to and the
democracy is at stake, they're willing to do things that
(29:47):
they otherwise wouldn't have been willing to do. And so
at the same time, not just online you have these
media that's traveling, you also have a fairly intense anti
war movement that is consuming this information and then bringing
(30:09):
it into the streets and trying as best as they
can to stop US imperialism.
Speaker 2 (30:18):
Corey, I know you don't think of yourself as a
conspiracy theorist, and that you have passionate views about right
wing conspiracsty like Alex Jones. What's the cleanest way to separate,
in your view, what the difference is between you and
someone like Alex Jones.
Speaker 1 (30:34):
Alex Jones is definitely someone who's turned this into a
money making operation. He's become very wealthy out of this,
and he's gotten himself into very high political places. I mean,
let's remember, and again this is something I really need
to harp on here because We've had a whole conversation
about conspiracy theories, and we need to talk about when
this really got out of control, because for a long time,
(30:54):
this nine eleven conspiracy stuff kind of really quieted down.
My life had moved on. People weren't talking about this anymore.
I wasn't getting nearly the messages that I still get
to this day until the candidate of Donald Trump came around,
and that candidate of Donald Trump utilized Alex Jones' platform
to promote himself and to align himself with this kind
(31:15):
of base of people, and then decided to use that
in his presidential career with the assistance of Fox News
to perpetuate these conspiracy theories on a level that's never
been seen before. Again, you're talking, we're talking about two
very different things here to twenty year old kids who
made a college level movie and put it out for
free on the Internet, and then the president of the
United States utilizing Fox News to weaponize conspiracy theories to
(31:39):
ignite a base to try to overthrow the country. And
then now we're in this kind of post era and
they used this and they took this and which so
ironic about it. It's the same group of people that
hated us when we made this video because we were
anti war, we were leftists, we were liberals, we didn't
want to we were pacifist, I'm not into guns and
that kind of shit. And so now to have the
same people that hated us using this material to propagate
(32:02):
their own nonsense is kind of very interesting to me.
And furthermore, on Alex Jones, Like, you know, obviously we're
talking about him. He just got hit with about a
billion dollar fine after you tie in legal fees and
all those different things, as as he should, and so
let's really focus on what that is. That's the shooting
and the fact that he's claiming that the people were
actors and all that nonsense, right, And so what's the
(32:23):
difference between those two events between nine to eleven and
Sandy Hook. Nine to eleven was a response by family
members in an era when there was information that wasn't
being disseminated to the American public, and it was not
only conspiracy theorists who were interested in that information. The
American media was perpetuating nine to eleven for decades afterwards,
with every little bit of new information that was coming out.
(32:43):
But back to you know, Sandy Hook. That kind of
conspiracy came up within a couple of months, and it
was generated on the Internet by people who were not
directly related to the event, which is very different than
the nine to eleven situation where this took years to culminate.
And so for us, we were coming from a place
where we were trying to do what we believed was
honorable using the things that we had available to us
(33:05):
at the time, and we believed in what we were doing,
and we were trying to make it the most scholarly
piece of evidence that we could put out there, and
we always that's why we did so many revisions, and
that's why we kind of removed things, and we admitted
to our mistakes, and we consistently tried to just have
a conversation about it so that we could always get
a new investigation, and that was always our aim, and
the reason we wanted that new investigation was to support
the family members who also wanted that new investigation into
(33:28):
nine to eleven, and never they never got it.
Speaker 2 (33:30):
What is your relationship with with it now? Knowing where
we're at, obviously we're in a very different place than
we were. Across social media is very different now, and
like you're an older person, information has come out, there's
distrust across the board, and I know you guys have
revised the film, but there's even a cottage industry that's
(33:51):
sprung up to debunk theories that you guys were putting
out there as well, Like how do you see that
film currently?
Speaker 1 (33:59):
I mean, I'm the producer of that film and I
will be for the rest of my life. So my
job is to make sure that it doesn't disappear because
it's such an important piece of information that we need
to analyze and have a conversation about.
Speaker 2 (34:10):
And I also think you still have the same questions
about nine to eleven that you had in that film.
Do you have those today?
Speaker 1 (34:17):
There's definitely, you know, there's a lot That film was
put out twenty years ago, right and during that time,
so much more information has come out from the United
States government with redacted documents and different things of that nature.
But there's still some major questions for me that need
to be answered.
Speaker 2 (34:30):
Is this brings up really a lot of interesting questions
and it's a delicate conversation. I think, Corey, I can see.
I think you bring up something that I think a
lot of people on the left on the right are
grappling with right now, we should be skeptical of our
government and the institutions around us, and I think we're
(34:51):
looking for what that line is of what is healthy
skepticism and what is skepticism that is degrading faith in institutions.
I think there are critics of something like loose change
and some of these the truth or movement. There are
critics that live within victims' families who feel like this
(35:12):
takes the responsibility off of the people who perhaps perpetuated
the horror of nine to eleven, and it adds disinformation
out there that it erodes faith in institutions. But I'm
sure we should be more skeptical of the institutions and
the information that we have. I think there's an argument
too of if some people would argue that what you're
(35:34):
putting out there's misinformation, it's also in response to a
government that is putting out misinformation. You're fighting a war
in Iraq that is based on misinformation, which puts us
in this fucking place right now where where it doesn't
feel like we're getting healthy, good information. Joan, I think
I look to you when it comes to theories where
is the healthy line? How do we show distrust in
(35:57):
uh positions of power without eroding distrust, eroding trust in
sort of our society.
Speaker 3 (36:05):
Well, I what's interesting about government or the state is
I don't think there's anybody that's ever been really satisfied
with the state. I don't think that there's a utopia
anywhere where people are like, you know, who's doing a
good job our government? Right? Like, it's just not something
you hear, right, especially as we get into different issues.
(36:25):
But back in the early aughts, people were using you know,
there was a familiar meme going around. Bushlie people died, right,
And he had made these statements about quote unquote a
massive stockpile of biological weapons. Other others had argued that,
you know, well we don't know if there were nuclear weapons,
(36:47):
but we're pretty sure, you know, And so there was
a lot of hedging back then about what to do
and how to do it. But when you say massive
stockpile and people are doubting that, it tends the governments
tend to double down on that information. We've seen that
meme repeated over and over. Obama lied, people died, you know,
(37:10):
Trump lied people died. It keeps coming up right, And
I think that as we imagine the role of government
in our lives and what governments should and could be
responsible for, we're at another crossroads right now with the
role of NATO in the Ukrainian and Russian war going on.
(37:32):
And is it the fact that NATO is fighting a
proxy war with you know, with Ukraine suffering all of
the serious, serious casualties, And so I think that it's
important for people to be skeptical of governments and very
powerful people making these decisions when it comes to massive casualties.
(37:58):
Now that does it mean we should just throw our
arms in the air and say everything is endlessly corrupt
and there's nothing we can do, because I do at
the end of the day, and I think maybe Corey
agrees with me. I do believe in the power of
people and the power of people to come together to
formulate their own ideas, to dig in and look at
(38:22):
what kind of evidence is out there. And we do
need to have more facts and public interest information circulate
throughout our society. And the last point I'll make on this,
which is to say, I think we need a lot
more journalism. I think we need a lot more investigation.
I think When it comes to who's going to hold
(38:42):
these people accountable, it's going to be journalists who are
going to be able to get the goods. I don't
think we can rely on law enforcement and those other
kinds of institutions to get to the bottom of corrupt governments.
It just doesn't really seem to be doing the job.
Journalists have always played this role of digging in, finding
(39:07):
and piecing together different bits of information and creating that narrative.
And so in many ways, Corey and those that made
Loose Change weren't necessarily your traditional style journalists, but they
do are. They are the archetype of this early form
(39:29):
of digital journalism where people were doing more than asking questions,
but really trying to make media to mobilize audiences and
to get people to think differently. And hopefully what it
does is it instills in people a skeptical attitude about
how do you critique and understand information, how do you
(39:50):
piece it together? And then further than that, how do
you hold accountable people in power that are telling massive lies?
And I think that where the big question about studying
disinformation comes in right now, is because we don't necessarily
know who's going to hold the very, very.
Speaker 2 (40:11):
Rich and.
Speaker 3 (40:14):
Powerful to account for spreading lies at scale. I think
the most recent example of that is trying to understand
who is responsible for the January sixth insurrection and what
does that mean to hold someone responsible for an event
(40:34):
like that, Corey.
Speaker 2 (40:36):
When you look at the information on the Internet, who
should we trust to ask these questions? And what information
should we be trusting on the Internet.
Speaker 1 (40:46):
I think we're in such a grey zone right now
that we don't have an answer to that, And I
think what we need to kind of come to terms
with is the fact that we as a society won't
have an answer to that. But I think, and this
is my idea, this is my solution, This is where
the line is for me, is that we need to
educate our children better right from the start. If you
ask any kid in America right now what we're Columbus's
three ships, they'll tell you right. And so we know
(41:07):
how to teach our children good information. We just need
we know how to teach our children information. We just
indeed to make that good information. And so I think
we need to kind of just accept the fact that
the where we are right now is kind of where
we are and of course we need to tombstone engineer
that as best we can. But we need to do
our research on how this misinformation is affecting us and
how it's driving us and the things like Joan is
(41:27):
doing and trying to grapple with, and then figure out
a way that we can instill that information into our
children early on, so that they grow up with the
right tools to be able to discern good information from bad.
And I think that's a solution. Of course, it's not perfect,
but it at least push us in the right direction.
And as very much what like Jones said, there's no utopia.
Humans aren't perfect and we never will be, and so
(41:48):
we need to just kind of keep working towards something
better and leave it better than we found it. And
so in this instance, with this new digital age, we
have created this new weapon of mass communication, and we
need to figure out how how it really adjusts to
humans and how we can use it as a benefit
instead of what we've created, which is this kind of
individualistic society where everybody thinks they're the center of the universe,
(42:09):
and figure it and it retool it into something that's
more beneficial for society, you know, like how is how
is the societies of the twenty one hundred is going
to be using the internet? Can we envision that, Can
we envision how they transmit information, good information, factual information,
and try to reverse engineer that for our own society
and start to implement those rules so that we can
get to that place for the next generation. Because as
(42:30):
I see it right now, our current generations, we just
got to let us go. We're done, Like we don't
even have a chance.
Speaker 3 (42:35):
Oh, come on, come on, Okay, I don't know.
Speaker 1 (42:38):
It's a very optimistic point of view. I've been watching
Jordan's pieces, and I like what I see out there
is like scary, so like I'm not sure there's ever
coming back from that. Right, And we think Trump was
so bad, wait till the next one comes down. Because
when I was in the army, the one thing I
always had a new first sergeant like every eight months,
and I was hoping that the next first sergeant would
be better than the last one. And he was never better.
(42:58):
He was always worse, always with new rules and restrictions
and and so I mean, I hope you know I
used to be optimistic, like ten years ago, right, we
all used to be optimistic, but then the last ten
years happened and now we're a lot more pessimistic.
Speaker 2 (43:11):
We'll put a bow on this, and I want to
We're going to talk about a couple other things, but
kind of the final question for both of you within
this segment here, what do you think the legacy of
Loose Change is? Corey?
Speaker 1 (43:26):
I think Loose Change is the first viral video of
the Internet. It's the only documentary that people are still
talking about now all these years later. There's a lot
of different stuff comes out, and I'm proud of that fact,
Like I helped make a piece of media that was
like truly just long. He's going to live past me probably,
and that's cool. And what I think it's turned into
is the digital version of a band book and we
need you know, And then the statement goes any band burger,
(43:47):
but any band book is worth reading, right, And so
again I think that Loose Change needs to exist on
the Internet so that we can have the conversations about it.
Are humans going to continuously use pieces of information like
loose Change or anything else to you know, push their
own views? Of course they are. That's human nature, and
it doesn't matter if it's loose Change or zeitgeist or
something they saw on Fox News. They're going to use
(44:09):
whatever they need to use to propagate their point of view.
But I like to look back at Loose Change as
the culmination of an amazing series of events that nobody
could have seen coming, and it really did rock the world,
like it still goes on to this day. And what's
super interesting about it is how it's morphed throughout these
years and to me to see, you know, how it's
been used incorrectly by other people, to especially American presidents
(44:30):
and Alex Jones and these different people. We need to
latch onto that and not be afraid of it. We
need to understand why it's happening and do the studies
that she's talking about so that we can understand why
these things happen and then again equip our children to
be able to deal with them better. I was not
trained for the society that I was pushed into. Right
in high school, I was like, Hey, we're gonna do this.
(44:51):
Nina Pinto Santarealiski's Columbus Day, and then you know, on
my eighteenth birthday, essentially I'm invading Afghanistan. And then on
my nineteenth birthday. I'm invading, you know, Iraq, and I
get to live for hand at this early stage in
my life seeing the American you know, foreign policy just
as horrible as it really is. And I mean imagine
the psychological like just breakdown that I went through as
(45:12):
a human being, trying to understand that everything that you
were raised to believe in is an utter lie like
and that it is just complete facade and the thing
that you think was you were believing in it is
long long gone.
Speaker 2 (45:22):
Joan, what do you see the legacy of loose Change
as I think.
Speaker 3 (45:27):
You know, I think about it in a broader sense.
Then it wasn't just the video and the evidence presented
in it, but it's part of a moment where, you know, Corey,
I appreciate you talking about how it was translated into
many different languages. People felt that they could pick it
up and take elements of it, translate elements of it,
(45:51):
and make it their own. And it really shows us
how this kind of participatory internet culture was going to develop,
was that people were going to take information, they were
going to remix it in many ways. And you know,
no shade, Corey, but we don't even remember the authors
(46:13):
of it, right, like it's anonymous in that sense, it
becomes a piece of the culture, and you know, clips
of it, people I'm sure will remember, and means that
come out of it are definitely something that have lived on.
But by and large it was you know, born of
the Internet and then created and became the infrastructure in
(46:36):
the content on which many different kinds of communities based
their worldviews. And I think that when you come into
contact with that those ideas very early on as you're
making your identity, and you know, I'm sure at eighteen
other people in your life were either going off to
college or starting new businesses or you know, not going
(47:02):
to war. But it was you know, it's a really
unique time in American culture with the technological shifts that
people were grappling with and the uncertainty we don't The
thing that nine to eleven itself introduces to the American
psyche is that it can happen here, that the war
(47:25):
can be brought home, and as a result, you get
this paranoia in society about the other and about being attacked,
and you don't feel as if you have protection and
security from the government, and so finding one another and
using information and building knowledge together becomes an incredibly powerful
(47:52):
mode of solidarity. And I think that, you know, as
the Internet has progressed and things have changed, those groups
of people, people that found each other in those moments
after nine to eleven that we're sharing these kinds of theories,
continue to be in community with one another and continue
(48:15):
to be critical of the state. And the last thing
I'll add about this moment, especially around conspiracy, is sometimes
communities have to use conspiracy as a way to protect
themselves from governments and government overreach. It's not uncommon for
(48:38):
if you take a situation like Flint, where people were
saying there's something wrong with the water, there's something going on,
and people were really dismissive at the beginning of the
Flint water crisis because people hadn't really learned how to
do science and to build science around the pollution in Flint.
And so sometimes rumors and conspiracy can help communities come
(49:03):
together and focus on a problem. And sometimes it's true.
And I think that elements of what came out of
loose change or out of that moment that we would
have called conspiracy end up challenging power and becoming an
important way in which we resist tyranny and authoritarianism.
Speaker 2 (49:28):
Well, we need to take a quick break, but when
we come back, I want to dive into how social
media companies are dealing with disinformation in twenty twenty two,
or if they even are at all. Welcome back, everybody, joan.
If posting a conspiracy theory on YouTube is media manipulation,
is the company that lets it remain posted participating in
(49:51):
that manipulation.
Speaker 3 (49:53):
Well, it's a good question. Right now. Legally the answer
is no, although there is an interesting case that's being
picked up by the Supreme Court where there was some
terrorism content that was posted on YouTube and the terrorists
made money off of it because it was monetized, and
(50:15):
so now the Supreme Court is trying to figure out
if YouTube was funding terrorism essentially, and so that is
a very unique thing though, but by and large, companies
get a big pass on their products being used to
spread conspiracy. It's only been since about twenty eighteen that
(50:39):
companies have decided that they're going to in force terms
of service around lies and disinformation. I think in twenty
eighteen was the first time we saw info Wars and
Alex Jones get deplatformed. He's probably one of the most
famous people that have been moved off of these platforms,
(51:00):
and that had a lot to do with public pressure
by activists and advertisers to ensure that the information that
was being provided on these platforms, even if it was entertainment,
was not defamatory, libelous, hate, harassment, or incitement.
Speaker 1 (51:19):
The question always becomes, you know, where is that line?
And if you have to censor him for this, then
you have to censor this person for this, and before
long no one can say anything. And I mean I've
dealt with this personally as well. Like loose change lived
on YouTube for years. It's hundreds of millions of you,
so many people have put it up. I had it
on my own channel just because I needed a place
to park it for free so that people could see it,
analyze that, have conversations about it, what have you. And
(51:40):
of course over the years people have complained to YouTube
about it, and they would send me warnings about it
and things of that nature. But one day, essentially right
after around twenty eighteen, I just one day got an
email from YouTube and it was like, we've taken down
loose Change for hate speech. You can't you can't fight
against this. This is just something we're going to do
and I and of course I'll write back, like what
exactly is the hate speech within loose change, because there's
nothing in loose change that's trying to incite a riot,
(52:03):
There's nothing in it that's defammatory towards anybody, and it's
just a it's a piece of information after information that
we're putting forward. And so YouTube has the ability as
a content provider to not allow me to put my
video on them, and that's their business, and that's that
I understand. I think that's a great line for companies
to have the ability to shut those things off. I
think it was amazing that Twitter was able to turn
(52:23):
off Donald Trump, right, and I hope he never comes back.
But at the same time, these are all crazy people.
They're going to keep talking. It's our responsibility not to
listen to them. There's you know, I'm driving through Amsterdam,
New York the other day, there's a guy in the
bus station just yelling at everybody that drives by. If
I stop and listen to him and start broadcasting him
on national television, well that's more on me and the
(52:44):
people watching than the person who's yelling at the bus station.
And so at my point, we need to be able
to live our life. We need to be a free
person in a world, and not because we're Americans, but
because we're humans. And you have the right to live
your life. And as long as you don't hurt another
person physically altered that you know, change their life in
any way, then you should have the ability to live
your life however you want. And we're seeing that pushback
(53:05):
between regulation the state and people who want to live
their life and do their own thing. And you know
self identify as a cat.
Speaker 2 (53:11):
Well, but we're in a tricky spot though right now,
right like you keep talking about loose change or all
these things as pieces of information, and you're right, we
should be able to have access to information, to have
conversations around information. I'd love to live in a society
that can have complicated, thoughtful conversations that can be extended
(53:32):
and interesting. Sadly, it doesn't feel like we're in that
society very often. But putting something controversial on an online
space might not just be information anymore. I mean, it
is an act that incites distrust, and it's an act
that could incite excitement and interest and curiosity for sure,
(53:53):
But I don't know if it is neutral anymore. And
so is it a cop out to say it's just information?
Alex Jones can put that out there, it's just information.
Like this information has a reaction and causes a reaction,
it is.
Speaker 1 (54:07):
And people should be held accountable when that information takes
things to the next level. And again, why I was
never invited on the Daily Show before Donald Trump? Right,
even when Loose Change was that it's heyday, like you
guys wouldn't even talk about it. And now twenty years later,
post Donald Trump, we're having these conversations not because a
DVD was made twenty years ago, but because a president
used the national platform to propagate lies to the American people,
(54:30):
which caused them to try to overthrow the United States Capital.
And every single person that was there should be held
accountable and they should be put in jail, and the
president should be held accountable, and we should learn from
that as a country and as a society. And that's
the line, right, because if you go over line, you
start to hurt other people, you take away their freedoms,
you're impeding them from living their free life. That's where
(54:50):
the line is. And we were never there before. We
were never having those conversations. It wasn't even part of
it now post Donald Trump, because we have this now,
we live in a world where we have to deal
with with all this craziness. And it was there because
corporations wanted to make money, because politicians want to be
a reelected And exactly like you highlighted in your last piece,
how many people that are running for office right now
(55:11):
believe that the election was stolen, right, And it's a
ridiculous amount of them. And that's not because of Loose Change.
That's because of a president who used Fox News to
propagate lives to the American people. And this is a
trend throughout current American history and new media where these
or administrations are using media to lie to the American people,
to propagate for their own profit and personal growth, and
(55:33):
then they just get to retire and go do whatever
they want to do. And so of course people are
starting to get pissed. And so, yes, it is information,
Yes it does stir stuff up. But I wasn't into
conspiracy theories before Loose Change. I'm not into conspiracy theories
afterwards because I don't believe it's conspiracy theory. I believe
these are things that we actually need to talk about,
that these are actually things that are happening in our country.
And as Joan just supported me on, like, we know
(55:55):
that the American government was lying to us about the
war in Iraq, and no one's been held accountable. So
where does that line go again? If people are hurt
or people are killed and their freedoms are impeded in
any way, then that has to be held accountable for.
But people having conversations and discussing free information, we can't
limit that otherwise nobody gets to say anything.
Speaker 3 (56:13):
Yeah, I think Corey, one of the things in this
is something that I think a lot about is the
scale is different. So social media introduces a different relationship
between free speech and audiences or listening. Right, there's no
obligation to listen. There's also no right to broadcast. There's
(56:35):
no right in that sense of being the right to
reach eighty million people. We don't have. We actually have
laws against using broadcasts to do inciting things. And so
so for me, you know, Alex Jones isn't necessarily just
(56:58):
having conversations that he's moving between that and mobilizing audiences,
and he.
Speaker 1 (57:05):
Was held accountable right across the line, and so now
we have a consequence to that which is exactly the way.
Speaker 3 (57:12):
And I wonder if that consequence is actually you know,
reflective of how out of scale with or out of
touch with reality. The Internet and social media companies have become.
Like finding someone a billion dollars, it almost seems comical,
But when it comes to the scale of the Internet,
more is different. It's different when millions of people are
(57:35):
doing a thing versus even a regional radio station. And
we've never had broadcast rules attached to the Internet in
the same way that we have broadcast rules for television
and radio, and so I you know, what I would
love to see is us moving more towards accountability for
(57:55):
people that have access to and are broadcasting to larger audiences.
So maybe it's the case that if someone's you know,
talking to their you know, twenty five friends on a
discord server, maybe that's not something we need to bother with.
But when somebody is reaching a million people and they
have these calls to action, and they are especially in
(58:17):
the case of profiting from political oppression, our profiting from
lies and disinformation, that we should have some new regulations
to ensure that they're not able to hurt people, and
so I think ultimately, until we understand the scale question
and how more is different, we're not going to be
(58:39):
able to completely address, well, what does free speech mean
in the context of the internet, especially when I could
just say your name and say you did this dastardly thing,
and there's really no retraction, there's no way to.
Speaker 1 (58:56):
Yeah, but it's gotten so much worse than that, right, Like,
we're way by on that at this point too, because
now you can have a kid walk into or I'm sorry,
a kid walk down the street with an AAR fifteen
shoot and kill people, and you have half the country
that supports that person, and you have political candidates, new
stations who then fight for that person. And what's even
worse is, again it's not just about groups on the Internet.
(59:17):
Now we have CNN and Fox that no matter what
the question is, it's going to be a debate from
one side or the other. And it is sickening. No
matter what side of the conversation you are right. If
you're a conservative and you're watching Fox News and you
see CNN, you're like, oh my god, this is just
absolutely ridiculous. But if you're a liberal and you're watching Fox,
You're like, you're the same position.
Speaker 3 (59:34):
There's a really interesting book one of my team members
wrote called Networked Propaganda, and it's about these media ecosystems
and how the media has developed over the last twenty years,
but particularly looking at the twenty sixteen election, and the
right Way media ecosystem is very different from left and
center media. And what's interesting about the right Way media
(59:58):
ecosystem is how how quickly they will call us around
a story in a narrative, and if the facts don't fit,
it's party over the news, right, You've got to get
the party line. And you know this isn't in the book,
but the controversy around dominion voting machines and how if
you said negative things about dominion on television, dominion is
(01:00:22):
able to sue you. If you're saying negative things about
dominion on the internet, it's going to be decided by
the courts. And I think that that moment where we
start to realize that these media companies are constricted in
some ways by these different mediums, and the regulatory systems
(01:00:42):
around those mediums eventually are going to be tested in
the courts. And you know, when it comes to left
and center media. They do not have the same kind
of infrastructure online. They don't have this kind of motivated
audiences in order to spread and distribute the news as
(01:01:06):
the right does. The right has an incredible distribution muscle
through Facebook and Twitter and YouTube. And so we're going
to see over time how these different media ecosystems interact.
And but I don't know, you know, like I'm you know,
I'm a big joker. I get it. Clinton News Network MSDNC,
(01:01:30):
I'm with you. I'm with you, you know. And I
don't know if cable news is really going to survive
the Internet era. But what we're dealing with is a
difference of well, do we want news or do we
want partisan politics that looks like news? Right, And some
of this is I know, I can tell Corey for
(01:01:52):
you it comes down to, well, who's getting paid out?
And you know, and I agree with you, we should
follow the money, always follow them money. But also I
think the light for me, or the optimism, comes in
where the Internet is a huge international project and we
(01:02:13):
could reimagine some technology, some design so that we have
room for news, we have room for fact based discussion.
We have right now what we have is social media
which is essentially trying to monetize any bit of information
(01:02:34):
that it can and it's not designed specifically to spread
public interest information. And I think that that's where we
get into a lot of our problems because we used
to rely much more on traditional media to get information
out there, and now the gates have shifted, and I wonder,
(01:02:55):
you know, at the end of the day, are we
going to be able to depend on Elon Musk and
Mark Zuckerberg and you know, Kanye is buying parlor, We've
got Trump with true social are we going to be
able to trust social networks to get this public information
out there? And if not, what do we build right?
(01:03:19):
And how do we get there? And that's those for
me are the big questions moving forward.
Speaker 2 (01:03:23):
Well, let's you know, I want to ask one final
question in that world, because I know there's disagreements here,
but it sounds like we want a similar thing, which
is to have free flowing information in conversations. The question
is where do those conversations live? And what you just described,
Joan is a sloppy social media system doesn't know how
(01:03:48):
to manage disinformation. It's now being run by Elon Musk
is running Twitter, who is throwing stuff willy nilly at
the wall. Where are we supposed to have these conversations one?
What can these platforms do? Or is there a platform
where this this type of healthy discourse can live or
(01:04:11):
we just are we just screwed Corey? What do you think?
Speaker 1 (01:04:15):
Well again, I mean this stuff's always been around, right,
it's just more visible now. KKK existed before the Internet,
and they had their little meetings and they put on
their costumes and they did all those different things. For me,
this is an issue with information in the way that
it's broadcast throughout American society, is that we just broadcast hypotheticals,
like we'll broadcast information about a case before we know
all the facts, and we have immediate like you know,
(01:04:35):
here's a car chase, we got to cover it. And
so we've gotten away from kind of fact based journalism
where we're just we're just broadcasting whatever we can do
to keep people's attention.
Speaker 2 (01:04:43):
What do you trust? Where do you go when you're
looking for information?
Speaker 1 (01:04:46):
I don't trust anyone, Like I've blocked every major news
application because I just can't handle it's all nonsense, Like it.
Speaker 2 (01:04:53):
Doesn't been your poe. What do you click you wake
up and what do you click.
Speaker 1 (01:04:56):
I read stuff about cameras. I read stuff about New
York state legalization. I'm into just different articles. I let
Google news feed give me stuff that's tailored to my interests,
and I block anything about Biden or Trump because I
just can't handle it. I think if you support a
politician at this point, it's basically the same as supporting
a football team. So it's just like they're just there
so you can buy a jersey. And so for me,
I'm what was the question. I don't know if I
(01:05:19):
got lost again, that's what it was. Sorry, So we're
broadcasting hypotheticals, right, and her question is how do we
fix this?
Speaker 3 (01:05:27):
Right?
Speaker 1 (01:05:28):
So there's I think the conversations can exist online because
even if they don't exist online, like I was saying
the KKK, they'll have their little meetings and so, but
it's up to the mainstream media to really grow a
backbone here and start to and again it's part of
the conversation that we need to go in how we
evolutionize the mass communication. But the media we need we
need to trust the media again, and that's I think
one of the major problems in America and the world
(01:05:49):
right now is that people don't trust the media, and
that's because they're reporting on hypotheticals. They're going to report
for a political base, and there's no true information that
you can follow anymore, where you would normally just clock
into the six PMNBC News and get the world report.
That you can't do that anymore without hypocrisy. And that again,
when I said at the beginning, hypocrisy is more visible.
People are upset because they know the government's been lying
(01:06:11):
to us and it's proven at this point over major
things for at least two decades. Now it's in my life.
And so, like you know, how do we hold people accountable?
How do we adjust this? And again, like Jones said,
let's focus on changing some regulations. Let's focus on, you know,
putting information where it belongs. And like I said, let's
focus on envisioning how the future societies communicate accurately, and
(01:06:32):
let's try to reverse engineer that for our society and
start to build those building blocks.
Speaker 2 (01:06:36):
But do you do you feel the same responsibility as
somebody who put information out there as the mainstream networks?
Speaker 1 (01:06:43):
Do I feel absolutely no responsibility over anything. No, I mean,
I'm living my own life. I'm living my own life.
If you want to do your thing, go do your thing.
If you want to make because again, everybody makes videos.
And what we're really talking about is a technological evolution
where people are able to carry a camera and disseminate
information online. And guess what, it's in everybody's hands right now.
We have all of human knowledge in our pocket. We
have a camera that can broadcast to everyone in the
world at the same time. And what do we do
(01:07:05):
with it as a society, And we're seeing it. We're
not doing the right thing. We're not growing as a society.
We're making things worse. So I don't know, but.
Speaker 2 (01:07:12):
You have a clear distrust for the media ecosystem, but
you yourself are a part of that.
Speaker 1 (01:07:18):
I mean, twenty years ago, I made a DVD. We
don't post on Facebook. I'm not out there promoting loose change.
I don't talk about it unless somebody reaches out to
me to ask about it. And I only do major
news at this point because the littler guys just are
normally tailored in one conspiratorial direction or the other, for
the right to the left, and so I like to
have real conversations with people like yourself, so we can
have a real conversation about this stuff and kind of
(01:07:39):
push it in a direction so that people understand it.
I've seen too much lazy journalism where they're just like,
loose change is responsible for all the disinformation on the Internet.
It's like that is the laziest thing you could do.
You're not digging into the conversation at all. You're not
even looking at the information. You're just trying to get clicks.
And that's where we are with reporting right now. We're
just trying to get clicks. And you yourself know that
you have to do crazy things. You have to go
to Trump route. He's ask people insane questions that I
(01:08:01):
would be terrified to ask them in person.
Speaker 2 (01:08:03):
Don't blow up my spot, Corey, don't blow up my spot.
Speaker 1 (01:08:06):
It's crazy. You're a crazy person, man. But I love
you and I really support everything you're doing too. And
I just want to say I love the Daily Show.
And John Stewart was my fucking hero and still as
as a veteran to be standing up for my rights
when no other political candidate is like, I would vote
for him if you ran. That's the only person I'm
interested in.
Speaker 2 (01:08:21):
I tell you, I think he'd have some backers for sure,
that Joe. If we can't trust the people running these platforms,
how are we supposed to trust and use these platforms?
Speaker 3 (01:08:33):
Yeah, I think you know, it's up to us to
work together. I think journalists have a huge role to play.
Journalism organizations have a huge role to play outside of
news media and corporations. I think journalists still like academics,
have a passion for the truth, right, And I think
that we are truth seekers and I think that that's
(01:08:56):
an important thing to hold on to in a time
when people feel like there's no anchor, that there's no
truth out there that we can access. And in some
ways I think that that post truth a society really
favors authoritarians. It really favors those who are willing to
(01:09:19):
lie to us at scale and depress us because we
then deactivate, We then step aside and walk away from
the responsibilities that we have to one another. So when
it comes to someone like Elon Musk, you know, he's
not your typical homoeconomicist, rational actor. He didn't buy Twitter
(01:09:45):
to make money, right. He spent forty four billion for
a product that he probably could have built on his
own for less than a billion dollars. But what he
was buying were the networks that were all part of
He was buying the networks of journalists, who was buying
the networks of politicians. He essentially bought the chessboard that
(01:10:08):
global politics is being played on at this stage. There's
really not a lot of ways in which anyone else
could have that kind of influence rather than being the
owner of a large platform. And I think that Musk's
political aspirations in terms of being part of the global
(01:10:32):
conversation about the war in Ukraine, what's going on in
Taiwan at the end of the day, are also being
driven by his business decisions around selling cars and who
the markets are that are going to buy these cars.
And he is going to be able to, you know,
(01:10:57):
gain some kind of political favor with different governments if
he uses Twitter in that way. And so I think
that there's a very big risk to allowing our communication
commons to be owned by single individuals that don't have
the public interest at the core, especially when it comes
(01:11:19):
to communication. You guys are old enough to remember long
distance calling. You know, you want to call three towns
over It was going to cost you twenty five cents
a minute. You know, we have a remarkable new innovation
here where we can call across the world. I'm calling
you from Ireland right now. I mean we can call
across the world for free and reach our family, reach
(01:11:44):
our friends, reach our collaborators, colleagues. And that's something I
don't want to lose in this moment where we're going
to see this massive shake up around what social media is,
how my platforms cost, and eventually how these networks are
(01:12:06):
going to change our society, especially our politics. And so
I think the time has come if we are going
to fix this for regulation around truth and advertising, knowing
your customers. Political advertising online needs to have much more oversight.
We do need to know exactly how much money these
(01:12:29):
platform companies are making and where it's going, how much
they're investing in content moderation, can they actually enforce their
terms of service? And as we move into understanding social
media as an industry, I think we can start to
fashion a public interest Internet that will provide the kinds
(01:12:52):
of information and forums that people need in order to
participate in elections and to participate in our political systems.
But right now we're at a very very early stage,
and it's going to take a lot of work to
(01:13:13):
build those institutions.
Speaker 2 (01:13:17):
Follow the money, follow the pornography. We'll get there.
Speaker 3 (01:13:22):
And follow me on Twitter.
Speaker 2 (01:13:25):
Oh self promotion.
Speaker 1 (01:13:26):
Yeah, don't follow me. Don't look for me. I don't
want you.
Speaker 3 (01:13:28):
Don't look for me. I'm not here.
Speaker 2 (01:13:31):
I was gonna say, Corey, I can't imagine you're big
on the TikTok no.
Speaker 1 (01:13:34):
I watched it for like a week, and then I
got tired. I got an Instagram follower with sixteen our
Inctragram account with sixteen followers. I don't do anything anyway.
I own a business. I make videos. That's my life.
Speaker 2 (01:13:45):
Well, Joe Donovan, Corey Row, thank you guys for great conversation,
healthy skepticism, trust and blowing all that shit up. I
love it. Listen to Jordan Klepper Fingers the Conspiracy from
The Daily Show on Apple podcast, the iHeartRadio, or wherever
you get your podcasts.
Speaker 1 (01:14:02):
Explore more shows from the Daily Show podcast universe by
searching The Daily Show wherever you get your podcasts.
Speaker 2 (01:14:09):
Watch The Daily Show weeknights at eleven ten Central on
Comedy Central and stream full episodes anytime on Fairmount Plus.
Speaker 1 (01:14:19):
This has been a Comedy Central podcast show.