Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
You're listening to Comedy Central.
Speaker 2 (00:07):
From the most trusted journalist at Comedy Central. It's America's
only sorts for news. This is the Daily Show with
your host son. Dude, why do the family show.
Speaker 1 (00:46):
We got one for you tonight. I'm not even praying
we got one for you tonight. Bill and Day, our
founder of PolitiFact, he'll be here on the program. But first,
you may have noticed it's October, the month we name
for the Roman goddess optimum. Obviously, you may not believe
(01:08):
it's October because the Mets are still playing basically, God,
I love them so much. Now an election years, October
is when everyone's on alert for October surprise. It's what
they call it when major unexpected news alters the race
(01:30):
in the home stretch of the campaign, like the Access
Hollywood tape, which, if you remember, destroyed Trump's chances of
being president. Whatever happened to him. But here we are
again in October. So a major port strike could make
(01:52):
for an October surprise.
Speaker 3 (01:54):
Could the infamous October surprise in this year's election actually
be coming from overseas.
Speaker 1 (01:58):
Hurricane Helen affecting at least two battleground states. This to
me might be The October surprise is a spike at.
Speaker 4 (02:05):
The pump, the October surprise that no one wants.
Speaker 1 (02:09):
Why our October surprises always so shitty? Why do we
never get a good October surprise and October surprise that
brings our country together? Oh, ladies and gentlemen, no one
saw this October surprise coming this close to the election.
But Pesto and MoU Dang are dating who it's an
(02:34):
October surprise. By the way, that picture is to scale.
That penguin is the size of a hippopotamus. Not shaming,
(02:55):
just saying, but it's a period of the campaign went.
No matter what happens, it's going to be analyzed through
its effect on the election, no matter how tactless that
may seem.
Speaker 4 (03:09):
October is now started out very good for Republicans. This debate,
chaos in the Middle East, the port strike, and of
course the cleanup in North Carolina. This is something obviously
in October. If this continues, that's going to bode well
for Republicans.
Speaker 1 (03:28):
Oh if monkey pox runs amuck, I don't see how
we lose. What does it actually say about a party
that a war, a strike, and a natural disaster work
in their favor sir. The elections close, but if we
could just get the population shell shocked and desperate, we
(03:50):
can do it. Of course, most people would say these
world events happening close to the election are not related
or intentional. Most people would say that some people what.
Speaker 5 (04:08):
Person Marjorie Taylor Green.
Speaker 6 (04:11):
She posted a map on Acts showing areas affected by
Hurricane Helene with an overlay of an electoral map, saying
it shows how hurricane devastation could affect the election. An
hour later, she posted this, and I quote, yes, they
can control the weather. It is ridiculous for anyone to
lie and say it cannot be done.
Speaker 1 (04:34):
Is this the space lazer thing? Again? Jews don't control
the weather. If we could control the weather, don't you
think we'd make Florida less humid? We retire there, here's
your think that you can you do something about. My
(04:57):
bulls are stuck to my thigh. But there was an
October surprise this weekend that I did not see coming,
and that surprises this Elon Musk has ups.
Speaker 2 (05:11):
This rocket company is the only reason we can now
send American astronauts into state.
Speaker 7 (05:18):
Come here, take over Elide, Yes, take over, but.
Speaker 1 (05:26):
Bye e'xactly like a guy who won a radio contest. No,
I can't believe I get to beat on the washer driver.
The world's richest man and one of the most popular
people on social media. He's got two hundred million followers
(05:49):
completely organically on his flatfor you know, because of how
interesting his sweets are, like things like hmmm and interesting
and FEMA is shutting down airspace to stop people from
bringing help. Yeah, he tweets that. Anyway, He's October Surprises.
(06:12):
He's come out Maga Hi everyone.
Speaker 5 (06:16):
As you can see, I'm not just maga.
Speaker 1 (06:18):
I'm dark, mega dark Maga. I didn't know it came
in flavors. I wonder if for the holidays they'll come
out with a peppermint balk maga or pumpkin spice maga.
I like my maga like I like my coffee filled
(06:38):
with chemicals that trink Your taste puts into thinking you're
drinking autumnal food. I don't know what my accent was.
Now you might think one of the world's richest men
controlling one of the world's most influential platforms could be
a recipe for what some may consider election interference. Stupid, stupid,
(06:59):
people you discussed me. Election interference is what Mark Zuckerberg did.
Speaker 8 (07:07):
Former President Trump alleging Facebook CEO Mark Zuckerberg will try
to unlawfully influence the twenty twenty four election, writing if
he does anything illegal this time, he will spend the
rest of his life in prison.
Speaker 1 (07:20):
That's why he'll be in prison, Not for falsely promising
me a beautiful life in the metaverse, oh JS sixty
nine four twenty.
Speaker 9 (07:37):
The life we could have led.
Speaker 1 (07:41):
Now. Zuckerberg did give four hundred million dollars to organizations
for voting infrastructure during the pandemic, and a good portion
of that money did go to Democratic precincts. And Donald
Trump did lose the election. So election interference but not illegal.
And obviously Musk isn't going to do anything like that.
(08:02):
Elon Musk is offering hourly pay to anyone willing to
encourage people in swing states to register to vote. He
stayed at quote for every person you refer who is
a swing state voter, you get forty seven dollars easy money.
Speaker 10 (08:17):
Oh shit, he's giving everybody forty seven million dollars. When
Trump finds out, when I think of the prison, Donald
Trump is going to send that sweet jumping bean of
a genius too.
Speaker 1 (08:28):
It kills me. When Trump finds out about this, it's
not going to be pretty.
Speaker 10 (08:34):
How good a guy is Elon Musk, right, you sall,
I'm bless us?
Speaker 1 (08:40):
How good? What the fuck?
Speaker 9 (08:44):
Wait?
Speaker 1 (08:44):
That's not election interference because he's for you? Well, what
else do you think is election interference?
Speaker 8 (08:51):
The Donald Trump biopic The Apprentice does not always portray Donald.
Speaker 11 (08:55):
Trump in a flattering light, and the Trump campaign threatened
to sue its filmmakers.
Speaker 10 (08:59):
Calling the film pure fiction and election interference.
Speaker 1 (09:03):
Oh, come on, that's election interference. Maybe it's election in afference.
But you've got to be a little bit flattered that
you're being played by Sebastian Stan. I may, oh, Sebastian,
if you are the Winter Soldier, why is it suddenly
(09:25):
so warm in here? I look like Sebastian Stan. If
you were to put his face through one of those
filters on TikTok that show your appearance right before you die,
(09:47):
it's yeah, you can applaud that. That's fine.
Speaker 12 (09:52):
I'm not.
Speaker 1 (10:00):
I know what I look like. But what about Big
Tech Donald Trump? Surely they're not sitting this out.
Speaker 11 (10:08):
Critics accused Big Tech of election interference, as Amazon's Alexa
gave favorable reasons to vote for Kamala Harris but not
for Donald Trump.
Speaker 9 (10:16):
Alexa, why should I vote for Donald Trump?
Speaker 11 (10:19):
I cannot provide content that promotes a specific political party
or a specific candidate.
Speaker 13 (10:25):
Alexa, why should I vote for Kamala Harris?
Speaker 11 (10:28):
Well, there are many reasons to vote for Kamala Harris.
The most significant may be that she is a woman
of color who has overcome numerous obstacles to become a
leader in her field.
Speaker 1 (10:37):
Okay, I'll give you that one.
Speaker 7 (10:40):
That one.
Speaker 1 (10:43):
No, that is tough. That is tough. I'm not sure
Alexa is really influential enough for it to be considered
election interference.
Speaker 11 (10:57):
But oh, I don't think I need a lecture from
mister Monday Nights.
Speaker 1 (11:08):
Yeah, that's fair. I was just I was just trying
to make the point that that's not what people should
use Alexa for. That's reminds me of Alexa. Could you
activate the bidet?
Speaker 9 (11:29):
That's good. That's a good tech, so does don't.
Speaker 1 (11:50):
Sometimes the ship's even too dumb for me. By the way,
none of the stuff that we're talking about is an
election interfeppearance. Yet Trump has threatened almost all of them
with either imprisonment, lawsuits, or censoring, which is why this
one section of this weekends rally in Pennsylvania was so striking.
(12:11):
When Elon Musk was discussing why he supports Donald Trump.
Speaker 5 (12:16):
The other side wants to take away your freedom of speech.
You must have free speech in order to have democracy.
That's why it's the first Amendment, Elon.
Speaker 1 (12:25):
Were you not watching the rest of the show. A
movie Trump doesn't like is gonna get sued. A tech
mogul he doesn't like he wants to put in prison.
It's not free speech if only Trump's admirers get to
do it without consequence alone. I don't going to go
that way. I don't see how his support of free
(12:49):
speech is exposed the belly worthy.
Speaker 13 (12:51):
I just don't.
Speaker 1 (12:56):
But at least the Constitution remains intact and is there
to ensure that we have the first Amendment.
Speaker 5 (13:04):
The second Amendment is there to ensure that we have
the first Amendment.
Speaker 1 (13:12):
Guns don't protect our free speech. Our free speech is
protected by the consent of the governed, laid out through
the Constitution. It's not based on the threat of violence.
It's based on elections. Organizing referendums and judicial system. Our
social contract offers many, many avenues to remedy these issues
(13:33):
and allows sides to be heard and adjudicated. Guns, from
what I can tell, seem to mostly protect the speech
of the people holding the gun. It's a tool of intimidation,
and if I may finish, I'm not gone. It is
(13:59):
a tool intimidation, and one that I think is actually
being irresponsibly and recklessly invoked because some people in your
crowd thought they might have been shadow band by Facebook.
I mean, for God's sakes, you guys are in Butler, Pennsylvania.
The whole reason you're there is because some fucking asshole
(14:23):
with an AR fifteen tried to permanently litigate his vision
of this country's free speech. That's why you're there. The
whole point of a society is guns don't decide it.
I would prefer at this moment not to trade in
a government that offers me many remedies for my concerns,
legitimate or illegitimate, for a situation where my rights are
(14:44):
determined by how many militia members agree with me. The
country ain't perfect, and there's a lot of issues we
don't agree on. Choice, immigration, shrink flation of snack chips,
the unholy marriage of penguins and hip books. But honestly, dude,
a country that can adjudicate these complicated issues there are
(15:08):
sometimes frustrating, overly bureaucratic constitutional system of checks and balances
and peaceful transfer of power is the only kind of
country that I want the children of Pesto and Moudang
to grow up in.
Speaker 12 (15:26):
When we come back, Bill a dare or go away?
Speaker 1 (15:48):
What about that? Mary Show My Death Tonight is the
creator of PolitiFact and the Knight Professor of Journalism at
Duke University. His new book is called Beyond the Big Lie.
Please welcome to the program, bel adare certain.
Speaker 12 (16:16):
There?
Speaker 1 (16:17):
The book is Beyond the Big Lie? Look PolitiFact? When
did you create PolitiFact.
Speaker 3 (16:23):
Two thousand and seven and so right before the for
the two thousand and eight election.
Speaker 1 (16:28):
And the idea is it's sort of a repository of
fact checkers for political speech. How did you decide what
would be included in what you would decide to check?
Speaker 13 (16:44):
Sure?
Speaker 3 (16:44):
So, the whole idea is to answer people's curiosity. If
you hear a politician make a statement and you wonder
is that true? Those are the things that PolitiFact checks,
I mean, and ultimately that's what journalism is all about.
To answer people bull's curiosity, if they're wondering what's true
and what's not.
Speaker 13 (17:04):
That's what PolitiFact fact checks.
Speaker 1 (17:06):
And you had ratings it was true, partly true, false
or partly false false, and pants on fire.
Speaker 3 (17:11):
And pants on fire and that and that was you know,
fact checking had been around before PolitiFact, yes, but what
distinguished PolitiFact was the truth ameter, which rated things, as
you said, and we did.
Speaker 1 (17:24):
With the technological advance of the exactly and we did
win a Pulitzer Prize really for when It's on fire.
Speaker 3 (17:34):
It sounds funny when you put it that way, but
but the reason is that there's solid journalism behind it, right,
and it's so important. And I think that people realized
that even back then, politics was getting complicated and people
were really beginning to wonder what was true, and PolitiFact
(17:56):
and other fact checkers filled that void in an important way.
Speaker 1 (18:00):
Now, this is all in some ways you might look
back on it and think, oh, how quaint. We went
through an analog and we would talk about whether it
was partly true or true, and we created a truthometer.
And then social media comes along and it turns into
this digital misinformation age where they talk about, you know,
a lie travels eight times faster than the truth. How
(18:20):
have you adapted and what do you think is the
kind of fact checking mechanism if you think it's it's
important in that way that can adapt to that moment.
Speaker 3 (18:32):
Well, I think what you're alluding to is the original
truth ameter used vacuum tubes and that.
Speaker 1 (18:39):
That was a way truth of meter for those of
you who at home that is copyright. Don't think you
can just put up You can't just put up your
own truth ameter or truthometer as I incorrectly called it earlier.
Speaker 13 (18:51):
You know, but you're very right. I mean, the.
Speaker 3 (18:55):
Fact checking has struggled to keep up with the many
ways that politicians and others spread lies. And I and
in the book what I talk about is and the
point of the book is to explore how and why
politicians lie. And one of the things that I get
at is that they're doing it in lots of different
(19:16):
places that fact checkers are struggling to keep up with.
And they're doing it with these huge megaphones that fact
checkers can't with their current staffing adapt to.
Speaker 13 (19:29):
So so and.
Speaker 1 (19:31):
With malice of forethought. As they might say on Court TV,
this isn't happen since misinformation has been weaponized as to
a large extent in this digital age.
Speaker 3 (19:41):
Absolutely, and you know we're seeing it in North Carolina
with the hurricane as you alluded to earlier. But fact
checkers have to get more assertive in how they respond
and think more digitally.
Speaker 13 (19:57):
So one of the things that does.
Speaker 1 (19:59):
That mean AI, Like when you say things more digitally?
Are you talking about taking this making it less bespoke
and creating kind of AI context overlays for these types
of things.
Speaker 13 (20:13):
So that's one way.
Speaker 3 (20:14):
So I think the the atomic unit of fact checking
I think.
Speaker 13 (20:21):
Will be for the foreseeable future.
Speaker 1 (20:24):
A creating your own metric system over there.
Speaker 3 (20:27):
The atomic unit of fact checking, I think humans will
will be needed to create fact checks. We hear what's wrong,
we need to research it, we need to respond to it.
But yes, AI can be used to spread it more
efficiently to broader audiences, to be more responsive. So two
(20:49):
ways that we've done that at Duke we helped we
worked with the Duke Duke University.
Speaker 1 (20:56):
I've heard it's a safety school. I've heard very very
very poor.
Speaker 3 (21:00):
Things So two things we've done with our team that
Duke is to We worked with the tech companies to
create a standard so that fact checkers could label their
fact checks. It's called claim review, and it allows them
when they publish something to put this tag on so
(21:20):
that tech companies, search engines, social media platforms can.
Speaker 1 (21:25):
Funding the housekeeping seal to some extent of.
Speaker 3 (21:28):
It's really what it is is like a street sign
that says this is a fact check on this person,
on this claim, and claim review helps helps find that fact.
Speaker 13 (21:40):
Check if you're simple, so that Google.
Speaker 3 (21:43):
Can then say, oh, here's a fact check and could
use it in powerful ways.
Speaker 13 (21:48):
So that's why.
Speaker 1 (21:49):
Keeps thee information from being let's say, laundered throughout the internet,
which is oftentimes what happens people lose attribution potentially.
Speaker 3 (21:56):
I can't speak for Google, but that's something they could do.
But now here's an other way that we're excited about.
We call it half baked pizza. So the idea, so
I'm just gonna send it right here.
Speaker 1 (22:13):
This is Duke University. Well this isn't just like Duke's
fact checking lab and pet repository like, well, it's half
baked Pizza.
Speaker 3 (22:27):
Index it's a safety school, Faro. So, but let me
tell you about half big pizza. So what we want
to do? So fact checkers have a problem in the
United States. There are not enough fact checkers in many,
many states. We studied this and we found huge what
(22:51):
we call fact deserts, places where governors, members of Congress
are never fact checked. They can see anything they want,
and they're never held responsible.
Speaker 1 (22:59):
Right, those are the twenty four hour cable news networks.
Speaker 3 (23:02):
So how can we hold them responsible? So often they
repeat the same talking points in Arizona that are being
fact check in Florida. So can we use AI to
monitor what they're saying in Arizona and duplicate a fact
check from Florida using genitive AI, but adapt it to
(23:26):
the claim in Florida. So we've been experimenting with that.
Why do we call it half baked pizza?
Speaker 1 (23:31):
Please? I was going to act.
Speaker 3 (23:33):
The idea is that if you have a claim that's
been done, say by PolitiFact, you need to review it
by a human editor. So I think of that like
half baked pizza. The chef looks at that pizza that's
not quite ready to go in the oven and says, yep,
(23:54):
the pepperonis are in the right place. There's enough cheese. Yeah,
it's got enough sauce. Okay, the half baked pizza is
ready to go in the oven. So that's our product.
We're trying to get funding for it. We think it
could be the answer.
Speaker 1 (24:07):
Your question when you came up with this, had you
had lunch that day? But this gets us all right,
So this gets us to the larger point. All right,
So we've got this idea of fact checking, but I
think the public solid fact checking, objective FactCheck that has
(24:28):
to be an earned trust with an audience, right, because
we really aren't a balance of it's misinformation. But then
it's also the First Amendment and censorship. You know, the
government for instance, and you tell this story in the
book Nina Jankowitz, who was hired by the Department of
Homeland Security to run a kind of operation within the
(24:49):
government that can examine misinformation generally coming from foreign sources
and other things. They ended up calling it, what did
they call it?
Speaker 3 (24:56):
The Disinformation Governance Board? The worst name any government agency
has ever been given.
Speaker 1 (25:02):
So it does the name itself conjures up Orwellian bureaucratic standards.
The right gets a hold of that, tweets it out.
Forty eight hours later, the whole thing is blown to shreds.
So it shows how difficult it is for even this
idea of creating that mechanism to take hold in a
(25:25):
country where misinformation is weaponized. Four partisan purposes.
Speaker 3 (25:30):
Sure, so how do you balance them? So let's be
clear about so. I focus on Nina Jankowitz for several reasons. One,
I wanted to show someone who is victimized by lying.
Here's someone whose life was turned upside down because of
these lives who face death threats, who had trouble getting
work afterward.
Speaker 1 (25:52):
And the speed of which it went from Twitter or
somebody tweeted it out to the right wing mainstream media
thing and people were brutal to her.
Speaker 3 (26:02):
Yes, Now it's important to point out this organization was
an internal working group that was designed to coordinate what
the Department of Homeland Security agencies did to combat disinformation.
It was not out to do the things that the
liar said about it, so.
Speaker 1 (26:23):
It was not there. The purpose was not to then
contact Facebook and Twitter and say you must remove this,
we don't agree with.
Speaker 13 (26:30):
It, correct.
Speaker 3 (26:31):
But the reason that there's so much misunderstanding about this
group is that the government, the Biden administration, did a
terrible job explaining what it was supposed to do, and
so this story they hung her out to draw they did,
and the story of Nina is the story is really
a really depressing story, although it has moments of humor
(26:55):
in it about how Washington works and doesn't work. So
it's sort of the backbone of the book because I
felt like I got so caught up in Nina's story
because it reveals so many things about lying and how
Washington works.
Speaker 1 (27:12):
Now does it show the limits of fact checking? Because
in the story, look the warmin homeland securities that don't
say anything, and so they let this thing go until
it built up a kind of you know, event horizon situation,
and it was all blown up. But when Maria Ressa
says something like a lie travels eight times faster than
the truth, doesn't that mean the truth has to work
(27:32):
nine or ten times harder. Doesn't this mean that to
battle misinformation you have to do it in a way
with a tenacity and a clarity of you know, sort
of a moral foundation that is kind of unyielding. And
they're not. They don't do that at all because let's
(27:55):
face facts. The government often bends the truth for their
benefit and their own propaganda.
Speaker 3 (28:01):
So in this case, there were no fact checks done
or there was one, right, I do think, you know,
the government is a culprit here in this case, and
it's it's very sad to watch how the government doesn't
do anything. But I think in combating misinformation and disinformation,
(28:23):
the government needs to step up and be upfront about facts.
And this is something you know. I've been an aviation
reporter in the past. I've been a political reporter in
the past. I've seen plenty of instances when government does
a good job telling its story, when it's honest, when
it's transparent, and one of the best things that government
(28:45):
can do is tell us when it does not know something,
be honest with us.
Speaker 1 (28:51):
Boy do they not do that? Well, I wonder, what
do you think you know? COVID is a great example.
So as we play this all out, we talk about
misinformation and trying to counter it and the weaponization of it.
But when the government, as you said, doesn't know something,
but come out with certainty, one hundred percent say and effective.
(29:11):
If you don't do that, you know, everybody dies. And
when that is shown when the misinformation that they say
is misinformation turns out to be maybe not misinformation, maybe information.
I'm not saying in every case, how badly? Does that
damage their ability to make any case vociferously? And does
(29:33):
that make it impossible for the government to have that
responsibility at all? Isn't that kind of the crux of
why they can put up maybe some guardrails, But can
they really be adjudicators of misinformation?
Speaker 3 (29:49):
Well, one, I don't think they should be an adjudicator
of misinformation. I think they should just tell us what
they know and what they don't know. And often it
takes courage for people and entities to say we're not sure.
And a classic example this week is the hurricane like
for the National Hurricane Center to say, well, here's where
(30:12):
we think it's gonna go, but you know, we were
not positive. You know, that's always been built into hurricane predictions.
And I think that's that same uncertainty should be reflected
in other things the government does. And I think they
have either gone silent on us with things, or they
(30:32):
have just fail or they have shown certainty when they're
really not, and I think that really harms their credibility
and as a fact checker. There's nothing worse than you know,
getting information you later from the government you later find
is not accurate.
Speaker 1 (30:52):
Right, Yeah, and that and and that can take the
whole thing. So when we when we look at the
big lie and how it's been weaponized so effectively, you right,
more by the right than by the left. You know,
you've taken criticism because you fact check more people on
the right, or you say they lie more, Yes, and better.
Speaker 13 (31:13):
They're very good at it.
Speaker 1 (31:15):
Yes, but you've done it. I mean it's in here.
There's a statistical annalysis.
Speaker 12 (31:19):
Yes.
Speaker 3 (31:21):
What what I did for the book was look at
fact checks by politifacted by the Washington Post fact checker,
and then talk to I think the most revealing thing
was when I talked to Republican politicians and said why
does your party lie more?
Speaker 13 (31:38):
And it was really revealing the answers.
Speaker 1 (31:42):
I just said like, hey, why do you guys lie more?
And they're like, good question, Yeah, there's something deeply wrong
with us.
Speaker 3 (31:49):
Yeah, well, these are for the most for the most part,
people who have left the Republican Party and who will
acknowledge this truth.
Speaker 13 (31:59):
But you know, they have a partisan.
Speaker 3 (32:02):
Media that not only looks the other way when they lie,
but but echoes their lies and often has a business
model built upon their lives.
Speaker 13 (32:14):
And so you begin with that.
Speaker 3 (32:16):
Then you have a culture in the Republican Party that
many people told me goes back to many people put
it with Newt Gingrich is sort of the turning point really,
that that nude Gingrich sort of changed the culture of
the Republican Party and and changed it into a sort
of anything goes Hey, if if we're gonna win, let's
(32:38):
you know you can you can change the fact.
Speaker 1 (32:41):
By any means necessary.
Speaker 3 (32:43):
Yes, right, And that culture took hold. Now some people
dated earlier, and Roger and er and I was just
gonna say, and Roger Al's and.
Speaker 1 (32:53):
Maybe money is not even the point. You know. Roger
Ail's who was the founder of Fox News, very famously
said during Watergate, I'm going to create an apparatus so
that what the left did to Nixon, they viewed any
sort of press as the left. What they did to
Nixon you can never again do to another Republican candidate
(33:14):
or president. And quite frankly, I think has been successful.
Speaker 3 (33:18):
And you know, combine all those things, you have a
recipe for lying and support for lying. That has just
become a culture.
Speaker 1 (33:26):
Now, are you suggesting the left doesn't lie or doesn't
weaponize it to the point where it's as effective.
Speaker 13 (33:33):
I am there's there's.
Speaker 3 (33:36):
Definitely a substantial amount of lying from the left, sure,
but nowhere near as much as from the right.
Speaker 1 (33:43):
I've gotten, if I may, I've gotten a couple of
pants on fires from you over the years, like literal
pants on fires, like not even like slightly untrue. Like
there was one where I think the tagline is there's
mother lying. I think it's that on it. It's a
terrible situation of it. I went home that night shamed.
Speaker 13 (34:05):
But see, okay, there's the difference.
Speaker 1 (34:07):
And I would have accepted that from an Ivy League school.
Speaker 9 (34:12):
But from Duke, from Dukes.
Speaker 1 (34:18):
At long last, beyond the big line, it's a fascinating
look on misifotion.
Speaker 11 (34:23):
You's got it.
Speaker 1 (34:24):
October fifty, The Veriti, the Order, Villa There, Ridicular Grace.
Speaker 6 (34:51):
That's our show.
Speaker 1 (34:52):
But tonight, but before we.
Speaker 14 (34:56):
Go, we are gonna check in with your host for
the rest of the week, Jordan Conference Nation.
Speaker 2 (35:09):
I'm come on, come on, come on, Jordan, do it Corn,
come on listen?
Speaker 9 (35:21):
Yeah, yeah, yes, yes, well yes, so excited her, very
excited about.
Speaker 8 (35:32):
Uh, very excited about hosting this week.
Speaker 13 (35:35):
Hippie.
Speaker 1 (35:38):
We talked about this, Jordan. You gotta do the jumping.
Speaker 5 (35:40):
Come on, John, John is comparracy.
Speaker 6 (35:44):
I'm thinking for you.
Speaker 11 (35:46):
I don't.
Speaker 10 (35:47):
I'm science serious, serious.
Speaker 1 (35:50):
Come don John.
Speaker 2 (35:54):
This week, Michael, I'm going along in the crap done.
Speaker 1 (36:01):
Hollo.
Speaker 9 (36:02):
Hello, Hello, held it a.
Speaker 1 (36:06):
Moment in time. I don't like flies.
Speaker 9 (36:09):
Get out of here flying.
Speaker 1 (36:11):
Never been a big fan of flies.
Speaker 13 (36:14):
You don't mind my bringing that up, do you?
Speaker 9 (36:19):
Anyway?
Speaker 1 (36:20):
This is a very aggressive sucker, that is this one.
Speaker 5 (36:24):
This run in particular is very aggressive, like I'm going
to be aggressive for our country, you can probably say.
Speaker 7 (36:34):
Explore more shows from the Daily Show podcast universe by
searching The Daily Show wherever you get your podcasts. Watch
The Daily Show weeknights at eleven ten Central on Comedy Central,
and streamful episodes anytime on Paramount
Speaker 11 (36:47):
Plus Paramount Podcasts