All Episodes

July 23, 2024 65 mins

In episode 1712, Jack and Miles are joined by bestselling author and co-host of Big Feets, Jason Pargin, to discuss…  Assassination Conspiracy Theories, Cyber Attacks, Psychological Impact Of Being Online, Entertainment vs. News & more observations from Jason's new book, I'm Starting to Worry About This Black Box of Doom.

LISTEN: It Was A Good Day (Footsteps in the Dark) by OMA

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Wait, justin you were saying you watched sprint. Did you
watch that netflish?

Speaker 2 (00:09):
I haven't seen it. I was just aware of it.

Speaker 1 (00:10):
I'm just I saw receiver, I saw sprint. I saw
like golf, hot dog man, every single part of sports.

Speaker 3 (00:20):
I know it good? Is it worth it?

Speaker 2 (00:23):
It's sprint like enthralling because I find it. I think
the egos that are in F one, I find it
hard to have that like mapped on into other spaces.

Speaker 1 (00:32):
Well, it's a little bit different because at the end
of the day, there is no car except your legs, right,
you know what I mean. And it's like the purest
form of like competition is like, all right, dude, who's
got better reflexes and stronger legs? And that is what
the Track and Field Association of America's tagline. There is
no car except your legs.

Speaker 2 (00:52):
There is no car except your legs the US Olympic
and that is how I need them to make sense
of it too. There is no car, only your feet. Wait,
where's the car? That's what That's what I say as
I'm watching a two hundred meter yeah, getting away the blocks?
Yeah all right, yeah, runners on your marks? Oh, where's

(01:12):
my car?

Speaker 1 (01:12):
Sorry, sorry, guys, I think you forgot where the car dude.
I'm telling you right now I'm gonna get smoked by
this dude unless I'm in a car.

Speaker 3 (01:19):
So what are we doing? This isn't even fair. This
isn't even fair.

Speaker 2 (01:29):
Hello the Internet, and welcome to season three, forty eight,
Episode two of day production of iHeartRadio. This is a
podcast where we take a deep dive into america share consciousness.
And it is Tuesday, July twenty third, twenty twenty four.
Oh of course.

Speaker 1 (01:47):
There's some weird shit happening on July twenty third. It's
National Vanilla ice Cream Day. That's fine, National Lemon Day,
and also Gorgeous Grandma Day, so shout out.

Speaker 2 (02:02):
I think it was written by somebody who was horny.

Speaker 1 (02:05):
Unfortunately, all the gorgeous greats.

Speaker 3 (02:07):
Look.

Speaker 1 (02:07):
It's a day where we recognize all the women who
embrace the age of a grandma, whether they are grandmothers
or not. Gorgeous Grandma Day embraces and kourges to fund
their granny attitudes with a purpose and.

Speaker 2 (02:19):
Stylet you gorgeous health out. You're like thiss bear with
these big fucking claws. Baby. I think swinger, I guess so, yeah,
all right. My name is Jack O'Brien aka Kama Kama
Kama Kamma Kamala is now in no longer Joe, no

(02:39):
longer Joe. Winning would be easy if your platform was
like my dreams, healthcare be free, healthcare be free. That
is courtesy of the Blake Rogers the Ohio State University. Also,
Halseion Salad took crack at a Kamala Karma came leon aka,

(03:02):
but I couldn't get the phrasing right on yours, even
though it was first. I'm so sorry, Helseion Salad. Shout
out to both of you, though, and shout out to
Kamala Kalala I donated to her.

Speaker 4 (03:14):
Don't like you see this.

Speaker 2 (03:16):
I'm thrilled to be joined as always by my co host,
mister Miles Gras Oh Man.

Speaker 1 (03:23):
It's Miles Great aka one two threes, It's Miles Gray
jacko b ee. We look at hummingbirds and cast pod
music explodes when Jack takes off. Wido, I said one
two threes.

Speaker 5 (03:37):
Likeitgeist legs are plumperly ee, our pants rip at just
the side of our quads. Boost it up to dunk
sick lobs ale that's also a House.

Speaker 1 (03:48):
On Salad Carter, shout out to you, House Sion got
on the board to split the Blake Rogers stealing on
on mind.

Speaker 3 (03:57):
Doing your thing. Love to see it?

Speaker 2 (03:59):
You do love to see it, Miles, And you know
what else you love to see? Tell me is when
we can have this guest on. A best selling author
of books like John Dies at the End, Zoe Punches
the Future, and the Dick and the new standalone novel
which you can pre order now. I guess can I
say it's a standalone? I mean, it's not part of
a previous franchise, but it's one that I wouldn't mind

(04:22):
seeing a sequel to or more from. These characters. Also
one of the hosts of the podcast Big Feats, which,
if I'm reading this New Yorker review correctly, is quote
the only Mountain Monsters podcast officially endorsed by Big Feet.
He's my former coworker at Crack dot com, a co

(04:42):
creator of the Cracked podcast. Welcome back to the show,
Jason Parja.

Speaker 4 (04:49):
First of all, I'm not dying. Don't tell the papers
that I'm dying. If I start coughing in the middle
of the show, I realize it in a movie or
TV show drama, if a character starts coffing that means
are going to die.

Speaker 2 (05:01):
You did just pull back the napkin that you coughed into.

Speaker 4 (05:04):
It and then just then quietly, quietly hid it from you.

Speaker 2 (05:09):
Yeah, you looked around, your eyes darted back and forth,
and then you tucked it into your waistband for some reason.

Speaker 4 (05:16):
But no, I'm just recovering from a cold. If I
start clearing my throat, I am not trying to get
their attention and demand that they let me talk. I
am trying to clear because from my throat, just.

Speaker 2 (05:27):
The way that you do it tends to be at hem. Uh. Yeah,
but that's all right. We'll just power through like we
do all the time.

Speaker 1 (05:37):
Yeah, if we see if we hear that horn take
at the top, we know you're not doing a DJ cool.

Speaker 3 (05:41):
Let me clear my throat either.

Speaker 2 (05:48):
How are you doing. We're excited to have you back.
I'm excited about this new novel, which I blew through
in a single weekend. It's it's a good one for
fans of Jason Parging novels. I think they're really going
to enjoy it. But it's probably a distant like writing
it was probably a distant memory to you at this point.
How long ago is that writing it?

Speaker 4 (06:09):
Yeah, but the dozens of interviews I'll have to give
about it means I have to keep in mind what
I wrote in it. But uh, for listeners who are
in the middle of the total world chaos, we're doing
a book review here. The book is relevant to what's yea,
to what's going on. I promise you you'll you'll see why.
And in fact, there's going to be at least one
very annoying review. I predicted a couple months it says, wow,

(06:32):
this is so precient. How if you predict that there
would be an event and it triggers all sorts of
misinformation and chaos?

Speaker 3 (06:40):
Yeah, that was not hard.

Speaker 2 (06:41):
Did you know that? That was my first question? Actually,
we can just.

Speaker 4 (06:47):
I love seeing the people with the little screenshots of
the Simpsons from twenty years ago. It's like, Wow, they
predicted that elections would become stupid in the future.

Speaker 3 (06:55):
Well, that's.

Speaker 6 (06:57):
Not take nostrodamis the one recite briss Hill though, did
actually perform with the London Symphony. Oh really, you remember
the homer Fest thing. He's like someone ordered the London Symphony.
Did someone do it when they were high and forgotten
like Cypress Hills, Like, oh shit, did we do that?

Speaker 1 (07:12):
And they're like, yeah, yeah, they're actually performing with them now,
and they're like, we kind of had an opportunity and
decided to make the Simpsons prediction.

Speaker 2 (07:19):
Right, So, I mean you hear those violins on some
of the early Cypress Hill stuff, you can't, man, Yeah,
it was. It was inevitable, all right, Jason, Well, we
are going to talk about some of the ideas in
your book. I'm starting to worry about this black box
of doom?

Speaker 3 (07:37):
Is that?

Speaker 2 (07:37):
What did I say that?

Speaker 3 (07:39):
Right? Yes?

Speaker 2 (07:43):
Did I not say the title?

Speaker 3 (07:44):
And the Well?

Speaker 2 (07:45):
First, yeah, I'm a pro folks. But first we do
like to get to know our guests a little bit
better by asking you, what is something from your search
history that is revealing about who you are.

Speaker 4 (08:01):
I have a whole series of searchers over the last
few days trying to figure out what CrowdStrike is, because
it turned out all of modern civilization ran on it
and I'd never heard of that company. It's an eighty
billion dollar company. Yeah apparently, yeah, and I still don't
know what it does. And this before the show I
tried to Google is the CrowdStrike thing over? Is that
old news or flight still grounded around the world. I

(08:23):
can't tell because Google Search is broken, so I don't
know any does anybody at home? No, email me let
me know.

Speaker 2 (08:31):
Yeah, I think like that. It seems like there's still
reports of flights being fucked up, mainly like you have
to go to the like the markets to see you know,
like it's like people reporting on the financials and they'll
be like, yeah, our our reporting indicates because they have
to get that right because money is at stake. But

(08:52):
it does seem like it's taken a while to untangle
CrowdStrike and there it was just like an update eight
that was ill advised, and somehow that took became one
of the biggest technology fuck ups yet to this point
in human history. A company that sounds like just a

(09:13):
board game from the nineteen eighties.

Speaker 3 (09:15):
Yeah, that would have been fun to play.

Speaker 1 (09:17):
Yeah, from the makers of crossfires crowdstrikes. But CrowdStrike there's
I mean, like they're the stat that I saw that.
I was like, oh right, because everything's just consolidating. Is
like they handle like these it services for like five
hundred and thirty eight of the top one thousand companies. Yeah,
so like, yeah, that chunk of like that kind of

(09:37):
critical software going down, Yeah, will cause a lot of
chaos unless you unless you're like a Linux person, and
then which case you were fucking laughing because you're like, well,
I'm on Linux.

Speaker 3 (09:48):
Fuck that.

Speaker 4 (09:49):
I think it's the range of things that went down
to shocks people because it's like, okay, flights got grounded. Also,
when we tried to buy coffee at Starbucks, their point
of sale software was dead because of CrowdStrike, and then
all so it's like a whole range of things across
all society. Nine to one one service went down for
some people.

Speaker 2 (10:06):
For a while.

Speaker 4 (10:07):
Yeah, hospital, it's like, oh so this all runs. It's like, hey,
it all runs on a computer somewhere, and if those
computers are running Windows, and if they're using CrowdStrike to
do whatever CrowdStrike does, I apologize, audience, I still don't know. Then, Yeah,
they pushed out that updated brick the computer. They could
not simply push out another update because the only way
to fix it was to like manually install and delete

(10:27):
a file because your the computers were stuck in a
boot loop. Yea yeah, so they couldn't just push out
so yeah, but one I don't know, one string of
code crippled much of the world's economy instantly.

Speaker 2 (10:40):
The solution was ultimately the solution to all type problems,
which is just like restart, throw another ground, yeah, throw
it the ground.

Speaker 1 (10:47):
Well no, because like the one version was like you'd
have to go into every individual computer and like take
this file out, which is like.

Speaker 3 (10:53):
We do oh yeah, like that's this.

Speaker 1 (10:56):
You can't just do this over the waves, or at
least at the time I had last I was like
looking for videos. I'm like, please explain to me, like
I don't know what a computer is, what the fuck.

Speaker 3 (11:06):
Is going on? And I still I came away with
a very vague understanding.

Speaker 2 (11:10):
Yeah, but it does feel like this becomes more and
more possible the more we turn everything over to the
machines and we're just like, they got it. We don't
need humans on the ground doing you know, local deployments.
That then if those work get deployed two more places,
it's yeah or yeah, yeah, or shout out to the

(11:33):
I don't want to update my software crowd, because they
also may have like just inadvertently gotten around things too yeah,
some of the like anecdotal stuff I heard in the
early days was or in the early hours I guess
was that the airlines that were actually functioning were the
non major carriers, and so it could be a situation

(11:56):
where like CrowdStrike is the expensive, higher end option that
like sells for millions and millions of dollars that only
a major corporation can eat. And so they got fucked.

Speaker 1 (12:09):
But oh yeah, our services we go through cloud clown.

Speaker 3 (12:13):
I it's it's affordable.

Speaker 1 (12:16):
But yeah, we don't have any problems.

Speaker 4 (12:18):
It's the open source.

Speaker 3 (12:21):
Cloud clown.

Speaker 2 (12:23):
What's Jason, something you think is underrated?

Speaker 4 (12:26):
How many world changing like mass shooters, terrorists, assassins just
want to be famous as opposed to having some kind
of ideology. Because we're now enough days past which I
don't know if you guys remember, but somebody almost killed
Donald Trump several days ago. I know this has been
long forgotten from the headlines, but that guy, Thomas Matthew Crooks.

(12:49):
By Now we're far enough along that the FBI, like
they you know, they cracked open his devices that day.
We're not far enough along by now we should have
his manifesto, we should have his social media, we should
have everything he had ever posted. I have no indication
that he cared much about politics at all. He had
on his phone, I guess photos and Internet searches for

(13:09):
various politicians, and that maybe that Trump was the one
that came closest to his house. Yeah, Like he literally
picked the one that was in driving distance because he
just wanted to go out with wanted the world to
know who he was, right, And I don't know if
that is a distinctly modern phenomenon, like you don't think
of like John Wilkes Booth trying to boost taking sales

(13:31):
to his next play. Like that was about Lincoln and
slavery and ideology. It wasn't just like, yeah, they'll know
my name after this. But these days, I don't know,
it seems like you see more of them where it's
just no, I want everybody to know my name. This
is this will be the mark I leave on the world.

Speaker 2 (13:48):
Yeah right, Lee Harvey Oswald was also like this, like
to the degree that people still aren't sure that he
was trying to shoot Kennedy. He may have been trying
to shoot the governor. He had already like tried to
shoot a different Texas politician like weeks before or I
guess it was probably months before. And yeah, he didn't

(14:09):
appear to have any big ideological ideas other than that
like he should be famous and he was going to
do something big. So yeah, but it's it's hard to
get your mind around that. I mean, we have an
entire over you know, sixty years at this point of
Kennedy conspiracy theories that and I'm not like here to

(14:33):
say none of those are true or there's nothing suspicious
about that, but it is uniquely unsatisfying for people to hear.
Now it's just like some incompetence on the side of
security and something somebody who wanted to be really famous,
like school shooter when his search history is literally stuff
like Donald Trump's near me and like that seems to

(14:57):
be kind of like yeah, like he was looking at
it and into Google man and search Donald Trump Trump's
and then you looked at where all the flags were.

Speaker 1 (15:04):
All right, Joe Biden's near me, Okay, on my way.

Speaker 4 (15:07):
But I feel like we've had a number of mass
shooters in a row where when they tried to figure out, like, well,
why did he pick this as his target? What grievance
did he have and unless they were shooting up their workplaces,
just this is where they thought they could get the
high score. This was a place they that was easy
to get into and a lot of victims all packed,
you know, packed together in one spot, and that was it.

(15:27):
Just this is what we'll get my name. And ironically
I can't remember any of their names at this point.

Speaker 1 (15:33):
Yeah, I had to look up Steven Paddock, the Vegas one,
because I remember that was another one where everyone was like,
what's what, What's what's going on? What was he trying
to say?

Speaker 4 (15:43):
To this day, we don't know. He left nothing behind.
It's all we know is that he planned it for
like a year in advance. He looked at different targets.
And there's some that think he was mistreated by casino
staff or disrespected by the casino, like they weren't treating
him like the high Roll where he was decided to
try to kill hundreds of people.

Speaker 3 (16:04):
I don't.

Speaker 4 (16:04):
I'm not trying to make light of it. But every
time you look for some mission or it's like no,
because you want to be like a Batman villain, like
they've got an ideology. My job is to tear asunder
the fabric of society and the name of some you know,
stupid movement or whatever. So often it's just just rage
or frustration, or more likely just I'm not gonna get

(16:26):
famous doing anything else. I have no other skills. I've
tried going viral multiple times on when my wacky YouTube pranks.
Nobody cares. So this is the only thing left to me.
It's one thing that guarantees I'll get on the news.
But if they believe they'll be remembered by history, I mean,
I guess crooks would have been if that if that
shot had been a few inches the other direction. But uh, yeah, man,

(16:48):
that's nuts. That seems like a uniquely modern American sickness.

Speaker 2 (16:54):
Yeah, what is something, Jason you think is overrated?

Speaker 4 (16:57):
Apparently the resilience of the Internet, because those of you
who are not around. Back when they first invented the
Internet in the nineteen sixties in the Cold War era,
the entire concept was that you could it was supposed
to survive a nuclear war because all of the computers
are networked to each other. If you take out this half,
the other half continues working. So it was originally about

(17:17):
like how you can continue communicating after the system we've
built can go down. There are so many single points
of failure because people have largely forgotten. Like there was
the Nashville bombing in twenty twenty Christmas Day twenty twenty,
the guy blew up an RV that took out an
AT and T office. Yeah, and that's largely been forgotten.
That knocked out internet across parts of seven states. It

(17:41):
knocked out nine to one one and knocked out air
traffic control at one airport. It was that same thing
because it turned out it wasn't It was an AT
and T switching station missions that the other ISPs like
rented out to. So we had Verizon, our Verizon Internet
went down because they just So there's all these single
points of failure because you look at the map and

(18:01):
it's like, well now hold on. That kind of makes
it look like, with I don't know seven or eight bombs,
you could make the whole country go dark, right, Because
it took them a while to get it back up.
It was not as easy as just re routing the
traffic elsewhere. They had to like get generators and bring
them into the building and try to fire up the
servers from within the rubble to try to get people

(18:22):
connected again. It just seems like such a fragile system,
and this CrowdStrike thing is like far more widespread than that.
But they're all of these single points of failure, and
I'm surprised that's not a bigger complaint or priority or
that you don't hear more about why why are there
no why are there no backups? It's I mean, how

(18:44):
many thousands of flights got grounded by this thing? How
many people missed appointments and whatever.

Speaker 3 (18:50):
I don't know.

Speaker 4 (18:51):
It feels like there should be congressional hearings, and I
think if they do something like that, it would just
be for show where they get like the CEO of
CrowdStrike and they yell at them for a while and
then it's just it's like, yeah, but we all depend
on this. Everything we do depends in this. Hospitals could
not do surgeries because their system went down due to CrowdStrike. Like,

(19:12):
lives can be lost if something like this goes down
for a prolonged period of time, and it does not
seem like it takes much.

Speaker 3 (19:18):
Yeah, yeah, it yeah.

Speaker 1 (19:21):
You like to your point, you'd think there should be
some kind of like you know, inquest into understanding it's
like this is critical infrastructure, and is it just because
every company is like, well, I'm a Fortune one thousand company,
and all the other Fortune one thousand companies use this,
so I'm mindlessly just given by what do you do?

Speaker 3 (19:36):
Yeah, we're doing that too.

Speaker 1 (19:38):
I could you know, like some people pointed out, like
you could hire consultants to actually help sort of replace
what CrowdStrike would do. Again, this is from my very
cursory understanding, but there's just a momentum where like this
is just sort of seen as like the default company
to use and here we are, Yeah, people writing paper
tickets now for airlines. I saw that like in India

(19:59):
where they're just like writing it down to be like.

Speaker 3 (20:00):
I don't know, man, we just have to go back
to full on analog.

Speaker 2 (20:03):
It's like a bar fight where like the in the
in an eighteen hundreds movie where they have the tickets
and people are just running up and be like, oh.

Speaker 3 (20:14):
Yes, exactly like a bar fight.

Speaker 2 (20:16):
Yeah, it's troubling. It does seem like the sort of
thing that this version of capitalism is just uniquely bad
at a dressing right, Like it just feel feels like
when a problem happens, it's so quick to be onto
the next thing that there's not it's not a ton

(20:36):
of people, you know, including like the global financial meltdown
of like two thousand and eight, like that we still
haven't really solved or like that like addressed to any
real degree, like a lot of the problems that caused that, Yeah, we.

Speaker 1 (20:53):
Just I guess it's well that people got bailouts that
needed them, you know, it's that financial mass, except for
like the people that you know, really like people that
needed them.

Speaker 3 (21:01):
The companies did.

Speaker 1 (21:02):
And I wonder if because of the actual amount of
losses that the airlines experience, they're like, Okay, we're not
doing this and like that was terrible, And I think
that would be the only way it is, like how
much of an effect it had on capital for there
to really be the kind of things you're like, well,
you fucked my money up, guys, So what's going on?
That's really the only question.

Speaker 2 (21:22):
I think that It feels like we're like addicted to
the velocity of the emotion, you know, like the how
much it's causing, like how big the feelings are, how
big the story is more than anything that's like just
coherently thinking through it and being like, how do we
make this thing work the best? It's like people get

(21:45):
their charge out of the intensity of the story.

Speaker 4 (21:49):
There were viral Twitter posts within hours talking about how
this had to be a DEI situation, how it had
to have been an employee who was not white who
was hired for DEI reasons because only that kind of mistake,
because you don't see white people making dumb mistakes like that.

Speaker 2 (22:05):
Seen, yeah, name one. Yeah, all right, let's uh yeah.
So I mean that also makes sense that like people
are so loud attributing it to the exact wrong, wrong
things that it gets drowned out. So there's there's no
reason or no no real urge driver to get the

(22:29):
thing actually addressed in any real way. All right, well,
I think all of this ties into some of the
ideas in your books. So let's take a quick break
and we'll come back and get into it. We'll be
right back and we're back, and I mean just to

(22:54):
continue talking about the crowd strike outage, because that I
think is the most recent example that I've had of
this thinking. But I mean, first of all, I should say, you're,
like I said up top, the novel's very entertaining. It's
called I'm beginning to worry about this black box of doom.

(23:15):
It's packed with very interesting ideas, and among other things,
it's about how information stories evolve on the Internet, like
in real time, with people kind of focus grouping the
story and not really focusing on what is the truest

(23:39):
thing so much as what is the most entertaining thing
or the thing that most kind of rhymes with their
preconceived notions, and that whatever that is seems to gain
the most momentum as opposed to whatever the truest thing is.

(23:59):
I had an experience on Friday where I called a
friend whose work was being affected by the crowd strikeoutage,
and they were too busy to do the googling that
I had done because they were dealing with the crowd
strike outage. And they were like, I don't know, everyone
around here thinks it's probably some sort of cyber attack.

(24:22):
And I was like, well, so crowd strike themselves are
saying that it's not that it would be especially in
their interest, you know, to have it even seem like
it might not be their fault, and you know, just

(24:46):
went through the thinking of why it seems like it's
definitively just a fuck up on crowdstrikes part, and they
were like, I don't know, it's suspicious. I don't know.
I guess. I guess there's a reason the plot of
most Hollywood movies don't hinge on someone accidentally fucking up

(25:08):
or like vague incompetence or when that is the explanation,
like in Titanic, we need to like make up a
sneering rich guy who is like, I would rather kill
you than let you hang out with a poor person.
But yeah, so I don't know, is that, like, was
that explicitly an idea you had heading into the novel

(25:29):
or did it kind of come out and am I
getting any of that rate?

Speaker 4 (25:33):
Yeah, so just for context, the setup of the book
I'm starting to worry about this black Box of Doom
is that there's on the first page there's a mysterious
woman who hires a driver who she's never met before. Says,
I want you to take me across the country, me
and this box. It's like a footlocker sized black box.
She says, I will pay you two hundred thousand dollars

(25:56):
in cash, but it has to be kept quiet and
you absolutely cannot look inside the box. So she needs
to be driven from southern California to Washington, d C.
She tells the guy, you can't tell anyone you're leaving.
You have to leave behind any devices that can be tracked.
No phone, no laptop, no GPS navigation. So why this
is relevant is that as soon as they start on

(26:18):
this trip, a rumor starts online among like true crime
types and just board people that this is a terrorist
attack because she's heading toward the capitol and that this
box is part of it. So specifically, there's a theory
that quickly goes around that this attack is carefully calculated
to plunge the country into a civil war by striking

(26:40):
it just the right way that will inflame people in
just the wrong way. So there's this ticking clock where
these strangers online have to try to coordinate to somehow
stop this thing before the vehicle reaches Washington, d C.
As it goes across the country. So as the story
unfolds and as this information kind of spreads across social
media and this horrible information ecosystem we've created in which

(27:04):
no facts can survive, that becomes the challenge because all
of the incentives pull people in directions other than the truth.
So the book is this is me having tried to
follow because I'm the most terminally online person you have
ever met. I've you know, going even before my crack days,

(27:26):
I was just on social media. I basically never close Twitter.
I've tried to track many, many, many news stories online
and I've watched this happen in front of me. So
if I can be said to be an expert in
any subject, it's this because it's always fascinating to see
how the human brain gravitates toward the most entertaining story

(27:47):
and not just what the what the facts seem to
be in front of them.

Speaker 2 (27:53):
Yeah, it reminds me of when we've covered like fan
theories in the past, like how fan theory he's come
up with ideas of like what the plot should be
of the movie or with the plot like for the
sequel could be. It feels like exactly the same thing,
like where it's they're pretty good at like spinning a
pretty entertaining narrative, you know, like fan theories about movies

(28:16):
like why is Bella able to you know, be in
love with this vampire when she's allergic to them?

Speaker 3 (28:24):
I don't know, and.

Speaker 2 (28:27):
You know, like Twilight fan theories like get made up
this way and they're the internet is good at like
telling spinning a yarn, but unfortunately they do it with
the news and fun I.

Speaker 1 (28:39):
Mean like yeah, you know Jason saying like watching story
after story like evolved on one way on the Internet
and then ultimately finding out the truth and being like, guys,
it wasn't it was not as you know, spectacular or
salacious as you thought it would be. Is that just
like do you see that as just like a factor
of like our boredom that would to make sense of
our world?

Speaker 3 (28:59):
Like at times.

Speaker 1 (29:00):
We were like we're yearning for something to like mimic
like the like drama or the narratives that we see
on TV. Or it's something to do with our powerlessness
or what is it that's so intoxicating about.

Speaker 3 (29:11):
Doing the Yeah, I don't know.

Speaker 1 (29:12):
I mean like like the Okham's razor explanation is this thing,
but I'm gonna go with alien conspiracy theory, hit job.

Speaker 4 (29:21):
I just feel like the and this is not scientific.
I'm not a scientist. I feel like the part of
the brain that wants to be entertained and the part
of the brain that observes the world around you to
make decisions are two distinctly different things, because the part
of you that wants to be entertained is just we
have evolution, right. Due to evolution, we have a need

(29:42):
for novelty. We enjoy seeing new things, things we've never
seen before. It's hearing new things, so you know, we
have we like narratives, but those are things we seek
out to distract us from, like our miserable, mundane everyday lives. Ideally,
you you would not be using that part of your
brain to browse the news because the news should just

(30:05):
be Okay, I have to make decisions from my household.
Here's what I'm seeing on the news that's going to
tell me what decisions I need to make. But those
things have completely merged, and as there is more and
more media, the difference between is this person talking to me,
are they trying to entertain me? Or are they trying
to inform me? Gets bored and I realize I'm saying

(30:27):
that on a show where your guys' job is to
try to make the news entertaining, because that's a way
to engage people like this. This is good. We want people.
What John Oliver does is good because these people otherwise
maybe wouldn't care about politics, but he packages it in
a way that makes it fun to listen to and
kind of takes you through it.

Speaker 3 (30:46):
Bitch.

Speaker 4 (30:47):
The people doing that have to be very ethical about
how they do it, because otherwise, if you're Alex Jones,
if you're Alex Jones, it's very easy to just make
up a fix that is more fun than the truth.
For the same reason a candy bar tastes better than
an ear of corn, Like one of them was made

(31:09):
to be It's full of sugar and fat. It was
made to be addictive. The other one is something that
happens relatively naturally. So if you can make up stuff,
it's always going to win in terms of engagement because
you're not restrained by reality, and we do not. I
don't think our brains are very good at filtering the

(31:31):
fact from the more entertaining fiction, and the economic incentives
definitely are not set up to punish people who fail
to do it right.

Speaker 2 (31:42):
Do you feel like there are trends that I think
this is something we tried to do at Cracked. Sometimes
is just in addition to debunking like myths that gets
spread around is like here are the types of lies
that our brain or the Internet tends to gravitate towards,

(32:03):
and it's you know, like one that I would say that,
you know, I feel like we're seeing this process of
like you know, internet focus grouping and writers rooming a
real event in real time with the attempted assassination of Trump,

(32:23):
as we've referred to. And I think one of the
themes that we're seeing there and also in the crowd
Strike story, is like people have a real aversion to
incompetence as being the explanation or you know, accident, somebody

(32:43):
fucking up. It's just not a satisfying plot point in
your movie, like if Diehard Hedge, just like the story
had resolved itself because the hacker had accidentally like detonated
of bunch of the bombs, well, the Hans Gruber was
on top of the you know, like something like that,

(33:04):
and then it's it's just a fuck up along the
way that doesn't happen. That doesn't happen in movies really
because it's not satisfying. The part of our brain that
craves novelty and like good storytelling resists that sort of thing.
And so I believe like that it's a bigger part

(33:26):
of the story of the JFK assassination then we tend
to think, and I think it's probably a bigger part
of the story of the Trump attempted assassination than some
people are willing to it. Like, I think it seems
to be pretty surface level that there is a fuck
up there, But are there other do you? First of all,

(33:48):
do you agree that that's a trend, And then are
there other kind of trends that you've noticed you as
you've kind of been studying the sort of the.

Speaker 4 (33:57):
Well, yeah, that's misinformation, Like I get that part of
it is you just want to simplify the world. So,
for example, I have one extremely unpopular political opinion, which
is this is the perfect time to get it out
when you're trying to sell a book and you've got it,
which is that I think most of the world's problems,

(34:18):
most of the things that frustrate straight you in your life,
are not anybody's fault. I think the world's an imperfect place,
and I think it's hard to run a society in
a way that's perfectly fair to every single person. I think,
you know, it's lots of times when prices would go
up or whatever. It's not necessarily that some evil person

(34:39):
has a scheme. It's just it's market forces, and it's
a company that's trying to maximize the revenue because the
shareholders demanded, and like the blame for things spreads in
so many directions that it just kind of disappears because
it's just a system that we're all trying to survive
in and that is incredibly unsatisfied. We would love to

(35:01):
hear that there's a villain because in a movie, if
there's a problem like this, I don't know if you've
seen the Jason Statham film The Beekeeper.

Speaker 2 (35:10):
A lot about it.

Speaker 4 (35:12):
Yeah, it's a it's it's a great boomer fantasy of
like everything that is terrible about the world. Although he
going up to the president, there's like a cabal of
just cartoonishly evil people that if you could kill them,
the world would finally be at peace. And that's that's

(35:32):
very satisfying to think, because, yeah, every movie's got to
have a villain, a human villain that is causing the problems.
Like even a film like The Martian, which is supposed
to be all about like troubleshooting and smart people and
confidence porn, they still had to have like the villain character.
The one guy who refused to was like being obstinate
and say no to all of their plans because there's

(35:53):
got to be a bad guy. And this is something
that I think is true across the whole political spectrum.
Everybody wants there to be a bad guy, and not
just sometimes like but the pandemic, sometimes pandemics happened. We
exist in nature and we actually, I don't know, it's

(36:13):
I think most people did their best, and most people
didn't freak out, and most people did what they thought
was most reasonable. And I don't think we like that.
I think we like the thought of there being somebody
we can yell at and hate, and then if we
could get rid of them, everything would be fixed. That
seems to be to me the most common bias, which
is I want to believe that somewhere there is a person,

(36:36):
a bad person, who has caused this, because then I've
got an opponent, and then if we could defeat them,
everything would be fine. And most things in life are
not like.

Speaker 3 (36:44):
That, I believe.

Speaker 1 (36:45):
But in that version, does that sort of like absolve
people of any responsibility for what the actions of like
an organization that they come like, you know, are the
figurehead of or how do you look at like that
sort of piece of it, Like I get the sort
of our eur to be able to like say, this
is where it's all focused and that's like it's in
these four or five people kind of thing. But how like,

(37:07):
at what point is there Obviously there are systems that
have the lives of their own, but are you saying
that everyone is just completely powerless to those things or
nothing can be done or how do you square that part?

Speaker 4 (37:18):
I think, for example, I could go on Reddit right
now and I could find memes talking about how the
boomers ruined the world, how the boomers when they were alive,
jobs were easy, lifetime employment, houses were cheap, they had everything,
and then they intentionally screwed over the next employee or
the next generation after them because they were so greedy
and so you know, sociopathic and narcissistic. If you could

(37:41):
actually grab a random.

Speaker 2 (37:45):
When I keep hearing recently, if.

Speaker 4 (37:47):
You could go grab a random boomer off the street
somebody in their seventies say hey, why do you ruin
the world. He's going to say, I worked out a
muffler shop for forty years. What are you talking about.
I don't even I rented for most of my life.
I got to take a vacation like once every five years.
What are you you're talking about? Like the CEOs and
the politicians, And it's like, no, we've now distilled all

(38:10):
of the boomers into like one evil person. And guess
what gang whatever generation you are, Like, let's say there's
some Gen Z kids listening to this a couple of
generations from now, they're going to blame you for what
happens with AI. And you're going to say, I didn't
do anything with AI. I thought it was stupid. I
barely used it. And the kids in the future can say, well,

(38:31):
why didn't you stop it? And you say, I don't
even know who who did it. I don't even know
who was in charge of it. Every company just started
doing AI and it suddenly there was AI and all
my devices, and they're gonna be like, well, why didn't you,
Why didn't you vote to stop it? Why didn't you
boycott those companies? Why didn't you? And you're gonna say,
I was just trying to live my freaking life. I
was trying to survive. No, I did not have time

(38:53):
to go firebomb a server farm where they were, where
they were operating chat GPT five. I was just trying
to And so what you find is you get that
same answer all the way up to the President saying, look,
I was voted. People voted for me to carry out
an agenda. They could have voted for somebody else. This
was the agenda. This is what I did. I did
what I thought was right. And this is the most

(39:15):
terrible truth that nobody likes to face, which is that
most people are doing their best, and the flaws that
happen are because you have different factions in society with
different interests. For example, like housing prices. Every time somebody
talks about why housing is so expensive, they want to
come up with this theory that like, there's like one

(39:37):
corporation is secretly buying up all the houses. It's like,
no that they may be doing that. The issue is
that half the country are already homeowners and they like
the fact that their house costs twice as much because
that's their retirement. It's not a secret cabal of guys
in a shadowy room. It's an entire section of the country,
and their interests are separate from yours, and they're not billionaires.

(40:00):
They're just retired dentists or whatever. It's like, well, no,
my entire retirement is based I'm going to sell this
house when I turn seventy. I'm gonna move to Florida
and rent a condo. But yes, the fact that my
house costs four hundred percent more than what it did
when I bought in nineteen ninety five, Like, no, I'm
I don't want housing prices to go down. This is
you know, this is my retirement right here. So there's

(40:21):
times when some people just want different things from you.
And if you're always trying to look for a specific
villain or a cabal or a conspiracy, you're going to
be disappointed more often than not. A lot of times
it's just people acting out of short term interests or
out of ignorance, or you know, they're just being oblivious.

Speaker 1 (40:41):
Yeah, but is there I mean, yeah, I guess in
that's like that feels like sort of like a bleak war,
Like how in that instance, what how would we solve
things if we're willing to always say like, well, this
person is just trying to do the best, not that
I think, like I get the point about like trying
to find like this cabal or like darker angle to
explaining certain things like that, but does like at a

(41:03):
certain point, like if how would that world view, how
do we try to change things like from that perspective.

Speaker 4 (41:10):
But things have changed. None of us would prefer to
go back and live in the year nineteen twenty four.
Think about what you lose if you go back. Think
about how many civil rights get rolled back. Think about
how much shorter people lived, how many more babies died
in childbirth, Think about how can much more contaminated the
food was back then, and how nobody had air conditioning,
Like we have improved the world immeasurably because while everybody

(41:33):
was yelling at each other, the normal people were just
out doing their jobs and building houses and building safer cars.
And there's bureaucrats that are just quietly passing, you know,
ordinances that make things slightly safer, and yeah, the you know,
none of us would go back and live one hundred
years ago. Things were worse by I think in every
possible measure.

Speaker 2 (41:55):
I think some of the yelling at each other is
part of that thing's getting better, right, Like not necessarily
yelling at each other in the public square, like on
the internet, as it tends to happen now. But I
mean that like the disagreeing and pushing for better at
a systemic level and criticizing the way the system currently
works is at least partially what drives some of that progress, right,

(42:19):
So it does, I get your meaning. But I think
taking that anger, that energy and focusing it on criticizing
how the system is actually working or failing to work,
feels like it's still pretty crucial given that model of
humanity where people are just doing their best, but they're

(42:41):
you know, their best is actually inefvertently harming other people, right.

Speaker 4 (42:46):
And that's something that comes up in the book because
the idea is and again you can disagree with this
or not, but in this particular era, in the social
media era, all of the incentives are toward keeping you
glued to a screen. So if you are arguing, the
system makes less money if the argument progresses toward a

(43:09):
resolution and a consensus. What the system needs is for
you to always be arguing, forever and ever and ever.
And that's why so much of what you're arguing about
is not something that's based in reality at all. Like, ideally,
if the systems working correctly, when COVID happened, and then
they got a sense of what exactly worked. What you know,

(43:32):
what cures worked, you know what treatments worked, how the
vaccines work. Once you arrive at that truth, it's like, Okay,
we have worked through it. We've been arguing the whole time,
but we eventually figured out we have the data. Here's
what we know. Vaccines worked for most people. You know
a lot of the other you know systems did not necessarily,
but this is this is what works. That's not what

(43:53):
you get now. You get people still to this day
yelling about like that what was the name of that
horse paced drug that the conservatives.

Speaker 2 (44:03):
Fucking I took a bath on that thing.

Speaker 3 (44:04):
Man.

Speaker 4 (44:06):
To this day you'll get people in claiming that all
of the deaths were actually from the vaccine, none of
the death real like they've got grass showing that actually
there was no there was no pandemic. It was completely
That's the malfunction the system, because that is money, that
is traffic, that is engagement, and if your company that
only cares that people are glued to the screen, you

(44:27):
by definition are pushing people in the wrong direction because
you're not pushing people toward a consensus. You're always trying
to invent new things for people to argue about. For example,
I can I'm looking at my Twitter right now and
there is a meme that somebody posted where it's a
it's a guy, it's a picture of guy with a

(44:48):
beard and said, hey, ladies, if your boyfriend doesn't have
a beard, you have a girlfriend. And people yelling over
whether or not a man without a beard is scientifically
a male, Like, that's that's not leading anybody toward anything
that is purely invented to get people to yell at

(45:09):
each other and stare at their screen for just a
few minutes more and nothing else.

Speaker 1 (45:14):
Sure, so yeah, right, So what you're saying is you're
as it's set up that, no, there's no incentive to
actually arrive at a conclusion because the discord, the disharmony
in the controversy is what's is what sort of being
brought to us. And we're increasingly motivated to this because yeah,
there's just so many like engagement type tweets that just

(45:36):
come out saying the wrong thing and being like, watch this,
I think I'm going to win the next presidential election
for the Democrats. In the comments section of that tweet
by printing out the JD vance actually sucks, right sure.

Speaker 4 (45:50):
And this is I don't even need to talk about
this in nebulous terms. The way x has run, the
way Twitter has run Sinceilon Musk took it over. There
are measurable changes he made. He unbanned specific counts, all
of the policies against spreading intentional misinformation, all of that
stuff he lifted that I can watch this change happen
in front of me, because he thinks the only way
to make it profitable is to bring back the bs

(46:13):
that gets people yelling. Whereas before this they at least
felt some responsibility, like they banned Donald Trump for a reason,
like they felt some responsibility that this is the public square,
this is where news breaks. I mean, freaking Joe Biden
announced he was dropping out on Twitter, Like that's it
still holds that position. The former owners took that responsibility

(46:36):
seriously to some degree, but not so seriously that they
would not sell to Elon Musk, who openly promised I'm
going to trash all of that, like Google's whole thing
like that don't be evil slogan that they're like, hey,
we understand, we have tremendous power here, and we're going
to try to wield that responsibility, even if we technically
could make more money, but with nonsense and with whatever,

(46:59):
with the entertain lie rather than getting people accurate information. Well,
it seems like corporations there is a eventually the shareholders
are like, hey, you're leaving money on the table. There's
a lot of money in the evil stuff.

Speaker 3 (47:11):
Why are you?

Speaker 4 (47:13):
And no one feels like it's their fault because like, well,
I'm just a shareholder. You know, they could do what
they want. But it's like, yeah, but you're saying you're
not making as much money as you could be because
you know, if you allow the BS is profitable, of
course it is. It doesn't cost anything to research. You're
just making it up.

Speaker 2 (47:31):
Yeah, all right, let's let's take a quick break and
we'll be right back. And we're back. And one version
of this that was kind of interesting that I noticed

(47:53):
following you on social media, Jason, is people were angrily,
very very seriously criticizing that. So somebody posted a clip
from Jason and the Argonauts and was like, Luma, the
AI like video thing basically is a game changer. Look

(48:16):
at how much better it made Jason and the Argonauts
and the like special effects, and then.

Speaker 4 (48:23):
The scene was the classic stop motion skeleton rattle, which
is one of the most famous effect sequences of all
time because of the era it came out and how amazing.

Speaker 2 (48:30):
It looks, and then it just the video is just
looks like complete shit. It's a very like I am
such a sucker for this very it's extremely funny. But
you had to come in and retweet it and say
this is a joke. Like the person who's posting this

(48:52):
is doing a bit like look at their profile, just
look at the at the video. Yeah, this is not
a game changer. And people's response to you was like, Matt,
they were I think more mad at like they're like
one person was like, if you need to explain to
people that you were making a joke, then you did

(49:13):
it badly. And it's like, no, this is.

Speaker 3 (49:15):
They did it good.

Speaker 2 (49:17):
They did a good job, Like this is actually a
very funny joke. Another person was like, it still sucks.
That's all it is doing is wasting resources, generating at
all for a simple joke, which yeah, now I get
the like AI is wasting resources thing, but like it's
I don't know they they definitely didn't get the joke

(49:39):
at first, And then I guess you were doing the
online rage equivalent of like yucking a youm in that
you're like calming a rage buzz and like that really
seemed to anger people. But yeah, that I guess that's
a new kind of thing where they're like ang instead

(50:01):
of happy, like instead of laughing.

Speaker 4 (50:04):
Anger is addictive, even if you're not if you've not
been diagnosed as a rage aholic, which is a term
I don't think I've heard in a long time, but
anger is addictive. It's more addictive than cocaine. It's all
the chemicals are released in your brain because that's your
fight or flight response, right, Like that's your your body
trying to charge you up to go to go take
on somebody. And it's felt as pleasure, of course it is.

(50:26):
You can see how badly people want to stay mad online.
If they're mad about something and it turns out it
was a misunderstanding, they don't stop being mad, they just
find they just find something else to put their anger.
And this is the one thing that I like, I
hate that, like the cancel culture, discourse has been seized
by the right and now you hear somebody complaining about
cancel culture, it's always a sex pervert who's got charges

(50:52):
in their pastor trying to why should I be blamed
for that? But the one thing that I've observed with
this that if you explain that they misunderstood, or they
took something out of context, or you showed them that
the screenshot was faked, that doesn't calm their anger. It
is once in a blue mood, you would hear somebody say, oh,
thank god, I thought you were yeah. Okay, you're right,

(51:14):
I misunderstood. I thought you were no. Because it's all
the people that already wanted to hate you, or the
people that liked you a while ago, and that they've
reached a curve where they've decided it would now be
more fun to hate you. And it's so you can
see that this is addictive behavior. You can see that
this is compulsive behavior because it's like you sought out
something to be mad at just because you wanted to

(51:36):
feel something. And highly unscientific. I'm not a scientist, I'm
not a doctor. I think this is a form of
self harm that people do. I think compulsively reach like
refreshing outrage headlines and making yourself feel miserable or doom
or scared or angry over and over again. I think
it's a self harm behavior where you're just trying to

(51:58):
hurt yourself so you feel alive somehow.

Speaker 1 (52:01):
Yeah, I mean that there's that I definitely got easily
swept up like many years ago, like in that kind
of like that sort of social media rage bait kind
of stuff, and I could stay even when people are like, dude,
this isn't even real. I would still pivot to like,
but I actually just need to be mad is where
I'm at, and it's really I'm realizing it might not
be this thing. And having that realization helped me definitely

(52:24):
to be like, Okay, what am I? Is it because
I'm observing something and I'm trying to be objective or
is it it's again it's that sensation.

Speaker 3 (52:31):
To be like they're fucking wrong.

Speaker 1 (52:33):
I'm with these other people that are right, and like,
you know, I don't care like how much it raises
my blood pressure or my heart rate or whatever. It's
just like that sensation in some way just I needed
it for whatever reason.

Speaker 4 (52:47):
And you'll hear people say things like, well, the fact
that this could could have been true. Is just evidence
of how evil these people are. There's just proof of
how bad they are that we thought this was all true.
Like okay, because even like Donald Trump, you know, who
produces three or four outrage headlines a minute, that people

(53:07):
will still invent stuff. There's there's anti Trump conspiracy. There's
that went around that if you try to debunk them,
it's like, well, oh, here we go. You know, there's
just another maga idiot try It's like, no, I'm I'm
trying to clean up our own information ecosystem so people
will trust us, because if we don't, if we don't
clean up that stuff from our own house, then people

(53:27):
will think everything we say is just a lie. So
it's like, no, you've got to reject that. You got
to say now, actually this is not what he said.
That is a thankless task and nobody wants you to
do it because no one is like happy that you
took away their anchors and they're like, oh, thank you,
I'm in a better mood. This is I was mad
all morning, but you have lifted this burden from me.

(53:49):
It is very rare to hear that.

Speaker 1 (53:51):
I mean, like, as you talk about cleaning up like
the information systems is your hope, Like I mean not
that this is like the you're like Reslin debt or
something like that. I have to make sure our echos
our information ecosystems cleaned up. But do you also is
there like a sense of nihilism that based on like
the way our engagements are structured and controlled by these

(54:12):
computers and the people that are able to sort of
like emphasize certain things, that it's like a futile effort,
or it's just something that maybe with increased awareness, we
slowly are able to sort of take some semblance of
like parody back.

Speaker 4 (54:27):
I would like to think so, but I think it's
a case where the technology has moved faster than the culture.
You know, our rules in our culture around things like
screen time, and you know, there was a time where
the culture didn't change for like a thousand straight years.
All the rules you had about do the chores before dinner,
do this, wash your hands, like these things were they
were established based on their lifestyle, based on the world,

(54:48):
based on what we keep everybody healthy, keep you from
getting sick, what would keep you safe well. Now the
world changes so quick. If I suddenly had a child,
I would have no idea what age I would let
them have a smartphone, whether what age I would let
them on social media. I can't give any advice on
that because I have no clue. I don't even know

(55:09):
what the science is in terms of brain development, because
I don't think we have the data you would. You
would need to be able to study someone through their
formative years on and offline and see what like are
they Are they seeing greater you know, levels of stress
and lower levels of like like self image, that kind
of thing. In the time it takes you to study it,
the world has moved on from Facebook to Instagram, then

(55:31):
from Instagram to tech talk, and then from TikTok to
whatever it is now, and you can't like, we just
don't know. But I'm not. I'm not a pessimist or
a centic. My my whole thing. I have this weird
sort of like optimism where I just think that very
few people in the world are truly evil. I think

(55:52):
most people in the world are not very smart about
things or don't have time to think about these things.
They're busy, they're raising kids, are working two shifts. No,
they've not sat down and pondered the role of freaking
TikTok in their child's life. They've got they're worried about
trying to pay the bills, or we're not trying to
get them to a soccer practice on time. I think

(56:13):
most people are doing their best. I don't think there
are that many. There are some true villains in the world.
I've read history books, but I think that number of
people is very small. I think most people, including like
most Trump voters, I think they've just been exposed to
an information ecosystem. I think if you watched Fox News
all day, if you if you only followed like these

(56:33):
right wing pundits over time, voting for him seems like
the only sensible choice. I think all of us are
more susceptible to that kind of thing than we like
to think. But I think mostly it's just people not
having the right information, or not having great critical thinking skills,
or just not having the time that I think the

(56:54):
culture eventually evolves. But for example, one unexpected thing that's
happened is people have stopped having sex and stopped having kids.
Nobody predicted that as much as much, but yeah, but
you can see you can see the rates like dropping radically,
like the culture was definitely not ready for that to happen.

(57:16):
We didn't have the data showing us that. Oh yeah,
if the teenagers, if you train them to just stay
home and interact over a screen, then they're not going
to be in proximity with each other. And one side
effect is you get much less crime because if you're
not if you don't leave the house, you can't do
a crime unless it's like some lame cyber crime. That
same thing. But they're also not dating as much, or

(57:37):
they're not you know, they're not getting married. We will
have to figure out how to deal with that. But
the technology just moves so fast that the culture has
to catch up, and you can see people struggling with us.
You can you can see like right wing media personalities
talking about wanting to go back to like the nineteen

(57:58):
fifties because that's the only way they can think to
turn back the conc is to turn it back all
the way yeah, before the Civil rights movement, right, Yeah.

Speaker 2 (58:09):
Yeah, I mean I spent my adult life writing for
and interacting with the Internet, and but yeah, when it
comes to my children, like my plan currently is basically
just like no Internet as much as possible, really, like
I yeah, I mean no, like social media, no phones,

(58:30):
like I mean, it just seems like there's too many
little shortcuts built into the system that are specifically designed
to like hijack our brain's risk reward center. But you
know it's I think it's it's much harder. And also, yeah,

(58:50):
it's based on just like a sense that the net
result on mental health is bad. And the anecdotes about
like the people who designed all this ship being like, well,
my kids are never allowed to do that sort of thing,
like go use the products that I designed.

Speaker 3 (59:07):
What do you know about it that we don't?

Speaker 2 (59:09):
Nothing, just nothing, just yeah, you just shut up. But
I don't know. It's definitely strange times, and I don't
want to make people think like the whole book is
just like dark. You know, there's a lot of it's
very funny, very entertaining, very edge of your seat. There's

(59:30):
great observations about packing underwear before a road trip that
really resonated with me.

Speaker 4 (59:37):
Everybody packs packs way too much, so much under Yeah,
but if I poop myself four times.

Speaker 2 (59:45):
I'm gonna be closed with myself constantly on this road trip. Yeah,
And like how we feel about pop culture references in
air pop culture like seems to be changing, and you
have some good interesting thoughts on that, but you can
find them in the book. Any anything else before we
say goodbye?

Speaker 4 (01:00:07):
No, just that I'm on TikTok as Jason K. Pargin,
where I have five hundred and twenty thousand forwards.

Speaker 7 (01:00:13):
Yeah, I hope I don't come across as a very
elderly and geriatric gen x being mad about technology.

Speaker 4 (01:00:24):
I'm on here all day, every day.

Speaker 3 (01:00:28):
I need it.

Speaker 2 (01:00:31):
Amazing. Where else can people find you? Follow you? Where
can they pre order the book? All that good stuff?

Speaker 4 (01:00:36):
The book is called I'm starting to worry about this
Black Box of Doom. It is up for pre order
in all formats including audio. I do not read the audiobook.
They hired a professional to do that. But yeah, you
can just search that title wherever you like pre order books.
If you want a pre order from an actual physical
indie bookstore in your city, they will kiss you right

(01:00:58):
on the lips if you do that, they will. That
would make them extremely happy. But I know that eighty
five percent of you are going to order it through Amazon,
so it's there too.

Speaker 2 (01:01:07):
There you go, And is there a tweet or some
other work of media that you've been enjoying.

Speaker 4 (01:01:13):
Uh, yeah, I had a couple. One is from the
Twitter user Peachy black Girl says, we're really living in
the most difficult section of somebody's ap government exam in
twenty fifty three.

Speaker 1 (01:01:29):
Yeah, what explained the omni crisis of twenty twenty four
in America?

Speaker 2 (01:01:33):
Like, there's so many they're so close together.

Speaker 4 (01:01:38):
Yeah, there's so many things that require so much explaining
of context. I try to compare to if somebody, like
six hundred years from now listen to that Kendrick Lamar
song about Drake, Yeah, and then how difficult it would
be to explain because it's so full of references to
scandals and pop culture and stuff that you would have

(01:01:58):
to write an entire book. It would be like that,
every every little thing that has happened, there's so much
context that's going to be impossible to explain.

Speaker 2 (01:02:08):
It's gonna be like entire.

Speaker 3 (01:02:11):
Academics.

Speaker 2 (01:02:12):
Academic career is built on explaining it. Amazing. Well, thanks
again for joining us, great work on the book, and yeah,
looking forward to having you back on again soon. Miles.
Where can people find you is their working media you've
been enjoying?

Speaker 1 (01:02:27):
Yeah, you find me at Miles of Gray, Twitter, Instagram threads,
wherever they got at symbols, I'm their PlayStation network.

Speaker 3 (01:02:33):
You know, it's called consistency. Baby.

Speaker 1 (01:02:36):
You can also find Jack and on the basketball Podcastle
the Jack. I MAT's He's gonna find me talking about
ninety day Fiance on four to twenty Day Fiance. A
couple of tweets I like, let's see first. One is
at Cuckoo Kado tweeted it feel like a lot of
women in their thirties are torn between sticking with their BF,
who's honestly not a great candidate, but he's the nominee,

(01:02:57):
or risking the chaos of.

Speaker 3 (01:02:58):
An open convention.

Speaker 1 (01:03:00):
Oh that was like, yeah, no, it's feels similar calculus.
Another one is like this picture of jd Vance. It's
from a pod app pod Leasia great account and its
just said I can't explain it, but he looks like
the guy at the bachelor party that doesn't know any
of the other cruisemen. Feels like big Jdvance energy and

(01:03:23):
also very dear friend of the show, Jamie Loftus, her
father passed away over the weekend and I just wanted
to send all of our condolences collectively to Jamie. She
tweeted on the twenty first first, we lost my amazing
dad this morning. The funniest person alive, endlessly supportive, a
massively talented and hardworking writer, A true friend. I don't

(01:03:43):
know what life looks like without him. Mike Loftis nineteen
fifty nine to twenty twenty four. So Jamie, our thoughts,
our love, all of that is with you and your family.

Speaker 3 (01:03:52):
But I just thought, if you guys.

Speaker 1 (01:03:53):
Hadn't heard, you know, maybe you know, just send some
support from life.

Speaker 2 (01:03:57):
Yeah, all right, tweets have been enjoying. Taffy Brodesser Ackner tweeted,
hurt people. Hurt people is what I think as I
reclined my plane seat because the person in front of
me did.

Speaker 4 (01:04:11):
That's a perfect metaphor.

Speaker 2 (01:04:13):
Yeah, and at E R r AI, I don't know,
I don't know how we're pronouncing that, tweeted dude, don't
be a dick. It's just a straw. Come on, your
camel won't even notice it.

Speaker 3 (01:04:27):
You can find.

Speaker 2 (01:04:30):
Us on Twitter at Daily Zeitgeist. We're at the Daily
Zeitgeist on Instagram. We have a Facebook fan page on
website Daily Zeiguist dot com where we post our episodes
and our footnotes, where we link off to the information
that we talked about in today's episode, as well as
a song that we think you might enjoy. Miles, what
song do you think people might endore?

Speaker 1 (01:04:48):
Oh man, I mean the title. I don't know if
it was a good day, but that's the name of
this song. It was a good Day. Parenthetical Footsteps in
the Dark obviously a reference to the ice Cube track,
but they're rotten the Isisley Brothers sample and it's by
that band Oma again. Oh ma, they're just like those
Brits playing live sort of hip hop instrumentals and you

(01:05:10):
love to hear it. It was a good day Footsteps
in the Dark by Oma.

Speaker 2 (01:05:14):
Oh my God. All right, so well, we will link
off to that in the footnotes. The Daily's Eyegeas is
a production of iHeartRadio. For more podcasts from iHeartRadio, visit
the iHeartRadio app, Apple podcast, or wherever you listen to
your favorite shows. That is going to do it for
us this morning. We are back this afternoon to tell
you what is trending, and we will talk to you
all then. Bye bye.

The Daily Zeitgeist News

Advertise With Us

Follow Us On

Hosts And Creators

Jack O'Brien

Jack O'Brien

Miles Gray

Miles Gray

Show Links

StoreAboutRSSLive Appearances

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.