All Episodes

March 28, 2025 62 mins

Text us your questions!

This is a re-release of an episode from our second season when we spoke with philosopher C Thi Nguyen. We think it bears re-listening in our current moment.

=====

What happens when we seek simple answers in a complex world? Philosopher C Thi Nguyen takes us into the machinery of belief, understanding, and value formation, exploring how we navigate information landscapes designed to manipulate us.

Thi introduces the concept of "moral outrage porn"—representations that give us the satisfaction of moral righteousness without requiring meaningful action. We discuss conspiracy theories and his notion of "the seduction of clarity"—the powerful feeling we get from explanations that seem to make everything simple. This feeling is particularly dangerous because we're limited beings who need mental shortcuts to navigate the world.

We also tackle echo chambers and why perfectly rational people can end up in them. Thi distinguishes echo chambers (where we systematically distrust outside sources) from filter bubbles (where we simply aren't exposed to contrary views), explaining that people inside echo chambers often follow logical procedures based on who they've decided to trust. This challenges the dismissive assumption that those with radically different beliefs are simply stupid or lazy.

Weaving through discussions of game design, social media metrics, and institutional incentives, Thi reveals how our values are increasingly captured by simplified scoring systems that reshape our priorities according to what can be easily measured. The result? We outsource our complex human values to technologies and institutions that weren't designed to handle them.

Uncomfortable yet?


Content note: this episode contains profanity.

=====

Want to support us?

The best way is to subscribe to our Patreon. Annual memberships are available for a 10% discount.

If you'd rather make a one-time donation, you can contribute through our PayPal.


Other important info:

  • Rate & review us on Apple & Spotify
  • Follow us on social media at @PPWBPodcast
  • Watch & comment on YouTube
  • Email us at pastorandphilosopher@gmail.com

Cheers!

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Kyle (00:15):
Well, welcome to A Pastor and a Philosopher Walking to a
Bar.
Today we have a guest that Imay be more excited about than
any guest we've had.
I'm not sure I don't want tooversell it, but I've been
following this guy's work for along time.
He writes basically all thethings I wish I had written.
Yeah, you said that hepublishes in all the coolest
philosophy journals.
All of his arguments are noveland interesting and get a lot of

(00:35):
traction, and he's also apretty up-and-coming public
intellectual, so he's able tosomehow get op-eds in all the
coolest places as well, and he'sjust so interesting, I don't
know.
I've heard him on a couplepodcasts and he's just a really
seems like a fun guy to talk to.
So I'm really pumped about this.

Randy (00:51):
Well, now you just set us all up for disappointment, no
matter what he says.
No, I've listened to a couplepodcasts as well, a couple of
his him speaking and, holy cow,he grabbed me within three
minutes.
I mean, I was in super, superfascinating, interesting,
philosophical but reallyrelevant stuff to our culture.

Kyle (01:10):
Yeah, so we're talking to a guy named Thi Nguyen, and he's
an epistemologist like myselfexcept a real one, I guess.

Randy (01:23):
He works at the University of Utah and has
graciously agreed to appear onour podcast, even though he
doesn't really work on anythingreligious.
Yeah, go figure.
I mean, maybe he's Mormon, he'sin Utah.

Kyle (01:27):
Maybe we're going to find out.
That'll be the first thing Iask.
Well, Thi Nguyen, thanks somuch for being on A Pastor and a
Philosopher Walking to a Bar.
Thank you.

Randy (01:39):
It's great to be here.
So Thi can you tell us, justtell our listeners about who you
are, what area of philosophyyou specialize in, and just your
whole world.
I'm T.

Thi (01:49):
Nguyen.
I am Associate Professor of.
Philosophy at University ofUtah.
My apparently my specializationin philosophy is so weird that
when some of my colleagues weretrying to describe to someone
else, they just like gave up inlike connections of laughter.
Like, I work in philosophy ofart, so okay stuff I've written

(02:12):
about recently games as an artform, gamification, echo
chambers, trust, porn yeah, someof our listeners just did a
double take.
I mean I think it's all relatedand one way to put it is that,
like I think a lot of philosophyhas historically been
interested in, like theindividual out of context, and I
get super interested in theindividual in context and a lot
of the ways that people work onthe individual in context just

(02:34):
involve talking about, say,communities or governments, and
I think that's right, but alsotechnology is part of it.
Like it's technology is part ofthe way we communicate, and I
ended up working across twofields I think not many people
work in.
One is social epistemology,which is a study of, like how
people know things incommunities, as groups, and the
other is the philosophy of art,which to most people is like a

(02:55):
weird connection, but to me arthas.
The philosophy of art hasalways been the study of the
relationship of communicationand technology Cause the
philosophy of art studies likeoh what happens when we get
photography?
What happens when we get film?
What happens like how does eachof these change the ways that
we connect to each other andexpress things, and express
subtle things and so, like thelast seven years of my life,

(03:17):
we're obsessed with trying toget a better theory of, like,
what games could communicate.
That was unique.
That's what I work on awesome.

Kyle (03:26):
Were you a gamer prior to that?

Thi (03:27):
oh my god, was I a gamer?
Yes, um, though most of myinterests, I think I'm.
I barely play computer gamesthese days, partially because
I'm married with kids and Idon't have time, and partially
because I feel like mainstreamcomputer games are getting
better and better at thetechnology of addiction, which
I'm sure we'll talk about yeahso, just for the gamers who are
listening, what are like two,three games that you lost years

(03:50):
of your life.

Kyle (03:51):
Oh my god, the civilization games like yeah
like I'm never allowed to touchany civilization game again,
like I've never played onespecifically for that reason, I
know myself too well and thenlike I mean, I mean, I can't
really like I was really intothe old school D&D computer
world, like before Skyrim.

Thi (04:12):
There was, like you know, ultima and Baldur's Gate and
like that era of like or likeanything like very tactical,
like XCOM, like from theoriginal, like I mean, I I had
an original atari like 2600,like I'm yeah, but I was, I was
like anything that involvesleveling and grinding is really

(04:34):
dangerous this sounds like adance club to me.

Randy (04:36):
I have no idea what you're talking about you have no
idea what I'm talking levelingand grinding you lucky dog, you
lucky so Thi you've talked aboutwhat you've written about
recently and I've I've listenedto a few things that you've
talked about, the things thatyou've written, and as
fascinated, so let's just diveright in.
You talk about porn and youtalk about specifically what I

(04:58):
loved was you're talking aboutmoral outrage porn.
So can you first give us yourdefinition of porn, whether
that's sexual or whether that'sfood or whether that's real
estate porn, all that stuff?
Give us that kind of fit, fitour listeners in there, and then
tell us about moral outrageporn and what, what you see,
what you think.

Thi (05:16):
So this, this I wrote a paper with Becca Williams.
This paper is a product of ourthinking together and the
conversation actually started asa drunk late night conversation
between the two of us onsomebody else's facebook post,
um, and it was just about.
I was just starting to jokearound like you know what.
We don't have a goodphilosophical account of porn in

(05:38):
the general sense, like youknow, like food porn or real
estate porn.
So my wife loves this sitecalled things neatly organized
to Tumblr.
It's just like pictures,closeup loving pictures of like
pencils arranged by color orlike corks neatly stacked.

(06:00):
This is all like if I toldpeople like oh, this is
organization porn.
Like everyone knows what thatmeans, right?
So it seems like there's ageneral notion of porn that we
we get.
Actually, one of my favoriteexamples are writing this paper
and becca called me and she waslike oh my god, I was watching
saturday night live.
It was like I don't know, likefive months after trump had
gotten elected and someone onecomedian was like just stop it

(06:21):
with all this impeachment porn.
And like everyone laughsbecause they know exactly what
this means.
And, funnily enough, if you lookat the all the discussion of
porn in philosophy, thedefinitions of what porn is are
all inherently sexual, and sowe're trying to figure out like
no, we know what this means.
What, what could it be?
And becca actually found this,this paper by michael ray, who's

(06:41):
a really good philosopher yeah,about sexual.

Kyle (06:43):
We're hoping to get him on the podcast.
Oh yeah, he's awesome.

Thi (06:47):
But his definition of sexual pornography is basically
that sexual pornography isimages of sexual content used
for immediate gratification,while avoiding sexual intimacy
or the development of arelationship.
Because what he was reallyinterested in and I think this
is deeply right is that twopeople in a loving relationship

(07:10):
can exchange, like erotic, nudephotos with each other, and
that's not porn, right?
And I think this is reallydeeply interesting that mere
nude sexual erotic content isn'tporn if it's part of a
relationship, right, if it goessomewhere.
And so we were like oh, my god,this seems all right.
We can generalize this, and soour definition of porn, of any

(07:31):
kind of porn.
Philosophers we call it, youknow, x porn, where x is a like
algebraic signifier, right?
So x porn is any case.
And when you haverepresentations of X which are
used for immediate gratification, while avoiding the costs and
consequences of genuineentanglement with X itself,
that's so good.
So like food porn, right?

(07:52):
Food porn is pictures of foodyou use for immediate
gratification, but you don'thave to cook it or deal with the
calories or deal with nutrition, or like try to go to a
restaurant.
Or like real estate porn youjust see images and it makes you
feel good.
You don't have to buy it, youdon't have to maintain it, you
don't have to keep it clean.

Randy (08:07):
Right.
Snl had this amazing skit thislast year about how Zillow has
turned into like the 30 and 40year olds 900 call in line and
you're exactly what you'retalking about All the
gratification without any of theproperty taxes, without any of
the upkeep and maintenance.

Kyle (08:24):
It's porn.
This is HGTV, yeah, yeah.

Randy (08:27):
Continue T.

Thi (08:28):
Oh yeah.
So we were like okay, then ifyou have this definition, it's a
good definition, it should helpyou, like it should be useful,
and help you identify new kindsof porn.
And so here's one kind of pornmoral outrage porn.
Moral outrage porn isrepresentations of moral outrage
used for immediategratification while avoiding the

(08:48):
costs and consequences ofactual, genuine moral outrage.
Oh so many listeners are justslayed right now.
So I mean, I think peopleimmediately know what this is.
They know what I think in thepaper were like maybe like 50 of

(09:08):
twitter is moral outrage porn.
But but I just want to saysomething.
So this paper of ours we shouldhave seen this coming has
already been abused and I justwant to tell you about the, the
abuse that annoys me the most,because it's the opposite of
what we want to say.
What annoys me the most ispeople who want to be like oh,
this means that any moraloutrage is bad.
Don't express moral outrageLike.

(09:28):
Just be oriented at the morallyterrible and the unjust.
That is what gets you to actLike genuine moral outrage.

(09:50):
So I believe philosophers likeMartha Nussbaum, who say things
like well-tuned emotions areperceptions of moral states of
the world, like well-tuned angerand outrage is a rendering of
how unjust the world is, likethat's the important stuff.
The problem with moral outrageporn is it undermines the

(10:11):
importance of genuine moraloutrage, right, like it.
So either moral outrage porn issomething you enjoy without
taking action, right, or it'ssomething that you enjoy because
you want to have the pleasuresof it.
You simplify your morality.
You get attracted to simpleexpressions of outrage instead
of trying to figure out thenuanced, complex, nauseating
thing which is the actual, whichgenuine moral life actually is.

(10:35):
So this was never intended tobe like an attack on moral
outrage.
This is intended to saysomething like moral outrage is
so important that thepornification of it like
threatens the ability to havegenuine moral progress against
injustice.

Randy (10:52):
Yep, and would you say a facet of moral outrage.
Porn is the, the act of feelingrighteous and like my voice
will be heard, without actuallydoing anything about it.

Thi (11:05):
Well, yeah, I mean okay.
So this is again.
I just want to be so cautious,because it's not that every
feeling of righteousness is acase of moral outrage porn.
You can be righteous becauseyou're on the side of right.
Right Abolitionists in theantebellum South were righteous.
They should have been so, andfeeling good about being
righteous in that case is reallyvaluable, right?

(11:28):
It keeps you going.
The worry is that if you justwant the feeling of
righteousness devoid of actualmoral truth, then what you're
motivated to do is to find thesimplest, easiest position you
can.
That just lets you get moraloutrage all the time and, to
take a sense, it won't lead toaction, because action is hard
right.
Just the pleasures ofsimplified moral outrage are

(11:49):
easy.

Randy (11:50):
Yeah, there's this little nuance between this idea of
moral grandstanding and moraloutrage porn, and you'd say
probably there's a differencebetween the two right.

Thi (11:59):
I mean, they're really similar in a certain way.
So the notion of moralgrandstanding is basically using
expressions of morality forsocial status, and the idea of
moral outrage porn is usingjudgments of morality for
pleasure.
I think both of them have thesame.
What moral expression issupposed to do is track genuine
morality and get you to takegenuinely moral action.

(12:22):
Both of these are, you youmight say, perversions of
morality.
Right, they're perversions ofeither either aimed outwardly or
aimed inwardly, but again, likebut you also see the parallel
problem, like.
One of the things that reallyirritates me about the current
social uptake of moralgrandstanding is that people who
have used it to makeaccusations against anyone that

(12:43):
makes any moral claim andthey're like oh, you're just
grandstanding.

Elliot (12:47):
And it's like no, no, no , no, no, no, no no.

Thi (12:49):
It's not grandstanding but it comes from a genuine moral
belief.
It's grandstanding if it's beenre-aimed at status and it's
often really hard to tell.
One of the things I notice isthat a lot of people that make
accusations I've seen peoplelike using this stuff about

(13:10):
moral grandstanding and moraloutrage porn to like attack
anyone that expresses moralmoral stance.
I keep thinking, like you knowwhat?
It's really funny that peoplekeep accusing their political
opponents of grandstanding andmoral outrage porn, but not
their side, and that itself is asign that something has gone
funky and wrong.

Kyle (13:22):
Yeah, so if this is, this is too nerdy, we can take it out
.
But in your conception of moraloutrage porn, is it deontic,
consequentialist, virtue,theoretic or none of the above?
So here's a case.
So let's imagine that KimKardashian gets really pissed
off about something andgenuinely morally outraged,
posts about it to her, however,many million Twitter followers

(13:43):
and because she has that manyTwitter followers, actually
changes the situation for thebetter.
Is that an instance of moraloutrage porn?
Why or why not?

Thi (13:50):
So this account of pornography is the essential
part is a particular user'sintention in taking it up.
So the same thing might be pornfor one person and not for
another, and this is actuallybuilt into the original Michael
Ray account.
Like he, this is a lovely pointof his that you know a couple

(14:11):
in a relationship can exchangenude photos and that's not porn.
And then someone can take thatout of context and use it as
porn.
And so the fact that someone'sexpression of moral outrage was
genuine doesn't mean thatsomebody else couldn't also use
it as moral outrage porn.
That's the first question.
We tried to write this thing soit was compatible with almost

(14:34):
any.
Didn't tie you to anything.
Yeah, we were trying to writeit Basically.
As long as you think that moralbelief should track actual
states of affairs, then youshould get on board.

Randy (14:49):
Okay, yeah, these sound similar to me or seem similar to
me.
You've also written and talkedabout the seduction of clarity
and what comes from theseduction of clarity,
understanding as orgasmFascinating stuff.
Can you bring our listeners andus into these ideas and
concepts?

Thi (15:06):
This is another.
Like this stuff is another casewhere there's this thing that's
really good, that's clarity,and then people give you a I
don't know, like a fake versionof it, and that's the thing that
.

Elliot (15:15):
I'm reading.

Thi (15:16):
I got really interested.
So I mean I started thinkingabout this just because when I
was working on some other stuffI ended up reading a lot of I
spent a lot of time onconspiracy theory websites, just
like hanging out reading forumstrying to understand how things
were working and like it seemslike one of the one of the
incredibly interesting thingsabout conspiracy theories is
they offer you a single potentexplanation for everything.

(15:38):
And people say things like it'svery powerful and it's very
clarifying, and I think part ofthe thing like actually I don't
think the world is that simple,right, but it feels so good to
have a single explanation thatjust takes everything into
account.
So I was getting reallyinterested in how this might
work and basically my thought issomething like so we're limited

(15:58):
cognitive beings and we can'tinvestigate everything forever,
for all time, and so we need tobasically guesstimate what's
worth investigating and what'sworth not investigating, what's
not worth investigating right,because you can't think about
everything.
You could go down the rabbithole about any question forever,
right?

(16:19):
So you need to make thisestimate.
So it seems like there was alot of empirical evidence in
various forms of sociology A lotof us are using as our
heuristic for when we shouldstop investigating, a feeling of
clarity right?

Randy (16:36):
Can you explain heuristic for us non-philosophers?

Thi (16:39):
So in many cases, getting a calculation exactly right is
really, really difficult, and aheuristic is a really simple
rule of thumb that gets youthrough things.
So let me give you an exampleof a heuristic.
So at one point I used to eatlike crap and I tried to eat

(17:00):
better, and the first thing Idid which is what a lot of
people do is you start trying totrack the calories, nutritional
content of every single thingyou eat.
Right, and this is like this isdebt.
You can't, no one can do thisfor like people burn out.
And then you start looking forreally simple rules of thumb,
like one I ended up using waslike don't eat processed carbs.

(17:22):
Right, like don't eat stuffthat has flour in it.
This is not a perfect rule ofthumb and you can move beyond it
, but as a beginning maneuver,like it's a really useful like
first step.
So, because we're limitedbeings, we need a way to like
quickly estimate what's worthinvestigating or not, cause we
can't investigate everything tofigure out if it's worth

(17:44):
investigating, and so onethought is we use the sense that
something is clear, right, sowhat's?

Elliot (17:52):
that.
So at this point like there's a.

Thi (17:54):
there's actually a really interesting amount of philosophy
about this, about what actualclarity is, what actual
understanding is, and one ofthank you, I've just been
brought a cocktail.

Randy (18:04):
Good interruptions.

Thi (18:06):
And a bottle of booze Thank you, my spouse understands me
and what I need for a podcast.
So in this literature, in thephilosophy of science and the
philosophy of education, there'sthis idea that what it is to
really understand is to get acoherent model that can explain
as much as it can right, thatcan explain as many things and

(18:27):
find as many connections betweenthings.
They can be communicated easily.
So for those of you who havesome philosophy background, old
school philosophy andepistemology used to think the
goal is knowledge, having truebeliefs right, and what a lot of
people ended up saying wasthat's not enough, because under
that model you can have a tonof individual knowledge but you

(18:48):
have no coherency or overallpicture.
What do you want in science?
You don't just want a bunch oftrue facts, you want a model
that can unite all the truefacts, explain them and make
predictions and connect newphenomenon and connect new
phenomenon.
So my thought was like okay, ifthat's what it is to actually

(19:08):
understand, what would it belike to fake that feeling?
Right?
How do you give people thatfeeling without actual
understanding?
So what you want is to givethem a really powerful model
that can fit, explain all kindsof things right and any new
phenomenon I can just handle.
And one suggestion is that's whyconspiracy theories are so
compelling right, and any newphenomenon I can just handle.
And one suggestion is that'swhy conspiracy theories are so
compelling right, becausethey're like a cartoon,

(19:31):
simplified, powerful version ofunderstanding.
It just like explodes out andjust can give you everything.
And I think it's interesting inparticular because if you
believe in science and thespecialization involved in
science, no actual human beingcan explain everything.
Right, and that's kind of a sadplace to be in.
But a lot of these systems theconspiracies, theoretic systems

(19:55):
make it feel like everything issuddenly in your grasp and you
have a model that can explaineverything.
So it feels more likeunderstanding than the actual
world, because you can neveractually get, if you believe in
science, an actual feeling canbe totally honest, because I've
got a really thick spiritual orreligious skin.

Randy (20:12):
Would you put religions in that category as well?

Thi (20:34):
Yes.
And then let me say more.
It's interesting.
One of my favoriteundergraduate teachers was this
English professor named RichardMarius, who's just a lovely guy.
I remember him.
We were sitting outside one day,office hours, we were talking
about Thomas Pynchon, and hesaid oh, yeah, said.
What Thomas Pynchon makes methink about is that there's a

(20:54):
deep similarity between theaesthetic of mystery novels and
the aesthetic of religion,because in both you have all
these seemingly random eventsand then you find out this thing
that provides this unifyingexplanation where everything
makes sense.
One caveat it's not like therearen't true theories that make
sense of a lot of things.

(21:14):
Right, that's what science istrying to get us right, and in
many cases the whole point isthat what you want is a theory
that can explain everything.
So the fact that a theory hasexplanatory power and a great
unificatory power doesn'tnecessarily mean it's bad right.
My worry is that there arecertain systems that have been
optimized for the feeling ofunderstanding and not actual

(21:37):
understanding.
One thing I should say, though,that might soften the bite here
is I think another place whereyou find exactly this effect is
a lot of bureaucratic systems ofjustification.

Elliot (21:46):
That has nothing to do with religion.

Thi (21:49):
I see this constantly in administrative life, in
universities.
We have simple metrics andwe're trying to create
justificatory systems in whichyou can explain any action in
terms of a couple of simplemetrics.
Sometimes I think there are alot of intellectual systems
whose appeal is that you can geteverything from a single
principle utilitarianism,libertarianism right and you

(22:13):
might also make such anaccusation of some versions of
such systems.

Randy (22:18):
Yeah, sure, yeah, I mean as a pastor.
I will say if you're aChristian or a spiritual person
who enjoys the quick and easyanswer, or your church leader
has a habit of giving you aquick answer, pat answer, for
every profound question aboutthe universe, question things
and question whether themotivations of that and question

(22:38):
your motivation of feeling goodabout simple answers to complex
questions, because it's justnot usually the truth and
there's a whole lot of mysterywithin.
There should be a whole lot ofmystery within our spirituality.
So I'm fully endorsing whatyou're saying.
We should be a little bitsuspicious when we hear easy
answers to really heavyquestions.

Kyle (22:55):
Yeah, like where did the world come from?
It's like that old joke, right?
The right answer to everySunday school question is Jesus.
There's a reason that's funny.

Thi (23:05):
One thing I say at the end of this selection to clarity
papers, something like what'sthe response?
And it's something likeheuristics are good until people
know what they are and startgaming them Right, like I think.
For example, in our evolutionwe probably evolved to have a
instinctual heuristic, and thatheuristic is consume as much
sugar and fat as you can, andthat heuristic, I think, works

(23:28):
in what the evolutionists callthe environment of evolutionary
adaptedness, because there's notthat much sugar and fat around.
If you just cram your mouth asmuch fruit and animal as you can
, you'll be fine, but then thatheuristic gets gamed and what
that gaming looks like is likeCheetos and Nilla wafers.
And so I think, like what youhave to evolve, what I have to

(23:49):
evolve, because I speak assomeone that's capable of taking
down like a Costco-sized bag ofkettle chips in a single go
yeah, baby.
I think, now that we know thatthere are people out there that
are trying to game our sense ofdeliciousness, we have to
devolve something.
It's not to say thatdeliciousness is bad, but when

(24:10):
you eat something and it's justso addictively tasty that you
have to maybe be like wait, wait, wait, wait, wait.
Hold on a second, Let me look atthis bag.
What's going on?
I think there's somethingsimilar.
Look at this bag.
What's going on?
I think there's somethingsimilar.
It's not, again, that thingsthat make sense and are easy are

(24:33):
necessarily wrong.
It's that because we are in aworld in which people are trying
to game our heuristic ofeasiness of understanding.
If something just feels good,you should immediately be
suspicious.
You shouldn't just accept likethe response to ease of
understanding should besuspicion in an environment
where people are trying to gameyou.

Randy (24:46):
Yeah, yeah, oh, man, politically, can you imagine how
much that would change thingsin our QAnon?
Whatever, I'm not going to gothere.
So this is kind of speaking tolike can you put on your
philosophical life coaching hat?
I don't know if you've ever puton a hat like that before, but
yeah.
So I've heard you speak onwhether or not we actually love
the truth, whether or not we'rededicated to the truth, or if we

(25:08):
just want to have everythingthat we already think and
believe affirmed.
You know confirmation bias,whether or not it's true.
So philosophy is all aboutloving the truth and following
where the argument leads andtrying to get to the bottom of
it.
Can you just putting on thatcoaching hat, can you?

Thi (25:37):
tell us how to actually seek and love the truth, rather
than living in constantconfirmation bias?
How do we examine ourselves andour own motivations, as we're
in an environment where peopleare trying to give us cooked
versions of the truth that godown easy, right?
Okay, so let me try it this way.
If you were trying tomanipulate people and get them
to accept what you wanted themto believe, a good strategy

(26:01):
would to be make believing yourchosen belief more pleasurable
and easy, right?
So, given that, I think we needto be suspicious of belief
systems that are pleasurable andeasy which is not to say that
pleasure and ease are alwaysfalse.
Yeah Right, there are plenty ofgood things to eat that are

(26:24):
delicious and there are plentyof truths that are incredibly
pleasurable to grasp.
But, given that we know thatthere are manipulators who have
a lot of motivation to get a lotof power by using the
manipulation of pleasure andbeliefs to get us on their side,
we should immediately besuspicious and ask what's going

(26:44):
on that's good Yep.

Kyle (26:46):
Yeah.
So I heard you say somethingsimilar on a different podcast
interview and I had this thoughtand I'm curious what you think
about it, because it kind ofsums up my psychology in some
ways.
So it seems like, on reflection, that maybe the best answer,
the most convincing answer thatI like feel in my gut, is
convincing that feels clear tome To any big question like why

(27:10):
should I care about truth, orwhy should I care about a good
method of gathering information,or why should I care about
morality or what's you knowright or wrong, or what's
healthy or whatever.
Why should I do any of thatstuff?
Maybe the most convincinganswer to me is because it's
hard and the other thing is easy.
Now, maybe that's somethingpeculiar to my psychology, that

(27:30):
that's what like hits me homethe most.
But like I've heard religionsummed up in that way
Kierkegaard kind of sums upreligion in that way
Christianity, right, his versionof it is very hard, maybe even
impossible, and that's why itappeals to me.
So does that say something deepabout human nature and, if so,
can we weaponize it to actuallycombat conspiracy theories or

(27:54):
misinformation?

Thi (27:55):
I think that theory is too easy.
Why do the hard thing that's?
I mean?
I've actually said that topeople before, but I don't I
mean, so let me let me.

Kyle (28:05):
Let me complexify it a little.
So I don't think that it'sactually effective at people who
just want the easy thing, but Ithink it's a very good way of
weeding out who wants the easything and who actually wants the
truth.
Right, if you're someone whoactually is prone to the truth
or maybe has the capacity todesire it or something coming to
the recognition that this isgoing to be hard and I'm going
to have to dig deep to get it ismaybe the best sales pitch for

(28:28):
that kind of person.

Thi (28:29):
Yeah, maybe it is a good sales pitch, but again, like in
the background, I think there'sa lot of difference between a
theory that says all falsehoodsare easy and all truths are hard
Like that's too easy, right,then you find the truth by just
doing the hardest thing.
Like, sure, that's too easy,right, you find the truth by
just doing the hardest thing.
Like that's like that's a metaeasy.

(28:49):
That's like that's a metaeasiness for someone that's like
trapped in some protestant workethic of like the hardest thing
is always the best thing bemore productive, right.
Be more overworking, right.
I don't.
Yeah, I don't think there's any.
I mean, I think the equivalentis nutritionally Eat.
The most disgusting thing.
That's not actually.

(29:10):
There's incredibly beautiful,delicious, wonderful food that's
actually deeply nutritious anddeeply.

Kyle (29:16):
Yeah, but balancing it is more difficult than just eating
the most disgusting thing, Right?
So I mean you're still aimingat what is actually going to
challenge me.

Thi (29:25):
Right, yeah, I mean.
So you've been, you're stillaiming at, but that's what.
What is actually going tochallenge me, I think.
I think one thing that I mightaccept is, given the presence of
manipulators who are trying togame you, it is very unlikely
that the easiest path is theright path, but that's really
different from saying, likealways do the hard thing, right.
Actually, I think the hard thingis starting out the fact that

(29:50):
some easy things are true andsome easy things are false, like
if it was that all easy thingswere false, then this would be
yeah, trivial, yeah, don'tbelieve any easy things, but
that's that's.
That's too easy.

Randy (29:55):
Now I'm wrapped up in better notes, right this is a
fun little like being able tosit in on a philosophy debate is
very enjoyable.

Kyle (30:05):
This is what every after conference drinks feels like
Nice, very good.

Randy (30:08):
So switching gears to.
You've spoken about epistemictraps or epistemic filter
bubbles and echo chambers andthe differences between them,
and it's all very fascinating tome, especially being a church
leader.
So can you bring us into whatare echo chambers?
Particularly what are epistemicfilters and filter bubbles All

(30:30):
the words that you use, right?

Thi (30:32):
The words are important, I mean I started writing on this
because I got irritated by theway people use words.
So, basically, for me, there aretwo different concepts that
people have been blurringtogether and it's really
important to keep them separate.
So one concept is an echochamber and the other concept is
sometimes called a filterbubble, but I want to call it an
epistemic bubble forcomplicated reasons.

(30:52):
So the bubble concept is theconcept that most people have
become obsessed with lately.
A bubble is some kind of socialphenomenon where you don't hear
the other side or you don't getexposed to the other side's
argument.
This got really famous from abook from Eli Pariser, the
Filter Bubble, and he was reallyinterested in the fact that you
know, if all your friends onFacebook share your politics,

(31:15):
you'll just never hear the otherside.
You'll never be exposed to theevidence.
Right Lately people have beenusing the term echo chamber and
the term bubble synonymously torefer to that.
But if you actually look at theearly research that leads to
this concept of echo chamber inparticular a book called Echo
Chamber by Kathleen Hall,jamison and Frank Capella, they

(31:35):
have a different concept of anecho chamber.
An echo chamber for them is acommunity where people distrust
everyone on the outside and thedifference between never being
exposed to the ideas of peopleon the outside and distrusting
systematically everyone on theoutside is just totally
different.
These are different concepts.
So the first thing I want tosay is people blur these things

(31:58):
together a ton and there's a lotof research that says, oh,
there's no such thing as echochambers or filter bubbles,
which is all showing thatactually conservatives know what
the liberal arguments are andliberals know what the
conservative arguments are, andclimate change deniers know what
the climate change argumentsare, and I'm actually fairly
sympathetic to the idea thatthere actually aren't many

(32:19):
filter bubbles or epistemicbubbles in this world.

Elliot (32:21):
That right now given the media environment we're in.

Thi (32:24):
Most of us know what the other side's arguments are.
I'm progressive, I know whatTrump's arguments are and we
just inherently distrust theother side.

Randy (32:32):
Yeah.

Thi (32:33):
It's that we think that the other side is systematically
biased.

Kyle (32:38):
So what's a runaway echo chamber?
Oh, a runaway echo chamber.

Thi (32:43):
A runaway echo chamber is a case where the following
happens you pick all youradvisors based on your
estimation of who's expert orgood, but if your notion of
who's expert or good is flawed,then you're going to pick bad
advisors.
For example, if you're a whitesupremacist, you're going to
pick moral advisors who areother white supremacists and

(33:05):
they're just going to confirmyour white supremacy, right?
Similar thing, actually.
I think people found thisinteresting, although this has a
lot of political implications.
I actually started thinkingabout this, thinking about art.
I was interested in artisticecho chambers because I was in
one in one, and my version ofthis was I was raised on

(33:28):
european classical music and allthe people that I trusted were
people that were good ateuropean classical music, right,
and all of them thought rap wasshit.
So I grew up having no abilityto understand rap and also
having picked, because of myclassical background, only
advisors who were, you know yeahwho thought that european
classical was the highest form.
It turns out that not only israp amazing, but part of the

(33:52):
problem is that because rap isrhythmically complex in a way
that is kind of skewed to thecomplexities of European
classical, that if you're raisedin European classical you won't
have the rhythmic skill to hearwhat's going on in rap, but you
can maintain your belief ifeveryone you trust is like oh,
that rap stuff is crap, don'tspend any attention on it, right
?

Kyle (34:08):
so that's a point.
Epistemology here too, yeah, solet's keep with that analogy.
Then what brought you out ofthat echo chamber?

Thi (34:16):
what was it that?

Kyle (34:17):
enabled you to appreciate rap, that's that's an
interesting question.

Thi (34:20):
so I think in my case there were two things that happened,
and I think this can begeneralized.
One was was at some point Ilooked at my shelf of music and
I was, like everyone here iswhite, there's probably
something wrong with me, likeI'm Vietnamese.

(34:41):
This is like and it's importantthat I am from the kind of
Vietnamese who were wealthyenough to go to French schools
and had a conception of Frenchculture as and it's probably not
that white people are justbetter at culture there's
something some systematic racialbias have gotten into your
education.
The other is someone.

Elliot (35:15):
I trusted.

Thi (35:16):
I met someone who knew a lot of classical and they were
like no, you should listen torap, listen to this.
And they had really complex,subtle views about classical.
So I trusted them and so theygot me to pay attention to rap
and I think that's a similarthing across all echo chambers.
One of the interesting thingsis when you see stories of

(35:37):
people leaving echo chambers itdoes seem to be because of
personal relationships of trust.
There it is.

Kyle (35:43):
Yeah, and this, honestly, is the most terrifying fact
about echo chambers to me,because it's not a scalable
solution.
Right To to depend on thepatience and dedication of
someone outside your chamber whoalso has taken the time to
understand your chamber is not ascalable solution like we're
not going to fix climate change,if that's, if that's what it

(36:05):
takes yeah, I mean it's.

Thi (36:06):
Since I've written this stuff, people keep asking me
like what's the large-scalepolicy solution?
I don't know maybe there's notone, yeah yeah, yes, uh, yeah,
uh, yeah, yeah, yeah, yes, yeah,yeah, maybe yeah.

Kyle (36:23):
Yeah, no, no seduction to that clarity.
So let's let's talk about howrationality works in an echo
chamber, because one of thethings that you've pointed out
beautifully that I try toconvince other people of all the
time, including in a talk thatI just gave yesterday, is that
being in an echo chamber doesnot make you irrational, and
that writing off people asstupid or lazy or irrational or

(36:47):
uneducated or fill in the blankof your favorite easy dismissal
is just going to for one.
It runs foul of the evidence,but it's just not going to be
helpful with any social problemfacing us, but it's the easy
thing to do, right?
So explain to us how it'spossible for an echo chambered
person to be acting rationally.

Thi (37:09):
To understand this we have to like, we have to get rid of
this like profoundly falseconception of how knowledge
works that seems to affect a lotof us that shouldn't.
And that profoundly falseconception is that we're capable
of knowing everything we needto know individually, that we
have the ability to know things,everything that matters, on our
own.
That's the ideal ofintellectual autonomy and that's

(37:29):
just obviously false.
Like in the current era of theproliferation of the size of
science, like no human being canmaster even like one millionth
of the amount of knowledgethat's out there.
I mean, I had a kind ofconversion experience because of
a book from a philosopher namedElijah Milgram called the Great

(37:49):
Endarkenment, and the book'sargument is basically that the
essential epistemic condition ofour era is that knowledge is so
hyper-specialized that everysingle genuine practical
conclusion comes from crossingso many fields that no person
can actually master the wholeargument.
Right?
Chemical engineers truststatisticians, trust physicists.

(38:13):
Right?
Like there are these huge longchains of trust that run totally
out of our control.
Like there are these huge longchains of trust that run totally
out of our control.
I think the way that we startour life is intellectually, we
end up trusting other peopleright About tons of things.
We trust large institutionalstructures.
I trust my doctor with my life.

(38:33):
Like literally, my doctor saystake the spill.
I'm like okay, I don'tunderstand any of it.
And not just that.
If I asked my doctor to explain, my doctor probably can't
explain all of the chemistryinvolved behind that and all the
statistical modeling behindthat.
So my wife is a chemist and Iasked her I was like you know
how much of chemistry can youexplain?
And she was like look, I can Iunderstand like one 100,000th of

(38:56):
chemistry, right.
Like here's like neighboringfields of chemistry.
I know nothing about it, it'sjust so complicated.
I just know my little patch ofchemistry.
I'm good at reading organicchemistry on one set of
instrumentation.
That's my specialty.
It took me 10 years to learn.
So our essential position is onein which we're born in the
world.
We have to start trustingpeople.

(39:17):
Without understanding themright.
We can't monitor and check allthe people that we trust in.
We trust large-scaleinstitutions.
Like I mean, I believe thatclimate change is real, but can
I give you the evidence?
Nope, right, it involves.
I actually did this as anexperiment once.
I checked out climate changemodels and I can understand like

(39:42):
maybe the first hundred words.
It's a complicated statisticalmodeling of like meteorological
events and I'm like I've got noclue, right yeah?

Elliot (39:52):
why do I trust them?

Thi (39:53):
I trust them because they're professors from like
Princeton and Yale that arepublished in science Right.
So what I'm trusting is largescale institutions.
So if you grow up with yourtrust setting set to the wrong
large scale institutions, youcan go through the same
procedure of using the peoplethat you trust and the knowledge
gathered from your trustnetworks to check on new things.

Kyle (40:18):
Yeah gathered from your trust networks to check up on
new things.
Yeah, yeah, yeah.
So.
So for the average person whobelieves that you know that the
election was stolen from trumpand handed to biden, how can
that be rational?
What?
What large-scale institutionsare they right trusting?
So in a way, in a way that'skind of blameless.

Thi (40:35):
Yeah I, this is the point where I genuinely don't know if
that position is blameless.
I can imagine blamelesspositions.
At that point you might startto worry about how much blatant
counter evidence is availablethat people are dismissing.

Kyle (40:48):
On the other hand, you tell another story.

Thi (40:50):
That story looks like a person might arrive at the view
that large-scale mainstreammedia is corrupted and that only
a small news source is to betrusted, and that's not an
inherently false view.
We know plenty of positions.
I mean, people always make funof philosophers for talking

(41:10):
about Nazi Germany, but imagineyour resistance fighter in Nazi
Germany.
Right, it's true that most newssources are corrupt and that
only a tiny, tiny true that mostnews sources are corrupt and
that only a tiny, tiny fractionof the news sources are
trustworthy.
That's an available position.
But yeah, I don't know if I'mwilling to say that that
particular case is a case inwhich there's a clearly rational

(41:31):
procedure to enter.
I mean, it's just so hardBecause, I mean, let me think
about my views about theimportance of vaccination and
the deadliness of the COVIDepidemic, which are denied by
the other side.
Right?
My views come from believing inthe deadliness of COVID and the
effectiveness of vaccines.

(41:51):
But where do I get thatinformation?
I get that information from theNew York Times and the New
England Journal of Medicine,right?
I haven't collected that formyself, right?
That information comes from apre-existing set of trust in a
large-scale set of institutions.
So I don't know.

Kyle (42:09):
Yeah, so this will be my last question about echo chamber
.
Do you think there is some kindof moral flaw somewhere in the
causal history of all echochambers?
Is there a liar back theresomewhere?
I don't know I suspect not.

Thi (42:25):
That's not necessary, right .
You can get echo chamberswithout that.
All you need is someone togenerate a plausible sounding
explanation.
That's a little too easy, right.
All you need is someone to gettoo excited by an easy
explanation and then come upwith one and be convinced by it,
and conspiracy theorists andwhatnot.

Kyle (42:45):
And there's this, there's this way, this I don't know
method, this tendency ofmanipulation that a lot of them
have, that seems remarkablyeffective that I want to hear
you riff on a little bit.

(43:06):
So I don't know if you couldcall it just like information
overload or what, but like theyhave this way of just dumping
really expert soundinginformation in front of you and
then kind of making it appear asthough the ball's in your court
.
What are you gonna do with that, right?
And?
And so either you just give upbecause you don't have the time
to engage, or whatever, and ofcourse that's going to make it

(43:28):
seem like you lost the argument,right, and everybody else
watching is going to be like ohlook, our guy won, or whatever.
So they like make the costs ofengaging too high, our guy won,
or whatever.
So they like make the costs ofengaging too high, which makes
all the reasonable peopleself-select out, and you're just
kind of left with the peoplewho can't see through it or are
too lazy to see through it, orsomething like that.
So do you want to talk aboutthat at all?

(43:50):
I don't know.
Like what can the averageperson who encounters something
like that.
Do about it, yeah.

Thi (43:56):
It's the thing you're talking about.
I think is a really interestingand subtle strategy, and I was
working it out with a guy namedAaron who runs a podcast called
Enter the Void, and we weretalking about, basically, the
philosophy of spamming.
The essential idea of spammingis that you are generating
content and the cost ofgenerating the content is really
low for you and the cost ofdealing with it is quite high,

(44:19):
yeah.
So I think one bad faith debatestrategy is to just like
generate an incredibly largenumber of theories and responses
that would cost a huge amountof energy to reply to, yeah, but
which you can just generateeasily.
I think that's a really that'sa really hard debating strategy
to deal with, and one of thereasons we want good faith

(44:43):
conversations is because in goodfaith conversations, we don't
spam each other right, we don'tjust overload each other, and I
don't know how you deal with abad faith arguer.
Who's spamming?

Kyle (44:54):
you, or even as somebody who just wants to.
You're a good faith arguer.
You know the other side is abad faith arguer, but you also
know there's a large audiencehere that has a number of good
faith arguers in it, or at leasta number of good faith
observers, and you want tocommunicate with them because
you know you're not going tocommunicate with the other side.
What's the strategy for dealingwith this kind of information

(45:18):
dump?
I honestly don't know.

Thi (45:21):
If I do, I would tell people, but I don't.

Randy (45:25):
Yeah, Damn it T.
We want that.

Kyle (45:27):
We want the seduction In my younger days, we know, when I
was a certain kind offundamentalist, I would, I would
engage and I would do my bestto humiliate the other person
and I would chase down everyrabbit hole.
And the last person to give upwins.
And I was the last person.
Like, I get the last word andtherefore the people watching
think that I won and thatvindicates my view.

Thi (45:50):
One of the things that's interesting.
So just a background thought.
A lot of the stuff I've beenworking on this space is about
the problem of expertrecognition.
This is a problem that's as oldas Socrates, right?
If you're not an expert, how doyou pick the right expert?
In particular, how do you picka real expert from someone
that's posing and trying to likepresent as an expert?

Kyle (46:08):
The sophist.

Thi (46:09):
Yep, the sophist and I, like I'm, a lot of the work I do
is based on pessimism aboutthis problem, like I don't think
there's a good solution to itand I got into it partially
because of this work about howjuries respond to expert
witnesses, and it turns out that, like you know, juries tend to
treat as expert the people thatsay clear, unqualified,

(46:32):
confident things, but actualexperts are often like these
things are super complicated.
They qualify things.
They would say that things areunsure for this and that reason.
And because of that, juriestypically treat them as an
expert and not knowing whatthey're talking about.
So I mean, I think part of theproblem is if your audience
members are already inclined totake clear, confident statements

(46:56):
as signs of expertise andworrying and fussing about
details as not, then you'realready fucked.
I don't know what to do aboutthat, except if you can somehow
teach people ahead of time thatclear, confident statements are
not actually inevitable signs ofexpertise.
I don't actually know how to dothat.
Maybe more philosophy classes.

Randy (47:17):
Nice.
So changing directions to you,to.
You've done a lot of researchon the philosophy of games and
you've tossed around that theword gamed as a verb.
You know, and can you tell uswhat the philosophy of games is,
what is, what is gamificationand why are you interested in it
?
And after that I'll ask youwhat do gamification and QAnon

(47:39):
and conspiracy theories andcults have in common, and how do
they gamify things?

Thi (47:44):
I mean, I literally just wrote a book about this and
you're like quickly tell meabout it.

Elliot (47:48):
Okay, I can do this 30 seconds Okay really briefly.

Thi (47:51):
What is the philosophy of games?
So I got into this question,partially because I was
irritated at people who weretalking about video games as an
art form because they were kindof movie and I was like games
aren't just a kind of movie,they're like something special,
they're different.
So I like went down a rabbithole for five years and I ended

(48:12):
up with this theory.
And the theory is that gamesare unique as an art form
because they work in theartistic medium of agency itself
.
What a game designer is doingis not just making an
environment or telling a story,but a game designer is creating
an alternate self for you to bean alternate agency, designing
that agency, and then you pickthat agency up and enter into it

(48:34):
.
So part of what that is is thedesigner gives you abilities.
I mean, I think everyonerecognizes a game designer tells
you, oh, you can run and jump,or oh, you have a portal gun, or
oh, you can like trade moneyright or bid.
But most importantly, a gamedesigner tells you what to want
in the game.
So I got this idea from one ofmy favorite game designers,

(48:54):
reiner Knizia.
He's this German board gamedesign genius.
He's been called the Mozart ofgame design, and in an interview
he says the most important toolin this toolbox as a game
designer is the point system.
Because the point system tellsthe characters what to care
about, right, it tells theplayers whether they are
cooperating or on a team, oragainst each other, or trying to

(49:14):
collect money or trying to killeach other.
Right, that tells you what towant.
As a philosopher, though, whenyou hear something like this,
you're like holy shit, that'sright, a game does tell you what
to want.
That's, that's part, that's,that's the core of the theory.
Right, that a game designer isnot just creating a world, but
creating a self with analternate value system that
cares about competing or killing, or building an efficient

(49:37):
railway network or collectinggold.
So the basic theory is that gamedesigners, through the point
system, specify an alternatevalue system and you enter into
it and this gives you certainpleasures, and one of the
biggest pleasures for me is thatgames give you a sense of value
, clarity, that in our normalworld values are complex and

(49:59):
plural and nauseating andunclear.
But in games, for once in yourlife, you know exactly what
you're trying to do, you knowexactly what counts as success
and you know exactly where youstand yeah so, yeah, that's
great in games.
That's fantastic in gamesbecause in games this is a
temporary system.
You step into it for a momentand you step back.

(50:19):
So now let's move togamification.
So a standard view in theindustry is that games are good,
so gamification is good.
So gamification is any processwhere we take a kind of normal
activity and then we add pointsand levels to it.
Right, like Fitbit gamifiesfitness, duolingo gamifies
language learning, a lot ofeducational.

(50:40):
So an early gamification I hadas a child is I think it was in
my elementary school we got acertificate for a free Pizza Hut
pizza.
For every 500 pages we read yes, I did that too.

Randy (50:55):
Yes, okay, that's a lot of pages, right, you're getting
points and clear levels andclear awards.
Those personal pan pizzas paidoff.

Thi (51:04):
So here's a worry, here's my worry.
Gamification increases yourmotivation by simplifying the
value system.
You're not reading for pleasureor richness or curiosity, but
just for the numbers Pages.

Kyle (51:16):
You're probably going to read pilf because it's easier to
get the page count Right right.

Thi (51:21):
I mean, if you want pizza, you should read the dumbest shit
possible, right.
So similar thought with Twitter.
Right, there are all kinds ofcomplex values for communication
, but Twitter doesn't measureall of them, it just measures
short-term popularity.
So the worry with gamificationfor me is in a lot of gamified
systems you get an increase inmotivation, but you get it for

(51:45):
being pegged to and allowingyourself to be motivated by a
simplified value system, wherethat simplified value system
often has to meet therequirement of being easily
instantiateable and amass-producible technology yeah
yeah, twitter can't measureempathy or understanding.

Randy (52:05):
It measures people punching the like button, which
is much simpler yeah yeah, sonow I feel like almost
everything that we've beentalking about is like crashing
down upon itself with, you know,the seduction of clarity and
moral outrage, porn andgamification.
All of it happens within socialmedia in really potent ways and
I'm I'm scared to ask because Idon't really want to change my

(52:26):
social media habits but can youjust how is social media messing
with and fucking up our brains?

Thi (52:33):
Right, let me go simple and then I'll go philosophical.
The simple version is simplythat it is capturing your
motivations and redirecting italong pre-established lines.
So, by the way, this is not aguarantee.
It's not like this will happeninstantly.
If you put on a Fitbit or startusing Facebook, insofar as

(52:54):
you're motivated by likes, thenyou are now motivated to aim at
a pre-established value system.
That is there partially becauseit's easily instantiatable in a
mass technology.
So in the game's book, I callthis phenomenon value capture,
and value capture is any casewhere your values are rich or

(53:16):
subtle or in the process ofbecoming more developed and you
get put in a place where theworld gives you a simple, often
quantifying value system thesimple version takes over.
So since I've written the book,I've been working more on the
stuff and I have a better way toput this.
I think what's going on isyou're outsourcing your value
system yeah, right that youshould be figuring out, in

(53:37):
response to the rich emotionalexperience of being in the world
, what you care about, butinstead you're outsourcing what
you care about to Facebook orTwitter.
And I just want to be clearthis is not.
I think there's a way of doingthis.
It's technophobic and I thinkthese are just new wrinkles of
something that's been going onfor a while.
So I think one of the best andmost empirically well-studied

(53:59):
examples is the coming of lawschool rankings and university
rankings in the US News andWorld Report.
Right, wendy Esplin and MichaelSauter have a book Engines of
Anxiety that I think reallyreally has a lot of carefully
researched, empirical evidencefor what I would describe as
people outsourcing their valuesabout their education and their

(54:21):
career to the US News and WorldReport.
And the US News and WorldReport is really insensitive to
any particular person's caresabout their life or legal
education.
Right, it tracks a few simple,easily accessed data points.

Elliot (54:35):
Yeah.

Kyle (54:35):
Yeah, and easily manipulated data points too.
Right.
The last university I taught atlimited the class size to 15
students, but only in the fall,because that's only one.

Elliot (54:45):
That's the only time the US News and World Report looked
at it in the spring.

Kyle (54:48):
It was double right.

Thi (54:50):
Yes, filthy gaming.
If you read this book, it'sjust horrifying.
Like so.
One of the main things theytrack is US News and World
Report tracks is employment rateand the nine-month mark after
graduation.
And so law school startedtelling their law students to
take any job, including at anail salon, nine months out,
because it would make theranking go up.

Randy (55:09):
Yeah, so good universities are changing their
best practices and their valuesystems in order to rank higher
on.

Kyle (55:15):
US News and World Report even though it might be a
shittier education method ofeducation and not even just in
the us news and world report,which is at least like an
organization that hires peoplethat try to be professional,
like they.
They outsource that shit to,like one guy who has a blog
that's influential.
You know who I'm talking about.

Thi (55:35):
For people not in philosophy, a lot of philosophy
for a while was ruled by theequivalent of US News and World
Report, which was just one duderanking journals and
universities, which becameincredibly influential.
There's a footnote in my paperabout this, by the way.
I mean I think it's you see thesame thing in academic research
for like things like citationrates and impact factors.

(55:56):
I think it's really broad, andI just want to say that if you
read stuff on the history ofbureaucracy and quantification,
it should be clear that thisisn't just social media.
Social media is oneinstantiation of a really long
trend towards hyper-simplemetrification.
I mean other examples thatmight be familiar to people

(56:18):
Grade point averages rightGrades.

Randy (56:24):
Yeah, you're right.

Kyle (56:26):
Or to bring it home church .
Well, yeah, I was just going tosay for us.

Randy (56:29):
For you churchy people listening, gamification happens
in churches.
I mean, like the way that manychurches measure success is by
number of baptisms, by number ofbaptisms, because number of
baptisms equals the number ofpeople who follow Jesus.
And so, literally, churcheswill have a baptism service and
they'll have people plantedthere who've already been
baptized and they already knowand they'll say, when we invite

(56:51):
people to be baptized, you standup and come down to get
baptized again, because that'llmotivate everybody else to go
get baptized.
And instead of saying,following Jesus is a lot of hard
work, that's going to be alifelong thing, we just want you
to get dunked so that we canhave it on our stats on our
website this is how many peoplewe've baptized this year.
Oh my God, it happens.
It happens.

Kyle (57:11):
I had no idea the church I grew up in had a sign hanging
on the wall with numbers on it.

Thi (57:16):
Yeah, this is the number of people that committed this.
This is what we got in acollection plate last week.
Yeah, that is amazing.
I mean, another version of thisis get any journalist in a room
and start asking them aboutclicks and how trackable clicks
have, like, completely changedeverything about journalism yeah
I mean, we're on a fuckingpodcast, we care about that
trackable

Kyle (57:36):
downloads is.
It's a big deal to us.
Yeah, people ask yeah, so we'vekind of already tackled this a
little bit.
But just segueing from mycomment about church there, if
you have any insight about thisat all, great.
If not, no worries.
So I don't.
I don't know how much likereligious epistemology you're
familiar with, but do you do youthink, as an expert on echo
chambers, that there's anything?

(57:58):
Is there a good explanation ofwhy religious people seem to be
so prone to them?
And there's good.
There's some empirical data tothis too.
Right, that's not just my senseof it.
I just came across a paper theother day the title of which was
belief in fake news isassociated with delusionality,
dogmatism, religiousfundamentalism and reduced
analytic thinking.
So there's a pretty good amountof empirical data for this too.

(58:21):
So any insight as to why thatmight be that religious people
are particularly prone to livingin echo chambers?

Thi (58:28):
I should offer a proviso here or a qualification, which
is I'm not a religious scholar.
Anyway, I don't know anythingas a raw guess.
I think for many people one ofthe appeals of religion is
having a comprehensibleunderstanding of the world.
So one of the interestingthings for me thinking about
conspiracy theories is that theway it's like a parody of

(58:53):
certain scientific andenlightenment values right like
this.
This is a thought that elijahmilgram, this philosopher who
got me into this stuff about thesize of scientific knowledge,
says.
Which is the whole thing thatstarted science was this
enlightenment ideal ofintellectual autonomy.
We should all be thinking forourselves and not trusting and
not uptaking stuff.
What is created is this worldwith so much information that no

(59:15):
one can think for themselvesand everyone has to trust this
vast realm of experts.
And one of the appeals of acertain kind of conspiracy
theory is you get to throw awaythe experts and you get to put
it all back in your head againand explain everything from
something that you can hold andI think not all, but from my
experience some religions, or atleast some expressions of some

(59:37):
religions, offer something likea complete, holdable explanation
of the world.

Kyle (59:44):
But I want to push on that just a little bit because in
both cases that's an illusion.
So the average QAnonconspiracist or the average
vaccine denier can no moreexplain to me the deep state or
what precisely is wrong with thePfizer vaccine than I could
explain general relativity.
We're both accepting that onauthority, so it's just an

(01:00:05):
illusion that I've got thissimplistic explanation.

Thi (01:00:07):
Yeah, I mean, that's the point.
It's an illusion and if you'reattracted to that illusion and
that illusion is behind aparticular brand of religiosity,
then it should also certainconspiracy theories offer
another version of, then itshould also certain conspiracy
theories offer another versionof that illusion should also be
appealing.

Randy (01:00:22):
Sure, yeah, yep, so T?
You've referenced a number ofbooks already.
Where can we find your stuff?
We're going to put links on ourshow notes to your books, but
what's the easiest way to findyour stuff?

Thi (01:00:33):
My website is objectionablenet.
There are links there to all mypapers.
My book is called Games Agencyis Art and I'm on Twitter still
as at add underscore hawk ad hoc, and you can find me in any of
these places, along with a lotof increasingly weird papers.

(01:00:55):
I've been getting emails frompeople that they're finding my
papers more and more disturbing.

Kyle (01:01:00):
Like the newest one on transparency.
What is that about?

Thi (01:01:03):
It's called transparency of surveillance, and it's the
claim that institutionaltransparency is also a form of
monitoring that underminesexpertise and trust.

Kyle (01:01:12):
Okay, I am immediately suspicious, so I'm going to go
read that.
Excellent, awesome excellent,awesome.

Randy (01:01:19):
Well, teen wayne, this has been super fun, hilarious
and really insightful.
Really appreciate you spendingtime with us thank you so much.

Thi (01:01:26):
Thanks for having me.

Elliot (01:01:26):
It's been a good time thanks for listening to a pastor
and a philosopher walk into abar.
We hope you enjoyed the episodeand, if you did, please rate.
Review the podcast before youclose your app.
You can also share the episodewith friends or family members
with the links from our socialmedia pages.
Gain inside access, extra perksand more at patreoncom.

(01:01:47):
Slash a pastor, andaphilosopher.
We're so grateful for yoursupport of the podcast.
Until next time.
This has been a pastor and aphilosopher walk into a bar.
Bye.
Advertise With Us

Popular Podcasts

The Breakfast Club
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Decisions, Decisions

Decisions, Decisions

Welcome to "Decisions, Decisions," the podcast where boundaries are pushed, and conversations get candid! Join your favorite hosts, Mandii B and WeezyWTF, as they dive deep into the world of non-traditional relationships and explore the often-taboo topics surrounding dating, sex, and love. Every Monday, Mandii and Weezy invite you to unlearn the outdated narratives dictated by traditional patriarchal norms. With a blend of humor, vulnerability, and authenticity, they share their personal journeys navigating their 30s, tackling the complexities of modern relationships, and engaging in thought-provoking discussions that challenge societal expectations. From groundbreaking interviews with diverse guests to relatable stories that resonate with your experiences, "Decisions, Decisions" is your go-to source for open dialogue about what it truly means to love and connect in today's world. Get ready to reshape your understanding of relationships and embrace the freedom of authentic connections—tune in and join the conversation!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.