All Episodes

November 11, 2021 43 mins

Dope Labs is BACK and WEEKLY! And so much has happened since our last Lab, it’s hard to even know where to start. So we’re kicking off Semester 4 by talking about science denial with Dr. Gale Sinatra and Dr. Barbara Hofer. Their book, “Science Denial: Why it Happens and What to Do About It,” explores the psychological issues that keep folks from having a broad understanding of science.

Learn more about your ad choices. Visit megaphone.fm/adchoices

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hey, y'all, we are finally back. Oh my goodness, it's
been too long, man. So much life has happened, So
much has happened, so much science has happened. Yes, so
much nonscience has happened on somebody. And that's why we
had to come back. I felt like when the world
needed us the most, we disappeared like the avatar aang.

(00:22):
It just was not right. Well, we are finally here
to make it right. I'm TT and I'm Zakiah and
from Spotify, this is Dope Labs. Welcome back to semester

(00:56):
four of Dope Labs. We have missed you so much,
but let's just jump into it. For the uninitiated, Dope
Labs is a weekly podcast that mixes hardcore science, pop culture,
and a healthy dose of friendship. This week, we're talking
all about something that's been heavy on our minds and
hearts for the last few months, science denial. But before

(01:20):
we get into that, we have some exciting news that
we want to tell y'all about. You thought you couldn't
get enough before, Well, we're gonna find out more more, more,
more Dumblamore, More Dope Lab, More Dope Lab. Moore, Hey,
y'all ask for it. No more of the bi weekly stuff.
We're gonna be in your ears every single week. And

(01:41):
that's not the only big change that's happening. Semester four
is coming exclusively to Spotify for free starting December sixteenth.
So if you already listened to us on Spotify, keep
doing what you're doing, and don't forget to follow Dope
Labs and tap that little bell icon so you never
missed when a new episode drops. Now after December sixteenth
be able to hear new episodes of Dope Labs anywhere else.

(02:03):
So if you don't listen to us on Spotify yet,
be sure that you go ahead and make that change.
Spotify is where you can listen to Dope Labs plus
all your other favorite shows for free. All right, ze,
I hope you're ready. Let's start the show. We're starting
off this new semester with a real banger. This week,
we're talking all about science denial. This has been a
huge topic, especially when it comes to the pandemic, but

(02:26):
we've also seen a lot of science denial reports in
recent years with other issues too. Tit you like climate change, absolutely,
we're really passionate about scientific information and combating science denial
in general. I mean, it's why we started the podcast. Yes, yes,
we want science to be accessible for everybody, and part
of that means having good information and the right tools

(02:47):
to make decisions, especially when it comes to your health.
So we really wanted to understand science denial, its history,
and the motivations behind it. And trust us when we
say this issue is not as simple as it might seem.
So let's get into the rest of tap.

Speaker 2 (03:01):
But it's a sham Sam Sam Sam, shams Sam, But
it's a sham sham.

Speaker 1 (03:12):
If you're new to Dope Labs. We typically structure our
episodes into three main parts, the recitation, the dissection, and
the conclusion. The recitations at the beginning of the lab
where we get everybody on the same page and define
what we know already and what we want to know
by the end of the episode, right, And that's followed
up by the dissection where we answer those questions in
the recitation, where we talk to our guest expert and

(03:34):
learn all the information we do our deep dive during
this part of the episode, and then we get to
the conclusion where we put a nice bow on everything.
We kind of round up everything that we have learned
throughout the rest of the lab and talk about any
conclusions that we can make. All right, So for this
episode about science denial, what do we know? Why are
we talking about this? Well? I feel like science denials

(03:57):
on the tip of everybody sung right now because of
the virus spreading that shall not be named. No, we're
in the middle of a pandemic. So this is a
new experience, a new shared experience for all of us.
And so there's a lot of people who are very confused,
who are trying to get up to speed with the
science around viruses, virus transmission, vaccines and everything like that.

(04:23):
And there's a lot of fear, absolutely, I think in
addition to all those things you just said and fear,
there's a lot of information of varying quality and truth
being spread. If you're trying to make some decisions, it's
hard to know who or what to believe, and you're
just constantly bombarded with information. Yeah, and then we also
know that science denial is affecting very specific communities more

(04:48):
than others. The other thing I want to know is
where do we draw the line from skepticism to denial,
Because I feel like a healthy dose of skepticism is good, right.
I think that helps you have like really great conversations. Right,
But there's like this really thin line where things start
to go left. Another question that I have is what
can we do to check in on ourselves. I'm not

(05:10):
coming from a place where I'm on a high horist.
How do I check in with myself? How do I
check myself if I'm falling victim to that? I think
all of those are really good questions. Let's jump into
the dissection. Our guests for today are doctor Gail Sinatra
and doctor Barbara Hoefer.

Speaker 2 (05:27):
My name is Gale Sinatra, and I'm a professor at
USC University of Southern California in the Rossier School of Education.

Speaker 3 (05:35):
And I am Barbara Hoefer. I'm recently retired from Middlebury College,
so professor Amrita and that's in Vermont.

Speaker 1 (05:42):
Doctors Hofer and Sinatra published a book earlier this summer
called Science Denial, Why It Happens and What to Do
about It. Their book explores the psychological issues that keep
folks from having a broad understanding of science. It also
offers solutions for those wondering what they can do to
help curb the spread of misinformation. And when we say
we want to know about science denial, what we mean

(06:04):
is we want to know why people may flat out deny,
or maybe just a little bit doubt or resist scientific
fact or general scientific conclusions. What's keeping them from accepting
what's already been proven. It can feel really easy to say,
not me, I'm not a victim of science denial. But
it's not just accepting big issues like climate change or

(06:27):
understanding that vaccine's work. It could also be how you
decide to take risks, or if you choose to buckle
up in your seat belt even though you know it
can protect you in a crash. I think it's also
these smaller nuanced things in our day to day lives
as well. And everyone is susceptible, all of us, even
the people that you know have been highly trained in
the science field. We can all be a part of

(06:49):
that group. And I think that is something that folks
have to understand. What I still want to know a
little bit more about is that difference between skepticism and
stepping all the way over to science deny.

Speaker 3 (07:00):
We want people to be skeptical if you see one
study with a small sample and there's some clickbait headline,
be suspicious, be skeptical. That's the time to question it
if it has not been substantiated, corroborated, supported with additional studies.

Speaker 1 (07:15):
TTU posted something the other day and I was like,
spot on, Oh no, what did I say. I've seen
a lot of crazy stuff on Twitter. You said it's
been a year and a half or a year and
nine months. If you're still doing their research, what kind
of research is it? Yeah? I just feel like people
are still saying I'm doing my research on coronavirus. I'm like, hello, yeah,

(07:37):
you doing research. You're kind of just not doing anything
and being stuck in your thought process, which I understand.
This is a big topic to turn and swallow. Yes,
especially if you have to get all the background skills,
if you need to understand virology, immunology, molecular biology, vaccine design, sociology,

(07:57):
human behavior, risk management, that's a big mountain to climb. Yes.
And you know, we've talked about skepticism. We mentioned it
a little bit earlier, but I think there's a difference
between skepticism of information that you know, you don't know
where the source is, it's just tumbling down your feed
versus the Johns Hopkins University Bloomberg School of Public Health

(08:18):
telling you that the cases are rising in your area,
and you're like, h, I don't believe what they're saying.
That's not just skepticism right there. So how do we
identify science denial?

Speaker 2 (08:27):
Then you don't see people who are very doubting and
resisting science, hesitating to use an iPhone or get on
a plane. They're not denying physics, they're not denying the
technology that goes into Wi Fi. So it is this
phenomena of selective denial, which really is driven by your motivations,

(08:52):
your emotions. So you're picking and choosing what you like
about science and what you don't like, and science doesn't
work that way.

Speaker 1 (09:00):
That's such a good point. You know, science does not
care about your feelings. It's not about our opinions or
what we want to be true.

Speaker 2 (09:06):
It's about what the evidence suggests is our best understanding
of the science at the time.

Speaker 1 (09:11):
Yes, it's so important to remember that science is backed
up by research and evidence. For example, with masking and vaccine,
scientists are doing studies to see how effective those measures
are and then creating guidelines accordingly. And yes, these guidelines
can change as the evidence changes as we learn more.
But we'll talk a little bit more about that later.

(09:32):
But I think we should start with the history of
science denial. Tt let's rewind a little bit. Have we
seen science denial before in different forms? And how did
we get here?

Speaker 2 (09:42):
The history of science probably starts with science denial, doubt
and resistance.

Speaker 3 (09:49):
We try to trace it back to Galileo and you
think about how he was under house arrests for the
beliefs that he had, how long it took for people
to accept his theories. Think about Darwin, It took more
than a n undred years for scientists to accept fully
what he was proposing in the way of evolution.

Speaker 1 (10:05):
So, for real, it feels like science denial has been
going on since the beginning of science itself, and in
the last fifty years it's become more pervasive as there's
been some outside meddling, so corporations realizing that fostering some
science denial could help their bottom line. It all goes
back to the money.

Speaker 3 (10:21):
Beginning with the tobacco industry, for example, were interested in
trying to deflect the idea that somehow it was cancer causing,
and they hired pr firms to so doubt, and the
same companies are being used by Exxon and other corporations
to make it look as though climate change isn't a
certain fact.

Speaker 1 (10:38):
In fact, even as recently as twenty ten, Philip Morris
has routinely argued that Marlboro gold cigarettes actually decrease the
risk of cancer. That's wild, but that brings us to today.
With a global pandemic and a steadily warming planet. It
feels like people are holding their noses up at scientific
evidence left and right. So this has made me ask
is there an increase science of denial?

Speaker 2 (11:01):
I think the difference that we see is the amplification
of misinformation through social media, and that's coupled with us
living in our information bubbles where we get the same
information and if it's misinformation, that's same misinformation reinforced over
and over again and it becomes more credible. There's the

(11:24):
joke that misinformation travels around the world before the truth
gets up and puts its pants on. Misinformation is really compelling.
It's sometimes interesting or intriguing or even funny to some people,
and that gets the clicks. And as we know the
way the algorithms are shaped, that more clicks gets more attention.

Speaker 1 (11:45):
We've talked about algorithms on social media before. What goes
viral isn't always true. It really helps us understand why
it's so important to talk about science denial right now.
So when you think about that amplification and what we
know about the brain, and the more you see something,
the more is reinforced and you begin to believe it.
I think all that makes sense in the current context.

(12:06):
Sometimes people who are science deniers go overboard and say
I'm just waiting for the science. Well, part of the
science is assessing risk. Early on and even later in
the pandemic, was people outright saying no to mass like,
it's not gonna keep you one hundred percent safe. Well, ma'am,
if it's going to keep you ninety percent safe, I'm
gonna say, that's still useful, right, And I think that's

(12:29):
the part that we start to see this kind of
doubling down on. I'm so scientific. I know ninety is
less than one hundred, but I think you also know
ninety is higher than zero. You know, It's like if
you look at the forecast and it says there's a
seventy percent chance of rain, you see that and then
you're like, Okay, let me take my umbrella just in case. Right,
this is the same thing. You don't say, I'm not
gonna take my umbrella because it's not one hundred percent

(12:51):
chance of rain exactly. So why don't you apply that
same logic to mass. Now that we have an understanding
of what science denial is, we want to understand what

(13:14):
is causing people to flock to science denial. Let's get
into the reasons. Doctor Sinatra and doctor Hoefer outline five
explanations for science denial, doubt, and resistance. The first is
mental shortcuts and cognitive biases, second is understanding beliefs on
how and what you know. The third is motivated reasoning,

(13:35):
fourth is social identity, and the fifth is emotions and
attitudes and not attitude like the keys attitude, different attitude.
The first explanation is mental shortcuts and cognitive biases. Right.
Cognitive biases are kind of these mental gymnastics that we
do so we don't have to run through all the

(13:56):
processing every time. Yeah, so our brain is learning along
the way. You know, A equals Z, and you don't
have to do ABCDEFG all the way through. But sometimes
these brains can trick us, and they learn something early
on and they reinforce it over and over again. We're
going to talk about that in a later episode Mind
Over Matter. But you know, one type of cognitive bias

(14:16):
is known as confirmation bias.

Speaker 3 (14:18):
Confirmation bias is this implicit tendency to seek, recall, affirm
things that already fit with your existing beliefs. So everybody
who's listening can probably think of a time when you
googled something to find an answer. You already thought you
knew what the answer was, and you're quick googling. As
soon as you find it, right, you think, okay, that
supports it, but you don't search laterally across to see

(14:41):
if it's confirmed or if there's anything that contradicts it.
That's confirmation bias.

Speaker 1 (14:46):
I think we all can remember stuff that we saw
on the early Internet or like heard through the grape
vine at school. Do you remember me sharing with you
on Twitter where this guy said that he found shrimp
tails and his Sentimento's christ And I was like, h
I don't doubt it, because you know, a long time
ago I saw this thing that said that like up
to ten percent or something like that of cerial product

(15:08):
could be unknown material. And as soon as I said
it to you, I was like, hmm, let me check that,
because I was like, I have never heard this. I
don't believe that. And also, my today many years old
brain knows that ten percent is a lot. I've eaten
a lot of cereal in my day. I've never seen
anything strange that confirmation biased, you know. I think we've

(15:29):
been trained to always look for a countering point, make
a liar out of me, make me wrong. That's how
my Google searches look. I think. The other piece of
this right, So, if we think of these mental shortcuts,
the second arm of this is just how we think
about knowing and learning in the first place.

Speaker 3 (15:47):
Another chapter that we have is on what psychologists call
epistemic cognition, So it's what people believe about knowledge, how
they think they know. And one of the issues is
epistemic trust. Who do we trust as a source of knowledge.
One of the things we talk about in the book
are reasons why some people might not trust the medical community.

Speaker 1 (16:06):
This feels so relevant TT, especially in the face of
people deciding whether they trust or don't trust the government
and regulating organizations like the FDA, and even when we
see these organizations overstepping each other, just like we see
the CDC overruling the FDA, who is our regulatory agency,
and the CDC is saying, yes, everybody should have a
booster shot, right. I mean, when you see stuff like that,

(16:27):
how do you know who to trust? Because they both
are organizations that we look to for the facts, and
especially after seeing such political influence within those organizations, it's
hard to know. Hey, if it was susceptible, then is
it susceptible now? Is it still unbiased? You know, it
makes it really hard. We see the same thing with

(16:47):
people being skeptical of mainstream media or which news stations
they go to for their information, and it's concerning because
the information is not the same. And we talked about
this in an article that we wrote for Scientific American.
You know, the roots of folks distrust of the scientific community,
the medical community to be real, from force sterilization to
the latest evaluations of disparities and health. Yeah, I mean, historically,

(17:11):
bad things have happened to minoritized folks and to poor folks,
and now that leads to poor outcomes for those people.
It's embedded in the system, and it feels like a
snowbal effect because it's self perpetuating. So you have folks
who are going to receive medical treatment and receiving sub
par care. That sub par care translates to terrible outcomes.

(17:31):
And when they see that terrible care and terrible outcomes,
the other people that are on the periphery, you know,
family members, children, parents, they then say, I will not
trust the medical system, and so they don't go get
any type of preventative care, or maybe they don't have
access to preventative care, and so then they continue to
present with medical issues that are at much later stages
and then they get poor care then, or even if

(17:53):
they get good care, then they still have poor outcomes. Right, Yeah,
it's a vicious cycle of things. We even see things
like that present day because I know that there are
probably some people who think that's old school medicine, no sir,
But when you think about the care that Serena Williams
had when she was giving birth to her child, she
almost died, right. She kept communicating that she was in

(18:14):
a lot of pain, but she wasn't being believed. And
that is something that studies have come out that have
said there are a large group of doctors who believe
that black people have a higher pain tolerance, and so
they're treated differently, exactly, treated differently from top to bottom.
So that means that black people are less likely to
get pain medication. It's not even that you can earn
enough money to move you into a different economic class

(18:35):
and that protects you. It's about being black, even if
I go to the best hospital. Look at Serena Williams,
a world class athlete, the Serena Williams, so many grand slams,
all of that, and she's still a victim of this.
And so when you consider this right, it makes sense
that people would have this mistrust or this hesitancy or
resistance to information from the medical community, or the scientific community,

(18:59):
or even the government.

Speaker 2 (19:00):
We also hear people say ask your doctor, as if
everyone has a doctor they can just get on the phone.
Do you have access, do you have a relationship with
the doctor, do you know who you can go ask?
Not everyone has that kind of access. Some people have
hypothesized that Great Britain has had a larger percentage of
people vaccinated because they have a universal health care system

(19:23):
and everyone knows who their doctor is, and everyone knows
where they can go and here, people don't necessarily know
where to go, and they don't necessarily have good.

Speaker 1 (19:33):
Access, preach doctor Sinatra, and that makes all the sense right.
Along with this historic and current difference in treatment for
different groups, there's also the matter of access that you overlay.
And we've heard a lot of things around vaccines where
people are saying, oh, wow, well, people just get vaccinated.
I'm like, hey, it's a little deeper than wrap. You know,
it's not just am I going to go do this thing.

(19:54):
I think that that's something that scientists, scientific communicators, and
folks in the medical community need to take into account
when we are communicating with folks who are skeptical or deniers,
is that it's not coming from a place of misinformation.
It's coming from real, lived experience, a real place, and

(20:14):
it should be respected as such. And TT you hit
the nail on the head saying that science, communicators and
organizations need to consider who folks trust right and what
their lived experiences may be.

Speaker 2 (20:26):
It's also about trust, So you trust people you identify with,
and then you have mistrust for people you don't identify with.
So while it's hard for us to understand why somebody
would take a livestock dewormer rather than a vaccine.

Speaker 1 (20:44):
That's right. Folks have been taking ivermectin and that's a
drug that's typically used as a parasitic de wormer for
a livestock.

Speaker 2 (20:52):
It's about where they're finding that information. They don't trust
the voices that talk about the safety and efficacy of
the vaccine, but they are trusting people that there's alternative
mechanisms medications to treat COVID, which has no basis. But
they're hearing this information from people they identify with and

(21:12):
that is who they trust.

Speaker 1 (21:15):
So someone in your community who you trust says something
but there's no supporting scientific evidence, that can still sway
people to action or inaction. We saw that with Nicki Minaj.
What people grabbed onto and ran with is Nikki didn't
take the vaccine and didn't go to the met gala
because of it, and then she starts talking about some

(21:35):
cousin's friend who has swollen testicles. And that kind of
misinformation is so dangerous because people won't do their due diligence.
They're going to say, I love Nikki, okay, I love Roman,
and they will run with that information and they'll say
that's all I need to know.

Speaker 3 (21:51):
We have realized that nobody trusts just one person. We
all have multiple people in our worlds that we trust,
and doctors and pastors, for example, can be very influential
in terms of the vaccine.

Speaker 1 (22:02):
And this brings us right back to that algorithm problem though, right,
because if the multiple people you trust are all in
your bubble, they're all seeing the same shared misinformation, then
it feels like everybody you trust is saying don't get vaccinated.
The problem then is when people like I know somebody
who didn't get vaccinated. They got COVID and they were
really sick and they were in and out of the hospital,
but then they wrote this really like cryptic post about

(22:24):
maybe you should get vaccine. I'm going to tell you
who to believe this and that, but I had this
terrible experience. You think they got shared like all of
their other misinformation? Do you think they came with that
same hot fire? No? No, And part of that may
be that it wasn't shared because other people have their
own what we call motivated reasoning behind what they will
and won't share or what they will and won't believe.
And doctors Sinatra and Hoefer told us that motivated reasoning

(22:46):
is another explanation for science denial.

Speaker 2 (22:48):
Motivated reasoning is that you can either reason towards what
we call an accuracy goal, like in other words, you
want to find out the accurate information, or you can
often subconsciously reason towards a desired conclusion. So that comes
into play when you are weighing information that you've read online.

Speaker 1 (23:11):
Doctor Sinatra gave us an example of motivated reasoning around
stem cell therapyes potential to help with Parkinson's disease.

Speaker 2 (23:19):
So perhaps you have a friend who has Parkinson's, and
so you read articles about whether stem cell therapy can
help with Parkinson's. You may be overly enthusiastic about the
potential for this therapy and you may reason that it's
great when it may be only okay or even not great. Conversely,
if you have concerns about the use of stem cells

(23:41):
and you question where they come from and you're wondering
if they've been used ethically, and then you look at
a stem cell therapy online, you may reason that, oh,
this stem cell therapy isn't any good, it doesn't work
at all. So that's a motivated reasoner. Whether you're reasoning
too positively or too negatively, based on and wanting the
outcome to go towards what you're already believing.

Speaker 1 (24:04):
That's a really good point. It almost feels like how
you do those Googles, you know, if you're already deciding something.
Is one way we start typing into Google. Google starts
to guess what you want to type. And if Google,
which it does, knows like your search history, it's collecting
all this data from your emails and all these things
like that, it'll probably lead you to the exact place

(24:25):
you're looking for, the exact answer that you want answered
in the exact way that you want it answered to
confirm your thoughts. Another psychological challenge that can lead to
science denial is related to our social identity.

Speaker 3 (24:38):
We are all tribal people. We all belong to certain groups,
and we draw our identity from those groups. And when
the groups believe certain things, we tend to believe certain things.
It's a shorthand for thinking about what to believe without
even maybe looking into it in a lot of depth.
So if you think about the things that many people
believe right now, about whether, for example, the vaccine causes infertility,

(25:00):
which it does not, we know that conclusively. But if
people have heard that on Facebook or heard it from
their friends or their neighbors or their identity group, they
go online, it's not hard to find confirming evidence for
that and just quit without looking at the fact that
there is no science evidence behind it. And so we
have seen some serious tribalism around science denial in ways

(25:23):
that shock even us who have been writing and thinking
about this for a long time, of looking at the
degree to which people will think, this is what my
people believe, this is what I'm going to believe. And
we were both dismayed to find that in Missouri last
week there were people wearing disguises when they went to
get vaccinations because they didn't want people they knew to
see them, violating the values that they had upheld that

(25:46):
masking was bad and that vaccinations were unnecessary.

Speaker 1 (25:50):
You know, this reminds me of and it goes right
back to Missouri. There was this state representative, Bill Kidd,
and he had written this post. He said, no, we
didn't get the vaccine. We're Republican. That's like a social
identity thing, right. Yeah. I wonder if there was any
other time in the history of this country where things
are so strongly tied to a political affiliation where you

(26:15):
can guess someone's stance on a medical issue outside of
abortion based on their political party. That's wild to me.
I think the thing that we both understand, and we're
seeing more and more people start to understand, is that
all of this relates to emotions and attitudes and feelings.
A lot of times, as scientists were trying to just
look at the facts and only think about the facts,

(26:37):
and we think of people as these vessels that we
just pour the facts into. Okay, now, they got it.
But what we know is how we feel in our emotions.
They affect how we understand and feel about scientific evidence
when it's presented to us. Right, And that's the fifth
reason for science denial.

Speaker 2 (26:51):
Our emotions are part of how we think and reason,
and they have to be You can't put your emotions
in a box. But you have to use use your
emotions in service of good thinking and reasoning, and you
have to be thoughtful about that. So you can't let
your emotions derail a good reasoning process. So if you're
too anxious about climate change, for example, you can shut

(27:15):
down and not want to engage. And if you're too
angry about climate change, maybe contributing to a change in
how you'd have to live your lifestyle. You also shut
down and don't want to engage. So you have to
think about your emotions and how they're affecting your thinking
and then use them in service of your thinking.

Speaker 1 (27:38):
Yes, and is it just me or does it feel
like it could apply to many areas in our life
and not just science denial. It sounded like doctor Sinatra
was preaching a little bit. Maybe it is a read. Okay,
you already know some of y'all just got your edges
snatched and you don't even realize it. Check the mirror.
Are you bollved? So let's take a break and when

(28:02):
we come back, we'll get into some of the solutions
for challenging science denial. We're back and we've been talking

(28:28):
to doctor Gail Sinatra and doctor Barbara Hoefer about their
fascinating new book. It's called Science Detile, Why It Happens
and What to Do About It, out now from Oxford
University Press. In the first half of the dissection, we
learned what science denial is and what it isn't. Just
to recap, we went through five reasons for science denial,
mental shortcuts, and cognitive biases, beliefs on how and what

(28:50):
you know, motivated reasoning, social identity, and emotions and attitudes.
So now let's get into the solutions. What can we
do about it?

Speaker 3 (28:58):
Often the solutions are talked about as though it's one
on one individuals making change in their own thinking, and
it's more than that. We need solutions at a higher level.
And for example, a couple of years ago, Twitter started
responding if you tried to retweet something that you had
not even opened, you just like the headline, that you
get a little message back that says would you like

(29:19):
to read it first? And that moves people from system
one to system two thinking in that moment.

Speaker 1 (29:25):
Nobel Prize winning psychologist Daniel Khanneman talks about system one
and system two thinking in his book Thinking Fast and Slow.

Speaker 3 (29:33):
So system one is that very quick intuitive response, that
is that gut level confirmation bias, for example, and system
two is the slower, analytical, thoughtful aspect of the mind.
And a lot of the times we're operating on system one,
and it works for us.

Speaker 1 (29:50):
A lot of times we're using system one, and that's okay.
You often need to make fast decisions, and you don't
need to tire your brain out over and over. So,
for example, if you're driving and you need to make
a split second decision, System one is your go to then,
But it's.

Speaker 3 (30:03):
Not a great thing when we're trying to figure out
should I inject bleach into my system in order to
address COVID do some more work. Don't just do it
because you just found it online. Are some friends said
to you or you saw it on Facebook?

Speaker 1 (30:17):
Instead, slow down, yes, absolutely, take a beat and really
look for substantial evidence. Like it does not serve you
to get to the answer quickly if it is the
wrong answer. So this is great to think about in
this kind of System one versus System two. And it
seems like, you know, Twitter and even the things on

(30:37):
Instagram that say this is about vaccine blah blah blah,
those things are prompting system too, trying to get you
to engage more analytically. I mean, it's great to see
this kind of stuff on social media and where information
is being shared, but it still feels like there's a
lot we can do as individuals to combat science denial
as well. Yeah, and one of those things is practicing
more balanced and informed research, especially when you're doing your

(30:59):
goo do.

Speaker 2 (31:00):
Your own research means google it. For most people, I
can't go do research on ice cores or ocean acidification.
That's just not going to happen. So when we say
do your own research, it's really not realistic because you
really can't dive into the research the way the scientists do.
You look for information online and you have to be

(31:25):
very discerning. That takes time, it takes effort, and you
have to know what you're looking for, what to be
aware of, for example, the source who paid for this research,
who's sharing this information, and to be able to evaluate
that takes a lot of awareness and education.

Speaker 1 (31:46):
The whole point of googling something is to get answers quickly.
When you think of it that way, it's kind of
counterintuitive to slow your brain down and really approach a
subject analytically. And that's okay if you're looking for the
best fall boot right, But I think when it comes
to making big decisions about your health, that kind of
quick judgment is not going to serve you will. One

(32:08):
of my favorite things to do when I'm really trying
to get knee deep into the information is scholar dot
Google dot com. For peer reviewed research. Yes, you know,
when we think about it, that's what these PhDs are. Well,
at least a large part of it is in research
and the ability to look for information, judge it, combine
it with other pieces of information to figure out what

(32:30):
the landscape is and to say, here are some of
the holes or here are some of the unknowns, and
knowing whether or not you have the tools to answer
some of those questions. That's always what I say. Is
One thing that I learned from getting a PhD is
that I don't know anything. I'm skeptical of anybody who
thinks they know everything about a topic. I establish myself
as an expert in a very specific field. People come

(32:51):
to me and they ask me questions, and I feel
absolutely confident saying I don't know. That's one of my
favorite answers. But the next is saying, how do we
get to the right answer?

Speaker 3 (33:01):
Right? Like?

Speaker 1 (33:02):
I don't know? But what questions can we ask? Yes?
Like and we can do that together Dope labs. You know,
I think this really makes me think about how we
teach people to ask questions and even what we teach science,
as I think so often science is taught as this
series of facts, and the truth is that it should

(33:22):
be more of kind of probing questions, right to understand,
to find the boundaries of what you do and don't know,
like you just said. And I think that's been a
lot of the conversation, like, Oh, we've been lagging in
STEM and science education for so long. Is science education
the answer to all of this? I don't know. I
think maybe it's just a piece of the puzzle to
getting us to a better place.

Speaker 2 (33:42):
We would argue, yes, let's improve science education, but you're right,
it's not just about more science content. What we think
students need to learn is more about how science is done,
the process of science. For example, at the beginning of COVID,
information kept shifting about masks and whether to wear them

(34:03):
or not, and whether you could contract COVID from touch
and surfaces and whether you had to spray down your groceries.

Speaker 3 (34:10):
They didn't understand that this is what scientists do. They
chip away at a problem, they work on it, they
try to corroborate what they know, and that this has
been done very very well in this period of time.
But a lot of people have dismissed science because they think, ah,
what do they know? They just keep changing their minds.

Speaker 2 (34:28):
But in fact, the strength of science is that it
does change based on new evidence, and I think we
have not taught that enough.

Speaker 1 (34:37):
Absolutely, as doctor Sinatra and doctor Hoefer explain, it's also
about educating people on how science and the scientific process
actually works. And by the way, that's also why we
decided to structure this podcast the way that we do. Yeah.
I think we're constantly asking new questions and taking in
the information we have and saying, what kind of conclusions

(34:57):
do we come to based on what we learned, and
what else do we see that we don't know? You know,
often our conclusion is just more questions. And I think
we've also seen this over and over again during COVID, right. Yeah,
if you think back to the early stages of the pandemic,
people are like, we just want something to make this over,
and it's like, oh, hey, we have vaccines, and then
folks are saying, I don't know if I'm going to

(35:18):
have a vaccine. And then now people are saying we
should get a booster, should we not, whether it's effective,
who should get them? You know, I think we're constantly
just collecting data. We're seeing what's happening in other countries.
But we're also seeing that there are some things separate
from just the hardcore science, but around social interaction and
behavior that make some things transferable to the United States
and some things are not, you know, And all of

(35:41):
that is part of that reiteration, right, and that constant
morphing of science, of everybody bringing things in and some
people saying, oh that's no good, toss it out. You know,
the quality is poor there. All of that is the
constant proofreading and editing of the scientific narrative, I think.

Speaker 3 (35:57):
And then the research that Gail and I have each
done independently and coincidentally, we've discovered that students are overly
schooled in the scientific method. They think that every scientist
does this controlled experiment with a hypothesis and a control group,
and so as a result, they dismiss some of the
findings that require more abstraction, more inferential reasoning, more observation. So,

(36:22):
for example, climate change is really confusing to people like that, Well,
how do they know they didn't do an experiment.

Speaker 2 (36:28):
Well, that's why I think some people really were taken
aback when the science changed so quickly about COVID, because
perhaps they were taught that here's a textbook full of
facts about science, and they're the same textbook we use
five years ago and nothing's changed. Then this is how
science is. And of course science is not a collection

(36:49):
of facts. Science is a process. A science is an
approach to evidence. It's an attitude, as Barber said, and
we need to teach it like that. People understand that,
of course science changes. Of course there's new information, and
you can use a scientific attitude in your day to
day life.

Speaker 1 (37:09):
TT you always say this, You've got to be willing
to change your mind. Yes, you've been talking about scientific
attitude all this time, and I really believe that for
most people, the hardest part is unlearning. Yes, going into
something feeling like you know something is a fact and
then finding out that it is not. Unlearning that fact.
It's really really difficult. I think that's something that's hard

(37:31):
for everyone. But you have to be open to the
idea of unlearning. And once you are open to it,
then you can really enter into these conversations and say, Okay,
I'm hoping to have in my mind change because new
information comes in. And the last piece of the puzzle,
beyond organizations and individuals, is science communicators, researchers and professionals themselves.

(37:56):
We need to open up the scientific community and make
it more accessible to everyone.

Speaker 2 (38:01):
We have too many scientists who just talk to each other,
who publish in journals that only other scientists have access
to their behind firewalls, and then when they go to
talk to the general public, none of us humans can
understand them. So we need to do a better job
training our scientists to be science communicators. We need to

(38:23):
develop their ability to communicate better about their work. Dope
Labs is an excellent example of what we can do,
which is make science more accessible to the general public.

Speaker 1 (38:39):
Yeah, I think we have a lot to do as
scientific communicators. We do a lot of work with this show,
trying to bring science to the people and do it
in a way that makes sense for everyone, in a
way that's fun for us and you know, hopefully fun
for everybody else to listen to. But I think that
for such a long time, the way that science was
communicated it was community in a way to big up

(39:01):
the scientists. But now we're finding that that does not
serve the people. No, and we do science in order
to advance our world, and if we don't include the
people we are trying to serve as scientists. What is
the point If we.

Speaker 2 (39:18):
In education don't do a better job promoting digital literacy,
algorithmic literacy, critical literacy so that we can have critical
thinkers and students in K through twelve and higher education
who can evaluate evidence and think critically about it, then
we're going to continue to have these challenges.

Speaker 1 (39:41):
So we're trying something new. Every now and then, TT
and I will share one thing that we either came across, experience,
want you to experience, or know about in our lives. TT,
what's your one thing this week? So my one thing
this week is that I actually saw on Instagram that
Jordan Peele was selling the get Out screenplay with all
this extra information and the entire script, and so I

(40:05):
jumped on that asap and it's really really cool. It
has some words from Tanana Revedo, which is a kid
I know you're a big fan, and then we get
some extra context from Jordan Peel. There's a section in
the back that has deleted scenes, so it lets you
know like what they were thinking about adding but ended
up on the cutting room floor, And there's an alternative

(40:28):
ending that's at the very end. So I'm really looking
forward to reading this and just seeing all the little
notes from each scene that made get Out become what
we know it today. Awesome. I didn't even know that
was happening. What's your one thing? My one thing is
really based on preparing for this lab. When I started
reading Science Denial, I really became interested in what I

(40:51):
considered irrational behavior, and so I picked up a book
that was already on my shelf. It came out in
two thousand and eight, but it felt so timely and
felt like it read me for filth Okay, Predictably Irrational
by Dan Airily, who is actually at Duke right now.
There when we were there, I don't think. But it's
like behavioral economics. It helps us understand why we do

(41:13):
some of the things that we do, and how we
actually are irrational, and we can predict some of our
irrational decision making. I love that. Okay, So when you're
finished with your book, I'll give you the get Out book.
We'll do a book exchange and so that I can
get my lap with your and you'll have all my
notes and highlights. I love that. That's my favorite thing.

(41:43):
That's it for LAP thirty seven. If you have some
other stuff to think about, some more questions, please be
sure to call us at two O two five six
seven seven zero two eight and tell us what you thought.
We'll give us an idea for a lap you think
we should do this semester. You know we like to
hear from you. That's two O two five seven zero
two eight. If you love today's episode, there's so much

(42:03):
more for you to dig into on our website. There
will be a cheat sheet for today's lab, additional links
and resources in the show notes. Plus you can sign
up for our newsletter check it out at Dope labspodcast
dot com. You can find us on Twitter and Instagram
at Dope Labs Podcast, and TT's on Twitter and Instagram
at dr Underscore t Sho, and you can find Zakia

(42:25):
on Twitter and Instagram at z Said So. And don't
forget to follow Dope Labs on Spotify and tap the
bill icon so you never miss when a new episode
drops special thanks to today's guest experts, doctor Gail M.
Sinatra and doctor Barbara K. Hoefer. Their book Sigence Denial,
Why It Happens and What to Do About It is
available now from Oxford University Press. Check out IndieBound dot org,

(42:47):
where you can find your nearest independent bookstore and pick
it up. Dope Labs is a Spotify original production from
Mega Ownmedia Group. Our producers are Jenny Ratlickmast and Lydia
Smith of Wave Runner Studios. Editing in sound design by
Rob Smerciak, Mixing by Cannis Brown. Original music composed and
produced by Taka Yasuzawa and Alex Sugier from Spotify. Our

(43:10):
executive producer is Gina Delveack, and creative producers are Baron
Farmer and Candace Manriquez Rinn Special thanks to Shirley Ramos
Yasmin of Fifi, camu Elolia, Till krat Key and Brian
Marquis executive producers from Mega Own Media Group all Right Us,
T T Show Dia and Zakiah Wattley
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

CrimeLess: Hillbilly Heist

CrimeLess: Hillbilly Heist

It’s 1996 in rural North Carolina, and an oddball crew makes history when they pull off America’s third largest cash heist. But it’s all downhill from there. Join host Johnny Knoxville as he unspools a wild and woolly tale about a group of regular ‘ol folks who risked it all for a chance at a better life. CrimeLess: Hillbilly Heist answers the question: what would you do with 17.3 million dollars? The answer includes diamond rings, mansions, velvet Elvis paintings, plus a run for the border, murder-for-hire-plots, and FBI busts.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.