Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Welcome to Stuff You Should Know, a production of five
Heart Radios How Stuff Works. Hey, and welcome to the podcast.
I'm Josh Clark, and there's Charles W. Chuck Bryant over there,
and we've got the scoop. Jerry's around here somewhere, and
(00:21):
this is Stuff you Should Know. Off to a great start.
She's in hers, she is she's got this remote thing
going on. Uh, it's like the COVID special, that's right. Uh.
And this was this has been what I've been wanting
to do since two thousand sixteen, and why it seems
(00:43):
like the fire kind of went down on it and
now it's the fires back up again. In election season,
I thought no better time than to talk election polling
and this weird sort of black magic which is really
not black magic at all. And it was, um, the
polling wasn't even really that off. No, it was great.
(01:06):
There was a furious fear we'll talk about in a second,
but there's a furious reaction by the media just left
polling and polsters out to dry, saying like you you're terrible.
Your whole craft is is useless. As the pollsters went
back after election night on two thousand and sixteen, which,
by the way, is a bit of a surprise to
(01:26):
everybody involved. Um. Yeah, when the pollsters went back and
looked at their stuff, they said, wait a minute, no,
this is this is all fine. It was you guys, media,
You screwed up. You don't know what polling is or
what it does, or how to talk about it. Most importantly, Yeah,
and then you public, you have no idea what's going on.
You just see some percentages and you automatically leaked to
(01:49):
some conclusions and this is way off. So it's impartant
that the media was misrepresenting it. Some polls weren't very good.
And then, um, the public in general just needs to
be a bit more educated on statistics to understand what
they're hearing. And that's what we're here for. Because I
I took statistics three times in college, the same course
at Georgia. At Georgia, I took one of those classes.
(02:11):
I hated it, intro to statistics, right, yeah, boy, I
hated the class. The professor. Finally I walked up to
her on the last day of the third time, was
like please, and she bumped my d up to a
seat and I was I never looked back. She say
you have a one in four chance, and you're like,
but what does that mean? Right? What is four? But
(02:31):
so if I can understand this after doing some research,
then anybody can understand at least the gist of it
enough to understand polling and not be taken in by
bad representation of what poll results are. Yeah. So, if
you remember, in there were polsters saying or I'm sorry
and I'm gonna say that wrong over and over again.
You had media saying that Hillary Clinton is going to
(02:54):
win in a landslide. Um, she's got a chance to
win some set as high is ninety five, She's gonna
win the popular vote by three percentage points. Um, all
the all the battleground states in the Midwest, Um, she's
gonna win those narrowly. And it did not work out
that way. And like you said, there was a furor
over how could everyone be this wrong with the polling.
(03:17):
And there's man named Nate Silver, who everyone probably knows
at this point, who has made his name as a
data specialist and runs the five thirty eight blog and said,
you know what, um pulling is flawed. And that's probably
the first thing that everyone should understand is all polling
is a little bit flawed. Um, state polling is is
(03:39):
definitely a little more flawed than national polling. But here's
the deal, everybody, these polls from we're not only not
so far off, but historically, dating back to since nineteen
seventy two, they actually performed a little better than a
lot of elections. Yeah, and the state polling, while worse
than average, wasn't a far off from the average error rate.
(04:02):
So what do you what do you want? So there's
a lot of stuff, Like we said that there was
a lot of post mortem that was done on the
two thousand and sixteen polls and what what was gotten
wrong and what was gotten right, And we'll talk about
that later. But um, the point is is that overall
it wasn't that far off. And so the the idea
isn't that the polls failure, that there's something inherently flawed
(04:24):
with polling, or that there's even something inherently wrong with
the media. Like I want to go on record here,
especially in this climate, the media is not our enemy.
Like any healthy democracy needs a vital, robust, independent media
is free from bias as an objective to reality and
good injustice as possible. But there's also such a thing
(04:47):
as a twenty four hour news cycle, and you've got
to fill that. And that's given the rise of opinion
news and pundits and um and basically trying to capture
as much market share as possible, which is definitely the
wrong track for media in general. But I just want
to go on record, while we're gonna be kind of
beating the media up a little bit, that does not
mean that the media has inherently flawed or evil or
(05:09):
or seeks to um to to kill you and your
family and your family dog. So Silver goes back and
a bunch of people go back and look at um
history and kind of what went wrong here in as
far as the polling goes. He says, you know what,
we went back for the past twelve presidential cycles since
nineteen seventy two, and he said the polling air was
(05:31):
four point one. He said in that national polling air
was three point one, so technically by a full point
it was. It was a full point better. He said, Uh,
we predicted that she would win the popular vote by
three percentage points. She actually did win the popular vote
by two percentage points. Um. The state polls were the
(05:52):
real difference maker. They actually did underperform at a five
point to error rate, and that doesn't sound like that much.
I think the overall or rate for state poll since
nineteen four point eight, so four point eight five point
two doesn't sound like much. But if you're talking about
a percentage of error and just a handful of swing states,
that can make something look like a landslide even though
(06:14):
you lose a popular vote. That's exactly what happened, right,
That's exactly what happened, because you gotta remember Trump didn't
win the popular vote, he won the electoral college, and
it came down to those swing states. But the fact
that they were off just by point four points from
the average for the error rate um goes to show
(06:34):
you just how close that race actually was, which again
is the opposite of how it was being broadcast throughout
the election. It was supposed to be a landslide, like
Hillary Clinton might as well just be like taking measurements
for curtains in the oval office right now, Like it
was just that set So it was presented one way,
when in reality, if you really looked at the polls
(06:56):
and the polling results, if you looked at them with
a sober face, it was a much closer race than
it appeared or than it was being broadcast. I haven't
had a sober face since that night. So, uh, we
should talk about the margin of error. Um, in polling.
Anytime you see a poll, it's you. They talk about
(07:17):
the margin of air. It's usually plus plus or minus
three or four, and that is on each side. So
for each candidate's poll, uh. In other words, it could
be a potential like seven to eight point swing and
still be within that margin of error. So when Trump
is winning States by a point two percent margin or
a point five or a point seven percent margin, that's well,
(07:40):
well well well within the margin of error, right right,
So um, that margin of error, by the way, is
just bill tim. We'll talk about it a little more
and a little bit, but it's like there's just no
way around it. If to to get around any margin
of air, you would have to literally go through and
interview every single voter in America and then compile the
(08:01):
evidence or their their data perfectly without any miss keys
or anything like that. And it's just impossible. So everyone
accepts that any poll is going to have a margin error,
but you want to keep it within plus or minus
three points. Right be four. Yeah, So a little history
of polling. Um, we've always been, um, pretty spell bound
(08:25):
by poles in this country. We put a lot of
stock and polls, especially the presidential race. Um. The word
straw pole, if you've ever heard that, that comes from
the idea that you hold up a piece of straw
to see which way the wind is blowing. So a
straw pole is kind of like, here's how things stand
today on something like this is why the way the
wind is blowing today on this matter. Yeah, and they're
(08:45):
just kind of informal. They used to take them like
on train cars, as journalists would ask people who they
were on the train with, who they're going to vote for.
Nothing like formal or anything, but it it was. It
does kind of reveal how long standing our fascination with
poles really really is. Yeah, it got pretty serious in
the nineteen thirties, specifically the nineteen thirty six election, where
(09:07):
a literary digest it was, it was a pretty big
magazine at the time, pulled it's subscribers and it's just
kind of funny even seeing this sence they predicted a
landslide wind for Republican Republican alf landon over fd R.
So if you've never heard of alf Landon, uh, you
(09:27):
know I because alf Landon did not beat f DR. Uh.
And the magazine's editor said, you know what, we didn't
even think about the fact that we just pulled our
subscribers and that they're wealthy people are at least wealthier
on average, and they're probably going to vote Republicans. So um,
alf Landon was their man. Right. So if you go
out it's even today and just interview Republicans and say, hey,
(09:52):
who you're going to vote for, and then take that
results and apply it to the entire population of the nation,
you've got a flawed pole. And that's what Literary digested.
But in doing so, they established this kind of they
pointed out a real design flaw that now is just
one of the first basic things that anybody conducting a
pole gets rid of. That's right. Gallup came on the scene.
(10:13):
They galloped onto the scene. Uh so sorry, And they
were one of the first big polling companies to say,
all right, we gotta get this right. We gotta get
a representation of all of America here. So we're gonna
send our people door to door. We're gonna go to
every zip code in America, and they did that from
n to nineteen four UH and got basically within about
(10:35):
three percentage points, doing a pretty good job. UM, but
it was really expensive. So in the eighties, in the
mid eighties, they switched to calling people on telephone, which
which I mean that that's still today. That is the
gold standard is for a human being to dial up
another human being and ask them some questions. And we'll
(10:55):
talk a little more about it. But what what Gallup
does and what you does, and what a few others
UM do is it's called randomized sampling or probability sampling,
which is where you basically leave it to chance that
any voter registered voter in America is going to get
a phone call from you, So that what what Gallup
(11:18):
is doing and what Pew does is called randomized sampling
or probability sampling, where the any voter in America has
an equal chance of receiving a phone call from Gallop
or from Pew and being asked these questions. And it
worked pretty well for a while when they moved from
UH in person over to the phone, because they were
(11:39):
still asking people questions and they could still UM get
their answers and harass them, which is a big thing
is we'll see about the this type of sampling. Um,
the problem is when people started to use caller I D,
they stopped picking up the phone as much, and so
the response rate went down dramatic. Yeah, so they would
(12:01):
call people using random digit dialing, which is a computer
system where it fed in an area code and then
the first three digits and then randomly dialed the last four.
So you've got a pretty good start there on the
random sampling. But even then they said, you know what,
women tend to answer the phone more than men. So
to truly randomize that whenever whoever picks up the phone,
(12:22):
we have to then follow up and say we want
to talk to the person in the house who's had
the most recent birthday, further randomizing. Um. I got it
kind of a laugh about this because I don't know
that I've ever, literally ever seen my father pick up
a telephone in his life, or at least growing up
for the first eighteen years of my life, I don't
think I ever saw him answer the phone. It's all
(12:44):
ham radio, huh. Not not he went into that, but
just literally not not one time. He would just let
it ring. If no one was around if my mom
wasn't around to answer it, and granted it was usually
never for him, no one ever called to talk to him.
But I picked up on that and my friends used
to get really frustrated back before texting that I would
just never answer my phone. And I always just thought
(13:06):
it was an option, Like when the phone rings, it
doesn't mean you're obligated. It just means now you have
an option you can answer it or not. Well, technically
that's true. I mean like it depends no, you don't
have to answer the phone, but it depends on you know,
who in your life could possibly be calling. I didn't
think it was rude or anything. I just thought it
was literally, like, you know, I'm gonna hedge my beds here,
(13:29):
that one of my friends isn't stuck on the side
of the road. They can leave a message and if
they are, I'll go get them. So, UM, what you're
talking about, Chuck, is what's called a non response, and
that's factored into the response rate, which with phone pulling
from nine the nine eight until the nineteen nineties, um
(13:49):
it was manageable. You. I think the response rate peaked
at thirty six percent in n which is good. Now
it's down to like nine percent, because, like I said,
people have callorad D and if some unknown numbers calling,
you typically don't answer. And that actually affects things because
there is a certain kind of person who answers the
phone no matter what, and they are not like every
(14:12):
single American, and that actually factors into the kind of
pole you're connecting. Plus also you want like a certain
amount of responses. I think out of a sample size,
you want a minimum of eight hundred survey responses. And
back in the day, when you got a thirty six
percent response rate, meaning thirty six percent of those people
(14:32):
you called would answer the phone and go through all
of the questions and answer them fully and complete the survey.
Um since it was down to nine percent, you went
from having to call between two thousand people to to
up to nine thousand people now just to get eight
hundred surveys completed. And that made the whole thing a
lot more expensive. On the one hand, because it was expensive,
(14:55):
it meant that there were fewer and fewer companies that
could conduct these polls, which meant the polls you were
seeing were more and more legitimate. But on the other hand,
it also UM usually decreased sample size a little bit, because,
as as Gallup pointed out, like you can kind of
fiddle with the numbers a little bit with a smaller
response rate and smaller sample size. Yeah, and it also
(15:16):
led to robo calls because of expense, because of people
not answering their phone as much, and those systems. Uh.
I mean, I love how Dave Ruce put it. He
said they they range from okay to terrible, UM, and
how well they work online polls in these other new techniques.
But I think we should take a break and then
talk about, uh what I found the very interesting UM
(15:39):
way that they further randomized this thing from this point
forward right after this. All right, so we've already talked
(16:10):
about the fact that they randomly called someone, and then
they take one further step on that that call by saying,
let me speak with whoever had the most recent birthday,
even if it's I guess you're your three year old, right,
And and one other thing I kind of made mention
to it that I have to interject, dude, like harassing people,
like if you've been picked by this computer, if your
(16:31):
phone number has been picked, they're going to keep calling
you and calling you. And that is because, as a
person who doesn't normally participate in phone surveys, you are
a specific kind of person that you you can't be
left out of the population because you represent a large
number of people and they want your opinion. So part
of this phone standard of calling people is to call
(16:53):
them over and over and over again to basically harass
them into participating to get their answers for this serve
because it's as important, if not more important, sometimes than
the people who are like, oh, yeah, I'd love to
answer this phone survey. Two totally different kinds of people. Yeah. Absolutely.
And I was totally kidding, by the way, to the
listener when I said they will speak to a three
(17:14):
year old. They they asked the most recent birthday of
someone a voting age obviously, right. Um, So then you've
got a pretty pretty decent random sampling to begin with.
And then you have to start uh the process of waiting, uh,
which comes in a lot of different forms. Um. If
you want an example of like a really good political poll,
it's gonna be paid for by a neutral source. It's
(17:36):
not gonna be um like you know, a CNN pole
or a Fox News poll or a superpack or anything
like that. Um, you're gonna have a random sample of
the public, which we just talked about. You're gonna be
dialing cell phones and landlines these days. That's a big one. Also,
they'll ask you if you have a cell phone and
a landline, and if you say yes, I have both,
they're going to adjust your response based on the fact
(17:59):
that you had a higher chance of being selected because
you have two numbers that the computer could have picked right.
And another thing is, like you mentioned, they're gonna keep
calling you the best ones, use live interviewers still. And
then what they want to do, and this last one
is really important, is you're they're gonna try and improve
the accuracy of the results by waiting the response to match.
(18:22):
What they want to do is just match a real
world demographic, age race, your income level, your education level,
and all of that stuff is factored in, and all
the stuff is weighed out because, um, well we'll talk
about it, but you know, there are many different kinds
of Americans, and if you want a really good sampling
(18:42):
of different kinds of Americans, you're gonna, like, like you said,
I have to fidget with the numbers to make it
a true representative population right. So, um, because even if
you just get it exactly right demographically and waited, which,
like you said, we'll talk about some more in a second,
you still have that marge and of error. And again
that's that um, you know, plus or minus three points,
(19:05):
and that means that it could be cent or it
could be they don't know, but somewhere between that, most
of your answers are going to be the like, the
correct answers somewhere in there. That's what that means with
that that margin of error, And the reason that that's
built in is because it is basically impossible to perfectly
represent the larger population through random sampling. You're just not
(19:29):
going to pick everybody correctly just by the fact that
it's random and it's a sample. Yeah, and that's important
because um, like that's why you hear so much hay
being made over a double digit lead in a poll um,
which Biden had sort of semi recently. I know it's
it's gotten a lot tighter since then, but you know,
when Biden was up by I think like ten percentage points,
(19:50):
people were flipping out because you know, like we said,
it's plus or minus four for each candidate. So that's
a total of eight. And so basically the press start
screaming like he's outside of the margin of error. Everybody
like nothing can beat him, right right, yeah, But now
things are back within that margin. I saw PBS News
(20:11):
Our UM they interview Mark Shields and David Brooks. Um.
Brooks is a New York Times columnists and I think
Mark Shields is an independent columnist. And one of them
actually said, and this is in July, America has clearly
made up its mind on who's going to be the
next president. And I was like, this is July. Did
you not learn anything from two thousand and sixteen or
(20:34):
you I couldn't believe that those words, And there's a
matter of factly, yeah, it's irresponsible. And there's been studies
about this too that have suggested that that words like that,
that um polls that say chance of winning, that this
kind of stuff like actually has a negative impact on
(20:54):
the leader because it makes people think, well, I don't
need to go out and vote. Everybody else is going
to go vote, and the turnout might be lower than otherwise.
There's also people that just who well, there's people who
dispute that. They say, yes, it makes sense intuitively and anecdotally,
but we've yet to actually see genuine data that that
says clearly that this has this effect. But it's something
(21:14):
that's still being studied right now whether it actually does
or not well. And I also saw an article the
other day about the the quote unquote silent majority, and
that another reason those polls were so wrong back then
and they're saying are probably wrong now is because there
are there they They say that there's a substantial block
of voters who vary privately and secretly vote for Trump. Yeah,
(21:37):
they're the term for them among pollsters as shy Trump voters.
They won't admit that they're going to vote for Trump,
but they're going to vote for Trump, and that that
affects polls. I saw that that's actually not been proven
to actually exist. Um, but I think it was a Pew.
There's a really great Pew article. If this stuff is
speaking to you at all, go check out Pews Key
(22:00):
Things to Know about Election Polling in the US and
Now has a bunch of great links that you should
follow in there. And there's also Sideline UM. They have
Surveys and Polling, which is a guide for journalists to polling.
But I I found out you don't actually have to
be a journalist to read it online. So if you
want to go check those out, they have some great like,
um like just some breakdowns of some of the stuff
(22:23):
we're talking about, but also about how to read polls
and what to trust and look for in general. Uh.
And then a little known fact, PEW was actually originally
called uh pupu until six when Star Wars came out
and they're like, we gotta change our name now, guys. Yeah,
I can't do it, man, it is dant o rama
today with you. Huh. So back to the waiting thing. Um.
(22:46):
And by the way, we should mention that Gallup said
if they wanted to um increase that sample size and
actually get the margin of error down to like plus
our minus two, that they could do that, but that
would be like a literal increase in the cost. So like,
everyone just please live with plus or minus three or
four points. Yeah, and now everybody generally does. And and
Dave uses this really good example. Dave Russ helped us
(23:08):
out with this, and he said, um, this this margin
of air is best understood where if you selected a
hundred marbles UM five there's a jar five five hundred
blue marbles, and you pick out a hundred of them. Um,
you might pick out fifty of each one time. And
then what he said, five hundred marbles. Oh no, I'm sorry,
(23:29):
a thousand marbles. I've lost my marbles. Yes, there's a
thousand marbles. Okay, five hundred are red and five hundred
of blue. Your task, Chuck, is to pick out a hundred.
So you go to the trouble picking out a hundred
fifty or red, fifty or blue. And I say do
it again, and this time is forty seven and fifty three,
and keep saying again again, right, and I smack my
(23:50):
riding crop on the desk. That's that you're sitting at
times because you gets super turned on. Yeah, I did
it a hundred times because dear leader told me to.
And at the end you get a little bell curve
and basically a plus or minus four. Right, So yeah,
almost all of them, this is what's confidence involved. Almost
(24:12):
all of them are going to fall in that bell curve.
There's going to be some outler, it's just gonna be
that one time where it was just absolutely insane. You
actually picked one hundred red marbles randomly blindfolded from this
jar that that's that's so insignificant statistically, it's just such
an outlier. But almost all the we're gonna be in there.
So when you're pulling like this large group of people,
(24:33):
like American voters, and of them are falling within a
couple of percentage points of either side of this this middle,
you can pretty much feel confident about that. And that
is the basis of of election polling, of political polling,
of all polling, really that they have this built in
margin that they know exists, but everybody can live with it.
(24:54):
The problem is is when you're hovering around that fifty
mark and you're talking about a two party system, one
of them has like and the other one has but
there's a plus or minus of like two points. It
means we have no idea. And some people would say, well,
why even do polling, because what you're showing there is
not who's going to win. That's not the point of polling.
(25:17):
But the point of polling is to take a snapshot
of how America or whoever you're polling is feeling that
moment about who to elect, about what laws to pass,
about religion, about um. The Cleveland Indians. It doesn't matter, right,
like the the the that's what a poll does. But
you can pervert polls into making them talk a different
(25:38):
language and say, hey, he look at this percentage. You
take these polls and you convert them into something else.
Now you have something like a chance this person is
gonna win. Go shout that, Wolf Splitz heer, and Wolf
Flitzer goes and shouts it as loud as he can. So, uh,
we need to talk a little bit more about waiting.
I mentioned earlier that there's other things they do to
(25:58):
sort of, um tip the scale, and that sounds like
a bad term, so I guess I shouldn't say it
that way. But um things they do to make it
equitable and a true representative of the American population. For instance, UM,
African American voters make up twelve percent of voters. So
if they did a poll and in the end they
only got six percent of respondents so were African American,
(26:19):
then they just double it. Basically, Um, if the respondents
were overwhelmingly Caucasian, they would weight that down to their
true representative number, which is about I think sixty six percent. Yeah,
the electorate is white, and of U of white people
respond or eight percent of the people that respond are white,
(26:40):
then they're going to kick that down. And again, this
is just adjusting the poll to the proper weight, so
you have a really legitimate snapshot. And you know, if
it sounds crazy that there using a thousand people's responses,
uh and drawing that out to the size of the
voting population of America, it is. But if you're a statistician,
(27:03):
it isn't you know. I mean, you know, it reliably
works as long as you present it with plus or
minus this margin of error. Um, it's as crazy as
just an average Joe on the street. It does to
be like they ask a thousand people and we're supposed
to know and extrapolate that, and a statistician who our
number wants and data wants would say, yeah, that's exactly
what that means. Shut up, that's really all. That's really
(27:26):
all you need. But it really is a testimony to
the power of those those statistics in that data and
the analysis of them. Yeah, waiting is really important. It
goes far beyond just like age political party. Um. I
think gallup uses eight different variables. The New York Times
see in a college poll uses ten, like and they
(27:46):
include things like marital status and home ownership PU uses
twelve variables. Um. They ask things like do you have
home internet access? To you have volunteers or do you
engage in volunteerism um, And all of these things have
been owned to be associated. So like, if you're a
white woman age sixty five to seventy five who volunteers
(28:07):
twice a month and lives in the suburbs, you're a
very specific person where you you there's a group of
people out there who vote like a certain way, and
you represent like all those people with that. So they
they'll wait the results based on these additional questions that
you're into. They don't just ask you do you are
you going to vote for Trump or Biden? And there's
(28:27):
also built into that question a really important point, are
you going to vote? Yeah, that's a that's a huge
thing we haven't talked about. It's one thing to pull
registered voters. But here in America, somehow, uh, presidential elections
only get about six turnout still, which is shameful, shameful
and crazy. But um, that's another podcast. But uh so,
(28:51):
most of the really really good polls drilled down and
to get a real, real good representation of what might
actually happen. They try to drill down to whether or
not you're most likely to actually vote, because he cares
what your pain is if you're not gonna vote, and
they and I mean they generally take your word for
it that you're telling the truth, you know. Um yeah,
(29:13):
but they do have like nine I think pu Yeah,
Pew has nine questions that they basically used to establish
that you're you are planning on voting, like you're actually
going to vote, You're not full of hot air, you know. Yeah.
I don't know those questions are, but I imagine they
have to do with do you know where you're pulling places?
Do you have transportation? Stuff? Like I was thinking they
(29:34):
were going to be like, are you really really gonna vote?
Was like question three, and they just kept adding release
right you So, um, So you've got these these people
who have been called and they have answered these questions,
and they have participated in this survey whether they wanted
to or not, and they've they've finally done it. Built
(29:58):
into that margin of error built into the poll. Is
that understandable margin verror that just comes from the fact
that it's a randomized sample, right, But what PE and
any other legitimate UM group polling group will point out
is that the margin is actually greater than that. That
the margin of error for the average poll according to Pew,
is that it's something more like six points, not not
(30:21):
three or four, it's actually six. And the reason why
it's built on top of that margin of error that's
that's automatically part of the poll just by the virtue
of it being a randomized sample. Are things like the
person typing in the wrong key accidentally that those kind
of things add up, or that the question isn't worded
(30:43):
clearly enough that anybody who hears it knows the intent
and knows what their answer is, that there's some sort
of um miscommunication involved. There's also things that they can't control,
for like people who have pseudo opinions who don't want
to sound dumb, so they just answer yes or no
by sense something they really don't care about either way
UM and because they don't actually have an opinion, that
(31:04):
actually that that waits things the wrong way. So when
you add all these stuff these things up, UM, you
have these additional um uh errors that lead to different
to like a bias overall in in the the UM
the poll, which can affect the outcome. But again, the
companies that have the money to conduct like these genuinely
(31:28):
big gold standard polls, are they they know enough to
know how to kind of factor control for those as
much as possible. But still, what Pew says is, if
you're listening to a poll and somebody says plus or
minds three points, you should probably go ahead and double
that in your mind, double it in your mind, double
(31:48):
your W, your pleasure wr fun W, your margin error.
So let's take a break and we're gonna come back
and talk about what exactly they think went wrong with
those state polls right after this, Sorry, George, all right,
(32:27):
So I think it's generally acknowledged that the um and
again I want to say the polling was was off,
but apparently the pulling wasn't off, but the way it
was reported on was off. But what really happened in
seen what was off was the state polling, and what
they think they've, like you said, gone back and obsessed
(32:49):
over these polls since then, um, you know, because they
were already statistical walks, but when something like this happens,
they really sort of get worked into a dander and
get to the bottom of it. I mean people were
calling for the end of polling. She said it was
a failed profession. Basically, he was like, I'm getting rich
off this man. Yeah, we can't end polling. Jimmy Pugh
(33:12):
was like, stop stop talking like that. So what happened
in is uh they think is that uh, a lot
of non educated white people came out in big, big
numbers for Donald Trump. And that was a sort of
a new not a new factor because they had always
(33:34):
talked about college education, but a new factor in how
outsized of a factor that was. It had never been
that outsized. And so all these state posters they didn't
wait it and they didn't adjust their polls to reflect
this um fact that college educated people are more likely
going to respond to these surveys. So their polls were
(33:54):
just off. Yeah, and they knew that college educated people
were more likely to respond to the survey. That wasn't
news to them. What caught them sleeping was that they
had not picked up on the fact that this group
of people, um, non college educated white voters, We're going
(34:15):
to go to the polls in numbers like never before,
and that they were going to vote for Trump. They
did not pick up on that that was brand new,
like that didn't exist before. Trump basically brought up a
new electorate that helped get him elected, especially in battleground
states like Wisconsin, in Michigan, uh, in Indiana, although I
think in the enda he was a shoeing because of pens.
(34:37):
But the the these this group of voters that did
not exist, with this line between college educated and non
college educated white voters, that that partisan gulf hadn't existed
before election day. The pollsters didn't pick up on it,
and so they didn't wait those responses because they never
had to wait the responses before based on college education. Yeah,
(34:59):
so suburb exurbs and especially the rural vote counted like
it had never counted before. UM, which is obviously why
you see what's going on right now, like a very
hard pushed by the Trump campaign to um to get
these the same people out again. Uh and in the
way that they do that that is the nicest way
(35:19):
I can put it there, genuinely is so UM. Yeah.
So the idea that that there was all this was
already kind of a close race, a closer race than
was being broadcast. UM, that these these electoral um, huge
electoral battleground states that got flipped. Uh. That was basically
(35:41):
the reason that UM Trump was able to take the
electoral College. But the the idea is that these voters
kind of came out of nowhere and voted for Trump,
and that there were some other things that happened to
UM that the pollster didn't anticipate. One that the undecided voters,
people who said I'm legitimately undecided at this point a
(36:03):
week before the election, Uh, from whatever, they broke hard
in favor of Trump. On election date when they made
their decision, they voted for Trump that hadn't been predicted. Um.
That was another big one. And then one of the
other things too, is that the polls were just doing
what polls do, which is sometimes they're right, sometimes they're wrong.
(36:24):
But polls had gotten so good in the two thousand
oughts that people came to to be overconfident in their
ability to predict and pick winners. And the two thousand
and sixteen grates reminded us, like, polling is not perfect,
let's stop pretending it is. Yeah, and it's UM. A
lot of it has to do with, like we've been
(36:44):
kind of harping on the way the media presents it.
And then a lot of that has to do with
just our conditioned how we're conditioned to look at things
like underdogs. Um, and it's different in politics. And I
remember when these aggregators, especially at five thirty eight, Uh,
they had these predictive models, and they started talking about
(37:04):
the fact that and I think the Washington Post even
wrote a good comparison to sports. And you know, if
someone has a is a real big underdog going into
like a super Bowl or a World Series and they
end up winning, people don't get angry and go after
the people who said they had a fifteen or twenty
percent chance of winning. They just said, wow, what a story,
(37:27):
the underdog one. But there are so few presidential elections, uh,
you know, one every four years that it's it's the
same thing, but people just look at it differently. Like
an underdog like Trump was an underdog that supposedly had
like a fifteen to thirty percent chance of winning. Some
people said, one, yeah, well that's ridiculous, But a thirty
(37:49):
percent chance of winning is a real shot at winning,
for sure. Yeah, that's the way it's framed. It doesn't
seem that way in politics, no, And so that's one thing.
But another thing is that we shouldn't even be talking
about president vential elections with like chance of winning, chance
of winning like that is not how we should present it.
We and that's not how we used to present it.
(38:09):
We used to present it saying like this poll found that, um,
that Clinton was going to lead Trump of forty eight
percent or something like that, plus or minus two points,
and that would have shown you like, Okay, well this
is a really close race, way closer than I think, um.
And that's that there's my information. Not The problem is
(38:30):
that you can take that same statistic chance of winning
plus or minus of four point um margin of error.
If you convert that to a normal distribution, you come
up with a probability of a win. That's the problem
is that the statistics that are being being the data
that's being produced by these polls, are being converted in
(38:53):
ways that they shouldn't be, and then that's what the
media jumps on. That's what the public lapse up because
that is the horse race statistic, and eighty four percent
chance of winning, a fifteen percent chance of winning, that's
what we we think about. That's what we look at.
And so rather than realizing that actually this is a
close race plus or minds four points, we see eighty
(39:15):
four percent chance of winning, and that's a foregone conclusion
that that person is going to win. That's that, ultimately
is where the media and the public are culpable for this. Yeah,
I don't think uh, I don't think they were meant
to be extrapolated like that to begin with. And that
you know, polls are valuable, but like, I haven't looked
(39:36):
at any polls, and partially because of the way went down.
Um And in fact, for the past week, I've taken
a complete uh internet news and social media break and
it's been pretty great actually because yeah, I mean I
literally haven't looked at a single news thing. Very sadly
(39:57):
found out that Chadwick Boseman passed away like three days
after word. Like that's how how dark I've gone, uh,
and not looking at the Internet unless it's something that
brings me joy, which is to say, you know, old
led Zeppelin and Van Halen YouTube videos. I was looking
up classic Mad magazine covers of the eighties. That's all
I've been doing is if it doesn't bring me joy
(40:17):
on the internet, I'm not doing it. Um. And you know,
I gotta break that soon, because I do think you
should be active and involved in in in the know.
But yeah, but taking a break fairly regularly is definitely
mentally healthy. But that aside, I haven't I'm not looking
at any polls and I don't care what any poll says.
We'll see. So I was I was thinking very similar
(40:39):
stuff too, and um, like, what what's the point of poles? Okay,
well I finally found it. If you look that up
on Google, there's there's just very little on it. But
I found somebody who who explained it pretty well. I
thought that, Um, poles aren't meant to tell you who's
going to win. They're not. They're not forecasting models, like
I said before, They're to be like a snapshot of
(41:01):
how whoever you're polling feels at the moment right um.
And in doing that, because you are sampling American people
and these are independent news organizations typically who are carrying
out these these polls, you get to tell everybody else
how America is feeling. Rather than the leaders saying I'll
(41:24):
decide how you're feeling. I can decide what you want
and what you need and what you think is important.
Polls prevent that from happening by telling the rest of
the people, Hey, this is how everybody else is feeling
right now too. And in some ways it is kind
of sheepish, where you know, the ideas like, oh, you know,
is that supposed to sway my opinion that everybody's going
(41:44):
to vote for this person and not for that person.
That should have no bearing or impact on your vote,
And it feels like that that's how polls are used sometimes.
But if you step back and look and see that
they're actually kind of an important part of of sharing
what other people are thinking, rather than being told what
we're thinking or you know, what, what to think, then
they actually are pretty legitimate in that sense. Yeah, well,
(42:07):
you know, I say, take your polls and sit on it. Well,
one more thing we get. We cannot talk about polling
and not talk about Internet polling real quick. This is
a completely different style of polling than's ever been done before.
Rather than a randomized sample, you actually just say hey,
you want to take the survey, and people click it.
So it's called opting in opt in surveying, and very
(42:30):
specific kinds of people take surveys on purpose on the Internet,
so they really are because they're new. They're really now
figuring out how to wait these things are not UM
and how to how to use them because they can
produce legitimate UM data, but it depends on who's conducting
the poll, how if they know what they're doing, that
(42:50):
kind of stuff. But just like everything else that the
moving things online is democratize polling, and so anybody can
conduct a poll now and basically enter the news cycle.
That's how Kid Rock almost became a senator in Michigan
for a second there. But so on the one hand,
it's good, but it's also we're in a big period
of disruption as far as polling is concerned. So for you,
(43:11):
the polling consumer, either go like Chalk and just stop
listening to polls altogether, or UM look for things like transparency.
Do you recognize the company or the name that produced
the poll. Are they sharing their data, like how the
questions exactly were worded, what their population size was, how
they waited at all this stuff? UM, if the if
(43:32):
there's all, if all that stuff is included, you can
probably trust the poll. And then UM, beyond that, just
remember what you're looking at that this isn't a predictor
of who's gonna win. It was a snapshot for a
very very brief moment of a very specific sample of
America to just to show how people would vote right then.
And it was right then too. This is not election
(43:53):
day we're talking about. Yeah, And I'm you know, I
want to be clear. I'm not poopooing polls. I just uh,
they're they're valid and useful, but I just don't care
to look at them right now. I understand. Yeah, that's
that's my jam. Well, you got anything else about polls,
nothing else about powells. Well, if you want to know
about polls, uh, start looking around and you go check
out Pew stuff and uh sidelines stuff and all that stuff. Um.
(44:16):
And since I said stuff three times here it comes
from will stillt Kid or a candy man. So, uh,
this is from Keiley Price. And Keily says this, Hi, guys,
I'm writing today not only to confess my unending love
for stuff you should know, but also to share a
link to some black owned bookstores. It would be so
(44:38):
cool if all of your listeners purchased your book. She
should just say period. Uh. Comma from a black owned
bookstore couldn't agree more. By the way, a couple of
podcasters that I listened to while I wait for stuff.
You should know have books out and coming out soon,
and they encourage their listeners to support black owned businesses
through the purchase of their book to win Win. I
(44:58):
don't know why it's taking me so long to think
to write this to you guys. I blame it on
Corona madness. But last, but not least, I'll say I
love the End of the World with Josh Clark and
movie Crush as well. Any chance to hear you guys
talk as a chance worth taking. When we get a
COVID vaccine and you guys can do your live shows again,
please come to Nashville. Oh yeah, for sure. I think
(45:19):
we planned on Nashville. Yeah, Nashville got scuttled by COVID
this time around. You're going to come now? We might
not ever be able to come. No, I know it's
super close to Atlanta. To lose my mind if I
got to see you guys here all the best, Keiley
Price and so. Keiley sent a link to a handy
website that lists black owned bookstores near you. I made
(45:40):
a little uh you are else shortener to make it
easier on everyone. Oh, let's have it so you can
go to bit dot lee slash s y s k
b l m uh and find black owned bookstores near
you to purchase Stuff you Should Know, an incomplete compendium
of mostly interesting things at the very least. Um, we
like to kurch people to go to indie bound dot
(46:01):
org and support indie bookstores. I don't know if there
is an actual black owned indie bookstore website, but I
would imagine most of the black owned bookstores are indie bookstores.
Uh yeah, probably, So check it out. Bit dot lee
slash s y s k v l M. Go out
(46:21):
and buy our book. Everybody. You're gonna love it. It's
really great, m um, and thanks thanks for that, Chuck,
and thanks for setting us up for that too, Kiley,
much appreciated. We'll see you in Nashville. I guess Kiely
will be the one, like you said, losing your mind
in the crowd. If you want to lose your mind
on us via email, we love that kind of thing.
Kind of um. You can send it off to stuff
(46:44):
Podcasts at iHeart radio dot com. Stuff you Should Know
is a production of iHeart Radio's How Stuff Works. For
more podcasts for my Heart Radio visit the iHeart Radio app,
Apple podcasts, or wherever you listen to your favorite shows.