All Episodes

July 5, 2016 42 mins

People often ask us how we do our research. We're not going to disclose all of our secrets, but we'll give you some tips on how to root out the bad studies from the good ones. Learn all about shady studies and reporting right now!

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Welcome to Stuff You Should Know from House Stuff Works
dot com. Hey, and welcome to the podcast. I'm Josh
Clark with Charles W. Chuck Bryant and Jerry This Stuff
you Should Know. Um, Josh, We're gonna do something weird today.

(00:22):
I'm gonna do a listener mail at the head of
the podcast. I know, all right, what? All right, let's
do it. Okay, this is from b Wait, hold on,
can you have the listener mail? Music going? Oh? I
don't know, Jerry, should we go the whole nine yards?
So let's do it? People might freak out? All right?
This is from Bianca. Uh voice is what I'm gonna say.

(00:46):
I think that's great. Hey, guys wrote you not too
long ago asking about Hey, research your own podcast. It
just got back from a class where we talked about
research misrepresentation and journal articles. Apparently journals don't publish everything
that is submitted, and lot of researchers don't even publish
their studies they don't like the results. Uh. Some laws
have been put into place to prevent misrepresentation, such as

(01:07):
researchers having to register their studies before they get results
and journals only accepting preregistered studies. But apparently this is
not happening at all, even though it is now technically law.
This ends with the general public being misinformed about methods
and drugs that work. For example, there are twenty five
studies proving a drug works, twenty five that don't. It's

(01:28):
more likely that twenty of the positive results have been
published and only one or two of the negative. Uh.
And that is from Bianca, and that led us to
this article on our own website. Ten signs that that
study is bogus, and here it is nice Chuck. Well,
we get asked a lot about research from people usually

(01:49):
in college. Are like, you guys are professional researchers. How
do I know I'm doing a good job and getting
good info? And it's getting harder and harder these days,
it really is. You know. One sign that I've learned
is if you are searching about his study, and all
of the hits that come back are from different news organizations,

(02:11):
and they're all within like a two three day period
from a year ago, nothing like, nothing more recent than
that than somebody released a sensational study and no one
did any actual effort into investigating it, and there was
no follow up. If you dig deep enough, somebody might
have done follow up or something like that, but for
the most part, it was just something that splashed across

(02:32):
the headlines, which more often than not is the is
the case as far as science reporting goes. So that's
a bonus. That's the eleven boom. How about that? Yeah,
should we just start banging these out let's do it?
Al Do you have some other clever well apart and
parcel with that, I don't know if it's clever. You

(02:53):
do come across people who you know can be trusted
and relied upon to do good science reporting. So like
Ed Young is one, another guy named Ben gold Acre
has something called bad Science I don't remember what, Yeah,
outlet he's with. And then there's a guy I think
scientific American named John Horgan who's awesome. Yeah, or some
journalism organizations that have been around instood the test of

(03:16):
time that you know are really doing it right, like
a nature Yeah. Scientific Americans are like really science Yeah,
Like I feel I feel really good about using those sources. Yeah,
but even they can you know, there's there's something called
scientis um where there's a lot of like faith in
dogma associated with the scientific process, and you know, you
have to root through that as well. Try it. I'm done. Uh.

(03:40):
The first one that they have here on the list
is that it's unrepeatable, and that's a big one. UM.
The Center for Open Science did a study, UH was
a project really where they took two hundred and seventy
researchers and they said, you know what, take these one
hundred studies that have been published already psychological studies and
pour over them. And uh in just last year. They

(04:03):
took him a while, took them several years. They said,
you know what, more than half of these can't even
be repeated using the same methods. They're not reproducible. Nope,
not reproducible. That's a big one. And and ones that
means that they they when they carried out they followed
the methodology. UM Scientific method podcasts. You should listen to
that one. That was a good one. That they found

(04:25):
that their results were just not what the what the
people published, not anywhere near them. UM. For example, they
used one as an example where a study found that
men were terrible at a determining whether a woman was
giving them UM some sort of like a clues to
attraction or just being friendly, sexy, sexy stuff or yeah,

(04:49):
or good to meet you or buzz off jerky um.
And they did the study again and as part of
this Open Science Center for Open Science study or survey,
and they found that that was not reproducible or that
they came up with totally different results. And that was
just one of many. Yeah, and in this case specifically,
they looked into that study and they found that it was, um,

(05:13):
one was in the United Kingdom, one was in the
United States. May have something to do with it. But
the point is, Chuck, is if you're talking about humanity,
I don't think that the study was like the American
male is terrible at it. It's men are terrible at it. Right.
So that means that whether it's in the UK, which
is basically the US with an accent and a penchant

(05:33):
for t I'm just kidding UK s soon, Um, the
it should be universal. Yeah, you know, unless you're saying, no,
it's just this only applies to American men, the American men, right,
then it's not even study. Yeah. The next one we

(05:54):
have is, uh, it's it's plausible, not necessarily a provable.
And this is a big one because and I think, um,
we're talking about observational studies here more than lab experiments,
because with observational studies, you know, you sit in a
room and get asked three questions about something, and all
these people get asked the same questions, and then they

(06:15):
pour over the data and they draw out their own observations.
And one of the very famously an observational study that
led to false results found a correlation between having a
type A personality and UM being prone to risk for
heart attacks and UM. For a long time, you know
that the news outlets were like, oh, yes, of course,

(06:38):
that makes total sense. This study proved what we've all
known all along, UM, And then it came out that no,
actually what was going on was a well known anomaly
where you have a five percent um risk that chance
will produce something that looks like a statistically significant correlation

(06:59):
when it's not all when really it's just total chance.
And science is aware of this, especially with observational studies,
because the more questions you have, the more opportunity you
have for that five percent chance to create a seemingly
statistically significant correlation when really it's not there. It was

(07:19):
just random chance where if somebody else goes back and
does the same same study, they're not going to come
up with the same results. But the if a researcher
is I would guess willfully blind to that five percent chance, um,
they will go ahead and produce the study and be like, no,
it's true, here's the results right here, go ahead and

(07:41):
report on it and make my career. Yeah. Well, and
they also might be looking for something inter affect Chances
are they are, Um, it's not just some random studying. Look,
let's just see what we get if we ask a
bunch of weird questions. It's like, hey, we're looking to
try and prove something most likely so that better minehoff
thing might come into play where you're kind of chair
picking data. Yeah, that's a big problem that kind of

(08:03):
comes up. A lot of these are really kind of
interrelated to totally. The other big thing that's in related
is how the media reports on science these days. Yeah,
you know, he's a big deal. Yeah, Like in John
Oliver just recently went off on this and NPR did
a thing on it, like they might even like the
researcher might say plausible, but it doesn't get portrayed that

(08:23):
way in the media. Sure, remember that poor kid who
thought he found the ancient Mayan city. The media just
took it and ran with it. You know, yeah, I
think there was a lot of maybe or it's possible,
we need to go check kind of thing. That mean,
He's like, no, he discovered an ancient Mayan city never
known before. Yeah, and let's put it in a headline.
And that's I mean, that's the That's just kind of

(08:45):
the way it is these days. Do you have to
be able to sort through it? I guess that's what
we're doing here, aren't we, Chuck, We're telling everybody how
to sort through it, or at the very least take
scientific reporting with a grain of salt, right, Not like
you don't necessarily have the time to go through and
double that research and then check on that research and

(09:07):
you know, right, so take it with a grain of salt.
Um unsound samples. Uh, here was this study that basically said, um,
how you lost your virginity. It's going to have a
very large impact and play a role on how you
feel about sex and experienced sex for the rest of

(09:27):
your life. Yeah, it's possible. Sure, it seems logical, so
we'll just go with it. But when you um only
interview college students and uh, you don't you only interview
heterosexual people, Then you can't really say you've done a

(09:47):
robust study, now, can't you. Plus you also take out
of the sample size your sample population, anybody who reports
having had a violent encounter, Throw them out, that data out.
That's not gonna inform how you feel about sex, right exactly.
You're just narrowing it down further and further and again,
cherry picking the data by throwing people out of your

(10:08):
population sample that don't that will throw off the data
that you want. Yeah, and I've never heard of this
acronym weird and um. A lot of these studies are
conducted by professors in academics. So a lot of times
you've got college students as your sample, and there's something
called weird Western educated from industrialized rich in democratic countries. Right,

(10:31):
those are the participants in the studies studies subject. But
then they will say, men, right, well, what about the
gay man in Africa? Like you didn't ask him? So
that was that's actually a really really big deal. Um.
In two thousand and ten, these three researchers did a
a survey of a ton of social science and behavioral

(10:55):
science studies found that eight percent of them used your
study participants. So basically it was college kids for eighty
percent of these papers. And they surveyed a bunch of
papers and they took it a little further and they
said that, um, people who fit into the weird category

(11:15):
only make up twelve percent of the world population, but
they represent eight percent of the population of these studies.
And a college student, chuck in North America, Europe, Israel,
or Australia is four thousand times more likely to be
in a scientific study than anyone else on the planet.

(11:36):
And they're basing psychology and behavioral sciences are basing their
findings onto everybody else based on this this small tranch
of humanity. And that's a that's a big problem. That's
extremely misleading. Yeah, and it's also a little insulting because
what they're essentially saying is like, this is who matters.

(11:57):
Well also, yeah, but what's sad is this is who
I am going to go to the trouble of recruiting
for my study. It's just sheer laziness. And I'm sure
a lot of them are like, well, I don't have
the funding to do that. I guess I see that.
But at the same time, I guarantee there's a tremendous
amount of laziness involved. Yeah, or maybe if you don't

(12:19):
have the money, maybe don't do that study. Is it
that simple? I'm probably over simple plying. I don't know.
I'm sure we're going to hear from some people in
academia about this one. We'll stop using weird participants, or
at the very least say, um, like this is sexual
Dartmouth students. This applies to them, not everybody in the world.

(12:46):
Of these studies where used those people as study participants,
and they're not even they're not even emblematic of the
rest of the human race. Like college students are shown
to see the world differently than other people around the world.
And so it's not like you can be like, well,
it still works, you can still extrapolate. It's like flawed

(13:06):
in every way, shape and form. Probably take a break. Come, yeah,
let's take a break because, uh, you're get a little
hot under the collar. I love it, man. Uh, we'll
be right back after this. Just so much, All right,

(13:40):
what's next, buddy? Uh? Very small sample sizes. Right, if
you do a study with twenty mice, then you're not
doing a good enough study. No, So They used this
um in the the article. They use the idea of

(14:02):
ten thousand smokers and ten thousand non smokers, and they said, okay,
if you have a population sample that size, that's not bad.
It's a pretty good start. And you find that fifty
of the smokers developed lung cancer but only five percent
of non smokers did, then your study has what's called
a high power. UM. It's if if you had something

(14:24):
like ten smokers and ten non smokers, and two of
the smokers developed lung cancer and one developed lung cancer
as well, you have very little power and you should
have a very little confidence in your findings. But regardless,
it's still going to get reported if it's a sexy
idea yea for sure. Um. And because these are kind

(14:47):
of overlapping in a lot of ways, it was I
want to mention this guy, a scientist named Ulrich uh
Dirnegle Uh. He and his colleague Malcolm McCloud have been trying,
I mean, and there are a lot of scientists that
are trying to clean this up because they know it's
a problem. But he co wrote an article in Nature.
Uh that's called robust research. Colon institutions must do their

(15:10):
part for reproducibility. So this kind of ties back into
the reproducing things, like we said earlier, and his whole ideas,
you know what good funding they should tie funding too
good institutional practices, like you shouldn't get the money if
you can't show that you're doing it right. Um, and
he said that would just weed out a lot of stuff.

(15:31):
Here's one staggering stat for reproducibility and small, simple size. Uh.
Biomedical researchers for drug companies reported that of their only
of the papers that they published or even reproducible, that
was like an insider stat And it doesn't matter. They
still drugs are still going to market, yeah, which is

(15:54):
that's a really good example of why this does matter
to the average person. You know, like if you hear
something thing like um uh, monkeys like to cuddle with
one another because they are reminded of their mothers. Study shows.
Do you could just be like, oh, that's great, I'm
going to share that on the internet. Doesn't really affect

(16:15):
you in any way. But when their studies being conducted
that are that are creating drugs that could kill you
or not treat you or that kind of thing, And
is it's attracting money and funding and that kind of
stuff that's like that's harmful. Yeah. Absolutely. I found another survey,

(16:37):
did you like that terrible study idea? That it came up?
The monkeys like the cuddle A hundred and forty trainees
at the MD Anderson Cancer Center in Houston, Texas, Thank
you Houston for being so kind to us at a
recent show. They found that nearly a third of these

(16:58):
UM trainees felt pressure to support their mentors work. It's
like to get ahead or not get fired. So that's
another issue. As you've got these trainees or residents, uh,
and you have these mentors, and even if you disagree
or don't think it's a great study, you're you're pressured
into just going along with it. I could see that
for sure. There's there seems to be a huge hierarchy.

(17:21):
And UM science can a lab. You know, you've got
the person who runs the lab, it's their lab, and
go against them. But there are people UM like Science
and Nature to great journals are updating their guidelines right now.
They're introducing checklists. UM Science hired statisticians to their panel
of reviewing editors, not just other you know, peer reviewed

(17:44):
like they actually actually hired numbers people specifically because that's
a bigger process. That's a huge part of studies. It's
like these this mind breaking statistical analysis that can be
used for good or ill. And I mean, I don't
think the average scientists necessarily is a whiz at that,
although I it has to be part of training, but

(18:06):
not necessarily. And that's a different kind of beast altogether.
Um Stats, we talked about it earlier. I took a
stats class in college, had so much trouble that was
awful at it. It really just it's a special kind
of is it even matter? Yeah, I didn't get it.
I passed it, though I passed it because my professor

(18:29):
took pity on me. Um that Ulric during durnagal Ulric
narnagal Um. He is a he's a big time crusader
for his jam making sure that science is good science.
One of the things um he crusades against is the
idea of remembering that virginity study where they just threw

(18:52):
out anybody who had a violent encounter for their first
sexual experience. UM. Apparently that's a big deal with annam
studies as well. If you're studying the effects of a
drug or something like there was this one in the article. Um,
if you're studying the effects of a stroke drug and
you've got a control group of mice that are taking
the drug or that aren't taking the drug, and then

(19:15):
a test group that are getting the drug. Um, and
then like three mice from the test group die even
though they're on the stroke drug. They die of a
massive stroke, and you just literally and figuratively throw them
out of the study, um, and don't include them in
the results. That changes the data. And he's been on

(19:35):
a peer of you on a paper before. He's like, no,
this doesn't pass peer of you. You can't just throw
out what happened to these three rodents? You started with ten,
there's only seven reported in the end. What happened to
those three? And how many of them just don't report
the ten? They're like, oh, we only started with seven.
Were going, you know, well, I was about to say
I get the urge. I don't get it because it's

(19:56):
not right. But I think what happens is you work
so hard at something yeah yeah, and you're like, how
can I just walk away from two years of this
because it didn't get a result? Okay, The point of
real science, though you have to walk away from it. Well,
you have to publish that. And that's the other thing too,
And I guarantee scientists will say, hey, man, try getting

(20:18):
a negative paper published in a good journal. These days,
you don't want that kind of stuff. But part of
it also is I don't think it's enough to just
have to be published in like a journal. You want
to make the news cycle as well. That makes it
even better, right, Um So, I think there's a lot
of factors involved. But ultimately, if you take all that
stuff away, if you take the culture away from it,

(20:39):
you're if you get negative results, you're supposed to publish
that so that some other scientists can come along and
be like, oh, somebody else already did this using these
methods that I was going to use. I'm not gonna
waste two years of my career because somebody else already did.
Thank you, buddy for saving me this time and trouble
and effort to know that this does not work. You've

(21:00):
proven this doesn't work when you saw it to prove
it does work, you actually proved it didn't work. That's
part of science. Yeah, I wish there wasn't a negative
connotation to a negative result, because to me, it's the
value is the same as proving something does work as
proving something doesn't work. Right again, it's just not Yeah,

(21:22):
but I'm not sexy either, so maybe that's why I
get it. Uh. Here's one that I didn't know was
a thing, predatory publishing. You never heard of this. So
here's the scenario. You're a doctor or a scientists and um,
you get an email from a journal that says, hey,
you got anything interesting for us. I've heard about your
work and you say, well, actually, do I have this

(21:44):
this study right here? They say, cool, we'll publish it.
You go great, my career is taking off. Then you
get a bill he says, where's my three grand for
publishing your article? And you're like, I don't owe you
three grand, all right, give us two? And you know
I can't even give you two. And if you fight
them long enough, maybe they'll drop it and never work

(22:06):
with you again. Or maybe it'll just be like, well,
we'll talk to your next court. Exactly. That's called predatory publishing.
And it is I'm not sure how new it is.
Maybe it's pretty new. Is it pretty new? But it's
a thing now where uh, you can pay essentially to
get something published. Yes, you can, um it kind of

(22:29):
it's kind of like who's who in behavioral science is
kind of thing, you know. Um. And apparently it's new
because it's a result of open source academic journals, which
a lot of people push for, including Aaron Schwartz very
famously who like took a bunch of academic articles and
published him online and was prosecuted heavily for it. Persecuted.

(22:50):
You could even say, um, but the idea that science
is behind this paywall, which is another great article from
Priceonomics by the way, um, really just takes a lot
of people off. So they started open source journals right,
and as a result, predatory publishers came about and said, okay, yeah,
let's make this free, but we need to make our

(23:10):
money anyway, so we're going to charge the academic who
wrote the study for publishing it. Well yeah, and and
sometimes now it's just a flat out scam operation. There's
this guy named Jeffrey Beale who is a research librarian.
He is my new hero because he's truly like one
of these dudes that has uh, he's trying to make

(23:33):
a difference and he's not profiting from this, but he's
spending a lot of time by creating a list of
of predatory publishers. Yeah, a significant list too. Yeah, how
many four thousand of them right now? Um. Some of
these companies flat out lie, like they're literally based out

(23:54):
of Pakistan or Nigeria and they say no, we're in
New York. Yea publisher, So it's just a flat out scam.
Or they lie about their review practices. Um, like they
might not have any review practices and they straight up
lie and say they do. There was one called Scientific
Journals International out of Minnesota that he found out was

(24:15):
just one guy like literally working out of his home,
just lobbying for articles, charging to get them published, not
reviewing anything, and just saying I'm a journal, I'm a
scientific journal. He shut it down apparently or tried to
sell it. I think he was found out. Um and

(24:35):
this other one, the International Journal of Engineering Research and Applications.
They created an award and then gave it to itself
and even modeled the award from an Australian TV award
like the physical Statute. That's fascinating. I didn't they could
do that. I'm gonna give ourselves. Yeah, let's the best

(24:56):
podcast in the Universal Award. It's gonna look like the Oscar. Yeah, okay,
the Oscar crusted with the Emmy. Uh this other one
med med No publications actually confused the meaning of STM
Science Technology Medicine. They thought it meant sports technology in Medicine. No. Well,
a lot of UM science journalists or scientists too. But

(25:20):
watchdogs like to send like gibberish articles into those things
to see if they publish him, and sometimes they do.
Frequently they do sniff them off the case it's the
big time. How about that call back? It's been a while,
it needs to be a T shirt. Did we take
a break? Yeah, all right, we'll be back and finish
up right after this. Just so much so, here's a

(26:05):
big one. You ever heard the term follow the money? Hm.
That's applicable to a lot of realms of society, and
most certainly in journals. UM. If something looks hinky, just
do a little investigating and see who's sponsoring their work. Well,
especially if that person is like, no, everyone else is wrong.

(26:28):
Climate change is not man made kind of thing. Sure,
you know, if you look at where their funding is
coming from, you might be unsurprised to find that it's
coming from people who would benefit from the idea that
anthropogenic climate change isn't real. Yeah, well we might as
well talk about him Willie Soon. Yeah, Mr Soon. Is
he a doctor? He's a He's a physicist of some sort. Yeah,

(26:52):
all right, m M. I'm just gonna say Mr. Or
doctor Soon because I'm not positive. Uh. He is one
of a few people on the planet Earth. Um professionals,
that is, who deny human climate change, human influence climate
change like you said, he said the fancier word for

(27:13):
it though, and anthropogenic um. And he works at the
Harvard Smithsonian Center for Astrophysics. So hey, he's with Harvard.
He's got the cred right. Um. Turns out when you
look into where he's getting his funding. Uh. He received
one point two million dollars over the past decade from

(27:34):
Exxon Mobile, the Southern Company, the Cokes, and the Koke brothers,
their foundation, the Charles G. Coke Foundation. Excen stopped in
stopped funding him, but the bulk of his money and
his funding came and I'm sorry, I forgot the American
Petroleum Institute came from people who clearly had a dog

(27:54):
in this fight. And it's just how can you trust this?
You know? Yeah, well you trusted because there's a guy
and he has a PhD in aerospace engineering by the way,
all right, he's a duck. He works with this, um,
this organization, the Harvard Smithsonian Center for Astrophysics, which is
a legitimate place. Um, it doesn't get any funding from Harvard,

(28:16):
but it gets a lot from NASA and from the Smithsonian. Well,
and Harvard's very clear to point this out when people
ask him about Willie Soon, Um, they're kind of like, well,
here's the quote. Willie Soon as a Smithsonian staff researcher
at Harvard Smithsonian Center for Astrophysics, a collaboration of the
Harvard College Observatory in the Smithsonian Astrophysical Observatory. Like they

(28:38):
just want to be real clear. Even though he uses
a Harvard email address, he's not our employee. No, but
again he's getting lots of funding from NASA and lots
of funding from the Smithsonian. This guy, Um, if his
scientific beliefs are what they are, and he's a smart guy,
then yeah, I don't know about like getting fired for saying,

(28:59):
you know, here's a paper on on the idea that
climate change is not human made. Yeah, he thinks it's
the Sun's fault. But he didn't. He doesn't reveal in
any of his um conflicts of interest. Uh, that should
go at the end of the paper. He didn't reveal
where his funding was coming from. And I get the

(29:20):
impression that in academia, if you're are totally cool with
everybody thinking like you're a shill, you can get away
with it. Right. Well, this stuff, a lot of this
stuff is not illegal. Right, Even predatory publishing is not illegal,
just unethical. And if you're counting on people to police
themselves with ethics, a lot of times will disappoint you.

(29:43):
The Heartland Institute gave Willie Soon a Courage Award, and
if you're not caring about what other scientists think about
if you've heard the Heartland Institute, you might remember them.
They are a conservative think tank. You might remember them
in the nineties when they worked alongside Philip Morris to
deny the risks of secondhand smoke. Yeah, that's all chronicle

(30:04):
In that book. I've talked about merchants of doubt, A
bunch of scientists, legitimate bona fide scientists who are like
up for for um, being bought by groups like that said,
it is sad um and the whole the whole thing
is they're saying like, well, you can't stay without beyond

(30:25):
a shadow of a doubt with absolute certainty, that that's
the case, and science is like, no, science doesn't do that.
Science doesn't do absolute certainty. But the average person reading
a newspaper sees that, oh you can't stay with absolute certainty, Well,
then maybe it isn't man made. And then there's that
doubt that the people just go and get the money
for for saying that, for writing papers about it. It's

(30:47):
millions of Yeah, it really is. Um self reviewed. Uh,
you've heard of peer review. We've talked about it quite
a bit. Your reviews. When you have a study and
then one or more ideally more of your peers reviews
your study and says, you know what, you had best practices,
You did it right. Um, it was reproducible, you follow

(31:08):
the scientific method. Um, I'm gonna give it my stamp
of approval and put my name on it, not literally
or is it? I think so? It says who reviewed
it in the journal when it's published, but not my
name as the author of study. You know what I mean? Um,
And the peer reviewer. Yeah, as a peer reviewer, and
that's a wonderful thing. But people have faked this and

(31:31):
been their own peer reviewer, which is not how it works. No,
who is this guy? Uh? Well, I'm terrible at pronouncing
Korean names, so all apologies, but I'm gonna say nung
In Moon. Nice, Dr Moon, I think, yeah, let's call
m Dr Moon. Okay. So Dr Moon Um worked on

(31:54):
natural medicine, I believe, and was submitting all these papers
that were getting very quickly because apparently part of the
process of peer of views to say, this paper is great,
Can you recommend some people in your field that can
review your paper? And Dr Moon said, I sure can.
He was on fire. Let me go make up some

(32:14):
people and make up some email addresses that actually come
to my inbox and just posed as all of his
own peer reviewers. He was lazy, though, is the thing, Like,
I don't know that he would have been found out
if he hadn't been um careless. I guess because he
was returning the reviews within like twenty four hours. Sometimes

(32:36):
a peer of view of like a real um study
should take I would guess weeks, if not months, like
the the study the publication schedule for the average study
or paper. I don't think it's a very quick thing.
There's not a lot of quick turner And this guy
was like twenty four hours. Dr Moon. I see your

(32:57):
paper was reviewed and accepted by Dr Mooney. It's like
I just added a Y to the end. It seemed easy. Uh.
If you google peer review fraud, you will be shocked
at how often this happens and how many legit science
publishers are having to retract studies. Uh. And it doesn't

(33:21):
mean they're bad. Um, they're getting duped as well. But
there was one based in Berlin that had sixty four
retractions because of fraudulent reviews. And they're just one publisher
of many. Every publisher out there probably has been duped. Um.
Maybe not everyone, I'm surmising that, but it's a big problem.

(33:44):
I'll review it. It'll end up in the headlines. Now,
every single publisher duped, says Chuck. Uh. And speaking of
um the headlines, Chuck. One of the problems with science
reporting or reading science reporting is that what you usually
are hearing, especially if it's making a big splash, is

(34:04):
what's called the initial findings. Somebody carried out a study,
and this is what they found, and it's amazing and
mind blowing and it um, it supports everything everyone's always known.
But now there's a scientific study that says, yes, that's
the case. And then if you wait a year or two,
when people follow up and reproduce the study and find

(34:25):
that it's actually not the case, it doesn't get reported on.
Usually yeah, and and sometimes the science scientists or the
publisher is they're doing it right, and they say initial findings,
but the public and sometimes even the reporter will say
initial findings. But we as a people that ingest this

(34:46):
stuff need to understand what that means, um, And the
fine print is always like you know, you know, more
studies needed, but no one if it's something that you
want to be true, you'll just say, hey, look at
the study, right. You know it's brand new and they
need to study for twenty more years, but hey, look
what it says. And the more the more you start

(35:08):
paying attention to this kind of thing, the more kind
of disdain you have for that kind of just off
hand um sensationalist science reporting. But you'll still get caught
up in it. Like every once in a while, I'll
catch myself like saying something you'd be like, oh, did
you hear this? And then as I'm saying it out loud,
I'm like, that's preposterous. Yeah, there's no way that's going

(35:29):
to pan out to be true. I got collebated, I know.
I mean, we we have to avoid this stuff. It's
stuff because we have our name on this podcast. But
luckily we've given ourselves the back door of saying, hey,
we make mistakes a lot. It's true though we're not
we're not scientists. Uh. And then finally we're gonna finish

(35:51):
up with the header on this one is it's a
cool story. And that's a big one because, um, it's
not enough these days. And this all ties in with
the media and how we read things as people. But
it's not enough just to have a study that might
prove something. You have to wrap it up in a
nice package to deliver people, get it in the news cycle.

(36:14):
And the cooler the better. Yep. It almost doesn't matter
about the science as far as the media is concerned.
They just want a good headline and a scientist who
will say, yeah, that's that's cool. Here's what I found.
This is going to change the world. Lockness Monster is real.

(36:35):
This is a kind of ended up being depressing somehow
not somehow Yeah, like, yeah, it's kind of depressing. We'll
figure it out, Chuck. Well, we do our best. I'll
say that science will prevail, I hope. So. Uh, if
you want to know more about science and scientific studies
and research fraud and all that kind of stuff, just

(36:57):
type some random words into the search part how to
work dot com. See what comes up. And since I
said random, it's time for a listener mail. Oh no, oh, yeah,
you know what. It's time for administrator All right, Josh,
administrative details. If you're new to the show, you don't

(37:19):
know what it is. That's the very clunky title. We're saying,
thank you to listeners who send us neat things. It
is clunky and generic and I've totally gotten used to
it by now. Well you're the one who made it
up to be clunky and generic, and it's stuck. Yeah.
So people send us stuff from time to time, and
it's just very kind of you to do so, yes,
and we like to give shout outs whether or not

(37:40):
it's just out of the goodness of your heart, or
if you have a little small business that you're trying
to plug either way, it's a sneaky way of getting
it in there. Yeah, but I mean I think we
we brought that on, didn't we didn't We say like,
if you have a small business, then you send us something,
we'll we'll be happy to say something exactly. Thank you.
All right, so let's get it going here. We got
some coffee right from uh, from one thousand faces right

(38:02):
here in Athens, Georgia from Kayla. Yeah, delicious, Yes it was.
We also got some other coffee too, from Jonathan at
Steamworks Coffee. He came up with a Josh and Chuck blend.
Oh yeah, it's pretty awesome. I believe it's available for
sale to Yeah. That Josh and Chuck blend is dark
and bitter. Uh. Jim Simmons, he's a retired teacher who

(38:26):
sent us some lovely handmade wooden bowls and a very
nice handwritten letter, which is always great. Thanks a lot, Jim. Uh.
Let's see. Chamberlayne sent us homemade pasta, including a delicious
savory pumpkin fettuccini. It was very nice. Um. Jake graft
two F's send us a postcard from Great Wall of China.

(38:47):
It's kind of neat. Sometimes we get those postcards from
places we've talked about. I was like, thanks, here, let's
see the Hammer Press team. They sent us a bunch
of Mother's Day cards that are wonderful. Oh those were
really nice, really great. You should check them out. The
Hammer Press team. Yeah. Uh, Misty Billy and Jessica. They
sent us a care package of a lot of things.

(39:10):
There were some cookies, um, including one of my favorite
white chocolate dipped rits, and peanut butter crackers. Oh yeah, man,
I love those homemade right yeah. And uh, then some
seventies Macromay for you, along with seventies Macromay magazines because
you're obsessed with Macromay. We have a Macromay plant holder

(39:30):
hanging from my um microphone arms a coffee mug sent
to us by Joe and Lynda heckt oh that's right,
and it has some pens in it. Uh. And they
also sent us a Misty Billy and Jessica a lovely
little hand drawn picture of us with their family, which
was so sweet, awesome. Um. We've said it before, we'll
say it again. Huge thank you to Jim Ruaine. I

(39:52):
believe that's how you say his name and the Crown
Royal people for sending us all the Crown Royal We
are running low. Uh. Mark Silberg at the Rocky Mountain
Institute sent us a book called Reinventing Fire. They're great
out there, man, they know what they're talking about. And
I think it's reinventing Fire. Colon Bold Businesses Bowld business

(40:12):
Solutions for the New Energy Era. Yeah, they're they're basically
like um, green energy observers, but I think they um,
they're experts in like all sectors of energy, but they
have a focus on green energy, which is awesome. Yeah,
they're pretty cool. Um john whose wife makes Delightfully Delicious
doggie treats. Delightfully Delicious is the name of the company.

(40:33):
There's no artificial colors or flavors. And they got um
sweet little Momo hooked on sweet potato dog treats. I
thought you're gonna say, hooked on the junk, the the
sweet potato junk. She's crazy cuckoo for sweet potatoes. Nice.
That's good for a dog too. It is very h
Stratt Johnson sent us his band's LP And if you're

(40:53):
in a band, your name is Strat. That's pretty cool. Uh,
die omeya still mhm. I think that was great. Yeah,
I'm not sure if I pronounced all right, d I
O M A e a uh. Frederick, this is long overdue.
Frederick at the store one five to one store dot

(41:15):
com sent us some awesome low profile cork iPhone cases
and passport holders, and I was telling him, Jerry walks
around with her iPhone in the cork holder and it
looks pretty sweet. Yeah, so he said, AWESO, I'm glad
to hear. Joe and Holly Harper sent us some really
cool three D printed stuff you should know, things like

(41:36):
s y s k uh, you know, like a little
desk oh as, like after Robert Indiana's love sculpture. Yeah,
that's what I couldn't think of what that was from. Yeah,
it's awesome. It's really neat and like a bracelet um
made out of stuff you should know, three D carved
like plastics, really neat. Yeah, they did some good stuff.
Thanks Joe and Holly Harper for that. And then last

(41:57):
for this one, we got a postcard from Yosemite National
Park from Laura Jackson, So thanks a lot for that.
Thanks to everybody who sends us stuff. It's nice to
know we're thought of and we appreciate it. Yeah, we're
gonna finish up with another set on the next episode
of Administrative Details. You got anything else, No, it's it.

(42:17):
Oh yeah. If you guys want to hang out with
us on social media, you can go to s Y
s K Podcast on Twitter or on Instagram. You can
hang out with us at Facebook dot com, slash stuff
you Should Know. You can send us an email to
Stuff Podcast at how stuff Works dot com and has
always joined us at our home on the web, Stuff
you Should Know dot com For more on this and

(42:43):
thousands of other topics. Is it how stuff Works dot com.

Stuff You Should Know News

Advertise With Us

Follow Us On

Hosts And Creators

Chuck Bryant

Chuck Bryant

Josh Clark

Josh Clark

Show Links

AboutOrder Our BookStoreSYSK ArmyRSS

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.