Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
From UFOs to psychic powers and government conspiracies. History is
riddled with unexplained events. You can turn back now or
learn the stuff they don't want you to know. Welcome
(00:22):
back to the show. My name is Matt, my name
is Noel. This just in they call me Ben. You
are you? And that makes this stuff they don't want
you to know? And we're broadcasting today from an active
tornado zone. No kidding. Yeah, tensions are high in the
office right now. People are staying away from windows. It's
kind of fun and scary. Thankfully, the podcast studio is
a tiny windowless box, so we are actually in the
(00:42):
safest possible location for this SORDI. Actually, yeah, except we
do have to windows. Well, I was just trying to
at and then they let the facts stand in the
way of a good story, Ben. And also, you're right,
these are interior windows, they're not exterior, which is totally different.
Uh I. I would say that that still passes as actual.
But yes, the rumors are true. Tornadoes were cited. Tensions
(01:04):
already running high in the office are amplified by our
meteorological circumstances. Speaking of not letting the facts stand in
the way of a good story. What are we talking
about today? Yeah, I was gonna ask as a way
of lead, and where do you uh we're talking about news.
Where do you guys go for your news? Your two
of the smartest people I know. Apparently my method of
(01:25):
getting news is flawed. Iich, we'll talk about today. But
I get it from the Facebook feed and I subscribe
to things, and um, I kind of find that the
things that I want to read pop up pretty often
and I don't really have a problem with it, and
I don't really have to venture much farther than my
my Facebook feed. Yeah. Yes, for me, it's in the
car generally, so listening to National Public Radio or Georgia
(01:46):
Public Broadcasting here in Atlanta, or various subredits. So essentially
social media. I guess that's what Reddit is. World news news.
There are so many of them. I do a lot
of scrying myself. Uh, goat and trails, stuff like that
can be a little bit vague. Um. Yeah, I'm a
fan of a lot of internet news. Like most people
(02:09):
in our generation, I tend to get a lot of
news online. I don't currently. I was subscribed to the Economist,
which got a little bit expensive for my taste. Uh,
and I do, and I do read um print when
I don't think they'll run me out of the UH
news agents or the Barnes and Noble. But if you
(02:31):
are like most people, UH, then you also get your
news from some form of social media. And I would argue,
I would argue that Reddit counts as a form of
social media. I think it is. I don't know if
it's like a really elaborate forum, right, yeah, yeah, exactly.
But we have some statistics so it's not just you know,
(02:52):
the three of us talking about this. We have some
statistics from the Pure Research Center about how news watching
breaks down and percentages over the US population. Well, to
begin with, sixty of United States adults get their news
from social media. And that's all of them. That could
be Facebook, that could be Twitter or whatever you're using, Yeah,
(03:16):
insert your favorite app here. And eighteen percent of that
population does this on a frequent basis. I mean, for me,
the Facebook feed is crack. I mean it's I'm sad
to say it, but I I look and refresh that thing,
even knowing that nothing new is gonna come up. It's
a it's quite a compulsion, and I share with I
think many others in my generation. I'm not proud of it,
(03:36):
but for the our purposes here, I'm willing to come clean.
You're an honest man, and you know what, I will
also follow your lead here. I was lying earlier. I
get most of my news from tender and it's yeah, yeah,
because you can't read the whole story unless you swipe
the headline. You realize those are all bots, right, then, well,
they do have a lot of They do have a
(03:58):
lot of bot related news. Yeah. Yeah, I don't know
where else to go with this joke, So let's go
to two thousand and twelve when posters used a slightly
different question than the Pew Research Center used, and then
they found that forty of adults reported seeing news on
social media other than getting it just seeing it. Oh
(04:18):
it was there, I'm pretty sure, right, which is a
little bit weasily. Uh. Since Facebook is the largest social
site it reaches slightly more than two thirds of US adults,
it means that the Facebook users who get news their
equals about forty percent of the population. So almost half
of the population is getting news from Facebook. What are
(04:40):
users do that? And thirty one of Tumbler users And
we should also mention YouTube here because that's one of
the places where where we thrive, and that reaches about
forty eight percent of the US population, But only about
one five of the viewers on YouTube say they get
news from that site, which makes it about ten percent
(05:01):
of the U s population overall, so quite a bit smaller,
and that was surprising to me. I feel like I
end up watching a lot of news on YouTube. Well
you're you're an exceptional person. Well, I think I'm just
I think we're on there all the time just because
of our job, so maybe I see it more. You
just sit there and watch your own videos all day,
don't you. Man, Yeah, I get my own news from
(05:22):
me preaching to the choir. What's interesting here though, There
isn't a whole lot of overlap, which which is interesting,
but to me, not that much of a stretch. The
majority of people get their news from only one site.
But like I said at the top of the show,
I think for a lot of people, that one site
is like an aggregator like Facebook, and you might be
looking at things within that site, but it's all content
(05:45):
from other sources that then being pulled together into this
singular feed, and sometimes they'll be versions of something that
exists in other places, that's tailored just for that Facebook experience.
So you are still getting your news from other sources,
it just happens to be pulled together in this one
like mega site that, as we're going to talk about,
gives a lot of power to certain right, And that's
an excellent point about aggregation. So one thing that I
(06:10):
don't really care for in the world of print readers
is people who will say what I you know, I
was scriptur in to the New York Times. I'm not
the soul who gets their news from the Internet or Facebook,
because Facebook is not necessarily bad in that way. You know,
like you could subscribe to the BBC page and the
(06:31):
Al Jazeera page and and whatever other media you want,
and you would get you could probably get or hopefully,
in theory, get these opposing viewpoints and you can always
visit their pages to see what they've published about a thing, right,
And that doesn't I think it's unfair for people to automatically,
um dismiss that as a legitimate source of begetting news,
(06:52):
especially when darn near half of the US population does
that primarily, you know, but what happens in see that
sponsored post that pops up and says Donald Trump gives
birth to devil baby. Right. What do you do then? Well, uh,
I would say the first part is to check the sources,
(07:14):
make sure you knew where you were nine months ago
or how long. It's never that justation takes just in case,
I mean, just like listic contingencies. But nely you raise
a really great point because not all of these news
sources are created equal, even if they seem to be
displayed equally in a news feed. And of course I
was being a little bit over the top with that headline,
(07:37):
but uh, in that same little sponsored spot, you may
well see a headline that has some seems to have
some merit, or maybe it even aligns with an opinion
that you already have and you kind of coming from
that place where you want to believe it. You maybe
even want to share it without checking into its veracity.
Of course, you've got stuff to share, your your moving
(07:59):
and stick and in the in the ever hungry world
of social media, you don't have time to do the
uh the source research. Uh, but just in case you do,
let's go ahead and lay out what what are the
basic rules of research we talked about before? A quick
and dirty version. You guys, know, I know, always verify
(08:19):
source information, multiple sources if possible. It doesn't matter if
just the Washington Post or the Denver Citizen or whatever
said something happens. See if other places, preferably with typically
opposing perspectives, also have the same thing. Learn about the
source presenting the information, like knowl the devil baby thing? Uh?
(08:42):
Who published it? If that comes out weekly world news,
weekly world news? Right, so weekly World News, no one
saying it's a bad paper. They have a bit of
an agenda, right, Most places have a bit of an agenda. Uh.
And you know, look for sources that claim to has
proved the info you're examining. That's the most difficult part
(09:03):
for a lot of us, right. Uh. And if I'm
convinced that something is real, like if if I'm convinced
the Simpsons will objectively be the best cartoon in human history,
or that Arkansas was originally pronounced Arkansas, then what I
(09:23):
should be doing is looking for someone who believes The
Simpsons was objectively the worst cartoon in history, and then uh,
looking for someone who believes there was always Arkansas and
asking them why they think that. But that's a very
hard thing. To do because it means that we are
escaping our bubble, and escaping the bubble was once the
(09:45):
once the prized skill set of UM A dying, a
dying race of humanity called investigative journalist. WHOA. I've seen
those in a museum one member member journalists member a number.
We've probably heard people lament the death of journalism, and
while it's not completely extinct, it's certainly in poor health. Yeah.
(10:10):
When I was a journalism minor back in school, newspapers
were seeing still as the hallmarks of what journalism could
and should be. Integrity, Yeah, absolutely, integrity, Truth being objective finding,
finding the facts and leaving out opinions one of the
things we try really hard to do on this show.
(10:31):
But I guess what, guys, those newspapers they're going away,
at least the print versions and the jobs of journalists.
They are nowhere near as secure as they once were, right, Yeah, absolutely,
So we have some statistics on this as well. So
compared with print, nearly twice as many adults UM get
(10:52):
news online, and that's thirty eight percent of adults either
from news websites or apps like a proprietary you know, app.
Version of the new York Times, for example, on social
media's eight or both. And you know, one of the big,
really big problems with these online versions of newspapers is
that I've just experienced it before we came in here.
(11:12):
I looked at ten New York Times articles this week,
and now I can't look at any more New York
Times articles unless I subscribe. Same thing with the Washington Post. Uh,
And why don't you just subscribe? Well, I know, and
I should subscribe, And I keep thinking I should just subscribe,
but thinking about giving a dollar a month or you know,
whatever tiny amount it is, it's hard for me to
(11:34):
to justify that when I can get roughly the same
thing from free sources somewhere else. I'm just playing Devil's
advocate here, but I mean to me, it's just a
matter of like it's too much, Like what am I
going to get a subscription to the Times and the Post?
And like when I can get it all from like
an aggregator or like get the high points from different
different sources like we talked about like with Facebook and
(11:54):
you know, um Twitter. Yeah, there's a pattern that occurs
when people are when people are getting something online. Let's
just be frank it's the ideas, there's all this other
free stuff that purports to have this information, why should
I pay for article eleven? Just like you said, Nolan,
(12:16):
you know, there's another place probably that I can find
basically the same thing. And one of those big ones
is of course television, which I can't believe we haven't
talked about yet. Turn that news on, yeah, I mean,
especially with some of the bomb bass that was being
thrown around on TV news during this most recent election.
The ratings are up for TV news and it continues
(12:37):
to be the most widely used news platform of United
States adults getting TV based news either from local TV
cable or network television or some combination of the food. Yeah,
and that pattern emerges when people are ask not just
where they get news, but what kind of platform they prefer.
So TV is still at the top, despite all the
(13:00):
cord cutting that's happening. TV just had such a lead
that it's just less in the lead, but it's still
number one. I wonder how they how they count streaming
a TV news like a network TV news. Oh wow, Yeah,
it's a it's a huge disaster because who gets responsibility
for the numbers or maybe a quagmire is a better
(13:20):
room for it. And then second will be the web.
Radio and print are trailing behind. The one thing that's
really saving radio would be online radio, of which podcast
or apart arguably. And then you know the fact that
most cars have a radio that plays free broadcast stuff
(13:40):
inundated with commercials or in NPRS case, today was brought
to you by this organization and our listeners. So what's
replacing journalism? Then that's that's the thing that you've probably
heard a lot about in the news recently. Right, Facebook
(14:04):
got deemed for it. Various people purporting to be reputable
organizations get deemed for it. What is fake news? And
we'll get into that right after a quick word from
our sponsor. Here's where it gets crazy, but not really,
(14:29):
because what do we mean when we say fake news?
It's like news that's fake. Nailed it, nailed it. So
so we live in a world now where everyone is
battling to get your eyeballs and your fingers to click
on a tiny little section of a screen somewhere. That's
that is the major battleground in our world right now.
(14:50):
How do I get you to click my thing that's online? Gosh,
that sounds dirty, but it's not. It's just like my
thing that's online dot com. Well, it's clear, it's literally
clicked my picture that has a couple of words below it, right,
That's all it is, click me and and because what
happens then is if you do click on that, somebody
gets paid somewhere down the line just for you viewing
(15:13):
that and clicking on that, because of the ads that
will be adjacent to that or will pop up on there.
And what ends up happening is that when everything is
kind of crammed into this format like a Facebook feed
for example, everything's kind of exists side by side, so
things tend to visually kind of carry the same weight. Right,
(15:36):
So if you see exactly if you see like a
news feed that has you know, um, maybe CNN or
BBC or New York Times or what like Scientific American
or something like that, and then uh in the Economist,
and then you see sandwiched in between them there's something
for like, you know, stop the Welsh dot org and
(15:58):
it's like it's like, you know, the sheep genocide continues
unreported in Wales and on that and especially if people
already have something against the Welsh or yeah, I guess
this is a bad example, but then you know, it's
just like Noll said, it has kind of the same
credibility that there's like a an osmosis of credibility that
(16:23):
occurs when you see these things displayed in the same
format on the same page, which with only an icon
separating them. So we've we've seen this um in there.
There's a great article in a place called Vox, which
I'm sure if you like this show, you've heard of
box where they they report on what's called the fake
(16:46):
news problem. Right, that's that's that's a big thing that
and it popped up quite recently. But it's a practice
that you know, as you point out, the format is
is old. It's so much it's older than the Internet,
you know. And Mark Zuckerberg originally said, well, less than
one percent of the content on the site could be
(17:07):
called fake news or a hoax, you know. Uh. The
devil Baby stuff is a very is a very small minority.
But pundits and self appointed policy walks and experts are
raising kine about it because they are convinced, some of them,
that it it directly influences people's opinions by exploiting their
(17:31):
reasoning ability and their solid psychological science behind how you
could get your rational thoughts sort of circumvented. But let's
be honest, the reason they're saying that is because of
the election in the US. Yes, it was a big
deal in the election, because you could be a third
party site that doesn't actually have news, but has some
(17:52):
story that you want to tell, and you can buy
an ad. Let's say for Facebook, if we use that
as an example of what Zuckerberg's talking about, you pay
enough money into Facebook and then your your fake news
or your whatever it is, Your ad is going to
go out to X number of people and it's usually
in the thousands, if not tens of thousands or more.
(18:13):
And so let's say you happen to hit the one
in five people who say they change their political views
due to social media. Then what happens is, uh. What
happens is maybe maybe an article comes out that says
candidate A is responsible for the uh, the double homicide
being covered up of the FBI agent and their spouse.
(18:38):
And the problem is for people who read the article
that it turns out this place only post bad things
about Candidate A. And then the weather that they pulled
from another site. Sure, and that turns out the homicides,
uh weren't homicides because you can't actually kill fixed old
(19:00):
characters and the FBI people or the FBI family never existed.
But for people who see that they go, oh wow,
I don't want to support this kind of thing. Then
that's that's the argument, and I'm not I'm not even
applying it specifically to this election. I'm saying the argument
there is that people can be so easily swayed because
our analytical and our emotional parts of our brain don't
(19:25):
work very well. At the same time, it was interesting
to me about the way Zuckerberg reacted. Um, it was
kind of a too little, too late kind of scenario
where you know, it's clear that these fake news sites
did gain some traction on Facebook, and you know, while
it's not clear that they necessarily played a huge discerning
part in the outcome of the election, there's no doubt
(19:47):
that it was a tool used by some. And you know,
Facebook and Google in the past have basically shut down
sites and types of sites from using their ads or
of us for much less, you know, for for supposedly
having quote unquote, you know, low quality content or like
some of these click farms that have kind of like
(20:09):
they just farm out a bunch of writers to write
these kind of a little short click baity articles that
give you like ten ads to half a paragraph per page.
And you know, Facebook and Google have changed their algorithms
to kind of deal with those sites. Just is baffling
to me that it took after the election for them
to actually do something about this, because there's a lot
(20:29):
of money involved there. For Facebook, I just have to
say that they're they're getting add dollars right there. They're
getting ad dollars using, of course, your personal information. And
the the strange thing is, yeah, Google did some good
work to restrict what they would consider that what what
did you what term did you use? No, I really
(20:50):
like to low quality or low value just low yeah,
I mean low quality content. I guess in terms of
like it being not particularly robust and it's a scholarship. Yeah. Yeah,
here's here's a little bit, a little bit behind the
scenes inside baseball and some of this stuff. So the
phrase that we used earlier there, the click farm is
(21:13):
just like was described, uh, bunch of ads on a
page that has a short amount of text or actual
information and has ten steps? What got me? And I'll
go ahead and say it, although I don't usually mention
specific names on here. It's like those eHow articles man
(21:34):
where where they'll have this very strange thing that you
don't really need information on, like how to how to
tie your shoes five different ways? But then they're twenty
I'm making this example up, by the way, But then
there are twenty five slides and every fifth one is
(21:55):
just an advertisement, not counting the ads are on the page,
which refreshed every time you click to the next, which
refresh continually. Yeah, and so there's there's a clear financial
motivation for this for some people. But also that financial
motivation is in competition with a site like Google, which
wants to be known for giving you the best answer
(22:17):
right in the most efficacious way. However, that as as
we see this ability to control what search generates is
very much a ring of power situation. Well, before we
get to that, what are what are people when when
we hear the phrase fake news? What are these people
talking about? What? What are the categories of fake news?
(22:41):
So I've seen this summed up and parsed out by
several different sources, but one that was a little surprising
was uh to see that CNN made a little list
of the different types, So let's go through. Let's go
through that list as an example, the first one would
be called fake news, straight up fake news. Uh, fake news.
(23:04):
UM Jonathan Strickland and Josh Clark discovered at a KKK
rally disguised as uh jackalopes. Whoa that I want to
click on that? But it's it's These are the easiest
to debunk because they're just obviously from some kind of
(23:27):
sham site. It's designed to look like a real news outlet.
But then if you do a little more digging, um,
you just find that there's there's nothing here. There's gonna
be misleading photographs, some kind of headline that just makes
you dive headfirst into that thing. Uh it's you know,
at first read, it sounds like you could be real
right with a lot of this fake news, and the
(23:49):
name might sound similar, might sound something like it could
be plausible, you know, like the um, let's just make
one up like the Gwinnette pick a unit or something. Sure, yeah,
you know, and that that's not a real paper. I
hope editorial staff of Gwinnette Picky and I apologize if
I have offended you. But yeah, so that's that stuff
(24:10):
is pretty pretty easy. And then they might have uh
missmislean paragraph or photographs, but they're also designed not to
be read in their entirety. Yes, it's designed to hey,
look at this, share it like. Do you see this
picture that's a little misleaning. Do you see this catchy title?
Share it out with your friends right now? Yeah, And
(24:31):
then maybe there's a and then there's a second category,
which would be misleading news. This is a little bit
trickier to to parse, uh, as you said, Matt, because
this we could be something as simple as an out
of context quote. And oddly enough, this happens a lot
in fandom when people are trying to farm out spoilers
(24:52):
for their click bait stuff. Um but one example, if
we're if we're just making one up, one example would
be maybe Tom Cruise is in a daytime talk show interview,
and in his daytime talk show interview, he says that, uh,
(25:13):
he would never, um never tell a woman what to
do with her body as far as like carrying a
child to term or having an abortion. And then someplace
you know that names itself the Gwinette pickyun or whatever.
And again, I hope you're not real, but if if
(25:33):
you are, I apologize. Um, I'm sure you would never
do this, but you know, some fake news site would
take the context quote the phrase would never and have
that in quotation marks. And and he goes, Tom Cruise
says he would never save an innocent child's life, you know,
And that's not what he said. He said he wouldn't get,
(25:56):
you know, in the way of somebody else's decision. Also,
celebrities don't have the authority to tell people what they
can do. It was an interesting twist on that where
they are like localized versions of fake news stories. I
don't know if you guys have seen these, but um,
speaking of the Gwynette Picka Yune, maybe there's a story
that comes up as from Gwynette Picka Yune dot co
(26:17):
dot biz and it says Miley cyrus Um helped a
poor stranded old lady change her tire in Madison, Georgia,
decides that she's gonna move there because it's just full
of wonderful people. So you'll see that article, and if
you google the text of it, you'll realize there's fifteen
(26:38):
twenty other different ones that have all the same content,
but the places are changed, so everyone wants to share
Oh my gosh, Miley Cyrus is moving to my community,
you know, and so everyone will just share that blindly.
And those are the kinds of stories where you'll see
on facebooks someone comments and say, you know this is fake, right?
(27:00):
Why people are so excited because they want to believe
that they're you know, pop star crush or whatever is
moving the little sleepy old Madison, Georgia. That reminds me
of another example, just the straight up click bait that's
just trying to get you to click on something, but
to what end, for what means? I guess it's just
getting those add dollars experts take them, yeah yeah, or
(27:23):
the most oddly enough, when the most effective don't click
on this. I was so happy. I saw some stranger
on the internet who had replied and don't click on
this thread and their reply was saying was okay, and
that was great. It's for the like the people who
here are intro on the show and decide to turn back,
(27:44):
you know, which has happened and I respect your integrity.
So another thing would be highly partisan news and this
is you know, interpretation of a real news event where
the facts are manipulated to fit in a particular agenda,
omitting facts or adding in a little extra opinion, so
cherry picking. You know, here's what you have to do.
(28:06):
Now you have to figure out is this fake news?
The the article that you're reading that has the list,
what's what's the what's the website? Right, we're getting a
little too meta here. No, this is no. This is
what we are struggling with. Are we real people? That's
what I mean? You guys are guys are freaking me out.
Should we take an ad break so I don't totally
(28:27):
lose it? You get it together? Well, we have one
more thing and then we'll have an ad break. Satire. Well,
I love satire. That's like the Onion and stuff. Well, yeah, exactly,
it's fun. It's the Onion. But we've seen examples where
an Onion article gets posted perhaps internationally by another country,
and people believe it. So what are my favorite examples?
(28:50):
That Joe Biden is a legend on the Onion because
of the way they portrayed him. He is a legend
and he has everyone's uncle Joe. The trade him as
this uh burnt out you know, uh Night former like
rock and roll ROADI concert roadie type, and they have
(29:10):
all these articles about him trying to do things like
get Scorpion tickets or selling bad weed or bootleg Hillary
Clinton campaign t shirts. And one of these articles I
saw somebody who had posted it as though it were true,
and people who agreed with them were like, can you
believe that that's the guy who's in office? Like, you know,
(29:33):
like our founding fathers would be rolling over their graves,
And so I commented, and you know, I said, this
is a fake news website. It's pretty funny, it I was.
I was saying, like, yeah, look, man, I wish Joe
Biden was doing stuff like that, but he's probably doing
more grown up plas. Yeah. I don't know if you
guys remember these. It seems like I've been seeing fewer
(29:54):
of them recently, But there were a lot of fake
news sites making the rounds that kind of purported to
be satire, but we're really just kind of poorly written
fake news stories with some kind of punny name somewhere
in the body, but like it was, you know, a
story that could be believed and was written in such
(30:15):
a way. Where where's the comedy there's there's really not
much comedy going on here right right. What one thing
that reminds me of, and this is related, is uh,
the prevalence of reality shows and things that purport to
be accurate depictions of one person's life for a certain
moment or event or phenomenon and are clearly clearly faked
(30:38):
or as people in the industry like to call them produced.
I'm loving these air quotes by the way, you know what,
you know, I've been using them ever since we ever
since you brought this in here. But but probably the
most egregious example for us on this show was a
company that we were attached to one point, uh wanted
(31:01):
to do this documentary, this mocumentary pretending that mermaids were real,
and a bunch of people, you know, believed this organization,
and we did put our foot down and refused to really,
uh in any way support that because we don't under
(31:21):
you know, we didn't we didn't see a benefit in it.
But the defense there, which was very tenuous, was going
to be like, oh, well, this is clearly a documentary
and this is clearly a satire, and it absolutely was not,
you know, I mean, it's it's kind of like viral marketing.
It's like trying to do what the Blair Witch Project
did very cleverly, but you know, they established this whole
(31:44):
thing where, hey, maybe this is real, you know, and
like it makes you, it has this air of mystique
about it, and then it makes you want to share
it with your friends because you're the one who knew
that it was real. And that's the same impulse that
goes into sharing all these BS news stories. You want
to be the one that that is right. You're want
to be the one that that disseminates it because it
(32:04):
supports something that you want to believe, or something that
you have espoused yourself on your page, on your Facebook page. Oh,
this supports everything that I've been saying. Well, and also
there's another thing that's even more that I I totally agree,
it's more dangerous because we all have this drive. It's
the human drive to be the authority in the room,
the Prometheus bringing the fire, the Moses with the commandments,
(32:26):
the Lucifer with the light, to be the one who says,
not only does disagree with what I believe, but look
upon me, my Internet friends, I have led you to wisdom.
And that's a very dangerous thing. I remember feeling that
in my twenties hard. Oh yeah, oh yeah, that's how
I feel every time a celebrity dies, and I want
I want to be the one that breaks the news.
(32:49):
You want to I'm kidding you, guys. Just before we
get off the satire really fast, I'm on the Onion
dot com and here I'm just gonna read one of
their headlines because it fits this so perfectly. It longtime
reader of lib slaves dot info, sick of mainstream bias
on sites like Wide Awake patriot dot com. I love it. Yeah,
(33:12):
that's the thing. It goes, It goes both ways. And
I found some interesting statistics about the percentages of I
guess the percentages of politically identified splinter sites doing this,
and they're distressingly high. Yeah. I feel like this is
a pretty decent spot to take a quick break, how
about you, guys? Yeah, totally. And we're back, but we're
(33:42):
not in the normal place that we were earlier. Nope,
changed changed spots here still in the windowless tornado shelter
bunker places, but mentally, mentally, now we were talking about something.
Now we're finally inside it, and you're in here with
us for listen and you can hear the internet echo chamber,
(34:03):
Chamber chamber. We got real metal while you were away,
and now we are kind of like stacked inside of
each other like Matushka dolls, you know. And um, we're
going to communicate as one single organism from this point on.
I'm gonna stop saying check this out and start saying
click this. Uh yeah. So yeah. So here's the big
question though, we we we should definitely address why do
(34:26):
people make fake news? And there are a couple of
different reasons. Why do people do most things? Because they
want to make money, They want to make a quick
book or entertain themselves or entertain themselves or others, you know,
and get a quick book. Um, but I digress. Uh.
NPR did what I thought was one of the coolest
(34:48):
pieces of you know, we talked about the lack of
investigative journalism in our day and age, and I felt
like this was a pretty fine example of that. So
a couple of NPR producers, um, with some assistance, trade
some prominent fake news U r L s uh such
as and I love this because this is something you
can really see for yourself. If you're on Facebook and
(35:09):
you see a suspicious site, check out what the actual
u r L is. So they looked at the u
r l's National report dot net, USA Today, dot com
dot co, Washington Post dot com dot co, and all
of these u r L s all pointed to a
single rented server UM that's owned by Amazon, who does
(35:33):
a ton of web hosting and you know, high level
cloud storage and you know, things like that. And they
were able to find an email address. The NPR producer
Laura Sidell and UH an intern I believe UM went
and found this person's address and knocked on his door,
(35:55):
and at first he answered but pretended he didn't know
what about get out of here, Get out of here,
Get out and UM they you know, I guess, slipped
a card under his door, and he had a change
of heart and gave them a call and his reason
for doing the fake news site. UM. He did talk
(36:16):
a lot about how he makes a decent living um
somewhere between ten and thirty thousand dollars a month. Not
too bad. It was just him and his wife, you know,
in a suburban Los Angeles home. That's that's bad. That's
not bad. Yeah, that's real good actually, because I mean,
you know, they're paying freelance writers a pittance. I'm sure
(36:38):
to turn out these stories and you know, a couple
of social media folks here and there, but not a
hugely costly operation in terms of overhead. So that's those
pretty good returns. Um But what did he say, Ben,
what was his reasoning behind besides in that cash? No,
you're you're asking me on this one because you know
that I frankly don't believe it. Uh So he he
(37:00):
claims that he had done this because he wanted to
out the essentially the credulity of people that he did
not politically agree with. So a lot of the stories
that were published on these sites were what you can
think of as red meat, you know, stories where people
would say, oh, I always knew that Barack Obama was
(37:22):
a bad character, and now, like Prometheus, like Lucifer, like Moses,
I am the smartest in the room, I'm bringing you
this knowledge online. And he said he doesn't believe in that,
but he was amazed the first time he did it,
how crazily well it worked because a completely fictitious story
that he made up went viral and then some uh
(37:46):
local legislature began to like put a bill out based
on worries created by this story. One headline mentioned in
the story was FBI agents suspected and Hillary email lead
found dead and apparent murder suicide. Um Cohla admitted the
story is completely false, but just the same was shared
(38:09):
on Facebook over half a million times. Jeez. There was
another one about people in Colorado using food stamps to
buy weed and so and what actually happened in real life?
A state representative in the House of Colorado, presume proposed
actual legislation to prevent people from doing I think that, Yeah,
I think that was the one I had heard. But
(38:32):
you know, that brings up another question. Maybe it's for
a different episode. But you can use food stamps to
buy butter? Right, embt like trying to banking stuff. So
my steak, Yeah, couldn't you buy couldn't you buy like
hemp butter or weed butter or whatever? They sat Colorado ben?
Oh yeah, not here, not here, obviously not here on
(38:54):
the internet and in the echo chamber chambership. So yeah,
that's that's interesting because that's a motivation for making the
fake news. This guy, uh would have us believe that
he is more on a crusade of sorts or you know,
he's on a mission where money is nice but doesn't
(39:15):
completely compel it. However, there was another piece. This was
by the New York Times. I know I've mentioned them
several times in the show. Uh Luckily I I still
have this article up without exceeding my ten because full
disclosure eye too. I am going to get a subscription
(39:35):
one day. Uh So in Blissi, Georgia, the country Georgia,
different Georgia across the Atlantic, a computer science student at
the premier university there decided that he can make money
from America's appetite for partisan political news. So he set
up a website and he posted stories that were gushing
(39:56):
stories about how great Hillary Clinton was, and he waited
for ad sales to to soar, but it didn't work.
So the student, whose name is Becca Labidze, was only
twenty two at the time, and he said, you know
what does drive traffic stories that applaud Donald Trump while
(40:17):
mixing real and fake news. That's anti Hillary Clinton. And
so he started making money from from that, and for him,
it was entirely out of uh profit motive, you know
what I mean. For national he didn't seem to apparent
(40:38):
to really have a horse in the game for either candidate,
except insomuch as they might be an income source for him.
So he would get this, you guys, He would often
cut and paste, like not rewrite, not rephrase. He would
just cut and paste stuff he found other places and
then put it as a news uh news headline and uh.
(41:00):
Then he ran into a guy named John Egan who
was in Canada, and he had a satirical site. And
maybe this is close to what we were comparing earlier.
He had a satirical site called the Bird Street Journal. Um,
so it was not trying to fool anyone, in his opinion,
(41:20):
was satirical. He called it a gold mine. But this
guy in Georgia starts cutting name pasting the satirical stuff,
put it with these real fake news things, and boom,
a monster has grown. And this this is happening so
often that it is if you are on Facebook, it
is statistically it's darned near impossible. It's very unlikely that
(41:47):
you have not seen one of these things, and you
shouldn't feel bad if you saw one and fell for it.
Before you know. I I have read stuff before where
I caught myself at the headline, or I caught myself
at the by line and said, hang on, hang on,
who is Ben Bulan stinks that blog spot dot org?
(42:14):
I just that somebody you might want to bag. So
how does this echo chamber work? We talked about this
a little bit before, But what what's happening when when um,
let's say, Nolan, I am Matt share a story that
(42:38):
that is true. Let's say we have a true one.
So many social media sites uh, and enterprises like Hulu
or some other cable providers, they want their users to
have a good time, to be highly engaged, click and
left and right, and you know also click click, hit, hit,
wink wink, nudge nudge cough cough to buy stuff when
you're hanging out with them. So they know, for example,
(43:02):
what's like a random person's name, real or fake? Random person? Okay,
what's Gregory's last name Trumble? Okay? So so they know,
for example that Gregory Trumbull tends to click on stories
that concerned puppies Russia underwear or whatever, and so they'll
tend to display other stuff or prioritize that in their
(43:23):
algorithm related to that, uh, you know, puppies Russia underwear
or whatever. But they'll also find other users who like puppies, Russia,
underwear or whatever and say, well, what do they like
And if it turns out that they also like I
don't know, big league chew or custom, um, what's something
(43:44):
dumb cut custom beach snowshoes and sand shoes. Yeah, and uh,
And then he'll just start getting ads for those and
you might be like, why is this happening? This gets
dangerous when it's brought to bear on topics other than
cute animals or consumerism, so it's touched on world news, politics,
(44:06):
protections of science, and so on. It can put some
serious limits on the scope of our world views. I mean,
I'm sure that my worldview is is relatively limited just
because I largely I overwhelmingly read the Internet in English,
read things online in English rather than other other languages.
(44:29):
And I probably have some some pretty crazy viewpoints that
just seemed normal to me. That's the other thing. That's
the mind blowing thing. It's completely possible. It's completely possible
and completely like you said, normal to you. So you
are going to have a hard time knowing that you're
inside this bubble, right, or at least see you can't
see the bubble. Yeah, I might be thinking something odd,
(44:51):
like I might be very offended by the fact that
goats don't wear pants, and in my world, because I've
spent time looking at that and they've built this profile
of me, they'll they'll be like, look at you know,
I'll have fake news articles about it or something. And
even in like a Facebook situation, you know, sometimes people
will kind of call their friends list if they start
(45:14):
seeing too much stuff that exists outside of that bubble. Which,
let's be honest, sometimes we've we've got some relatives maybe
or some friends we haven't seen in a long time
that we're Facebook friends with to post some things that
we find offensive, maybe like we're just kind of stupid.
Maybe we want we don't want to see that anymore.
So maybe you don't unfriend them, but you can kind
of hide those posts. But then sometimes we do it
(45:36):
when faced with things that directly oppose our own viewpoints,
and maybe that's not such a good thing. Yeah, I'm
guilty of that. Yeah, I think you know, we saw
that happening a lot recently in in the US. We
think we know better, though, is the issue we think
we don't need this because we know what we believe.
I got it figured out, guys, I gotta figured out,
(45:58):
all right. But that that's And imagine ladies and gentlemen,
imagine a world we're liking an article by Al jazeerra
the nation. In the world, we're liking an article by
Al Jazeera and the nation. Perfect means that you will
only see articles relating to that publication or the interest
that people who like that publication. You're going to be
(46:18):
less and less likely to understand the other side of
a story or opposing perspective or maybe even here about it,
you know what I mean. So there's the problem is
that there's proven psychology behind this. There's an excellent thing
I'd like to talk about called the backfire effect. So
(46:39):
there's this misconception that when confronted by information that challenges
our existing views, we as human beings, will incorporate this,
will learn from it, we'll move forward, and we'll be
better prepared to confront the world around us. Because we're
all smart, right, yeah, we're all uh, we're we've all
pretty much got it, got it figured out. Like Fox Molder.
(47:01):
We all search for the truth. We want that over
we want an ugly truth over a pretty lie. However,
that's incorrect. Yeah, we want to believe that the Socratic
method is alive and well and we all use it
and we're good to go. Right, But no, no, no,
no, no no, no, no, my friends, what actually happens when
we see something that uh contradicts our worldview, the one
(47:25):
that we hold true and dear and close to our hearts. Well,
spoiler alert, we double down. Yeah, we go, not a brother,
I know what's what. We take our opinions and we
turned them into a delicious sandwich made of too deep
fried chicken Cutlets just check out the YouTube comments on
(47:48):
our flat Earth video. Yeah, do you remember the double down? Right? Yeah?
I love the double down. I've never had it. I
never had the pleasure. I have this problem where I
fall in love with ideas of things instead of actual things,
and I was obsessed by the idea of the double down.
This is here. Here's the worst part, you guys know,
I don't really talk about in personal life. One time,
(48:08):
after after we had recorded a bunch of stuff and
I was almost by myself in the office, I printed
out a picture of a double down and like to
take it with me. You should print on another picture
of the cheetos covered chicken fries from Burger King. Doesn't
that sound delicious? Derito tacos as well, let's turn it
(48:31):
into a different show. Yes, yeah, we we do. We do.
We we take the we take the double down approach,
which is not healthy in any sense of the word. Uh. So,
for instance, you know, if I, um, if if I
(48:53):
believed that mineral water caused yeah, if I believe that
mineral water caused some sort of strange cancer. Uh and
I was convinced it did. And then my relatives so
much as crazy, and they sent me this thing that said,
here's a study that objectively proves that there's no carcinogenic
(49:17):
material in there. Uh. Then I would instead say I
would find a problem with it, like you're the Oh
they were in the pocket of big Perry A. You
scientists are paid son exactly. And we've got there was
a great there was a great article on this backfire
effect from a website called You Were Not So Smart,
(49:39):
which is not the nicest name, but it gets some
great stuff in there, and we'd like to read an
excerpt from it for you. In two thousands six, Brendan
Nihan and Jason Riefler at the University of Michigan and
Georgia State University. What what Sorry, I'm just calling out
where I went to school. Created fake newspaper articles about
polarizing political issues. The articles were written in a way
(50:03):
which would confirm a widespread misconception about certain ideas and
American politics. So as soon as a person read a
fake article, researchers then handed over a true article which
corrected the first one. For instance, one article suggested that
the United States found weapons of mass destruction in Iraq,
(50:23):
the next said the US never found them, and that
second one was the truth. Those who opposed the war
who had strong liberal leanings tended to disagree with the
original article and accept the second. Those who supported the
war and leaned a little more towards the conservative camp,
tended to agree with the first article and strongly disagree
(50:44):
with the second. And these these reactions, they shouldn't surprise you,
which would maybe give you just a little dash of pause,
is how conservatives felt about the correction. After reading that
there were no w n d s, they reported being
even more certain than before there actually were w m
d s and their original beliefs were correct. The double down, Yes,
(51:09):
the double down, The dangerous double down. And look if
you if you think of yourself as more conservative person,
or if your political ideology is very important to you,
don't feel attacked. This is an example of a mechanic
of a bio almost a biomechanical process, a neurocognitive process
that occurs to all people that really really left leaning,
the really really right leaning, the centrists, the people who
(51:32):
don't care about politics. There's going to be something in
their lives that that makes them double down like this.
And what's troubling about this, What this means is that
whether the talk is of farm animals wearing pants, whether
it's politics, whether it's perry A, despite our best intentions
(51:55):
and possibly self and possibly overinflated self regard, we don't
tend to fact check things that already agree with or
further our existing world view. And this happens when people
move to like take them taken area like San Francisco
(52:16):
is a pretty liberal place, right, somebody moves from San
Francisco and lives in a very conservative area for a
certain amount of time, and slowly over time, their views
tend to align. Just because there's some psychological part, some
tribal part of us that says, you gotta root for
the home team, and uh, it's a scary thing. Like
(52:37):
if you think Canada a and an election is already
a thief, you're way closer to just get in slippery
sloped into a little worse and worsom worse stuff about
him until until all of a sudden one day you're like, oh,
devil baby. Yeah, obvious that checks out. Obviously it all
comes back around to the Trump devil baby. But just
(53:00):
until you see a story that you disagree with, you'll
fact check that bad boy to the moon and back.
Washington Post, what a propaganda thing. New York Times sort
of propaganda thing. Fox News, Well, you know they're a
bunch of shills, right and so on and so on. Um,
and we've seen these internet arguments when people don't do this.
This makes me think that we should mention, should make
(53:21):
an important distinction here between fake news and propaganda. Oh
the P word the other Are you done with? Opp Yeah?
You know what I mean. That's that's why Bernes said.
He was famously quoted as saying, I'm gonna grab him
by the propaganda. Ye. So, uh. The problem with propaganda
(53:47):
is that it's everywhere, all all countries do it. You know,
Al Jazera does some solid research, but they'll get shut
down when they do stuff about Yemen. They're based in Yemen.
You're not really going to hear the BBC doing its
due diligence. In my opinion on a lot of things
in British politics, uh, and Russia or Russian times I
(54:10):
think now exclusively called RT is gonna be you know,
kind of slanted when it comes to stuff in the US.
But but the point here is that propaganda is meant
to persuade a larger segment of a population to support something,
(54:30):
whereas a lot of fake news is looking for a
specific thing. Right, Um, they know that Matt Frederick already
loves orange Fanta, so I do. But so that that
would be aimed, you know, a lot of the fake
news is a little more specific in in its mindset.
(54:54):
And Ron Paul came out recently and said that the
the US governed was responsible for a lot of fake news.
Well yeah, and that's that line between propaganda and fake news.
And where where does it lie? Where is the line
between fake news and advertorials that let's say CNN will
produce for a for a country. Why I can tell
(55:17):
you the line isn't. I mean, propaganda is basically state sponsored,
whereas fake news is a bunch of just individual kind
of rogue Internet entrepreneurs trying to make some cash while
maybe steering an election. Well, propaganda can also be a
(55:38):
corporate sponsored that's true, Yeah, I think. I think it
being state sponsored though, is probably one of the bigger
definitions yet. And I like that you make that point.
And I've had this conversation before. A lot of people
are under the impression that the United States does not,
uh do domestic prop baganda. For a long time, there
(56:02):
was a law preventing them from doing that. That law
or that practice, that policy has since been repealed, abolished,
It's gone. So it is completely legal for the federal
government to exercise propaganda domestically. Did they do it before, sure,
of course, of course, I'm sure. But are they doing
(56:23):
it now? Yeah? Man, now more than ever. I mean,
one could argue that like a communications director for a
particular branch of the government is essentially their head propagandists.
I mean, they're the ones that are writing the narrative
that they want to steer the story with. Now doesn't
mean that they can control what journalists, right, But the
(56:43):
principle is still there. It's all about, like we want
to own the narrative of whether it's something bad, maybe
it's something good. Maybe it's something good they've done. But
there's a detail they don't want people to know about
as much, so they don't put that. They bury the lead.
You know. I'm just saying that the machine is there.
It's just not it's it's a little more. You know,
we're living up a society of free press, and thank
(57:06):
god for that. But a couple of clicks away, yeah,
we might not be there anymore. Yeah, that's a great point.
Like the same factories that make cars can be turned
into factories that make bombs or tanks. So the mechanism
is there. Just as you said, that's kind of spooky
when you think about it that way. We do know,
(57:28):
of course, going to open up those libel laws. Yeah,
open canny. Though the topic for another text is open
up again. Just in case I apologize to the Gwynette pickyune. Well,
technically i'd be doing slander, wouldn't I be committing slander?
And it's spoken, You're fine. I did several searches. The
(57:50):
closest thing I could find was the Times, pick Une
and a couple of small grenette papers that I won't mention. Well,
are there any papers that are not owned by Illumination
Global Unlimited? And what is a pick of units? Answer
so many questions, Well, we do want to point out
just some other things here. Uh. So, yes, these are
(58:11):
not conspiracy theories. They're active conspiracists working online, either for themselves,
for personal profit, for their government, for their corporate overlord,
or just for laws or just for loals, just to
troll you a month, Yeah, Matt keeps thinking this place,
So we bring that up. But but our point being
(58:33):
that these people are trying to deceive you, to trick you,
to exploit a m to exploit a neurochemical, neurocognitive vulnerability
and every human brain. And it's not a it's not
a situation that's always easy to see, and it's not
(58:54):
a situation where the bad guys are the same people
every time. So we do know that some stories are
genuinely suppressed. Of course, Facebook got in trouble for actively
suppressing some stories. Uh. And then we do know that
there are stories that are advertised by their supporters as
being suppressed, right, They don't want you to know the
(59:14):
truth about the flat earth. Here's a question, though, are
we entitled two complete and total freedom of expression on
the Internet? Like Facebook might get dinged, you know, from
a philosophical standpoint for suppressing certain stories, But who says
that they aren't allowed to do that, right if it's
(59:37):
a private sandbox too? And who says that they are
required to not um like the Newer Times should be
required not to publish uh fake stories? Right? But Facebook
is very different. It's an aggregator. So what is their
due diligence? I think that's a very good question, you know,
(59:59):
to what respect to them to define those parameters based
on how much ill will they think they're getting as
a result of the way they're behaving. It's a pr move, right,
And then ideally wouldn't they Ideally wouldn't history tell us
that if that doesn't work, a competitor would arise that
would have more curated powers or something, and then that
(01:00:23):
would have a backlash. I don't know. I think these
are great questions because we have we do have to
ask ourselves how much responsibility can we put on a
third party entity going to go out on the gonna
go out on a limb and say most of us
have not met Mark Zuckerberg, right, so how much of
that responsibility should be with our own action? I met
(01:00:44):
Bill Gates? You did meet Bill Gates. That's true, you guys.
No Brown met Bill Gates. You you got to talk
with him, he said, it was cool in a tiny
little hotel room. I did. I went with Josh and
Chuck for stuff you should know, and I got to
set up audio stuff and uh, it was pretty crazy.
But I digress. No, But like no, I think the
(01:01:04):
burden of the burden should be only individual to use
their thinking nogging caps and say, hey, maybe I shouldn't
believe everything that I read on the Internet, especially an
aggregate like an aggregator like Facebook. Maybe I should do
my own research and not just sit back and have
this stuff just like wash over me and just believe
(01:01:25):
it because it makes me feel cool or it makes
me feel good that I'm the one who's sharing it. Yeah,
what do you think about? I think that kids should
be taught media literacy at a young age. And when
we say media literacy, you know it goes beyond learning
to read. It's the stuff we've been talking about, like
seeing the sources that people should check, having the personal
(01:01:46):
responsibility to say, like, again, I keep going back to
this example, but for me to say, like, yeah, you know, personally,
I think goats should wear pants, and that's normal for
people like me. But let me see what these goat
nudists think, like to find these opposing viewpoints. And another
thing we've talked about before, like the way people fall
(01:02:07):
into buzzwords, the way we live in a world that
wants it's explanations, simple, high level, quick, and most importantly
less than five minutes long. Dirty and dirty, simple, quick
and dirty. Uh so uh we we also think another
way to guard against fake news is to support journalism.
(01:02:29):
There's some great places like pro Publica, which are nonprofit,
nonpartisan outfits. Yeah, and uh the thing is that the
real hard nosed journalists you know them because they have
a wealth of enemies on both sides of most stories,
and Pro Publica, as you know, has not pulled punches
(01:02:52):
where some other places would have. So I would recommend
checking out some of their stuff if you're interested in
supporting journalism. And of course, you know, if we want
to be more preachy, would teld people to go out
and support their local papers, but I don't know how
you feel about that, Or subscribe to the New York
Times well, and then other people will tell you that
(01:03:14):
that the New York Times is worthless pennies a day,
pennies a day, but people will tell you it's worthless, uh,
and it's it's it's definitely more worthwhile when you have
two different that you have two different sources to compare.
I think with maybe is coming here is that we
need a new model for journalism because you know, clearly
people aren't paying because there they expect everything to be free.
(01:03:36):
And I'm not saying that's good. I think it's kind
of crappy, but it just is. So what's the new model?
Maybe it is something like pro public are like NPR,
where you have you know, contributors. Maybe you don't buy
a subscription, but you do, you know, some sort of
pledge drive. I don't know how you do it exactly,
but I think important news as we know from NPR,
the fact that they continue to exist and do interesting
(01:03:57):
work and do all this podcasts and stuff in their
complete you know, members supported or whatever. I want to
actually ask you about this before you you have a
thought you're going to finish, and I'm sorry, but I
want to jump in because the NPR is not completely
listeners supported. Okay, that's and their state they're the like
(01:04:18):
the news of the state of the nation. That's true.
But if we're talking about individual um regional branches of
NPR and public radio, a lot of that does come
from you know, listeners support, and they do receive a
portion of their funding from NPR proper. But I mean
it is you know, it's not completely self. It's more
(01:04:39):
grassroots than than a lot of places. All those things.
I think people want that, and I just wonder what
the model is to make that possible without it being
a for profit model. I I don't know. Well, here's
what here's what the model is now, if we're being honest,
here's what it's trending towards. Now. It's trending toward a
world where every single individual believes that they are somehow
obligated to be continually reporting their lives. Right. We have
(01:05:03):
become the surveillance state we were warned about for decades
and centuries before. And the people, so every person in
the world, where every person is kind of a journalist
right with their own stance, you know, uh, protest breaks out,
and then the first way you hear about it is
through Twitter or Instagram with a bunch of people locally
(01:05:24):
posting things. Then the folks who become the gatekeepers and
the arbiters of what that information is that people are
serving the higher level functions of a journalist or replacing
it are those aggregators, the twitters, the facebooks and so on,
and the way that they decide the display. I personally
(01:05:44):
believe that's a very dangerous model because it leads to
the suppression of stuff, you know what I mean. It
can lead to suppression, and it can also lead to
witch hunting and you know, not having all the facts gathered.
And one more thing before we head out for the
week to get back to our various non sketchy pursuits,
(01:06:06):
shut up corners. First shout out today comes from Facebook. Hi,
Matt and Ben and the rest of the crew. Dirk
from Belgium here. Just to make clear, s tz w
y t K is also popular across the Big Pond.
That is really good to know. I love things across
the Big Pond. First of all, thanks for the great podcast,
(01:06:28):
big fan, he says, As you mentioned, you'd like to
suggest some topics for podcasts below, or some curious stories
that might interest your listeners. Number one Operation View Law
and Underground anti Apartheid network and its story about secure communications.
I don't know this one, do you, guys? Actually a
lot of these are new and exciting to me. We've
(01:06:49):
got Jack Barski and East German spy with a double
family life trying to make his career in the US.
Nothing gotta check this one out. And then finally we
have Castle fur Sty or how a German WW two
scientists built a castle to develop all kinds of technology
with a curious postwar career involving intelligence services and the
n s A. I hope this stuff might bring some
(01:07:11):
inspiration for a future podcast. Keep up the nice work. Hey,
thanks so much for thanks so much for writing in Dirk.
A lot of we didn't read all of their topics,
but a lot of these are pretty fascinating. He's got
links attached to putting him on that Google calendar. Our
next shout out is from Missy T. Misty says, looking
(01:07:33):
around online for toothpaste that remineralizes enamel, I found that
toothpaste containing novamen actually does that. Sincidine repair the protect
as the leading brand of many other countries. It contains
no wamen, but it's also not allowed here in the US.
The FDA made it nearly impossible to get into the
country and only allow GSK to sell it in the
(01:07:54):
US because the active ingredient have been changed from novaman
to standus fluoride. I'd be interested to know what you
guys could find out. Yeah, anything about floriade I'm interested in. Yeah.
And also, I you know, years ago, I read this
science fiction story, this dystopian thing about this guy who
invents a um. They called it a dentificus and all
(01:08:15):
you had to do was take it once and your
teeth would be immune to decay for life. Well, it's
the science fiction story, and it was. Yeah, it was
about how essentially evil this evil group of dentists came
to overthrow him and kill him and end spoiler alert
with him pouring the dentifice down to sink. Excellent story. Um,
(01:08:37):
but misty uh this this is a topic that would
be a lot of fun to look into. Finally, we
have a message from Charlie. Charlie says, hey, guys, absolutely
love what you do, but there's a running sort of
reference that's immensely intriguing to me, the implications of Ben's
demonic heritage. It makes me wonder, though, are your co
(01:09:00):
workers outside of Stuff they don't want you to know,
aware of the potential danger. I envisioned maybe the co
host of Car Stuff Scott Benjamin making a friendly bet
or wager and having no idea the real ramifications of
the deal, the possible motivations behind laying this sort of
storyline sets the imagination of Flame as well. I imagine
(01:09:20):
the sparse references and vague nature of the atmosphere being
created as intentional. But I always end an episode with
these references nearly salivating for more. I am desperate to
understand the conclusion to these happenings. I hope this hasn't
been as confusing to read as I fear it might,
But I'm deeply curious about Ben's involvement with the topic.
(01:09:45):
So Ben no comment? All right, Well that's the show.
If you want to reach us on Facebook or Twitter,
we are Conspiracy Stuff at both of those. You can
find us on Instagram Conspiracy Stuff Show. If you want
to watch our videos, go to YouTube Conspiracy Stuff. You
can also check out our fake news side of conspiracy
(01:10:06):
stuff and things dot biz, dot co, dot UK, dot
c o CE. Again we have to make that now.
But hey, a fake news isn't your bag and you
want to send out something to us directly and we
will see that thing. You can reach us by email.
We are conspiracy how stuff Works dot com