All Episodes

January 17, 2019 56 mins

In Part Three, Robert is joined again by Maggie Mae Fish and Jamie Loftus to continue discussing Mark Zuckerberg, Facebook and fake news. 

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
What's up people, That's not how I opened the show,
but it happened. Now you turn into a tech bro.
You guys are both really positive about this, but Sophie
is giving me the thumbs down and looks livid. I'm
loving casual Evans over. Yeah, it's We're an hour three
of the zutcast the pod mark this weekend. I'm Robert Evans.

(00:27):
This is Behind the Bastards, the show where we tell
you everything you don't know about the very worst people
in all of history. And again, this is part three,
so listen to the other two episodes. First, don't like
Mark like that song about the jeeps. Yeah rules. As
you can tell, we are all deep into bags of derrito.
We are deep into bags of Derrito's in order to

(00:49):
handle the stress of being in a hardcore zuck hoole. Yeah,
fully fully zucked. We're getting in the EU in the
We're getting z in the E. Thankfully eating a nice
D helped me deal with that the demon Rito, that

(01:14):
the D and M. That's what busy people say from
time to save Rio. It's too much going on in
this workaday world. In March of two THO, some asshole
named Robert Evans published an article on cracked dot com
title five reasons the Internet could die at any moment
I am not proud of The title was a different time,

(01:37):
but one of the entries on that ultra clickabilistical is
relevant to our current topic. It was about the worry
that something called the strip mall effect was rapidly destroying
the wild and weird Internet that most of us grew
up on, the place where each new click was as
likely to bring you to goatze as it was to
some dudes meticulously archived coin collection, or a website full
of roy Orbis and clean wrap fetish fiction and nothing

(01:57):
in between. Nothing, No, I mean literally everything in between.
It was a fun place, and ten was the year
that Facebook hit five million users. People were spending more
time there than they've ever spent on a website. The
worry was that Facebook and a few other giant, consolidated
social media sites would swallow up the weird little websites
that had given the Internet it's character, much in the

(02:18):
same way that Walmart and Target had swallowed up the
tiny shops and a family owned businesses that once dominated
Main Street America. Fortunately that didn't happen. Yeah, I mean
I don't need to say that. That's exactly yeah. I mean,
and I didn't make that call. Somebody else that I
was citing in the article did, but it was prescient.
That's exactly what happened to the Internet. The term they
used at the time was splinter neet, which is not

(02:38):
a term we wound up using, but I kind of
like it. I defind what Facebook did pretty well now.
Something else happened, though that almost no one predicted. I
certainly didn't. Rather than being an engine for the spread
of knowledge is many techno utopians had hoped at the
turn of the millennium, the Internet became the greatest engine
for the spread of bullshit ever conceived. Rather than bringing
people together, it facilitated division and hatred on an precedented scale.

(03:01):
Facebook is not the only corporate behind this, but it's
probably the largest one. I don't think this was inevitable.
I think most of the negative impacts we've seen Facebook
have can be tied directly to the things we already
know about Mark Zuckerberg. Based on the first two parts
of this podcast, there are a couple of clear facts
I'm established about the man, and I'm going to list
number one. He believes Facebook fundamentally is good, and so

(03:21):
keeping people on the site longer is also fundamentally good.
Number two because he guessed one thing about the future
correctly once, he thinks he is always right about where
the future is headed. And three, he has no problem
with lying, cheating, and stealing to expand Facebook and further's
the things he believes to be inevitable. Ha ha ha. Now,

(03:41):
in April sixteen, Mark Zuckerberg announced to the world that
within five years, Facebook would be almost entirely a video. Video.
He assured us, was how most people now preferred to
consume their content. This is what the kids wanted. He
didn't know that, as Jamie Loftus knows, the kids wanted toritos.
Dorito's hooking up. They're making love when it's doritos. They

(04:03):
are always making love and never hooking up. Scissor Robert's
lovingly scissoring. Let them have their beautiful romance. This is
my christ and a chip, This is my the notebook.
Uh yeah, no, that whole video content worked out great,

(04:24):
Speaking of someone, aren't we still all employed by all
worked out? Yeah? I remember when my friends and I
all had healthcare. Yeah, and then this happened. Rats on
getting that far. Yeah. Actually I'm a little older than you.
I had a couple of extra years, you know. I
hit the internet at its sweet spot. I came in

(04:45):
too late. They were like, butt chug something, and we'll
give you seventy five dollars. Okay, and I did, and
it's there forever and never run for office. No, I
will say this the one good thing about President Donald Trump.
There is no way in which you're disqualified from office
for but chugging. A guy got to the Supreme Court
and we talked about boofing and co bofing in congresses

(05:08):
will be the name of my memoir because we're just
gonna cut the mental manager. Yeah. I was so weirded
out to hear a term that my friends and I
used when I was damaging public property as a nineteen
year old, especially from a band with gigantic pores. Yeah,

(05:29):
always my favorite term. And again, I don't want the
creepiness of Brett Kavanaugh to make boofing look like a
bad thing, because boofing is an inherently noble action. Yeah,
it's gorgeous. Yeah, much like to Dorito's making love turn away.
You guys knows they've started moving, they're moving together, They're

(05:49):
always moving in your heart. Yeah. So Mark Zuckerberg tells
everybody that within you know, five years, Facebook is going
to be almost all video videos. How most people consume content?
Who tells everyone that is what people want? By this point,
Facebook had more than one point six billion monthly users.
The site was increasingly the place where ad dollars are
being spent. On the first quarter of sixteen, five cents

(06:11):
out of every dollar spent an online advertising went to
either Facebook or Google. So what Facebook says, make video?
You make video now, Maggie, You and I were both
working for a website named crack right around this time.
We're not working there anymore. Do you remember that fun
six months when we all got extra money to make videos?
You know what? That was really fun? That wasn't fun

(06:32):
six months fun? That was wild. I didn't get to
make video, but you guys did. Yeah. I liked those videos,
quality videos, directed by great people. Yeah. And then anyway,
video companies that have just shut down suddenly and then
they're like, I I just contacted them to get like

(06:53):
tax information, and they're just like, we just destroyed everything.
We burnt our files. Yeah, the one I was, We're
going a different company that did like little like news videos,
um and they too like it was like one minute
we were working in the office, then my friend and
I complained about one of our sexist bosses and we
got told to work from home, and then the company

(07:14):
went under. Well stuff out. Yeah, yeah, I got paid.
So Jamie, you have an out. You're going to flee.
Now you have to go back. Guys, I didn't want
to like make a big deal of this, but there
is someone like waiting from me outside, and I am

(07:35):
going to go back and have my name kind of
like leaped down every time it's mentioned, because are you
on a date with Martha? No, No, Jamie, Look I
mean to say, no, I'm fucking the Winkle bosses so much.
When I got to ask before you go. When one

(07:57):
of them climaxes, does the other make this sound? It's
it's but like pitch it up in aftive pitch up.
That's it. And they come bitcoins, which is pretty do
they go right into your wallet stopping wet, absolutely disgusting.

(08:18):
Sorry guys, my, they're wedding in their bitcoin. You go,
you go find love like these Dorito's Goodbye friends, well,
we've lost a loftus. Yeah, I feel it. I feel
the absence. We are unfed were but still have a
Maggie made fish, which is pretty great. I'm alright, No,

(08:41):
I'm no winkle boss. Well you're one and a half
winkle Bye you, I would say so, thank you. I'm
so happy those weird guys are stuck in pop culture forever. Now,
there's like a permanent fixture. And the only images will
ever have of them is them rowing in a boat

(09:02):
and being angry about Facebook. And that's them forever. They'll
never do anything to and we don't care. We won't
ever carry yet, I will never care. Beautiful now, uh.
Facebook offered a lot of incentives to websites that pivoted
the video for a short time. They even offered a
partnership where they would pay you to post videos on
their site. The entire digital media world very quickly swerve

(09:25):
to oblige what they thought were just the new realities
of the world. In two thou sixteen and seventeen, MTV
news vice vocative Mike and Mashable all fired writers and
editors and put more resources into hiring new video teams. Now,
Maggie and I both worked for a would say medium
sized digital media company. At the time. Crack did not
lay off its editors and writers to hire video people.

(09:46):
But we started pumping a ton of company resources into
making videos, and the people who had been writing articles
started spending more and more of their time pivoting for
them sweet sweet Facebook dollars, which honestly, like as someone
who didn't know any of the like tech stuff made sense.
It seemed great. It seemed great, It seemed to make
sense with you know, YouTube and you know, so this
is what the kids wanted. Zuckerberg new, that's what we're

(10:10):
about to get to. Now. This was not having in
a vacuum from about two thirteen to do this in fifteen.
I think Facebook was very good to most of us
because people like sharing our articles, and Facebook's algorithm and
share the articles people like to share got shared. The
huge audiences that really started to shift in two thousands sixteen.
We started seeing the same kind of traffic from Facebook.
Every few months, they tweaked their algorithm again and traffic
but fall this was all part of a strategy. Mark

(10:32):
Zuckerberg outlined in an internal email that can two thousand
twelve quote the answer I came to is that we're
trying to enable people to share everything they want and
to do it on Facebook. Sometimes the best way to
enable people to share something is to have a developer
build a special purpose app or network for that kind
of content and to make that app social by having
Facebook plug into it. However, that may be good for
the world, but it's not good for us unless people

(10:53):
also share back to Facebook and that content increases the
value of our network. So ultimately, I think the purpose
of platform, even the read site, is to increase sharing
back to Facebook. So Facebook was doing well for us,
but Facebook did not think they were doing well enough
by us because they didn't want people ever off of Facebook.
The splinternet because it is a it is a good

(11:14):
it is it's inherently good, and it's inherently good if
like Piste off boomer News x nineteen looks just as
credible in a headline form as the New York Times
because it's all on your Facebook timeline and yeah, and uh,
we're not going to check, and we will, we will check,
but we're not gonna, you know, pull any cards here now.

(11:34):
For a year or so, there, digital media companies worshiped facebooks, algorithm,
as if it was some sort of mysterious elder god,
inscrutable but capable of delivering vast rewards if properly appeased.
We published Facebook instant articles because Facebook paid us for those.
Fantacitly promised that they would increase traffic for our other content,
and of course, we filled the Internet with videos now
unbeknownst to most of us. In August of two sixteen,

(11:56):
Facebook quietly published a blog post on its advertising helps
in admitting that it had wildly overestimated the amount of
time people spent watching videos on Facebook. It turned out
they've been counting any video views longer than three seconds
for their average duration of video viewed metric, and discarding
any views of less than this amount of time. If
you understand basic mathematics, you may recognize this as providing
wildly inaccurate information about how value video adds on Facebook were.

(12:20):
So they were ignoring anytime someone scrolled past the video
like you do nine of the time. That didn't count,
only if you watched it for more than three seconds. Right, Well, uh, yeah,
that makes sense. Yeah yeah, doesn't it do for Facebook? Yeah? Now,
even that blog post had the reality of the situation.
Here's vanity fair quote. According to a new lawsuit that

(12:41):
allowed a group of small advertisers in California to review
some eighty thousand pages of internal Facebook records, it appears
that Facebook was actually aware of the issue long before
it claimed. At the time, Facebook told advertisers that it
had overestimated views by about eighty percent at most, but
in Tuesday's complaint, the plaintiffs alleged that average viewership metrics
had been exaggerated by a hundred fifty to nine percent.

(13:06):
Oh my god, it's so infuriating. That's my my buddies
and I don't have health care no more. It's just
it's so it's wild. Shout out to Tom Ryman and
David Bell in the game Plan employed network back, I'm
on Patreon because they don't have health care anymore. Take
it like like, and they're great people, and it's so
it's infuriating because it's just how can someone think that

(13:29):
they are good at business and then do things like
this like insane? That is that? Okay? That's like if
Marie Condo came to your apartment and as she was
cleaning up, just stole most of your clothing and walked away.
It's not on the ground anymore. Isn't it clean? Isn't
it anyway? You'll make a profit. I'm gonna go sell

(13:50):
this ship to Buffalo Exchange. The result of all this
was the digital media crash we are all still dealing
with today, which is not to say that companies did
not make mistakes, like all those companies that fired their
writers in order to hire on. Errors were made, but
they were made based on the fact that the company
that was responsible for all of the ad money lied
to us blatantly, and so blatantly that I don't think

(14:14):
anyone could have predicted that they were inflating their numbers
by a hundred and fifty nine alleged by the advertisers
who also got screwed over. Although I'm not super simple.
Actually I love advertisers. They're great well when they are ethical.
When they're ethical by advertising on our podcast Gorito Doritos Now, Uh,

(14:39):
you're wonderful, Maggie Um. The digital media crash was exacerbated
by a number of things, including the fact that after
the election, Facebook de emphasized and limited the spread of
content from brands, largely his reaction to complaints about the
spread of fake news on their platform. Thousands upon thousands
of journalists, writers, and other creatives lost their jobs. The
results of this were earth shattering, too many of us.
To Mark Zuckerberg, it all came down to a couple

(15:00):
of speeches he probably only half remembers. Now. I started
with this example of a thing Facebook broke because it's
very personal to me. But the consequences of Mark Zuckerberg's
bad decision making have amounted to a lot more than
a few thousand lost jobs. Let's talk about Myanmar. Oh,
here we go. The ethnic cleansing of the Muslim row
Hinga and Myanmar by the Buddhist majority has to date
forced more than six and fifty thousand people out of

(15:21):
their homes. Tens of thousands have been massacred. Forty three
thousand dead seems to be the low end of the
body count estimates. Last year, the United Nations announced that
Facebook had played a determining role in the massacre United
Nations determining roles. Other social media was also blamed, but
Facebook was by far the big cahuna. Quote from the
United Nations, it is substantively contributed to the level of

(15:45):
acrimony and dissension and conflict if you will, within the public.
Hate speech is certainly, of course, a part of that.
As far as the Myanmar situation is concerned, social media
is Facebook, and Facebook is social media. The how here
has a lot in common with the whole issue of
fake news and the spread of violent, divisive con it
that's turned American politics upside down. It also has connections
to Russia, because of course it does. Remember how in

(16:06):
two thousand nine, Facebook introduced that news feed thing. Oh yeah.
In addition to turning the Internet to a walled garden
sucking in ever more ad dollars, it also ensured that
divisive content would spread further and faster than it ever
had before. This is because, per Mark Zuckerberg stated desires
Facebook's algorithm praise time spent on Facebook more than anything else.
What kind of content drives that sort of engagement? Why

(16:28):
the kind of content people argue over and get angry
over to Facebook? Piste off people are the people who
aren't going to leave Facebook. They'll keep commenting, fighting, and sharing.
It took a little while, but oppressive regimes around the
world realized this. One of those regimes was me and
Mars military hunt partly pushed out of power into this
and eleven, but still very powerful and very shitty. Here's
the New York Times. They began by setting up what

(16:49):
appeared to be newspages and pages on Facebook that were
devoted to Burme's pop stars, models and other celebrities like
a beauty queen with a pin shot for parroting military propaganda.
They didn't tended the pages to attract large number of followers,
said the people. They took over one Facebook page devoted
to a military stiper own monk who had won national
acclaim after being wounded in battle. They also ran a
popular blog called Opposite Eyes that had no outward ties

(17:11):
to the military. Those then became distribution channels for lurid photos,
false news, and inflammatory posts, often aimed Nean Mars Muslims.
Troll accounts run by the military helped spread the content,
shoutdown critics, and fuel arguments between commenters. To rile people up.
Often they posted sham photos of corpses that they said
were evidence of Ringa portrayed massacres. This is interesting. So

(17:33):
you know, we watched the frontline documentary Roach talks a
little bit about this um. And to bring it back
to how Zuckerberg never learned and never grows up. The
way he cheated on his final exam at Harvard was
to make a fake account on Facebook post a divisive
article about the art that he was supposed to appraise,

(17:54):
made another fake Facebook account to stoke arguments on his
pay each so that he could write an essay made
off of other people's arguments, and he went on the
site to make sure people kept talking about it, kept
fighting over it, and that that's included in the Social
Network movie. And it's just wild to see him do

(18:15):
the exact same thing that led to that the military
of Myanmar did in order to engage in an ethnic cleansing.
It is the exact same thing. And how he can
keep claiming that he has no idea in the military
of Meanmar received training from the Russian government because they
were pretty good at doing this sort of ship. Yeah, yeah,
funny for all they do, they're pretty good. It's funny

(18:36):
that shitty people all think the same. Yeah, good times.
Oh we are having fun, you know. Every once a while,
just glanced at the two chips having sex just to
look remind myself they're still beauty in the world. That's
what Doritos is there for to remind you of the
world's wonder splendor of the world deritos. Now, for most

(18:57):
people in Myanmar, Facebook is the Internet. This is because
of a plan launched by our man, Mark zuckety Zuck himself,
the ear Zucker, the ear Zucker. Now we're gonna get
into that plan, but first we're gonna get into both
products and occasionally if we have time services products, we're back.

(19:24):
We've been produced and service. That's not the way to
frame that. And we were the ads were products. I'm
gonna have me a do rito cover that up. That
crunchy taste. I like licking the cheese off of the chip. Fantastic.

(19:47):
This is going to be either the best episode or
the worst episode for the s mr crowd. It's really
hard to predict. Dog. He's laying on my coat. She
likes the way your coat smells. Dogs, it's my sweat.
I'm Mark Zuckerberg in that interview. I'm just sweating out
of all of my holes. And dogs love that because

(20:09):
they don't judge. A dog would love Mark Zuckerberg if
he were capable of human affection. He has a dog
named Animal. Oh my god, no, no, that's just I
don't know if he has a dog or not. If
he did, he wouldn't name an animal or farm animal

(20:29):
or farm animal, and then rate whether or not girls
he met were hotter than it. I don't think I'm
hotter than a cow. I've been thinking about this entire
time since part one. He has a dog, it looks
like a mop. He has an expensive looks like a mop.
It does look like a mop. He would choose a
dog that looks like an object because objects are valuable
to him, and dogs are objects to some people. Now,

(20:51):
for most people in the animal Facebook is the Internet,
and this is because of a plan launched by Mark Zuckerberg.
As I stayed in the last one. The plan had
its roots in two thousand twelve, when Facebook for she
went public as part of an I p O investors
get research on both the businesses potential and its potential pitfalls.
One problem that was noted for Facebook in the future
is that by the time it went public, it had
already connected virtually every human being in the parts of

(21:13):
the world with white spread internet access. There just wasn't
a lot of room for the company to grow. Everybody
in Europe and America is everybody in the countries that
have a lot of Internet money. Yeah where where the
money comes from, because it gets sucked out the other
but yeah yeah yeah. Uh So, Zuckerberg realized that in
order to expand Facebook, he would need to connect the world,
so he started partnering with makers of cheap mobile phones

(21:35):
and service providers, initially in the Philippines, providing their customers
with free data when they used Facebook and just Facebook.
These first steps seemed to work well, so Mark announced
a formal plan in two thousand and fourteen at the
Mobile World Congress in Barcelona, which is a fun week
if you're in tech journalist. That for another drunken Matt
vomiting story. I had to go to like deal with
a bunch of like one of these products showcases at

(21:57):
a hotel and I got really sick, either because of
the drinking or because I ate some bad pie am
and I vomited in front of the hotel and then
got into a cab and the cab driver asked me
did you see the King? And I was like, what
do you mean? Did I see the King? And he
pointed like a flag flying on top of the hotel,
and he was like, whenever that flags there the King's
at the hotel, and I was like, I think I
might have just puked in front of the King's limousine.

(22:19):
That's my, that's my. Matt hasn't changed or learned anything,
and Robert hasn't changed or learned anything. And thirty years story,
I'm kind of proud of that, thank you. I'm picturing it,
and I'm proud still have problematic substance and I'm fine
with that. You know why, I guess who didn't exacerbate
and ethnic cleansing in this guy? This guy, you're fine exactly.

(22:42):
You know, you read about these people like Dick Cheney
or like George Bush who had horrible substance abuse problems
and then sobered up and then killed millions. What if
they'd kept drinking and doing coke and died at fifty?
Better world? If your only other option is drugs with
the presidency, choose drugs. Lease, Please don't go into politics

(23:04):
after sobering up. Yeah, oh boy, we are a dangerous
ground with this podcast. Well, that is funny that Zuckerberg
did try to incite the idea of presidency. I have
been on record as saying that I think a great
TV show idea would be about a time traveling drug
dealer who finds horrible people in history like Saddam Hussein
and gets them hooked on pills before they can kill

(23:24):
people like if Hitler had just had oxy he no
no Holocaust. There's just sitting in the room listening to
fucking Wagner into shipload of pills until he dies. World.
That would what a beautiful world, time traveling drug dealer.
If anyone listening is worth a network or a time
traveling drug dealer, go for it and cast Maggie may
Fish as your lead. I saw those pictures you did

(23:46):
when you were like a twenties detective. You could you
could rock the look for the episode in the twenties
about Hitler. Oh my god, I'll do it. Yeah, I'll
do it. It would be great, boy, Hotty. That was
quite the diggression. Know how we got there? It's it's
a Facebook status. We keep pausing to discuss and then
come back to the feed. Yeah, well here's back to
the feed. So Mark announced his plan to connect the

(24:08):
world at the Mobile World Congress in two fourteen. Here's
a quote from The Guardian covering a speech he gave
that day. This is marked. There was this Deloitte study
that came out the other day. He told his audience
that said, if you could connect everyone in emerging markets,
you could create more than a hundred million jobs and
bring a lot of people out of poverty. The Deloitte study,
which did indeed say this, was commissioned by Facebook based
on data provided by Facebook, and was about Facebook. Now,

(24:33):
the crux of Mark's plan involved giving people in poor
countries free internet access to a limited selection of websites.
Mark started with Zambia, but India was the real prize,
with six or seven hundred million potential new users. Now,
there were some signs that just rolling Facebook out for
free in these places might be bad, and two thousand twelve,
a series of fake images began circulating on Facebook purporting
to show the massacre of Muslims by Buddhists. This marked

(24:55):
a riot that left several debt, but Mark did not
pay this much heat. He rolled right head with Internet
dot Org and began connecting the world Zambia, India, the Philippines,
Sri Lanka, and a little country called Myanmar. Now. Last year,
in an interview with Fox, Mark directly responded to the
claims made by the United nations about the genocide his
social network was enabling and making much worse. He brought

(25:18):
up a recent success story to fake news chain letters
that had been circulating on Facebook before they were caught
and deleted. Quote from Mark, So that's the kind of
thing where I think it is clear that people were
trying to use our tools in order to concite real harm. Now,
in that case, our systems detect that that's going on,
we stopped those messages from going through. Now, as soon
as the interview was published, it provoked fury from activists

(25:40):
and social media researchers in Myanmar who were actually working
to stop the spread of fake news and saved lives.
Their response to Mark is pretty damning. I'm going to
read a healthy excerpt from it. As representatives of Myanmar
civil society organizations and the people who raised the Facebook
messenger threat to your team's attention, we were surprised to
hear you use this case to praise the effectiveness of
your systems and the context in Myanmar. From where we stand,

(26:02):
this case exemplifies the very opposite of effective moderation. It
reveals an overreliance on third parties, a lack of proper
mechanism for emergency escalation, a reticence to engage in local
stakeholders around the systemic solutions, and a lack of transparency.
Far from being an isolated incident, this case further epitomizes
the kind of issues that have been rife on Facebook
and Meanmar for more than four years now and the
inadequate response of the Facebook team. It is therefore instructive

(26:23):
to examine this Facebook messenger incident in more detail, particularly
given your personal engagement with the case. The pictures were
clear examples of your tools being used to incite real harm.
Far being stopped, they spread in an unprecedented way, reaching
country wide and causing widespread fear and at least three
violent incidents in the process. The fact that there was
no bloodshed as a testament to our communities resilience into
the wonderful work of peace building and interfaith organizations. This resilience, however,

(26:46):
is eroding daily as our community continues to be exposed
to violent hate speech and vicious rumors, which Facebook is
still not adequately addressing. Damn that eviscerating. They reported those
posts to Facebook, which eventually a couple of days later
I think, removed them and then Mark Zuckerberg lied in
an interview and said that Facebook caught them and removed

(27:08):
the motherfucker. One of the things that hit Facebook on
most in that letter was an over alliance on third parties.
In this case, the third parties, of course, were the
people writing this open letter quote. We identified the messages
and escalated them to your legal team via email on
Saturday the night September Me and mar time. At that point,
the messages have been circulating for three days, and they
continued to circulate for several days after they were reported.
So let me be clear exactly about what happened. Number one.

(27:30):
After years of bloodshed and racism spread by Facebook, local
activists managed to warn Facebook in a timely manner about
a new threat. Number two. In a miracle, Facebook listened
to them and remove the threat days after it had
first been posted. Number three. When Mark Zuckerberg took flak
for enabling an ethnic cleansing, he touted this as a
success erasing the existence of local activists in Me and Martin,
pretending Facebook itself had done this. Yes. The way that

(27:51):
Zuckerberg speaks, because he does view himself as such a genius,
he often says no one could have seen it coming.
No one could of It's been happening for years, for years.
People have been telling you for years. Several people could
have stepped in at various moments. Now, Facebook, not Mark Zuckerberg,
did issue a response and apologize for racing the local activists.

(28:15):
In Mark's first response, uh now, manmar is the most
shocking example of Facebook enabling unspeakable evil, but it is
not the only one. While Internet dot org seems to
be something of a failure in India, oddly enough, in
part because of a massive grassroots net neutrality campaign. They
got a bunch of Indian peasants to understand net neutrality
and realize they wanted it. And like, it's a really
cool story, but we will not cover in enough to

(28:35):
tie because this is a sad podcast about bad people. Yeah, yeah,
but it is a cool story. Check it out. Fake
news spread through Facebook has exacerbated ethnic tensions between Muslims
and Buddhist as well as Muslims and Hindus, leading to
numerous angry mobs and several deaths. There has been quite
a lot of bloodshed as a result of Mark's relentless
desire to connect the world. Here's a quote from the fantastic,
utterly indispensable New York Times article where countries are tender

(28:57):
boxes and Facebook is a match, solid time, solid titling.
That's sexy. Title. Almost makes me forget that they platformed
the Dictator of Turkey and an op ed recently. But
that's but that's on the op ed section. You know,
that's different from these guys get hard working journalists. Quote.
Last year, in rural Indonesia, rimors spread on Facebook and WhatsApp,

(29:19):
a Facebook down messaging tool, that gangs were kidnapping local
children and selling their organs. Some messages included photos of
dismembered bodies or fake police flyers. Almost immediately, locals and
nine villages lynched outsiders they suspected of coming for their children.
Near identical social media rumors have also led to attacks
in India and Mexico. Lynchings are increasingly filmed and posted
back to Facebook, where they go viral as grizzly tutorials.

(29:42):
That kind of content does really well. You don't get
to five million friends about making a few enemies. Oh
my god. One of the spokes people that um Facebook
put up for front Line, if she was confronted with
that she would say, um, yes, act so we are aware.
We are aware. We are gangs. We just I mean,

(30:04):
we're a company that came from a dorm room. I
don't know if you saw the movie A dorm Room quirky,
isn't that quirky? We're just like you know, we're found
the doom. And now college students are being dragged out
of their dorm rooms and beaten to death in several
countries for being gay. We hear you, hear you. It's
not our fault that fake news about them assaulting people

(30:28):
spread on Facebook and then they got murdered. It is
our fault. We like dorm rooms and if you try
to put in any law to stop us, we'll just
get slower and worse at doing this. So don't you
fucking dare dare. We're Facebook now. This ship has happened
in Sri Lanka to last year in the capital city
of Colombo, and anti Muslim video went viral. Activists and

(30:50):
government officials watched in horror as prominent racist posted things
like kill all Muslims, don't even save an infant, and
let Facebook's algorithm carry it off to millions of angry,
armed people. Now, social media analysts in Sri Lanka flagged
the video and that baby killing post and then sort
of sat back to see if anything would happen. Despite
repeatedly complaining about the horrific violence unleashed by Facebook, the

(31:11):
company had not provided these activists with any kind of
direct line. Facebook had assured them the tool would work
well enough. The anti Muslim video and the kill even
babies post were found not in violation of Facebook standards.
One of these researchers who helped flag the videos told
The New York Times, quote, you report to Facebook, they
do nothing. There's incitements to violence against entire communities, and

(31:31):
Facebook says it doesn't violate community standards. Now, Facebook standards
can be hard to parse out or understand unlike a
human and emotional level. Yeah, since they are a private company,
we have no right to that information. The best that
we can do is looked at some of the comments
Mark Zuckerberg himself has made on similar matters. Yeah, let

(31:53):
us shall. In an interview with The Guardian, he was
asked about the proliferation of Holocaust denial talking points on
his site. He called such content deeply offensive, but said quote,
I don't believe that our platforms should take that down
because I think there are things that different people get wrong.
I don't think that they're intentionally getting it wrong. It's
hard to impute an intent and to understand the intent.
I just think, as a part as some of those

(32:14):
examples are, I think the reality is that I also
get things wrong when I speak publicly. Oh so, oh well,
I mean it makes sense. He is a liar and
a thief, so he should allow a site that allows
other liars and thieves, because that is, he's a billionaire.

(32:35):
I get things wrong and accidentally claim credit for the
work of diligent activists who are trying to stop the
damages of my platform. And so also holocausts get things
wrong too, And I can't can't be angry at him,
you know, I can't, because who am I to get
angry at that? Now? Facebook is a private company. They
are publicly traded, but they're a private company, and they

(32:58):
can set a policy of sense ship for any of
their like any of this internal stuff. They can claim
that this is all like our standards and stuff is
like a business like thing that we need to keep
secret otherwise other social media companies computer exactly. So I
just love that quote that like, well, if they're honest
holocaust deniers, why why would we censor them? If they

(33:20):
honestly think Muslim babies should be killed, why why would
we censor them. It's their opinion that Muslim babies should
be murdered, and that's okay on Facebook in my online country.
That's okay with me. That's okay now. In the interview,
Zucky zuck Zuckaru state his opinion and presumably Facebook stance,
that offensive speech only crossed the line when it endangered people.

(33:41):
We are moving towards a policy of misinformation that is
aimed at or going to induce violence. We are going
to take down if it's going to result in real harm,
real physical harm, or if you're attacking individuals, and that
content shouldn't be on the platform. So it's sweet to
know that killed these Muslims and their babies did not
cross that line. Yeah, for him personally cross that line,
I don't he understands what danger or danger across means

(34:04):
or is No, his life would change irrevocably if someone
just punched him in the face want which is why
I am in favor of punching rich young men in
the face. I agree. I think he would therefore maybe understand, oh,
what these people in Sri Lanka and Myanmar are going through.
It might be like that one time I got hit
in the face and I hated that. Maybe that would help,

(34:31):
oh boy. In that same interview, Marzuck also addressed the
hate speech ethnic cleansing problem in Myanmar. But people use
tools for good and bad. But I think that we
have a clear responsibility to make sure that the good
is amplified and do everything we can to mitigate the bad.
Because they're one of their um, I don't don't know,
spokesperson Verbatim said about the mean mar Um. They had

(34:54):
a bad experienced experience with some death mobs in an
ethnic cleansing, and so we as a company, we want
to make sure that we want to reduce the bad
and increase the increase the good. Now, don't ask me
any more question. Just to imagine, who were like a
microphone talking to some lady who's been beaten to death
in the street for being a Muslim in me and marche, So,

(35:17):
could you tell us how could Facebook improve your experience?
How can we make this better? How can we how
can we fix this just a little bit, No, not
that we're not gonna take it down. We're not gonna
take it down. When he said your baby should die,
that was not a violation of our terms. But because
maybe your baby should die. If I improved the timeline,
would that help you how that we can make it

(35:40):
easier for you to tell people that your baby gut murder. Okay, Okay,
So maybe that does not all clear up exactly what
Facebook's line is, But thankfully an internal guy they handed
out to their content moderators did leak out, and it
included the clearest statement from the company yet on when
violent hateful speech crosses a line. It's like a power

(36:01):
point slide. Introduction is up at the top and then
it says why do we I p block content, and
then there's some bullet points the content does not violate
our policies. We face the risk of getting blocked in
a country or a legal risk. We've respect local laws
when the government has made clear its intention to pursue
its enforcement. Holocaust denial illegal and fourteen countries. We only
consider it for the four countries that actively pursue the

(36:23):
issue with US. So Facebook will only block the spreaders
of dangerous content with the governments of those countries actively
go after them. We don't care that it's illegal in
your country to deny the Holocaust. Only if you threaten
Facebook's bottom line. Will we block Holocaust denial content? Yes,

(36:45):
then and only then then and only the freedom of speech.
But also if you're a sex worker, you can't use
Facebook and stuff that's not okay. Deny the Holocaust? Absolutely,
do it all day long. Advertise your business as a
sex worker. No, no, no, sir or madam. No violence
yes sex, sex no, very American, very American, very, Mark Zuckerber,

(37:11):
Mark Zuckerberg. Violence, yes, sex No. Actually that should be
our T shirt. That should be a violence yes, sex no,
Mark Zuckerberg's face in the middle. I think we got
us a T shirt. Oh my god, I'd wear it.
I'd wear it twice. I'll buy it and then I'll
donate money to my local sex workers. There you go.
They will appreciate. Yeah, alright, we got some ads, some products,

(37:36):
maybe a service or three, and uh, well will we
wait for that? Are you just licking that Derina? This
is I actually looked it earlier, and this is my
second leg maximizing the flavor potential. Yeah, this is oh
my god, that's what we call him. The bisin. Yeah, products,

(37:59):
we're back. We we just finished talking about when Facebook's
cool with holocaust and and the answer is usually usually
usually unless you're Germany and you threatened to go after him,
which good on Germany. Good on Germany. And also I
realized there must have been a day or someone wrote
down the four countries that will come out where we

(38:19):
have to stop holocaust and paceswhere is illegal but we
don't care. Yeah, but it won't affect us. But it
won't affect us, so we're fine with it. Yeah. Now,
it would be unfair of me to acknowledge all of
this without acknowledging that Mark Zuckerberg himself does not want
any of this ship to spread. There are anecdotal stories

(38:40):
that when Donald Trump first announced his run for president
with that uber racist wall speech, Mark wanted to ban
Trump from Facebook and his campaign from Facebook. He was
reported kock down from this. I have no doubt that
if Mark Zuckerberg had been on the other end of
that flagged murdered the Baby's post when it came through,
he himself would have deleted the comment and I p
banned that person. Any mun would, but Mark didn't put

(39:01):
a human in charge of that job. He chose a robot.
The biggest problem with Internet dot org, the reason that
has been responsible for so much bloodshed. It was also,
by the way, absolutely critical to the rise of Rodrigo
de Tarte. He was now killed twenty people. He has
a social media army who harasses him since death threats
to be a Facebook to his detractors. Anyway, Facebook also
sent people out to help train his team and how

(39:22):
to use Facebook. That's fun. They do that don't. But
the big reason that all of this violence has been
possible is that Mark Zuckerberg launched his groundbreaking society altering
technology into countries that neither he nor anyone else at
his company understood. This is the Silicon Valley equivalent of
the Iraq War, which by the way, was planned without

(39:44):
the input of anybody who spoke Arabic, let alone understood
Iraqi culture. Uh, they didn't have one guy who wanted
to take Saddam down so that he could make a
bunch of money because he's a crupt motherfucker and to
be Bouer himself. But they didn't. They didn't. They didn't
have any like experts on the culture. And other than that.
This is what Mark zuckerbig dig two, you know, when
he brought the Internet via Facebook. It's there's a very

(40:06):
short distance for a tool to become a weapon. You
just to hold it above your head. And there was
no people from me and war, no people from Zambia,
and no people who were like working for Facebook and
like on the ground in the company. They're letting algorithms
deal with it, figuring that I'll be fine now. Victoria Rio,
a social media analyst in Myanmar, told The New York
Times a major issue in the spread of violent rhetoric

(40:26):
via Facebook in that country was the fact that Facebook
had almost no Burmese speakers that local watchdogs could communicate with.
The few people who speak that language and work for
Facebook are based in Dublin, which you may note as
not me and mar not at all, not at all. Uh.
And this is where we see, in my opinion, the
clearest downside of the move fast and break things ethos.

(40:50):
Sometimes things you break are people. Maybe, if you're going
to introduce a service to a new country and that
service has the potential to absolutely revolutionize the way they communicate,
you should not do that until you have a sizeable
team of people who speak the language and understand the
country actually working for you. Maybe doing anything else is
unspeakably irresponsible and perhaps even evil. This is why Mark

(41:11):
Zuckerberg is my pick for the worst monster of the
twenty first century so far. He is not a murderous,
violent man like Vladimir Putin or Rodrigo Jutarte. In fact, many,
if not most, of the people who spend a lot
of time around him, describe him as warm, decent, a
good listener, and an empathetic person. But the twenty first
century so far is a period defined by arrogant, mostly
men making rash decisions based on little evidence that have

(41:33):
a shattering, violent impact on the lives of millions of
people who live far away from them. Mark Zuckerberg is
the equivalent of a little kid who asked for a
bb gun for Christmas and was given a nuclear warhead,
when that is the positive way to spend this the
conclusion that gives him the most credit as a human being.
There is a lot of evidence that this is exactly
what Mark wanted, that his dream all along was to
become basically the dictator of a digital nation, and that

(41:54):
connecting people has been less important to him this entire
time than building an empire. An October of two thousand ten,
Vanity Fair declared Mark Zuckerberg our new Caesar, and an
article lauding him as the greatest of the Silicon Valley Titans.
I'm going to guess that was a comparison Mark really enjoyed.
Here's another quote from that fabulous New Yorker article. He

(42:15):
first read the Inid while he was studying Latin in
high school, and he recounted the story of Enius's quest
in his desire to build a city. That he said,
quoting the text in English, knows no boundaries in time
and greatness. Zuckerberg has always had a classical streak, his
friends and family told me. Sean Parker, a close friend
of Zuckerberg who served as Facebook's president when the company
was incorporated, said, there's a part of him that it

(42:35):
was present even when he was twenty one, that this
kind of imperial tendency. He was really into Greek odyssees
and all that stuff. At a product meeting a couple
of years ago, Zuckerberg quoted some lines from the INID.
On the phone, Zuckerberg tried to remember the Latin of
the particular phrases. Later that night he I aimed me
to tell me two phrases he remembered, giving me the
Latin and then the English. Fortune favors the bold and

(42:56):
a nation empire without bound. We are all duly aware
of how fake news spreads on Facebook and the impact
it may have had on the election in the United States,
and for some of us are relationships with our family members.
Mark denied this at first, but he has gradually copped
to a tiny amount of responsibility for the hundreds of
thousands of fake news pieces Russia's Internet research agency managed
to promote, although he thinks it's silly to blame the

(43:18):
results of the election on that. There have been numerous
calls since two thousand and sixteen from Mark to step
down from his creation. He has so far refused them
all At present, despite a falling stock price and uh
all the corpses, Mark Zuckerberg has no plans to give
up his empire, and I don't think he ever will
know why would he at this point? Yeah, he is

(43:38):
my pick for worst person of the twenty one century
because he's like, it's like Hitler. Pretty much, everyone's gonna
call the worst person the twentieth century now killed more people,
but now killed more people mostly by accident. Hitler killed
them because he wanted to kill them, and killed them faster,
like concentration camps are a huge deal in that country.
Hitler is the guy who really figured out how to
make the most efficient, most vile, most like terrible and effective.

(44:00):
Mark Zuckerberg is the greatest thing thing. George Bush is
another one. Just a dumbass guy, got in power, fucked
with a place he didn't understand and caused unspeakable height.
And Mark Zuckerberg is that guy. But for the world,
it's just fascinating. The reason I kept saying that Marker
marker Berg, Markatberg, zuck is dumb is because I think

(44:21):
one it would hurt his feelings the most. Yah, that's
the thing he clearly values. He clearly values it, and too,
I don't understand how he just blatantly didn't want to
consider anything outside of his I'm gonna connect this country,
but I'm not going to have an office in this
country where there are people who can deal with, say,

(44:41):
the spread of violent and like racist propaganda. Yeah, that
would reduce my stock value because it's more expensive. It's
just it's baffling. Uh, And I don't know if a
cautionary tale is correct, but kind of just like a diagnosis,
like a problem. It's a problem like Facebook is here.
You're not going to stop people from connecting on social media.

(45:01):
He needs to be removed, right, he needs to be gone. Yeah,
and we need to understand like a different set of
values for what our online space is going to look
like and how the tool is going to be used.
If you have to put a guy in charge, pick
someone like Hamdi Ulakai at the Chobani CEO who was
a refugee as a kid and understands the dangers of
hateful rhetoric spreading like wildfire. Yeah, it's just it's sad.

(45:28):
It's sad and also speaks to the problem when power
is hand in hand with money, because this will always
be a problem when that's the case. And Mark will
always have power because he will always be one of
the richest people in the world, and he will do
god knows what with it next what God knows. But
I'll bet you he doesn't think it's I bet he's

(45:52):
not going to do like what Lebron James did and
just start good free schools. From four kids, even though
he could he could do hundreds of him. There could
be for his book, schools all over, but instead there
is a Facebook hospital that does not accept anybody's insurance,
but does it accept bitcoin. Um yeah, you know. Also

(46:14):
a good time to point out that he doesn't have
a charity. He has an LLC. Everything is for profit.
He's giving away his wealth. Yes, sure, yeah, are you
telling me that when he said he was making a
charity he actually built a perpetual money machine for himself
and his family. Is like what I'm saying that sounds
like oh Zucky Zuck Oh man, you crazy fuck yeah? Wow.

(46:40):
It Also it's fascinating, um, you know, watching all these
interviews of Zuccherberg, how many times he cites his origin
story um as portrayed by Yeah, even though he says
he doesn't like it. Yeah, he has really glommed onto
that idea of himself. And there's again just a snake
eating its own tail. He believed is the mythos of

(47:01):
his own gene. Mark the zuck zuckman with a Samurai
sword in one hand and dead Burmese baby in the other.
That's how we should start picturing z That's the shirt.
That's the shirt. That's the shirt. That's the shirt. Sophie
looks like he thinks this is a terrible idea, and

(47:22):
it may be. It may be. Let let's do it anyway.
Let's do it anyway. My income is no longer tied
to Facebook. Boy, Maggie, do you have any research that
you didn't get to in this podcast, because I know
you did a bunch you and they're normally our guests
coming cold, which allows me to have the illusion of
being smart. Right. Well, again, I think we did basically

(47:43):
cover all everything the idea that he um buys into
his mythosum when there was talk of him running for president,
that motherfucking that he was running off of the reputation
that the film and the book gave him, none of
his own work, or the fact this is a personal opinion.

(48:05):
I have no problem with that picture of Trump and
a truck. I have a huge problem with that thing
of Mark on a tractor because with Trump in the truck,
he's clearly having a great time. He got a chance
to be behind the wheel of a big truck. Of course,
why not any that's the most human thing he's ever done.
He's like, yeah, I want to be this fine mark
looks like a fucking ah. This is how you humans

(48:25):
make money? Yes, yeah, um. And it's also crazy he
wanted to run for president and yet when you hear
him talk, he's a void of charis who it is
a just as straight as a run should be on
the T shirt holding up a ball. It's it's wild

(48:51):
the way that he was able to um adopt his
own I don't know. He became a Greek legend in
his own mind. Um. And so literally nothing will stop him.
He will keep stealing and keep breaking until there are
laws in place, which, again their company, every way they
word their responses is basically a warning like, look, these

(49:15):
are problems that we like may have could have fixed.
But one first of all, like who, you've seen it
coming and nowhere coming? Yeah, like who? And then secondly, um, basically,
we just don't have a lot of resources, so like,
you better not try to like regulate us or like
more people. We don't need to be regulated. Why would

(49:36):
that be necessary with the utility, which we're not, even
though people use us in their daily life and we're indispensable.
Now we're not a utility, so we can't be regulated
or controlled in any way, shape or form. Yeah, it's
a problematic mindset that comes from his background and who
he is and his early success. And you know the

(49:57):
way that he talked about women being compared to farm animals.
What are people signing up for Facebook? But more farm animals,
more dumb in his mind, people giving us their data
to hand off to others for profit, um and for
the sake of completion, which I do think is a
big fact. He just wants, he wants, he wants six billions,

(50:19):
He wants six billions, he wants all of it. He wants.
You know, when we tried to take over America, what
is that called the expansion manifest destiny. Yeah, you're right,
you're right. This is that exact attitude. It's manifest destiny
for the Internet and people's um data. Yeah, that's a
really good comparison to drum. Yeah. I want to amend

(50:43):
that we were choking a little bit earlier about you know,
people who have drug problems and then sober up and
do terrible things. I think the answer is really really
rich kids should be encouraged. Just no, don't go, don't
start an app, don't start a company. She's doing your
dad's got all the money to do shipload a drugs markets. Fine, fine,
it's fine, be lazy and be by yourself. Let some

(51:04):
kid who's been at the other end of the hate
mob and survived start the social network and be like, yeah,
but we gotta make sure it can't be used to
do this, because that's because that's true. I know how
bad that is. Yeah, yeah, I mean again, it's just
like another one in a long line of why, like
diversity will make your company, yeah better, not just will

(51:25):
make it better. We'll stop it from committing horrific crimes
against humanity. Were ruining itself. We're ruining itself and its
brand by contributing to horrible crimes against humanity. Well, Maggie,
as we close this out, I'm want to propose something.
We have these these two derritos, which are merged in
the dorrito equivalent of coitus. I'm gonna grab one and

(51:48):
to grab the other. We'll pull them apart and we'll
have these last derritos here we go. Okay, all right,
m morgen, a cigarette that three episodes of a lot
of stuff and to toritos derito's with a lot of

(52:08):
loving them because Derito's I love my friends, Maggie. I'm
gonna plug your plug doubles before we close this episode out,
I do um. Well, first, I will say, since the
Loftest left, the Lofton left for her new life with
the Winkle Boss with the wink Um, so do make
sure to follow her on Twitter Jamie Loft, Jamie loft Is,

(52:29):
Help You Get and Jamie Christ Superstar Jamie Christ Superstar
um please and also check out her podcast Bechtel Canist.
If you ever see her in public, don't say anything
to her, but hand her a single orange and then
wordlessly walk away. Oh my god, she'll love it. That's
so Godfather's gonna be assassinated the day. Well should listen

(52:53):
to the episode probably and that ruins the joke, but
it wasn't a nice joke. Do that or break a
light bulb or a light bulbs See this is this
is why you need a diverse crew around you. Otherwise
I'd just be a ruinous lightbulb chucking orange throwing wreck there. Yeah,
I would also be terrible without the people around me.
We all are. That's why society exists. Because we're gross

(53:18):
monsters on our own. We need to help each other
be better. Um. Yeah, so to follow follower follow me
on Twitter at Maggie may Fish and Instagram. Uh. You
can find out my video essays on film and cultural
phenomenons on YouTube at Maggie may Fish. Um, including a
really fantastic one on David Fincher's by Flight Club. It's great. Also,

(53:45):
when I was rewatching the social network, Uh, what a
heavy lifting off of Fight Club. And I will say,
much as Zuckerberg stole a lot of his data and ideas,
Fincher stole a lot of his ideas from lesser known
French and Russian filmmakers. Well, who among us hasn't stolen
from a Russian? Hey, that's that's why they're so angry.

(54:08):
I mean, that's why they're so mad. Sophie's admitting to
stealing from any Russian. I don't know where your dog
came from genetically, I don't know. That's true, entirely possible
it came from some wolf on the Siberian steps. Could
be no way to know. I mean, probably a way
to know, but I don't know. I don't know. And
I'm Robert Evans, the host of Behind the Bastards, which
you can find on the internet at Behind the Bastards
dot com, where they will be the copious source list

(54:29):
for this episode wealth these episodes, Um, you can find
us on The Graham courtesy of Mark Zuckerberg at at
Bastard's Pod. You can find us on Twitter courtesy of
another creepy guy at at Bastard's Pot. And I agree
with guy who seems to have a real perfose for Nazis.
And uh, you can find my book A Brief History

(54:51):
of Ice on Amazon. And there's a Bizos episode coming.
I mean, how could there not be? How could they're
not very excited whatever, I'm glad she's getting a bunch
of money. I mean, that's what I'm excited. She might
become the richest woman in the world. I hope you recognize,
like you got a hundred and thirty billion, White Fight

(55:14):
billion are the same amount of money. Yeah, maybe maybe,
Maybe I don't know the man anyway, I know the man. Yeah,
we all kind of do. Don't wait, we know him.
We know him. I do love. One of my favorite
things is to look at pictures of Silicon Valley billionaires

(55:34):
backward before they were billionaires, like like Elon Musk, before
he got money. It's so cute. Oh he's a totally
different man. He looks like Dana Carvey in that movie
where Dana Carvey played a turtle. God. Oh my god,
Oh my god, oh my god. Oh that's a good

(55:59):
note to end on a little bit of joy. Look
up Elon Musk before you get rich, and just have
you a good time until next week. I'm Robert Evans
and I love roughly h

Behind the Bastards News

Advertise With Us

Follow Us On

Host

Robert Evans

Robert Evans

Show Links

StoreAboutRSS

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Bobby Bones Show

The Bobby Bones Show

Listen to 'The Bobby Bones Show' by downloading the daily full replay.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.