Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
Uhmerica, I don't know why I opened the show this way.
I really, I really don't know why you did. I
don't know why I did that. Um, Jamie, Jamie, Jamie, Hi, Hi,
is this a podcast? Yeah, it's like, after all these years,
(00:24):
it's a podcast. It's a podcast. It's a podcast. Sometimes
podcast I get on the phone to podcast and I
think that I'm just talking to my friends, But then
I remember that every relationship in my life is dominated
by podcasting, and I don't even know how to interact
with people outside of the filter of a zoom call anymore.
That it is incredibly accurately. I really, that's actually the
(00:49):
saddest thing I've ever heard of my entire life. Thank you, Jamie.
How are you doing it to me? Like? I'm gonna agree?
I was? I was terrible, and then I was great,
and now I'm fine. Excellent. That sounds like a solid trajectory.
That's a good little hero's journey. That's like the that's
like The Green Night. More or less, I didn't see
that movie. It looked long. It is it is, but
(01:11):
it's quite a film. I would not see a movie
over an hour and forty minutes anymore. That's marred out.
You know what, I don't recommend for short movies as
Herbert West Reanimator or The Reanimator. Sorry, it was before
the Herbert West one, which was a Halloween movie. I
loved and then rewatched this Halloween and it was really
great up until one one really horrifying scene that I
(01:34):
had kind of forgotten Reanimator before. It's got some amazing
stuff in it if you're a horror movie fan, and
then there is an incredibly uncomfortable sexual assault scene. Um
that is like really bad, like really bad. Wow, thank you.
I think we're opening this episode in a really strong,
powerful than you focused way. I want to recommend a
(01:56):
horror movie to you. If it's Midsummer, I'm not going
to listen. It's not. It's not. I only watch Midsummer
to get all horny for Will Poulter, and then I
turned it off Shifter he dies. But that's a reasonable
thing for a person to say. Absolutely, when the person
that I want to have sex with dies, I turn
(02:16):
off the movie because it's boring. After that, it's a
very healthy way to go through life. Thank you so much. Okay,
So the movie, which I will show you at some
point is called Pin heard of Pin so Pin Pin
is a It's about a pediatrician who has like a
life size medical dummy sitting in his office. Oh, this
(02:39):
doesn't seem like it's going a good place. The pediatrician
the only way. It's like a psychological thriller. It's not
really that gory. But the pediatrician the only way that
he can communicate with emotional honesty with his children is
by making a little ventriloquist ventriloquist voice for the dummy.
And so the ventriloquist dummy gives the kids sex head
(03:02):
the ventrala because dummy does all this stuff, and then um,
and then one of the kids thinks that the ventrals
quest dummy whose name is Pin, is real. And then
Pin starts to like control his thoughts and actions. And
then there's a scene where a nurse has sex with Pin,
but he's just a dummy. Okay, it's the greatest movie
(03:24):
I've ever seen. I'm not doing it justice that that
that sounds like quite a film, Jamie. That sounds like
quite a film. Mars has sex with Pin, Robert and
I don't. Yeah, I mean that definitely does sound like
a scene that would make me have very specific physical reaction.
And the best part about that scene is that the
nurse never comes back and it's never addressed in the
(03:45):
movie again. And that now that now you're speaking the language,
the language of shoddy filmmaking, something horrible happens and then
you just canonically have to forget in order to watch
the rest of the movie. No, I'm on board with that.
And you know, Jamie, now that you mentioned quitting a
movie as soon as the person you want to have
sex with dies, that may explain where I've never made
(04:06):
it past the halfway point in the first Star Wars movie.
Really wow, Yeah, once Alec Guinness is out of the picture,
why even keep watching? You know? See? Well, yeah, you're like, well,
the hottest person is gone. I don't want not a
single fuckable face. And the rest of that film a
bunch of ug goes Jamie. So you know who else
is a bunch of ug? Gosh that we did it.
(04:28):
It's so perfect. The people who run Facebook, it is
it is. We're talking today about the Facebook papers, which
is we'll talk about this a little bit more in detailed,
but an enormous cash of internal Facebook documents that just
got leaked revealing a tremendous amount of funked up shit.
And I think we have to start with the uncomfortable
(04:50):
situation that is everybody talking shit about how Mark Zuckerberg
looks like an android. I feel so mixed about it
because on one hand, yes, the thing that's bad about
him is not his appearance, but also yes, he does.
He does hit the uncanny valley. There's something missing in
his eyes. Look, there's something and and that's ivy league
(05:10):
boy syndrome, right, Like, that's not just him, that's anyone
who who graduates screen photos. I truly. I mean, even
though we did just refer to the entire cast of
Star Wars one as a bunch of ugos hideous, do
feel that it's like the worst, the most lazy thing
(05:31):
you could do is go after how someone looks when
there are so many other evil facets of him. I
will agree that there is no light in his eyes.
There's certainly no light in his eyes. There's nothing like
the pupils are very There's not that little thing that's
supposed to be there is not there. Yeah, he could
watch a kitten get murdered and it would just be
a dial tone inside his soul. He looks like a
(05:53):
character from the Polar Express. He looks he looks like
Herbert West from the movie Rhianna of Matter, but less
cares to check that he looks like Pin. Look up Pen.
I'm not going to look up Pen. At this point
in the history of the show, Jamie, we've recorded a
lot of podcast episodes about Facebook. You have been there
(06:14):
for what three of them? Um? Yeah, I forget. I
forget entirely how many episodes we did on Facebook. We
did three episodes on like the creation of Facebook, and
it's kind of a brief list of its crimes. I
think we did at least one follow up, maybe two
follow ups. Um. And then we've mentioned Facebook foppery fakery
in episodes of Like It could happen here in Worst
Year Ever. Facebook is personal to me for a couple
(06:36):
of reasons. Number one, number of people who helped raise
me have slowly lost their grip on reality and the
face of viral propaganda spread be a via Facebook's engagement algorithm,
and that's kind of bumped me out. Um yeah. And
number two, my friends and I all lost our jobs
in the company we built for a decade due to
the fact that Facebook told criminal lies about video metrics
(06:57):
that they have currently been find forty million dollars for,
which also frustrating. Actually over it, it sounds like you're
really over it. And you know, I I we've talked
about this. I don't know if it was on micro net,
but yeah, I also asked my job to that. I
mean we worked at the same place. Yeah, worked it
like yeah, I mean they all the whole industry went
(07:17):
up in flames. It's so sace. I'm still I'm still
mad about it. Yeah, I am mad about it, even
though like things have been going fine, great for me
career wise. Um, it's just like kind of bullshit. It's
kind of frustrating. It used to be. Did you catch
that Robert's career is going great? Yea? In my head,
I went, You're welcome. I'm a human rocket ship. I'm
(07:39):
the iggy pop of talking about Facebook Welcome. It's kind
of nice that we all just had to pivot to, like, Okay,
you can still talk about what you're passionate about, but
just no one has to look at you anymore. And
I'm like, that's actually not the worst thing. That's that ideal.
And in my case, it was like you don't have
to write articles. You just have to tell on a microphone,
(08:01):
which involves writing an article. But I don't know, they're easier.
It's true. You can, like you can really have a
series of bullet points and be like, well, you just
don't have to edit anymore. That's that's what that's what
we that's what we got rid of in this pivot
to podcasting. Editors were all, we're all green walding a
(08:21):
little bit. Yeah, it all worked out, but you know,
but also Q and on stealing the fact that there
will soon be death squads roving many of our neighbor
Like there's there's downsides to it too, you know. Um,
but then also meta, you know, and then yea meta,
thank god, we're getting meta. We'll be talking about meta
at some point. Um. But yeah, like it's it's Facebook's bad.
(08:42):
I don't like Facebook. But in one of those episodes,
and I forget which of those episodes, I said something
along the lines of, at this point, there's no moral
case for remaining employed by Facebook. Um. And earlier this year,
a Facebook employee named Francis Hoggan came to the same
conclusion on her own. Rather than jump out with her
dock options or whatever perk she decrued, and then get
another tech industry job, which is what a lot of
(09:04):
people do. I know people who have done this when
they were like, Facebook's kind of fucked up. I'm just
gonna go hop to another country company and make even
more money. Uh. Instead of doing that, Francis spent weeks
painstakingly photographing internal Facebook has its own internal communications network
that is patterned off of Facebook, but it's for like
the corporate the the employees to use. Um well, I
(09:29):
mean it's it's like it's like Slack, but probably more
all consuming and soul destroying. Um yeah, but she is.
There's like nothing worse. I mean, personality wise, I can't
stand someone who's like really killing it on Slack. That's
one of my least favorite traits in a person, as
if they're they're really giving it a percent on Slack.
(09:51):
I'm like, just I'm I'm asleep. I hate it. No,
I mean, Sophie barely hears from me when we need
to work, let alone, when we don't need to work
hardcent not true good stuff. So rather than you know,
take the money and run, so she gets in this
(10:12):
like internal communications app that Facebook has and because they
have protections about like because like they know that this
is a risk people leaking internal documents as a risk,
they have like security things set up and to get
past them, she just photographs a bunch of internal documents
on her phone. Um a huge Like she gets a
lot of shit. Uh, And then she didn't leaks those
files to several news agencies and are good friends at
(10:34):
the sec. UH. This week, we're going to go through
some of those documents and all of the damning ship
they reveal. The first and perhaps most important revelation is
this many Facebook employees understand that their company is a
force for evil. Some of them have vowed to fix
it from the inside. Others are convinced the evil is
outweighed by some nebulous good. But at the core of it,
(10:56):
they know that what they were a part of is problematic,
and a lot of them hate themselves for it. You
can really see that coming across in some of these conversations,
evidence of the yeah it's good stuff, don't you love? Yeah,
when human beings compromise the very nature of their soul
uh in the in seeking profit Yeah. And then and
(11:16):
then you know, you watch the you watch the light
leave their eyes, and then you're supposed to feel bad
for them. Uh. Evidence of the struggle over the soul
of Facebook can be found in the reactions of employees
to the growth of the Black Lives Matter movement. After
the murder of George Floyd by a cop that June,
as protests reached their height, a Facebook employee posted a
(11:37):
message to the company racial justice chatboards stating, get bright
Bart out of news tap. He was enraged at the
fact that the far right publisher was pushing this information
about violence at protests, and included screen grabs of bright
Bart articles with titles like Minneapolis Mayhem, massive looting, buildings
and flames, bonfires exclamation point, and BLM protesters pummel police cars.
(12:00):
Wonder how much more attention they paid the police cars
than the man who was choked to death by a cop. Anyway,
good stuff, good journalism, nailing it. This employee claimed that
these articles were part of a concerted effort by Breitbart
and other right wing media sites to quote paint Black
Americans and Black lead movements in a negative way. He
argued that none of those hyperpartisan sites deserved to be
(12:21):
highlighted by the Facebook news tab, so Facebook's News tab
consists of two tiers of content providers. Right, It's just
like the tab that tells you what's going on in
the world, and all of the people whose stories get
in there have been vetted to some extent by Facebook.
So there's a first tier of big publishers like the
New York Times, the Wall Street Journal, like the big dogs,
(12:42):
and they get paid. Facebook gives the money to be
a part of the News tab. And then there is
a second tier of news sites who are not paid
but did have to get vetted as a reliable news
source for Facebook to put them on their News tab.
And bright Bart is in that ladder tier, which means
Facebook isn't giving them money directly, but is institutionally pumping
a shipload of of like traffic towards their propaganda and
(13:03):
throwing a lot of their propaganda out into people's news feeds.
In their public facing statements, Facebook claims to only include
sites on their newstab who do quality news reporting. Sites
that repeatedly share disinformation, it claims are banned. This functions
on a strike system. In July, President Trump tweeted a
Brightbart video claiming you don't need a mask to protect
(13:24):
against COVID nineteen. The video also spread misinformation about hydroxy clarquin.
Despite the fact that this video clearly violated Facebook stated standards,
it was able to reach millions of people through the
newstab before Facebook took it down from the Wall Street Journal.
According to Facebook's fact checking rules, pages can be punished
if they acquire too many strikes, meaning they published content
deemed false by third party fact checkers. It requires two
(13:47):
strikes within ninety days to be deemed a repeat offender,
which can result in a user being suspended from posting content.
More strikes can lead to reductions and distribution and advertising revenue.
In a town hall, Mr Zuckerberg said Brightbart wasn't punished
for video because that was its only infraction in a
ninety day period. According to internal chats described to describe
the Yeah, now that seems wrong, right, knowing Brightbart that
(14:09):
they would have one strike and that's so I mean,
and was the reason that that video reached so many
people before it was taken down. It was that just
like a delay in fact checking, does that mean that
a certain number of people need to like? No? I mean?
Trump tweeted it and it spread and Facebook didn't want
to take it down until it had it had already
kind of made them some money, I think. I also
(14:31):
think it's just like they don't put a lot of
work into checking on this stuff. They don't. They don't
want to piss it. We're talking about all this, but
like they also just don't want to piss any conservatives off.
Like there's a lot of things going into why this
stuff is not in fashioned to any degree. Now you
express surprise at the fact that that Brightbart only had
one strike in ninety days. Let's talk about why. Yeah, So,
(14:53):
thanks to Francis Hoggans leaked documents, we now know that
Brightbart was one of the news sites Facebook considered managed partners.
These sites are part of a program whereby the social
network pairs handlers who work at Facebook with the website.
These handlers give Facebook a back channel to sites that
spread disinformation, which allows them to have content removed or
(15:13):
altered without giving the content maker a strike. So, in
other words, they put out the content it gets few
millions of times. Facebook one employee like messages and editor
and says like, hey, you need to change this now.
It gets changed after it's spread around, and they have
a way to strike and thus stay on the news tab.
That's a good way to do a back channel. That's
(15:33):
so dark, Okay, I mean that. I I guess if
you're looking for a way to keep misinformation up, that
is a logical way to go about it. Yeah. Yeah,
they do a perfect job. So it's actually you're saying
that bright Bart is accurate. Yes, perfectly accurate. That is
what we always say about Bridbart and Andrew Brightbart, a
man who did not do cocaine to until he died.
(15:56):
Um about someone who's who's got light in his eyes.
Not anymore, It doesn't so actual strikes were automatically escalated
for review by senior Facebook executives, who could decide to
overturn the punishment and remove a strike. Through these methods,
Facebook strike system for spreading disinformation actually proved to be
nothing at all, and he sufficiently large right wing website
(16:18):
was given numerous opportunities to avoid strikes without being delisted.
This was a problem that went further than Breitbart, as
The Wall Street General reports. In an internal memo, the
engineer said that he based his assessment in part on
a queue of three dozen escalations that he had stumbled onto.
The vast majority of which were on behalf of conservative
content producers. A summary of the engineer's findings was posted
to an internal message board. One case he cited regarding
(16:41):
pro Trump influencers Diamond and Silk, third party fact checkers
rated as false post on their page that said was
like porn stars Wait diamonded Silk? Oh do you not
know about Diamond and Silk? Oh? They are? Are they banned?
They're not great? No, no, but they're bad. They they're
not not nice people, not good people. Um so they
got yeah, it's a fact Checkers rated false a post
(17:04):
a post that Diamond and Silk made stating how the
hell is allocating million dollars in order to give a
raise to House members that don't give a damn about
Americans going to help stimulate America's economy. When fact checkers
rated that post false, a Facebook staff are involved in
the partner program, argued that there should be no punishment,
noting the publisher has not hesitated going public about their
concerns around alleged anti conservative bias on Facebook. So this
(17:26):
is a pretty minor case, but it shows what's going
on there. They post something that's not accurate. This raise
is not something that's like going through UM and fact
checkers flag it is inaccurate, which should mean it gets removed.
But then someone at Facebook is like, if we remove it,
they're gonna yell about us removing their post and it's
going to be a pain in the asked for us.
So just like fuck it. Yeah, this is I feel
like this is always the route that Facebook does. It's
(17:49):
just like this big, gigantic, bureaucratic style operation that people
uh do shitty things so that they're not convenience or
yelled at by someone else. Like it's also insidious and
also so boring at the same time. Yeah, it's it's
the consequences that aren't boring. Um. And to some extent,
(18:13):
this is true of a lot of the worst things
in history. There were an awful lot of men intotalitarian
societies who signed effectively or literally the death warrants of
their fellow man because like, well, otherwise it's going to
be a real pain in the ass for me, My
day at the office is going to be terror I
don't want to take this to the boss. I don't
(18:35):
want to escalate this yet just kill them. Yeah, yeah,
I mean it is so I like the most evil
stuff is done in a very slow and boring way.
I feel like it's just because if you can get
people to, you know, fall asleep, you can get away
with fucking murder. Like literally, Yep, it's good stuff. Loving
these papers, Robert, Yeah, they're very fun. So Diamond and
(18:58):
Silk were able to be the third party fact checker
into changing the rating on their article down from to
partly false, and with the help of the managed partner
escalation process, all of their strikes were removed. Um the
chat conversations that the general review showed that inside the company,
Facebook employees repeatedly demanded that higher ups explain the allegations. Quote,
(19:19):
we are apparently providing hate speech policy consulting and consequence
mitigation services to select partners, wrote one employee. Leadership is
scared of being accused of bias, wrote another. So that
seems bad. That seem good. Now, that seems like the
root of a lot of problems we've been having as
a society. Then like, well, conservatives are loud when they're angry,
so let's just let them lie and try to get
(19:41):
people killed. Diamond and Silk was doing that in that case,
but that that's a thing in writing media. Now when
you're saying, I don't know what to picture when you
say diamond and silk. So at first I was picturing
porn stars. Then I was picturing hair metal band looked
more like gospel singers. I was picturing two Springer sp annuals.
Most recently, I don't think I'll stay there. Yeah, no,
(20:03):
I wouldn't. I wouldn't. You should? You should look them up. UM, Yeah,
they're they're they're. There are two musicians who like opposed
with Trump and have like a I think they're on TikTok.
They're just like right wing media influencers, and they're they're
they're not they're not great people. UM. In a farewell
memo of colleagues in late one member of Facebook's Integrity
(20:24):
team UM and the Integrity team, their job is to
reduce harmful behavior on the platform, UH complained that Facebook's
tolerance for Britbarts stop them from effectively fighting hate speech. Quote.
We make special exceptions to our written policies for them,
and we even explicitly endorsed them by including the mistrusted
partners in our core products. Yeah, it's it's bad, and
(20:45):
you can see, like there's this constant with the Facebook
papers revealed, there's this constant seesaw and aggressive between the
integrity team, the people whose job is to reduce the
harm the site does, and everyone else who's only real
job is to increase engagement on the site. Right. That
is how you get your bonus, That is how you
get the kudos from the boss is keeping people on
(21:06):
the site for longer. So most of Facebook that is
their job, and a small number of people their job
is to try and make sure the site doesn't contribute
to an ethnic cleansing, and the ethnic cleansing people like
the people trying to stop that. The best way to
do that is always going to be to do things
that cut down on engagement with the site, and so
they nearly always lose the fights they have with everybody else. Um,
(21:28):
Jesus Christ. Yeah, it's great, Okay, okay, Yeah, that is
the scariest extension of that logic, yep ye. One thing
we know thanks to the Facebook papers is that the
social network launched a study in two thousand nineteen to
determine what level of trust its users had in different
media organizations. Out of dozens of websites across the US
and UK, Bright Bart was dead last. Facebook themselves rated
(21:51):
it as low quality, which, again based on the company's
own claims about how they decide who to include in
the news tab, would disqualify bright Bart. And guess what,
bright Bart is still a trusted Facebook partner. Oh hey,
what's this unrelated news clip from a too in November
one Washington Post article doing in my script? Quote? Right,
Bart is the most influential producer of climate change denial
(22:11):
posts on Facebook, according to a report released Tuesday that
suggests a small number of publishers play an outsize role
in creating content that undermines climate science. Good ship, Right,
that's still number one after all these years. Good Isn't
that a good thing? Isn't that a good thing? So
they haven't said two inaccurate things in the last ninety days,
which I find never Facebook's terrorists thought of offending conservatives
(22:35):
by cracking down on hate speech and rampant disinformation started.
I don't know if it started, but it really, it
really hit the ground running in two thousands sixteen, during
the only election that was even worse than this last election.
In May of that year, Gizmoto wrote an article reporting
that Facebook's trending topics lists suppressed conservative views. A handful
of ex employees made claims that seemed to back these allegations.
(22:58):
Up now. Reporting later in the year by NPR it
made it clear that this was bullshit quote. NPR called
up half a dozen technology experts, including data scientists who
have special access to Facebook's internal metrics. The consensus there
is no statistical evidence to support the argument that Facebook
does not give conservative views a fair shake. But truth
never matters when you're arguing with conservatives. They needed a
(23:20):
reason to threaten Facebook with regulation, et cetera. When Trump,
and when Trump won later that year, the social network
decides these threats may have teeth, and so we're going
to spend the next four years allowing them to say
whatever the funk they want, no matter how racist, no
matter how conspiratorial, no matter how many shootings it may
help to inspire, no matter how no matter how many
shootings may be live streamed on the platform, like the
(23:41):
christ shoot shooting. Um, we're gonna let it all in
all because yeah, money, well, because otherwise they yelled at
and maybe regulated. The conservatives are good angry. I don't
want the conservatives to get angry. The funny thing is
they're no stopping them from getting angry. Right, you know
(24:02):
how this works. I know how this works. They're gonna
be angry, and they're going to claim bias no matter what,
which is what they do. And so as Facebook gives
them a free pass and their content is consistently the
most influential stuff on the entire site, allegations of anti
right wing bias continue to spread, even though again like
eight of to nine out of the tin top shared
post on any even day are from right wing media.
(24:24):
But you know what's not from right wing media? Jamie?
What all these products and services that you're not at
all sure? You can't at all, not at all. We
have a different we have a different brand of brain
pills than the ones Alex Jones sells, and ours have
less than half brain pills, less than half the lead.
That is that. That is a promise, Jamie, every much
lad you think a post should have it's less than that.
(24:47):
Because we care, I'll take your sick little centrist brain pill.
See if I care, which I can start watching MSNBC
at any moment. Okay, take brain cooked, get brain cooked. Uh,
here's some other products. All right, so we're back. Okay,
(25:11):
So we're back, So we're back. Are think it's gonna
get happy. Are think he's gonna get happy? I think
it's gonna get funny. Not really, okay, just checking, Yeah,
that's not really. I mean Mark Zuckerberg will like, I
don't know, fall down a man hole someday. Maybe we're lucky.
That would be funny. That would be funny. In two
(25:32):
thousand eighteen, if Facebook engineer claimed on an internal message
board that the company was intolerant of his beliefs, the
reality is almost certainly that his co workers found him
to be an obnoxious bigot. I say this because he
left the company shortly thereafter and hit the grifting circuit,
showing up on Tucker Carlson's show. He just he does
the thing that like you remember two eighteen nineteen, a
bunch of these guys were like leaving big tech companies
(25:53):
and like going on the Alex Jones show. There was
one guy left Google and claimed like brought a bunch
of leaks, but they weren't in a thing because it
was never anything that people being like this guy kind
of seems like it sucks. It's very funny. Those press
tours were Yeah, that was truly that feels like it
was ten years ago, but yeah, it was. It was
funny because like I think the first one of these
(26:15):
dudes did all right money wise, but after that, like
this Spiggott dried up and so they were just like
detonating their careers in the tech industry for nothing, going
to work for gab afterwards. To watch after the election,
and I apologize for the rate that we're jumping around
here on the timeline, but it's unavoidable. Facebook became the
(26:37):
subject of bad pr from the left as well. The
Cambridge Analytic a scandal broke, and the outrage in the
wake of Trump's election meant that Facebook was being pressured
to do something about bigotry and disinformation spreading on their platform.
At the same time, the Republicans are in charge now,
so they can't actually do anything otherwise they'll be attacked
for being biased and maybe regulated. So they tested a
(26:58):
couple of different changes. One was a tool called sparing sharing,
which sought to reduce the reach of hyper posters. Hyper
posters are exactly what they sound like. These are users
that have been shown to mostly share false and hateful information,
and so reducing their virility was seen as potentially helpful.
This seems like a sensible change, right. Oh, these people
(27:18):
are are sharing at an incredible rate and it's all
violent trash. Let's reduce the number of stuff. I guess
that's like, that's a that's a real band aid just
be like, Okay, we're gonna have them. They can still
share stuff, but just less hateful stuff. Yeah, and it's
not less garbage. It's not even a shadow band because
(27:39):
the shadow ban would imply that, like you are actually reducing,
like artificially the spread, You're no longer artificially inflating their
reach because their stuff gets great engagement, right, because it
pisses people off even though it's untrue, and the algorithms
the default is, oh, this pisses people off. Let's let
everybody see what this asshole saying. And they're just being like, well,
(28:00):
let's not do that for these specific assholes, right, that's
all they're doing. Um, it's not a band, it's a
we're going to stop inflating these people's reached to the
same extent that we were. Seems like a sensible change.
You know who disagreed with that, Jamie Loftus, who Robert Evans,
Joel Kaplan, former deputy chief of staff to George W.
Bush and Facebook head of Public Policy UM famous right
(28:24):
wing ship had Joel Kaplan, who is huge at Facebook
UM and is a major driving force behind don't piss
off Conservatives. That's that's the guy that he is. That's
his whole job. How are we supposed to work together
if we're pissing on the conservatives? It actually at rising tide. Yeah,
so Kaplan's like, most of these hyper posters are conservatives.
(28:47):
This is this is you know, unfair, and he convinces
Zuckerberg to weaken have his engineers weakened the tools so
that they do kind of reduce the influence that these
hyper posters have, but not by as much as they
wanted to, and it doesn't really seem to have a
much of an impact. As we will talk about later,
this is still the way Facebook works. So however, to
whatever extent they did reduce the harm, it was not
(29:09):
by much. Another attempt is also like way too cool
of a word to describe what that is, which is
spreaders of hate speech. Why give them a cool name
like that? Yeah, why give him a cool name like that?
Although I don't know that might have that sounds like
something we might have said as like an insult to
people when I was young and on the internet. You're
(29:32):
a hyper poster. I don't know, dude, you're like hyper
posting right now. You need to chill the funk out.
I'm picturing someone sitting at their a filthy keyboard in
a Power Ranger suit. I am. I am imagining a
filthy and the filthy Power Rangers suit, Jamie. Oh, it's
really dirty and it doesn't fit either way too big
or way too small. Yeah. They they have soiled themselves
(29:55):
in it on more than one occasion because they can't
stop posting, Robert, because they're post too much. It was
It was not an accident. It was a choice they
made in order to finish an argument. I'm going to
make an oil painting of that exact image, Jamie. I
swear to you, I will put that up in my
living room. I will put it on my roof like
(30:17):
the Suesteem chapel. Really absolutely, don't underestimate how much free
time I have, Robert, I would never do that. You
work for the Internet. So another attempted tool to make
Facebook better was called informed engagement. This was supposed to
reduce the reach of posts that Facebook determined were more
(30:37):
likely to be shared by people who had not read them.
This rule was instituted, and over time, Facebook noticed a
significant decline in disinformation and toxic content. They commissioned to study,
which is where the problems started from. The Wall Street Journal.
The study, dubbed a political ideology analysis, suggested the company
had been suppressing the traffic of major far right publishers,
(30:58):
even though that wasn't it's an tint. According to the documents,
very conservative sites it found would benefit the most if
the tools were removed. With bright parts traffic increasing and
estimated Washington Times is eighteen percent, Western Journal sixteen percent,
and Epic Times by eleven percent. According to the documents,
the study was designed that's why you never conduct a study, Robert, Yeah. Yeah.
(31:22):
They find basically, hey, if you don't let people, if
you reduce the amount by which people share, like, by
by which posts shared by people who have not read
the article, if you if you make those spread less
bright parts, traffic drops. I still I still think that
that those like um, the tools that have developed over
time to be like, are you sure to read the article?
(31:44):
Our soul? Goofy? I I do like when Twitter, um,
I feel like they're like I just picturely a little
shaking person. Are you Are you sure you don't want
to read the article before you retweet it? What do
you think? I was like No, I felt so bad
because there's been times when I've like retweeted, like shared
my own articles that I've written, and because like I
(32:07):
wrote them, I didn't necessarily click the link before sharing.
I just like woke up and I like shared it,
and it's like, are you sure you don't want to
read this? And I just click to share because it's
like nine or ten in the morning and I haven't
had my coffee yet. But I feel bad even though
it's like, well, I wrote the motherfucker, like I know
what's in there. Usually by the time I share something,
I have already read it. But I do I think
(32:29):
that that function. I think it has a good purpose
and it like pains something in your brain that is
you know, yeah, I think it is a good thing.
It's just funny. It's this little Oliver twist that appears
in front of you and it's like, are you sure
you don't want to read the article for your share?
We're following like like I'm good I can read it's
all good. Would you like to maybe read the article
before suggesting that an ethnic group be slotted for their crimes? Right,
(32:52):
And that's where he really comes in. Handy, isn't those Yeah, yeah,
that guy shouldn't be British. Um. So, so the study, like,
the reason they conduct this study is in the hopes
that it will like allow them to say that it's
not biased. But then it turns out that, like, I
wouldn't call it biased. But this change, which is unequivocally
(33:16):
a good thing, impacts conservative sites, which are lower quality
and more often shared by people who haven't read the
articles but are incensed by a shitty, aggressive headline like
the brightbart ones. We just read those get shared a
lot and they don't read the article, and that's great
for bright part um. But they decided, like, oh shit, actually,
this study, the results of this study were absolutely going
(33:37):
to be called out for bias. One of the researchers
wrote in an internal memo, we could face significant backlash.
We're having experimented with distribution at the expensive conservative publishers.
So the company dumps this plan, they kill it. It
is bad for Brightbart, good for the world if it's
bad for bright part. If it's bad for the Bart,
we gotta we gotta. Can the plan bad for the Bart?
Can you have always said that Facebook? Yeah, I say
(34:01):
you are Cheryl Sandberg. Actually not a lot of people
know that. Um yeah, I you know, listen off of
making fun of Cheryl's very share eff So the good
news is that Facebook didn't just make craven decisions when
threatened with the possibility of being called out for bias.
They were also craven whenever a feature promised to improve
(34:23):
the safety of their network at the cost of absolutely
any profitability. In two thousand nineteen, Facebook researchers open to
study into the influence of the like button, which is
one of the most basic and central features of the
entire platform. Unfortunately, as we're discussing more detail later, likes
are one of the most toxic things about Facebook. Researchers
found that removing the like button, along with removing emoji
(34:45):
reactions from posts on Instagram, reduced quote stress and anxiety
felt by the youngest users on the platform, who all
reported significant anxiety due to the feature. But Facebook also
found that hiding the like button reduced the rate at
which users interacted with posts and clicked on ads. So
now this is more of yeah, yeah, no, this is
(35:05):
I will say, more of a mixed bag than the
last thing because removing the like didn't like, it made
one group of young people feel better, um, but not
other groups of young people like. It didn't reduce it
reduced like kids social anxiety, but it didn't have as
much of an impact on teens really, so it's not
as clear cut as the last one, but still a
protective effect had been found among the most vulnerable people
(35:28):
on Instagram in particular. Um, but they don't they don't
do anything about it because you know, that's so frustrated.
That is genuinely, very very valuable interesting information where I
don't know, I mean, I I feel like it probably
didn't affect teenagers because by that point it's like, I mean,
you don't want to say like too late, but by
(35:50):
that point you're so hardwired to be like, well, I
can tell what is important or like what is worth
discussing based on likes, and once that that's just such
a sticky I mean, I still feel that way, even
though it's like you can objectively know it's not true,
but once you've been kind of pilled in that way.
It's it's very hard to undo. Yeah, it's Um, I
(36:13):
don't know what it is, Jamie, it's not great. Um
upsetting that. Yeah, it's not great. Uh So, as time
went on, research made it increasingly clear that core features
of Facebook products were fundamentally harmful. From BuzzFeed quote, time
and again, they determine that people misused key features or
that those features amplified toxic content, among other effects. In
(36:35):
August and in August two, nineteen internal memos, several researchers
said it was Facebook's core product mechanics, meaning the basics
of how the product functioned, that had let misinformation and
hate speech flourish on the site. The mechanics of our
platform are not neutral, they concluded. So there's Facebook employees
recognize internally, we are making decisions that are allowing hatred
(36:56):
and other and just unhealthy toxic content to spread, and
we're not. The bias is not in us fighting it.
Our biases in refusing to fight it, like we are
not being neutral because we're letting this spread. The people
making the site work recognize this. They talk about it
to each other. Um, it makes it. They feel guilt
over it. They talk about it. You know, we know this. Yeah, yeah,
(37:18):
I mean we've discussed that before. Of just like the
existential stress of working at Facebook. Not the most empathetic
problem in the world, but not at all. A lot
of them are making bank but clear paper trail though
of deep existential guilt and distress. Now, uh, it's it's
(37:38):
pretty cool. Yeah, it's pretty cool. Pretty cool. So, rather
than expanding their tests on the impact of removing the
like button on a broader scale, Mark Zuckerberg and other
executives agreed to allow testing only on a much more
limited scale, not to reduce harm, but to quote build
a positive press narrative around Instagram. So not not to
actually help human beings, but to have us something to
(38:00):
brag about. Right, for him to be like, we're so nice,
We're so cool, We're gonna have fucking rad we are
I'm the guy site to stop the dated slang. He's
going to be like, yeah, this is gonna be uh,
we gotta get some lit press around Instagram, know what
I mean. Yeah, in three years, someone's going to tell
him the word poggers and then he's gonna say it
(38:22):
like thirty times in a week to all of his
imaginary alien friends on meta'd like that's bro sucker, I
hate that, screaming into avoid. In September of Facebook rolled
out a study of the share button. This came in
the wake of a summer of unprecedented political violence, much
(38:43):
of its stoked via viral Facebook posts and Facebook groups. Uh,
the ship at Kenosha started on Facebook in a lot
of ways. Uh, we're tracking it that night. A hell
of a lot of that ship got started on Facebook.
A lot of the like let's get a militia together
and protect businesses. You know, it's good stuff. Company research
identified reshare aggregation units, which were automatically generated groups of
(39:04):
posts that you're so they identify one of the problems
leading to all of this is what they called reshare
aggregation units. And these were automatically generated groups of posts
that were sent to you that we're posts your friends liked, right,
and they recognize this is how a lot of this
bad shit is spreading so right, right, that's creating a
feedback loop on purpose. Yes, in part because users are
(39:26):
much more likely to share posts that have already been
liked by other people they know, even if those posts
are hateful, bigoted, bullying, or contain an accurate information. So
if somebody gets the same post in two different ways,
if they just like see a bigoted article pop up
on their on their their Facebook feed, but they're not
being informed that other people they know have liked it
or shared it, they're less likely to share it than
(39:47):
if like, well, my buddy shared it. So maybe now
I have permission, right, and you can think about how
this happens on like a societal's level, how this has
contributed to everything we're dealing with right now. Um, so
I feel like everyone knows someone who has probably was
very very influenced by that exact function. That's yeah, God,
that's awful. So company researchers in September like these research
(40:08):
aggregations the fact that we don't have to do it
this way, right, we can only show people that the
articles that their friends comment on at the very least
as supposed to just like or just share without much commentary. Like,
they have a number of options here that could at
the very least reduce um um harmful content because when
every like I think the numbers like three times as
likely to share content that's presented to them in this way.
(40:30):
So in May of that year, um, while you know,
so this is actually months before Facebook researchers find this out.
Myself and a journalist named Jason Wilson published what I
still think is the first proper forensic investigation into the
Boogaloo movement. Um. It noted how the spread of the
movement and its crucial early days was enabled entirely by
Facebook's group recommendation algorithm, which was often spread to people
(40:53):
by these reshare aggregation units. You see, oh my buddy
joined this group where everybody's sharing these like Hawaiian shirt
photos and pictures of maybe I'll hop in there, and
you know, the cycle goes on from there. When you've
joined one group, it sends you advice like, hey, check
out this other group, check out this other group, and
it starts with like we're sharing memes about like Hawaiian
shirts and and you know, the boogaloo. And then five
(41:15):
or six groups down the line, you're making serious plans
to assassinate federal agents or kidnap a governor. You know, yeah,
I mean that he's uh, you know, I remember where
I was when I read it, because the how steep
the escalation is and how quickly it like, it's not
I guess not completely surprising, but at the time I
(41:36):
was like, oh, that that is a very short timeline
from yeah, so I don't like it. It's not good.
It's not good, and in fairness, um, there are actually
some face like it kind of becomes accepted the stuff
that Jason and I were writing about in May by
a lot of Facebook researchers around September of that year.
(41:57):
But there were people within Facebook who actually tried to
blow the it on this earlier um in fact, significantly earlier.
In mid two thousand nineteen, an internal researcher created an
experimental Facebook account, which is something that like certain researchers
would do from time to time to see what the
algorithm is feeding people. This experimental account account was a
fake conservative mom, and they made this acount because they
(42:18):
wanted to see what the recommendation algorithm would feed this account.
And I'm gonna read from BuzzFeed again here. The internal research,
titled Carol's Journey to Q and On, detailed how the
Facebook account for an imaginary woman named Carol Smith had
followed pages for Fox News and Sinclair Broadcasting. Within days,
Facebook had recommended pages and groups related to Q and On,
the conspiracy theory that falsely claimed Mr Trump was facing
(42:39):
down a shadowy cabal of democratic pedophiles. By the end
of three weeks, Carol Smith's Facebook feed had devolved further.
It became a constant flow of misleading, polarizing, and low
quality content. The researcher wrote, Yeah, how did so so
some some some jerk was like, let's call it Carol,
like they Carol stereotype. Statistically not unlikely that it could
(43:02):
It could have been a Carol that is. I mean
that that's interesting that we also all know a Carol.
A Carol. Yeah, unfortunately. So they're in They're in the
they're the Duncan Donuts drive through. I I walk among them. Yeah,
they live among us. They are dating you walk among them.
That's so funny. You eat hot dogs next to all people. Yeah,
(43:28):
we're in mine. I think that hot dog eaters are
maybe more politically astute bunch. But the Dunkan Donuts line
is just absolutely unmitigated chaos. There could be the politics
of the Dunkey Donuts line are all over the place.
They are they are anarchists. In the line. You have
you have the scariest people you've ever met in the line.
(43:49):
You have been Affleck in the line looking like his
entire family just died in a bus crash, but Afflex
in the line and you can see his little dragon
back piece. Oh my god. So Phoenix, Jamie, come on,
come on, sorry, that was disrespectful. That was disrespectful. And
you're right and you're right, thank you, thank you. So
(44:10):
this this study with this fake account that immediately gets radicalized.
This study, uh we it comes out in the Facebook
papers right this year, but it was done in two
thousand nineteen. And when this year, like information of this
dropped and journalists start writing about it, um, they do
what journalists do, which is you you, you put together
your article and then you go for comment. Right. And
(44:31):
so the comment that Facebook made about this experiment that
this researcher didn't two thousand nineteen was well, this was
a study of one hypothetical user. It is a perfect
example of research the company does to improve improve our
systems and helped inform our decision to remove Q and
on from the platform. That did not happen until January
of this year. They didn't ship for two years after this.
(44:53):
They only did ship because people stormed the fucking capital
waving Q and on banners. You motherfucker's sounds like them,
sound like that. They're like, Oh, let's wait until things
get so desperately bad that the company will be, you know,
severely impacted if we don't do something. The huge amount
of the of the radicalization Q and on gets supercharged
by the lockdowns, right because suddenly all these people, like
(45:16):
a lot of them, are in financial distress, they're locked
in their fucking houses, they're online all the goddamn time.
And they knew they could have dealt with this problem
and reduced massively the number of people who got fed
this poison during the lockdown. If they've done a goddamn
thing and do that's a nineteen. They had the option,
they did not. Yeah, okay, so no surprises here that
(45:38):
researcher side. Nothing bad happened, right, It did not name
one did nothing happened? Well, all happened actually and was
pretty heavily tied to this. Uh. In August of that
researcher left the company. She wrote an exit note where
she accused the company Facebook of quote knowingly exposing users
(45:58):
to harm. We've known for over a year now that
our recommendation systems can very quickly lead users down a
path to conspiracy theories and groups. In the meantime, the
fringe groups slash set of beliefs has has grown to
national prominence, with q and on congressional candidates and kean
on hashtags and groups trending on the mainstream. Out of fears,
out of fears over potential public and policy stakeholder responses,
(46:21):
we are knowingly exposing users to risks of integrity harms.
During the time that we've hesitated, I've seen folks from
my hometown go further and further down the rabbit hole.
It has been painful to observe. Okay, okay, and I
mean it is I mean, no arguments there. It is
very painful to observe that happen to people who are
(46:43):
not Yeah, that's a Facebook employee who's not going to
get any ship from me. Um. She identified the problem,
she did good research to try to make clear what
the problem was, and when she realized that her company
was never going to take action on this because it
to produce their profits, she fucking leaves. And she she
(47:03):
does everything she can to let people know how unacceptable
the situation is within the company. You know, I mean
that is good that's what that is the minimum. That
is the minimum, right, Yeah, I mean it is a
little silly to be like, and I just recently realized
that I don't think Facebook is ethical. You're like, like,
shut up, no way, I don't. I don't know when
(47:23):
this person started, but like, yeah, there they she got
there and she's clearly horrified by what she like realized
the company was doing. Like again, we've all been where
she is. Where you just see these people you grew
up with lose their goddamn minds and it's bad. It's bad.
And you're like, oh and I'm I'm and I work
in nucleus at the problem of the problem. Interesting. Yeah,
(47:47):
I mean that's why I had to quit working at
Preduce Pharmaceuticals. Yeah. I do miss the freedom, la. I know,
I know, I know, I know all those I was
a great salesman. You're so good plays a produced pharmaceuticals salesman.
Or no, maybe it's that Will Poulter. Oh yeah, okay.
You know who didn't is Alec Ginnis. Because he never
(48:08):
loved He never lived to see opiate's become what they
are today. Tragic. He never lived to taste the sweet
taste of tramadol or delatted and we don't talk about
that enough. We don't talk about that enough. What a shame?
What heartbreaks? Say I will what if Alec Ginnis had
access to high quality, pharmaceutical grade painkillers an essay? I
(48:30):
think it would have been sweet. I'm pitching it. I'm
pitching it. Okay, Um, someone who's better at movies than
I could have made a train spotting joke there because
he and McGregor in the Heroin movie than he played on.
I don't know, there's some joke there, but I didn't
come up with it. Okay. Someone Yeah, someone figure that
out in the reddit, and then don't tell us about it.
(48:51):
Do not tell us because I've never seen train spotting.
I'm just aware that woggin. Yeah, I haven't seen it,
but like, you know what it's about. I'm so really yeah,
I know what it's about. I also get so stressed
out when I anytime it's not often, but anytime I've
had to say even McGregor's name, I say, it's so weird.
It's the worst thing in the world. Saying when McGregor's
(49:12):
name is the most frightening experience a human can have.
I can't. I can't make my mouth make that shape.
It's embarrassing. I think what he has to live with.
Thankfully he's gorgeous. That must make it easier. Yeah, I
mean being sexy has to help. It has to help. Yeah.
You know who does know how to pronounce Ewan McGregor's
name and never feels any anxiety over it because their
(49:35):
friends they hang out on the weekend? Oh nice? Who
who's at the products and services that support this podcast
are all good buddies with Ewan McGregor hangs out with
the highway patrol. Ah, we are back. As the election
(50:01):
grew nearer, disinformation continued to proliferate on Facebook, and the
political temperature in the United States rose ever higher. Facebook
employees grew concerned about the wide variety of worst case
scenarios that might result if something went wrong with the election.
They put together a series of emergency break glass measures.
These would allow them to automatically slow or stop the
formation of new Facebook groups if the election was contested.
(50:25):
This was never stated, but you get the feeling. They
were looking at Kenosha and how Facebook groups had led
to spontaneous and deadly militia groups to form up due
to viral news stories. My interpretation is that they were
terrified of the same sort of phenomenon that it would
lead to Facebook like fueling a civil war. Like I
think that we're literally worried that like something will go wrong,
people will form militias on Facebook and there will be
(50:45):
a gun fight that a shipload of people die, and
that escalates to something worse, and everyone will say it
started on Facebook because that happened in Kenosha, Like, this
is not a thing. It happened with the Boololoo stuff
after Yeah, it happened several times last year at a
fair anxiety. We've seen it happen. Yeah, it was never
a massive exchange of gunfire, thank fucking god. Um, but
they were they thought they saw that possibility. This is
(51:07):
the thing. I was worried about the entirety of um um,
and we got really close to it several times. I
was there for a few of them. It sucked. UM.
So they're worried about this and they start coming. They
try to figure out like break like emergency measures they
can take to basically like shut all that ship down,
like stop people from joining and making new Facebook groups.
If they have to write if like it becomes obvious
(51:30):
that something needs to be done. Um and yeah, so
they they uh yeah. In September, Mark Zuckerberg wrote on
an internal company post that his company had quote and
obligator had quote, a responsibility to protect our democracy. He
bragged about a voter registration campaign the social network and funded,
and claimed they'd taken vigorous steps to eliminate voter misinformation
(51:52):
and block political ads, all with the stated goal of
reducing the chances of violence and unrest. The election came,
and it it all right. From Facebook's perspective, the whole
situation was too fluid and confusing in those early days
after the election. You know where we're getting the counts in.
Everything's down to the fucking wire. It was all too
messy for there to be much in the way of
violent on the ground action to occur. Wow, like that
(52:14):
was getting sorted out because people just didn't know where's
it going to land. Um, so they think, oh, we
dodged a bullet. Everything was fine because they're dumb. Oh baby. Yeah.
The reality, of course, is that misinformation about election integrity
spread immediately like wildfire. On November five, one Facebook employee
warn't colleagues that disinformation was filling the comments section of
(52:36):
news posts about the election. The most incendiary and dangerous
comments were being amplified by Facebook's algorithm to appear as
the top comment on popular threads. On November nine, a
Facebook data scientists reported in an internal study that one
out of every fifty views on Facebook in the United States,
and fully ten percent of all views of political information
on the site, was content claiming the election had been stolen.
(52:59):
Temper into Facebook's political posts are the election was stolen? Yeah,
one out of fifty views on Facebook is someone saying
the election stolen. This ship is out of controlsutably, everyone
engaging to agree. Wow, Okay, I honestly, honestly, I would
have guessed that it would have been higher. But but
one intent is still there's a lot going on on Facebook. Yeah.
(53:21):
The researcher noted there was also a fringe incitement to violence.
And I would quibble over the word fringe because I
don't think it was very fringe. Um, But otherwise the
data is interesting, you know, like, there's a lot of
ten people. It's not the fringe. He's saying. The incitement
to violence was a fringe of the posts claiming the
election had been stolen. I disagree with his interpretation of
(53:44):
that based on my own amount of time that I
spent looking through these same posts. But whatever, maybe we're
looking at different posts. You know, there's a lot going
on on Facebook. In those days. Facebook did not Yeah,
Facebook did not blow the whistle or sound the alarm
or do anything but start canceling. It's some agency procedures.
They were like, we get it. The critical periods over.
Everything's gonna be fine and dandy, baby. Um. They thought
(54:08):
the danger of post election violence was over. And most
of all, they thought that if they took action to
stop the reach of far right propaganda users, then conservatives
would complain. As we now know, the most consequential species
of disinformation after November two would be the Stop the
Steel movement. The idea behind the campaign had its origins
in the election, as essentially a fundraising grift from Roger Stone.
(54:30):
Ali Alexander, who is a ship head, adapted it in
the wake of the election, and it wound up being
a major inspiration for the January sixth Capital riot. As
we now know from a secret internal report, Facebook was
aware of the Stop the Steel movement from the beginning.
Quote from Facebook. The first stop the Steel group emerged
on election night. It was flagged for escalation because it
(54:51):
contained high levels of hate and violence and incitement V
and I and the comments. The group was disabled and
an investigation was kicked off looking for early signs of
coordination and harm across the news stop the Steel groups
that were quickly sprouting up to replace it. With our
early signals, it was unclear that coordination was taking place
or that there was enough harm to constitute designating the term.
It wasn't until later that it became clear just how
(55:13):
much of a focal point the catchphrase would be, and
that they would serve as a rallying point around which
a movement of violent election delegitimization could coalesce. From the
earliest groups, we saw high levels of hate, v and
I and de legitimization combined with meteoric growth rates. Almost
all of the fastest growing Facebook groups were stopped the
Steel during their peak growth. Because we were looking at
(55:33):
each entity individually rather than as a cohesive movement, we
were only able to take down individual groups and pages
once they exceeded a violation threshold. We were not able
to act on simple objects like posts and comments because
they individually tended not to violate even if they were
surrounded by hate, violence and misinformation after the Capital insurrection. Yeah,
that is such garbage. I mean, it's like, and I
(55:56):
know that you have examined far more of these posts
than I have in depth, but it's just the fast
and looseness that people interpret like just it's like free
form jazz of the way people interpret Facebook community rules.
Because I feel like in groups like that and in
groups like less inflammatory than that, there is just constant
(56:17):
breaking of the Facebook community rules. It's just uh, it's yeah, yeah, yeah,
they're not really moderated at all. Um. Yeah, So this
is interesting to me for a few reasons. For one,
it lays out what I suspect is the case these
researchers and Facebook employees needed to believe and be able
to argue in order to not hate themselves. Um, the
idea that like, we just didn't recognize it was coordinated.
(56:39):
We thought it was all it was. It was all
kind of grassroots and happening like organically, and so it
was much more complicated for us to try to deal with.
I think they need to believe this. I'm going to
explain why it's not a good excuse. Starting in December two,
nineteen and going until May, the boogleoo movement expanded rapidly
in Facebook groups. Incitements to violence semi regularly got groups nooped,
(57:00):
and members adapted new terms in order to avoid getting
deep platformed. It became gradually obvious that a number of
these groups were cohesive and connected, and this was revealed
throughout the year in a string of terrorist attacks by
boogleoo types in multiple states. When these attacks began, Facebook
engaged in a much more cohesive and effective campaign to
ban boogaloo groups. The Boogaloo movement and Stop the Steel
(57:21):
are of course not one to one analogs, but the
fact that this occurred earlier in the same year, resulting
in deaths and widespread violence, shows that Fate's book fucking
knew the stakes. They could have recognized what was going
on with the Stop the Steel movement earlier, and they
could have recognized that it was much likely more cohesive
than it may have seemed. A decision was made not
to do this, not to act on what they had
(57:42):
learned earlier that year, and I would argue, based on
everything else, we know that the reason why this decision
was made was primarily political, like they didn't want to
piss off conservatives, you know. Yeah, I mean, and that's
that is like a criminal level of negligence. I would argue,
that's leaving a did gun with a six year old,
you know yeah, and being like, well, I was pretty
(58:04):
sure it had a safety. Yeah. I just I'm like
they like, God, there's just I miss when they were
radicalized by farm bill. Yeah, white supremacy. Yeah. So my
critiques aside, this internal report does provide us with a
lot of really useful information info that would have been
very helpful to experts seeking to stop the spread of
(58:26):
violent extremism online if they had had it back when
Facebook found it out. So it's rad that Facebook absolutely
never intended to release any of this. Isn't that cool?
And that they were never going to put any of
this out? There's like really useful. Dad. I have a
quote in here. I don't think I'll read it because
it's a bunch of numbers and it's only really interesting
to nerds about this, but about like how many of
(58:47):
the people who get in to stop the steel groups
come in through like invites, and like how many people
are actually responsible for the invites. What the average number
of invites for a person is? Like, it's really interesting stuff.
I'll have the links for it in there. You can
read these into and on Facebook post. But like you
know what, I'll read the quote. Stop the Steel was
able to grow rapidly through coordinated group invites. Sixtent of
(59:08):
stop the Steel joints came through invites. Moreover, these invites
were dominated by a handful of superinviterstent of invites came
from just three point three percent of inviters. Invits also
tended to be connected to one another through interactions. They
comment on tag and share one another's content. Out of
six thousand, four hundred and fifty high engagers, four thousand
and twenty five of them were directly connected to one another,
(59:31):
meaning they interacted with one another's content or messaged one
another when using the full information cord or seventy connected
to one another. This suggests that the bulk of the
stop to steel application was happening as part of a
cohesive movement. This would have been great data to have
in January, right, That would have been really good to know. YEA.
That is speaking as a guy who does this professionally.
(59:53):
That would have been great to have. But this is
all just internal Like, okay, so we know, so we
know this, let us never to speak of it again.
Problem is, yeah, now we'll deny it to anyone who
alleges this while we have this excellent data that we
will not hand out because we're pieces of ship. Yeah yeah,
(01:00:14):
called January six. Facebook employees were as horrified as anyone
else by what happened in the capital. This horror was
tweaked up several degrees by the undeniable fact that their
work for the social network had helped to enable it.
Was that it was their fault. Yeah, in that Yeah,
in the same way that like when I finish having
a gasoline and match fight in my neighbor's house and
(01:00:34):
then inexplicable tragedy ensues, I can't help but feel somewhat responsible,
you know. Wait, hold on, I'm feeling this vague melan
and I know it's not my fault. Don't worry. I
know it's not my fault. Um, but I feel that way. No,
you just lit a match. I mean, I think we
can hold the match accountable. The match was responsible, the match,
the neighbor for having a house A lot of people
(01:00:55):
are to blame here them, That's on them exactly. So
they're all horrified. Everybody's horrified. Much of the riot itself
was broadcast in a series of Facebook live streams as
Mike Pince's secret service details scrambled to extricate him from
the Capitol grounds. Facebook employees tried to enact their break
the Glass emergency measures, and originally conceived for the immediate
(01:01:16):
post election period, this was too little, too late. That evening,
Mark Zuckerberg posted a message on Facebook's internal messaging system
with the title employee f y I. He claimed to
be horrified about what had happened and reiterated his company's
commitment to democracy. Chief technology officer Mike Schrepfer, one of
the most internally respected members of Facebook's leadership, also made
(01:01:36):
a post asking employees to hang in there while the
company decided on its next steps. The theme from the
Trolls song like what Jeffrey Tatsenberg fired the entire hang
in their folks, hang in there, you're hanging in there.
Here's an amazing song by misss Anna Kendrick. That's right,
(01:01:57):
that's what he did. Uh. So he tells them us
uh and an employee responds, we have been hanging in
there for years. We must demand more action from our
leaders at this point, faith alone is not sufficient. Another
message was more pointed, all due respect, but haven't we
had enough time to figure out how to manage discourse
without enabling violence. We've been fueling this fire for a
long time, and we shouldn't be surprised it's now out
(01:02:18):
of control. The Atlantic. Yeah, fair, the Atlantic the poster
with a little kitten hanging from the branch that says
we have been hanging in there for years? Yea? Or
how about we have been fueling this fire for a
long time and shouldn't be surprised now it's now out
of control. We could do that with the this is
fine meme. Guys sitting in the fire. This is on us.
(01:02:40):
You know, we shouldn't be This isn't surprising. We had
ample warning of the fire. But hang in there, Hang
in there, kiddos. So I think The Atlantic has done
some of the best reporting I found on this particular
moment when Mark and shreppor get up and like say,
don't we like hanging there? We love democracy and like
people go off on them January six, like there are
(01:03:03):
people were like a little more open about the frustration
they felt about all this stuff then and then they
stopped being that open um. It's frustrating, and everybody's treating
these people with kid loves whatever. The Atlantic has done
really good reporting on this, on this exact momentum, which
seems to have been kind of a damn breaking situation
for unrest within the company. One employee wrote, what are
(01:03:23):
we going to do differently to make the future better?
If the answer is nothing, then we're just that apocryphal
Einstein quote on the definition of insanity. Another added to
Mike shrep for please understand, I think you are a
great person and genuinely one of the good people in
leadership that keeps me here. But we cannot deal with
fundamentally bad faith actors by using good faith solutions. Um,
(01:03:44):
that's a good way to but yeah, yeah, I would
also like democratic leadership to know that. But well, let's
not set the bar too high. They're they're not going
to figure that out now. In the wake of January six,
an awful lot of people me included exhaustively documented Facebook's
contribution to the attack and criticized company for enabling political
violence on a grand scale. The company responded the way
they always respond with lies. Mark Zuckerberg told Congress in
(01:04:08):
March that Facebook quote did our part to secure the
integrity of our election. Cheryl Sandberg, boss girl and a
chief operating officer for the company, claimed in mid January
that the Capitol Yeah claimed in mid January that the
Capitol riot was quote largely organized on platforms that don't
have our abilities to stop hate. I mean, Robert, first
of all, as you know, as a big Sandberg advocate,
(01:04:31):
you can't talk about her that way because she told
women that they should negotiate their own salary. You fucking loser.
Did you ever think about negotiating your own salary? Fucking
dweeb dollars? And I love that. I do love that too.
That's my favorite thing that she did. So Sandberg is
a lot smarter than Mark Zuckerberg, and her statement was
(01:04:53):
the very clever sort of not technically a lie that
spreads a lot more disinformation than just a normal lie. Whatever, man,
Because it's technically true that more actual organizing for the
riot was done in places like Parlor as well as
more private messaging apps. But it's also a lie because
the organizing of the actual movement that spawned the riot
to stop the steel ship that was almost entirely Facebook.
(01:05:14):
So like, yeah, people didn't like go and open Facebook
groups and do like most of the people didn't like
go in there and be like, Okay, we're doing a caravan,
although some people did and we have quotes of that. Um,
a lot of that happened in other apps, but like
the overall those other apps if they had not first
been on Facebook. No, yeah, that's what I'm saying, Like,
that's why it's smarter than her, because Mark Zuckerberg is
(01:05:37):
just lying to Congress, um, because they didn't Cheryl Sandberg
is being very intelligent um, and also kind of backhandedly
complimenting Facebook in its hour of most blatant failure within
the United States. At least not most blatant failure. That
would be far. I mean, check the date that year, listen,
(01:05:57):
you know, check the date record at this we may
have had an ethnic cleansing enabled by Facebook by the
time this episode drops. Yeah, as of this recording of
the non ethnic cleansings and also the things that have
been done in the United States that were worse than
Facebook did, this tops the list, does not top the
list of their overall crimes, which include the deaths of
tens of thousands us. True, ye, good stuff, good stuff.
(01:06:22):
I just I felt like the need to celebrate I do.
Oh those were nice. Those are nice. My crush used
to send me those in high school. And now that
website is responsible for the deaths of tens of thousands
of people. Yeah, it's it's good stuff. So what I
find so utterly fascinating about the leaks we have of
Facebook employees responding to their bosses on the evening of
(01:06:43):
January six is that it makes it irrevocably, undeniably clear
that Zuckerberg and Sandberg and every other Facebook mouthpiece lide
when they claimed the company had no responsibility for the
violence on January six, The people who worked for Facebook
on the day it happened immediately blamed their own company
for the arnage quote. Really do appreciate this response, and
(01:07:03):
I can imagine the stress leadership is under, But this
feels like the only avenue where we opt to compare
ourselves to other companies, rather than taking our own lead.
If our headsheets shocked, if our headsets shocked someone, would
we say, well, it's still much better than PlayStation VR
and it's unprecedented technology, which I felt otherwise, but it's
simply not enough to say we're adapting, because we should
have adapted already long ago. The atrophy occurs when people
(01:07:25):
know how to circumvent our policies and we're too reactive
to stay ahead. There were dozens of Stopped the Steel
groups active until recently, and I doubt they minced words
about their intense intentions. Again, hugely appreciate your response and
the dialogue, but I'm simply exhausted by the weight here.
We're at Facebook, not some naive startup. With the unprecedented
resources we have, we should do better. Yeah, but I've
(01:07:47):
that's like part of the Zuckerberg eathos is to continue
to behave like he's fast and yeah or well or
more iconically, Um what if what is the quote that's
on my shirt? Um? Oh and that's how I live
(01:08:09):
my life? Ha ha the key And he's still like, yeah,
I mean, and I feel like that is whatever. I'm
sure that does say so much about him, because like
a good person can say, you can break the law
and not be unethical, and that's how I live my life.
And that's fine because the laws generally trash. Zuckerberg is
(01:08:30):
specifically saying, I get to be a piece of ship
as long as I don't technically break the law. And
because I have money, I'm never technically breaking the law.
Is that not sweet? And then don't forget ha ha
ha ha. You can be an ethical and still be illegal.
That's the way I live my life. Ha ha. I'll
never forgaw, I'll never, never, never forget. I'm getting that
tattooed right above my comeback with a warrant tramp stamp.
(01:08:54):
Yeah and yeah and then and I would also recommend
getting it on on the other side, right next to
your phoenix that I know you're you're planning out. Oh
full back. It's actually gonna be a perfect replica of
the tattoo that Ben Affleck has. And then over my
chest a perfect photo realistic tattoo have Ben Affleck picking
up Duncan Donuts and looking like he's just watched his
(01:09:16):
dog shoot itself. No, I don't I like appreciate his
devotion to Duncan Donuts. I don't know how he I mean,
I guess he's just tired, because I'm like, I don't
look that way at Duncan dunt picture I've seen of
myself at Duncan Donuts. I look so happy to be there.
How could you not be thrilled? I don't know. I
don't know. I mean, i'd say it's Boston, but you're
from Boston, right, I'm from Boston and I'm so happy
(01:09:37):
to be there. Yeah. People keep saying, No, he doesn't
look miserable. He just looks like he's from Boston, and
I think he just as to that. I don't go
to other girls, Robert. I don't know if you know that.
I'm not like other girls. So I'm I'm happy at
the Duncan Donuts. Okay, okay, fair enough? Um so yeah,
I don't know. At the beginning, I talked about the
(01:09:58):
fact that I have said I think working for Facebook
is an a moral activity today given what's known. That said,
there were some points made during this employee bitch session
that do make me kind of hesitant to suggest employees
just bounce from the company. In mass quote, please continue
to fight for us, shrep. This is the person talking
to the I'm sorry you care you're hearing. I know,
(01:10:22):
I know, I know. Facebook Engineering needs you representing our
ethical standards in the highest levels of leadership unless uck,
what's is products built by a cast of mercenaries and ghoules.
We need to employ thousands of thoughtful, carrying engineers, and
it will be difficult to continue to hire and retain
them on our present course. That's not a terrible point, Okay,
that feels like a half step. Yeah. Yeah, It's one
(01:10:45):
of those things where like, on one hand, it is
bad to work for a company that's entire job is
to do harm at scale, which is what Facebook does.
On the other hand, if they are replaced by people
who don't have any ethical standards at all, that also
probably isn't great. Now I don't agree with that, Yeah,
it's it's complicated because, like I guess you could argue that, like, well,
(01:11:07):
if all of the good engineers leave and they have
to hire the ghouls, like, it'll fall apart eventually. And
I guess it's the question of like when does the
damage done by Facebook, like fading in popularity hopefully eclipse
the damage done by the fact that everyone working there
is the blackwater equivalent of a guy coding a fucking algorithm.
Like I don't know, it's yeah, it's whatever, it's a
(01:11:30):
it's just something to think about. I guess, um, yeah, yeah,
Facebook is having a hiring crisis right now. I think
it's gotten a little better recently. Um, but they've had
massive shortages of engineers and failed to meet their hiring
goals in two thousand, nineteen twenty. I don't know if
they're gonna if they have, if it's gotten better this
year or not. Um, I don't know how much any
of that's gonna like help uh matters. I don't know.
(01:11:52):
It seems unlikely that anything will get better anytime. I mean, yeah,
I think the real solution is to make the company
run by even worse who are less qualified that I
don't know. That doesn't that doesn't sound great either. I
don't know. I think there are volcanoes and that probably
has part of the solution to the Facebook problem in it.
There you go cast their servers into the fires. So
(01:12:15):
Mark Zuckerberg and his fellow Nerd horsemen of the Apocalypse
have basically built a gun that fires automatically into a
crowd called society every couple of seconds. The engineers are
the people who keep the gun loaded. It's good, yes,
So the engineers they keep the gun loaded, but also
sometimes they jerk it away from shooting a child in
the face, and if they leave the bull, the gun
(01:12:37):
might run out of bullets. But it's just going to
keep shooting straight until the crowd until that point. So
maybe it's better to have engineers. Yeah, I had to
end with a metaphor um. I don't know, it's complicated
whatever I want to end this episode. Yeah, it's it's
just a mess. It's a messy thing to think about.
We should never have let it get this far. We've
been putting fuel on the fire for a while. We
(01:12:58):
shouldn't be surprised that it's burning every thing. Truly, it
has become It feels like it is slowly becoming, just
like an annual document drop of like, yeah, things have
steadily gotten worse. Yeah, the Hell Company is pretty shitty here. Yeah,
Hell Companies bad stuff at Nightmare Corp. Bad. It is
(01:13:22):
really bad. It's funny how bad it is. I want
to end this episode with my very favorite response from
a Facebook employee to that message from ct O Shrepp
Facebook employee, if you happen to be listening to this episode,
please hit me up because I love you. Here it
is never forget the day Trump wrote down the escalator
in two thousand fifteen called for a ban on Muslims
(01:13:42):
entering the US, and we determined that it violated our policies,
and yet we explicitly overrode the policy and didn't take
the video down. There was a straight line that can
be drawn from that day to today, one of the
darkest days in the history of democracy and self governance.
Would it have made a difference in the end, We
can never know, but history will not judge us kindly. Wow. Yeah, yeah,
(01:14:02):
they know what they're doing. They know exactly what they're doing,
and we get an annual reminder, we get an annual
little Pelican summit drop of documents saying that no, they
still know what they're doing. They have not forgotten what
they're doing. I might suggest that overtake. Are we the
batties as like the moment you know things need to change?
(01:14:24):
When when you're like, boy, I think history might judge
me from my employment decisions. I think I may be
damned by like the historians of the future when they
analyze my role in society. Right, And it's like, if
you're at that place, Uh, that's not good. That we've
passed the point of no return. You know, like five
(01:14:45):
years ago with with Facebook, it's just good Laurd. I
mean yeah, and I and I do like applaud the
whistleblowers and the people who are continuing to drop documents.
And at this point it also does just feel like,
you know, getting punched in the face repeatedly because it's like, well,
I'm glad that there is the paper trail, I'm glad
that there's the evidence, but it's who at this point
(01:15:05):
is surprised, like there's like when people it's whatever. I mean, technically,
I think by the definition of the word, these are revelations,
but they're also very much not They're just confirmations of
things that appeared very obvious based on the conduct of
this company already. Yep, yeah, Jamie, do you have any
plugables for us? Do you do you perhaps have a
(01:15:28):
Facebook owned Instagram meta thing? It don't We don't need
to call it meta. I I think no one needs
to call it what. There's only one meta and it's
a metal world piece. Wow. Basketball player basketball player Ronald
has changed his name a matter world Piece, like many
years ago, so he did the meta first. Okay, alright,
(01:15:51):
the metaverse was already taken. I you can you can
listen to my podcast. I got a bunch you listen
to Bechtel Cast, My Year and Men to Act Castle
Lead to podcast or none. I won't know. I'll know
in so actually Sophie will know. So maybe you'd better
listen to them, or I'll lose my livelihood, which would
(01:16:12):
be which would be interesting? Uh? You could follow me
on Twitter Jamie Lofts Help, or Instagram at Jamie christ Superstar,
where I bravely continued to use uh Zuckerberg's tools of
having Yeah, isn't it funny Jamie that we call it
our livelihood, which is just a nicer way of saying
(01:16:33):
our ability to not starve to death in the street. Yeah? Yeah, whoever,
whoever back in the day rebranded survival to livelihood. Really
real genius, sleight of hand. Incredible. This is why I
think we should have a program in schools where we
determine which kids are going to be good at marketing,
(01:16:54):
and then we shipped them to an island, and we
don't we don't think about it after that point into
the island. If they go, it's a nice island, like
a good one, like a solid island. Marketing the island
you're marketing to the island. That's good. I think we
put them on an island and we divert all of
the world's military forces to making sure that nothing gets
to that island that isn't a marketer or leaves it,
(01:17:15):
and then make sure that and and make fucking sure
there's no WiFi signal on the Oh good god. No,
would absolutely not absolutely fun. Once a day they can
watch Shark Tank together, but that's all they're getting. Oh,
that's sounds that actually kind of sounds nice living on
an island and watching one episode of Shark Tank day.
That's like, that's my like dream lobotomy. That's kind of nice. Yeah,
(01:17:38):
well that's the episode, Paul Robert Jamie. How are you doing?
How are you feeling? I feel like, you know, I
am feeling just kind of a low thrumming of despair,
but but I have good Yeah, that's right, that's how
you're supposed to feel. I have felt worse at the
end of this show, which I don't know if that
says more about like my threshold for despair or you know,
(01:18:00):
happens to be a coincidence, but you know, I'll say
I'm hanging in there. But also I've been hanging in
there for years. Yeah, that's all you can do is
hang in there. Yeah, keep hanging in Are you hanging
in there? Robert allegedly Yeah. I mean, by all accounts,
you're hanging in there. But like internally, there's no who
could say no. No, I'm I'm, I'm. I'm as unmoored
(01:18:21):
and adrift as the rest of us are in these
I'm gonna when I visit, I'm going to show you
pin and I really think you're gonna like it. Oh good, Okay, Well,
how's