Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
All Zone Media. Hello, and welcome to Better Offline. I'm
your host ed ze Tron. Over the last decade, few
platforms have declined quite as much as Facebook and Instagram.
(00:25):
What used to be apps for like catching up with
your friends and family and seeing things from bands you
might like and now pretty much algorithmic nightmares, and they
interrupt you constantly with suggested content and advertisements that far
outweigh the stuff you actually logged onto the platform to see. Conversely,
those running Facebook groups or pages routinely find that their
content isn't even being shown to people that follow them,
(00:49):
and that's all thanks to Meta's basically abusive approach to
social media. The customer is not only always wrong, but
they ideally will have little to no control over the
things that they see on the platform that you monetize
them on. It's all very frustrating, and in the next
two episodes, I'm going to walk you through the decline
of these platforms, in particular Facebook and Meta of course
(01:12):
as a company, and I'm going to start today with
the events that led to its decay and those I
believe are responsible for turning the world's most popular consumer
apps into skinner box or advertising agencies. In this episode,
I want to frame Facebook's to cline around two specific
features people you may know, which is an innocent yet
(01:33):
kind of horrible feature that really explains itself, and the
news feed, which is the central hub of information on
the platform, one that's become so algorithmically charged that it's
far more about what Facebook wants to show you than
any conscious action by a user. And it's my belief
that these products, piloted by the deeply insidious people led
by Mark Zuckerberg himself, are central to the darkness that
(01:56):
has enveloped this company. But I want to be clear,
none of this is really new. It's a common trope that, oh, well,
Facebook went downhill because they went public, which is partially true,
don't get me wrong, But I really want to be
clear that the rot began early. Facebook has always been
(02:17):
a rotten, manipulative company, and they've always acted very little
regard for their users, even at times when they pretended
to do so. The goal's always been more, not just
more money, but more engagement, more time on app, more
visits to the website, perpetual growth, all mandated by Mark
Zuckerberg himself, and a rogue's gallery of rot economist assholes.
(02:41):
But as ever, I need to give you a little
history lesson, so we need to go back in time.
We start our story in two thousand and four, a
few months after the company was founded, when Peter Seal
became Facebook's first investor, and he put five hundred thousand
dollars into the company and got an astonishing ten point
two percent steak. As you might have seen in twenty
ten's The Social Network by Aaron Sorkin, Teel was introduced
(03:04):
to Zuckerberg by Sean Parker, the co founder of Napster
and if you're a little younger than I am, that
was a website he used to use to illegally download music,
and he was fresh from a beat down by Lars
Auric and Metallica, who again, if you're younger than me,
much younger, I guess. They sued Sean Parker, who was
running Napster, which was basically imaginative Spotify was just stealing everything. Anyway,
(03:27):
They sued him, he had to shut down the website,
made him very sad. But what was significant about this
event when Sean Parker introduced Teal to Mark Zuckerberg wasn't
really the people involved the amount of money, but the
terms of the deal, terms that would permanently and inextricably
ruin Facebook forever. Parker, ever the advocate for founders, He
(03:48):
negotiated with Peter tee Or to allow Mark Zuckerberg to
retain two of Facebook's five board seats. When Parker resigned
in two thousand and five, he insisted that Mark Zuckerberg
be given his seat. Just to be clear, that's three
out of five board seats for one guy unilateral power.
While this may have seemed no paul and maybe even
(04:09):
ideal for a founder in these early days, this single
move has allowed Mark Zuckerberg to wield complete power over Facebook,
which I know is now called Meta. And it was
the first step along a road that made him completely
impossible to fire, which is doomed to this company, to
this endless, miserable death march, all thanks to Harvard's most
(04:30):
quirked up white boy, Mark Zuckerberg. I really cannot express
how important this moment was. This was when the mistake
was made, albeit innocently, to keep Zuckerberg in power. I'd
be kind and suggest that Sean Parker didn't know what
Zuckerberg was like at this point, other than the fact
that he absolutely did and he should have known better,
or maybe it's more about Sean Parker just selecting people poorly.
(04:54):
I guess we'll never know. A few years later, in
April two thousand and six, Zuckerberg was once again able
to negotiate terms that kept him at the top, in
part thanks to Facebook's pretty strong revenue for the time
six million dollars a year in revenue, pretty impressive for
a social network, especially and at the time, by the way,
very early days, and the fact that he owned three
(05:15):
out of the five board seats and could do whatever
the hell he wanted allowed him to pretty much dictate
the terms. Venture capitalists were desperate to avoid missing out
in this company had rocket ship growth, and that the
time my space was actually a very successful business, and
thus what were they going to do try and negotiate
with the petty king? Absolutely not, They needed that cash.
(05:37):
In September two thousand and six, Facebook would launch the newsfeed,
a relatively unsophisticated chronological feed of your friend's status is
that is on some level kind of one of the
most important launches in software history, and it's a great
example of a good product that got completely bulldozed by
executives that don't really care about their users and want
(05:58):
to just use advertising in what algorithms to torture them.
The feed, which Facebook product manager Rugi sang Give described
as quite unlike anything you can find on the web,
led to an immediate revolt within Facebook's nine point five
million users at the time, as it effectively showed a
stream of literally every action you took on the platform,
which users found and I quote creepy and stalker esque,
(06:20):
according to a group made at the time called Students
against Facebook Newsfeed. For those who weren't around when it
launched or didn't use Facebook, which I don't blame you for,
this is all kind of hard to grasp. It would
show you literally every activity you and your friends did.
If you joined or left a group, or made a
new connection, everyone could see it. It was almost like
(06:42):
a pnopticon, except you were both the jailer and the prisoner. Somehow.
A few days later, Zuckerberg would apologize, saying that Facebook
and I quote really messed this one up, and that
Facebook spent two days coding non stop to add a
special privacy page that allowed users to choose which types
of stories went into the feed while this feature was
(07:03):
actually released. It's important to know that the rest of
the world might not have known what you were doing afterwards,
but Mark Zuckerberg and Facebook absolutely did. This was the
same year that Facebook opened to the general public. Had
previously been just for college students, then for certain organizations,
and I think then they had high schoolers. But then
they opened up to everyone. And they also turned down
(07:26):
a billion dollar acquisition offer from Yahoo and signed a
three year advertising deal with Microsoft. Ah ah who in
two thousand and six. Who was worth probacar rakavam? Anyway,
moving on yet, Facebook's rot really began in ernest in
two thousand and eight when a former McKinsey analyst, Yes,
now the vice president of Global Online Sales and Operations
(07:48):
at Google will be made chief operating officer. Her name
was Sheryl Sandberg, and she'd be the force that would
grow Facebook into one of the most evil companies alive.
Now that I've given you the basic history and the
(08:08):
basic terms here, you've got Mark Zuckerberg completely empower, cannot
be fired, never will be fired. And then you've got
a COO chief operating Officer, Sheryl Sandberg, former McKinsey, analyst
chief of staff under Larry Summers's Department of Treasury. A
real management consultant brain is in here, and it's only
(08:30):
going to get worse. According to an excerpt from Stephen
Levy's Facebook The Inside Story, sometime in two thousand and eight,
Facebook's growth had stalled somewhere around ninety million users, with
Zuckerberg telling Levy in an interview that his company had
hit a wall. Shamath Paliapatia, a former venture capitalist who
was at the time Facebook's VP of Platform and Monetization,
(08:51):
came to Mark Zuckerberg with an idea the Facebook should
focus on a new metric, monthly active users. This metric,
which is now a cornerstone of most tech companies growth metrics,
would be a measure of whether the users were actually
sticking around on the platform, and boosting this number would
mean that users were coming to Facebook and they were staying.
(09:12):
Please remember this metric. It's going to be important in
the second episode, an even more important right now. In
a meeting with Zuckerberg and Sandberg, Shamath was asked to
name it, and he came up with the term maus.
Samberg responded that they should just call it growth. At
the next board meeting, Shamath would present what Levy would
(09:32):
call aggressive growth techniques that would double or even triple
Facebook's user base and then use the platform itself as
an engine for growth. After a tepid response at the meeting,
Shamouth was given license to build a new growth team
from both within and outside of Facebook, one that would
eventually become pretty much the most evil super team in tech,
(09:54):
like the Avengers of Assholes, including Facebook head of Product
Naomi gLite and Alex Schultz, who go on to become
Meta's Chief Marketing Officer and VP of Analytics, along with
fellow teammate Javier olivan who replaced Samberg as chief operating
officer in twenty twenty two. All of these people, now,
by the way, have senior roles at the company. I
(10:14):
want you to remember that throughout this story, Glight, Olivin,
and Schultz are at the epicenter of almost every single
choice that Meta has made to put growth above the
user experience. Shamath wanted their team, which also included early
Facebook data scientists Danny Ferrante and Blake Ross, who previously
co created the Firefox web browser, to become and I
(10:35):
quote the growth circle, what Levy refers to as a
power center in the company, with special status and distinctive subcultures.
Not great, but he succeeded. Chamath succeeded. In two thousand
and eight, Facebook would launch a feature called People You
May Know, a seemingly innocent feature that would, as the
(10:57):
name suggests, suggest people that you might know on Facebook.
Yet this feature seemed a little too good at its job.
With reporter Kashmir Hill, who spent over a year investigating
the feature book Gizmoda between twenty sixteen and twenty seventeen,
Hill said that it mined information users didn't have control
over to make connections that they might not actually want
(11:19):
to make, such as suggesting patients find their psychiatrist or
outing a sex workers real name to her clients. Despite
doggedly researching People You May Know, He'll never really got
Facebook to explain how it works. Nevertheless, Stephen Levy was
able to get Shamath to reveal one horrifying detail. The
(11:40):
Facebook's growth team would take out Google ads on people's names,
targeting those who hadn't joined Facebook, with something called a
dark profile, a fake link that would suggest that somebody
was already using your name on Facebook, conning you into
going to the saite. This is absolutely goddamn disgraceful. It's
insane to me that more people know about this and
(12:01):
more people aren't angrier. And by the way, you're wondering
how they found out who the Facebook hold outs. Where
you wondering where that came from, Well, it's because they
would happily ingest your entire contact list, allowing them to
see which friends were on there and which weren't using
that data. And you may think edge you jump into
bloody conclusions. You just put in words in their mouth incorrect.
(12:25):
Now the CTO of Facebook guy called Boz Andrew Bosworth,
and I'll get to him later. He's already said, quite
literally that that feature was justifiable. But we continue. While
many might consider people you may know as a well
meaning feature, indeed a useful one for a social network,
the algorithm behind the service is powerful and sinister, and
(12:48):
it's capable of dragging up these distant relationships honed in
by both the data you give Facebook and other shadowy
sources that we may never know. I do, however, believe
there's another person responsible, a man named Lars Bakstrom, a
relatively unknown yet extremely powerful figure in Facebook's history. Backstrom's
(13:10):
LinkedIn notes that he built pymk people you may know,
and he built its back end infrastructure and machine learning
system starting from September two thousand and nine through February
twenty twelve. In twenty thirteen, and this is nasty by
the way, Backstrom published a paper in tandem with a
computer scientist called John Kleinberg, and it was called Romantic
Partnerships and the Dispersion of Social Ties. A network analysis
(13:33):
of relationship status on Facebook. Try saying that fast twice.
It's actually very difficult. This paper focuses on an algorithm
that was able to independently identify someone's spouse according to
the time Stephen Law, sixty percent of the time, and
it was even capable of predicting when someone would break
up with someone else. The paper hinged on the idea
(13:56):
that mutual friends isn't really an indicator of a couple's
relationship status, but rather the dispersion of those mutual friends
the amount of mutual friends that are also mutual friends
with each other. This is a sensible idea that when
framed as an algorithm made by an engineer who made
Facebook's extremely successful growth tool feels a lot more creepy.
(14:18):
Backstrom in a twenty ten talk relayed again by Stephen Levy,
said that people you may know accounted for a significant
chunk of all friending on Facebook, and that friends of
friends are the most powerful part of the tool. People
you may know as power wasn't just that it found
people that you knew well, but people that you kind
of saw on you, offering you the tantalizing idea of
(14:39):
proximity to them, and when you're friended them, Facebook's news
feed would automatically make their content more prevalent, forcing a
kind of into mercy that may not have actually existed
by making a new connection even if it was super tangential,
and the feed made it feel more immediate, giving Facebook
more power over you and also actively manipulating. And this
(15:00):
isn't even the half of them. This subtle feature is
responsible for much of Facebook's growth, and it was deliberately
engineered by rot economists like Schultz, Klite and Olivin to
do so at any cost, even if it endangered the
lives of children. And by the way, I'm not being dramatic.
According to the Wall Street Journal. In twenty eighteen, David Irb,
(15:21):
an engineer in charge of one of Facebook's community integrity teams,
found that and I quote, the most frequent way adults
found children to pray upon was using people you may know.
The journal also reported that a few days later, Herb
found that Meta was planning to add encryption to the
Facebook messages system on the platform, something that would prevent
(15:42):
the company from fixing this problem, and he threatened to
resign in process. Now you'd think at this point that
the company would go, ah, yeah, that sounds dangerous, so
I probably don't want it. No, no, no. They placed
him on administrative leave, and then he eventually resigned. Facebook,
by the way, did introduce encryption to its mesas despite
Herb's stark warning, and this is a quote, that millions
(16:04):
of pedophiles were targeting tens of millions of children. People
you may know was and is a dangerous tool. And
it was created and maintained and proliferated by people that
now effectively run Meta. Schultz is the CMO, Bosworth is
the CTO. Olivan is the COO Glit is a head
(16:25):
of product and from what I've been told by sources,
pretty much the Darth Vader to Mark Zuckerberg's Emperor. And
if you've not watched Star Wars, just google it. I'm
not gonna help you there. Once. Palliopetia left Facebook in
twenty eleven to start a venture capital firm his team Schultz,
gLite and Olivan. They continued to accumulate power, and as
(16:46):
I've mentioned, I believe these people, along with Lars Bakstrom,
are responsible for making people you may know, into this reckless, dangerous,
scurrilous growth tool for a company that doesn't really seem
to give a shit about its users. But we continue
because they're not done writing yet. In January twenty twelve,
(17:06):
that's when it really would set in, with the launch
of sponsored stories in Facebook's news feed, which Josh Constein,
allegedly a journalist at the time, claimed he would rather
see inform him about the activity of friends than traditional
ads that can be much less relevant, a statement that
frames exactly how little the tech media criticized this company.
(17:27):
Makes me feel crazy. He also added one interesting tidbit.
The Facebook are tested letting advertisers pay for sponsored content
in the news feed in two thousand and six, but
discontinued doing so in two thousand and eight because they
decided to advertisers shouldn't be able to show content in
the news feed unless it could appear there naturally, a
(17:48):
statement that Constein wrote without a single hint of alarm.
Isn't that funny though? He is really funny that half
of these companies, they've said at some point now will
never build the Torment nexus. Then it decade later like, yep,
we got the Sawmant nexus going fellas it looks great
s making US billions anyway. In February twenty twelve, last Backstrom,
as mentioned the architect to People you may know, would
(18:11):
move over to manage Facebook's news feed rankings, and then
a few months later, Facebook would acquire Instagram for a
billion dollars, possibly one of the best acquisitions in history,
at least for a company. In May twenty twelve, Facebook
would have what would be considered a disastrous IPO, with
Wall Street concerned about its lack of growth on mobile devices.
(18:31):
After its first day of trading, the share price was
barely over its debut, and by September it had shared
more than half of its value. Something had to change,
and sometime in twenty twelve, Facebook would find one man,
one man who could fix these problems. They'd promote a
product manager called Adam Masseri to the head of Facebook's
newsfeed division. It was during this period when it started
(18:54):
recommending content into your feed. In Mark Zuckerberg's statement of
intent to potential investors in Facebook's IPO, a standard letter
that you'll see in pretty much every run up to
a public offering, he declared that there was a huge
need and huge opportunity to get everyone in the world connected. Well,
what I like to know is phase one of Facebook's decline.
(19:16):
Behind the curtain. Zuckerbug's vision wasn't to, by the way,
get everyone connected in the world or anything like that.
He wanted to connect everyone in the world to Facebook,
and by the end of twenty twelve, the site had
a billion users, or about seventh of humanity. He also
made one very weird statement in this letter. He said
(19:38):
that Facebook didn't build services to make money, but they
made money to build services, and as I'll shortly reveal,
he was completely lying. Look, I'm on a level with you.
For over a decade, Facebook has deliberately made itself worse
to make more money. Thanks to Mark Zuckerberg and his
growth team perpetuating a culture that manipulates and tortures users
(20:01):
to make number go up. It really is that simple.
While reporting this, I genuinely had to check with multiple
people just to make sure I hadn't accidentally written a
rot economy fan fiction piece. This shit is completely bonkers
and also a lot of its public has no one
put anyway. In a twenty sixteen memo leaked a BuzzFeed.
(20:23):
Several years ago, then vice president Andrew bos Bosworth, now
Facebook's chief technology officer, wrote a horrifying screed. It's a
real stomach turner, and it romanticized Facebook's growth at all
cost culture. Boss want to name no if that connecting
people on Facebook, and these are quotes, by the way,
(20:43):
may cost a life by exposing someone to bullies, and
that and again I quote, maybe someone dies in a
terrorist attack coordinated using Facebook's tools, but that because Facebook's
mission was to connect people, all the work that Facebook
does in growth is justified. Boss makes it clear that
this is a moral judgment by adding that again I quote,
(21:06):
all questionable contact important practices such as the one by
the way that I mentioned with people. You may know,
all the subtle language that helps people stay searchable to
their friends, which again refers to terms and conditions, and
all the work that Facebook does to bring more communication
in was justified in the pursuit of growth. This man,
(21:27):
this horrible, nasty man, is the chief technology officer of
meta and he's the mastermind, by the way, behind both metas,
Metaverse and their artificial intelligence goals. This is the guy.
This is the guy. Anyway, as I'll explain and as
you can probably kind of work out right now, Facebook
(21:47):
is culturally a rot economy company. It is at its
core a growth at all costs enterprise, and they will
make any changes they need to make their products, make
a number go up, and it can be as simple
as revenue, or it can be a completely made up
number that they made just to make Mark Zuckerbok happy.
In December twenty twenty, a Facebook engineer published a document
(22:10):
that they wrote in twenty nineteen called when User Engagement
Doesn't Equal User Value? Shared with me by a source
that assures me that these documents are available through Harvard.
This remarkable document is a stark warning one posted internally
at the company, and it's warning them that Facebook's brutal
(22:31):
focus on user engagement and I quote time spent on
the app damages the user experience and the person in question.
They explain it through the context of cable TV versus
broadcast TV, and how cable TV needs to maximize its
value for their fixed fee options, and that broadcast TV
needs to maximize people watching shows intently because one makes
(22:52):
money off of subscription and one makes money off of advertisement,
which means people need to really watch it in the
case of broad cost TV, and what the engineer was
getting at when they were talking about this was that
this sometimes leads broadcast television to do some nasty things,
some kind of anti user things, by having constant recaps,
padding scenes with flashbacks, adding cliffhangers, giving people a reason
(23:15):
to keep watching that isn't just the show's good. But
I'm going to continue here. The engineer then goes on
to explain several ways that Facebook has deliberately made its
products worse to maximize engagement. He lists the Facebook deliberately
limited the amount of information and notifications to make people
come to the site to increase engagement, as people had
(23:37):
to keep checking to see what was happening and couldn't
rely on notifications for well, I don't know notifying them
about stuff. The Engineering Question referred to this as a
clear value engagement trade off. Facebook also deliberately stopped sending
out emails telling people what happened on their feed, so
that they would have to visit the site. Facebook would also,
(23:58):
by maximizing time spent on the site, incentivize a kind
of bad ranking, I imagine, because when you're incentivizing things
to keep people on the site, you're not actually trying
to show them the things they want. You're trying to
get in the way of it. One other not is
Facebook's algorithm kept prioritizing headlines that said misleading things and
(24:18):
subtly exaggerated things to get people to click links, and
the Engineering Question says it did this all the time,
and the solution to deal with it, again quoting was
very crude. To be clear, all of these things would
deliberate actions by Facebook. Facebook could have more information in
notifications than indeed they used to. Facebook could get rid
(24:38):
of clickbaity stuff like this, but doing that would bring
the numbers down, and that just isn't gonna make Mark
Zuckerberg happy. This piece was a dire warning to Facebook's
internal staff, and it includes new most really worrying warnings
about what the company's doing to keep people engaged. It's
laired with evidence from news in academia. The engineer warns
(25:01):
that higher Facebook use is correlated to worse psychological states
that an experiment found that a month long break from
Facebook improved self reported well being in people. They found
that a large fraction of Facebook users struggled with their
Facebook and Instagram use, and that a significant minority of users,
about three point one percent of them at the time,
(25:22):
reported serious problems with sleep, work, or relationships that they
attributed directly to Facebook, and concerns or preoccupations with how
they used the website. Underneath these warnings, the engineer gave
some suggested solutions, and they noted that over the last
couple of years, Facebook's news feed has slowly switched from
maximizing time spent towards maximizing sessions, meaning that Facebook had
(25:46):
now changed to focus less on how long a user
was on the platform and more towards how many times
they visited, which the engineer noted was a strong predictor
of problematic use. This document, again, just to be clear,
written in twenty nine, published in twenty twenty, it really frames,
how craven Facebook was about twisting its users habits, and
(26:06):
how intimately aware the company was of these problems. One
commenter on the piece added, in a larger thought about
addiction that they personally worried that driving sessions incentivized Facebook
to make its products more addictive without providing much more value,
and that, my friends, is exactly what Facebook has become,
a gratuitor social experiment where the customer is manipulated to
(26:29):
make a number go up. And for over a decade,
as I have said, Facebook has knowingly and repeatedly taken
these steps to maximize activities on the website at the
direct cost of the user experience, and they've made countless
tweets for the product to increase internal metrics, metrics obsessed
over by Mark Zuckerberg, Alex Schultz, and Haavier Olivan. In
(27:01):
another document from October twenty twenty provided to me by
the same source, an engineer explained why the Facebook app
took on sessions as a top line metric. In twenty nineteen,
this piece posted on an internal channel centered around Facebook's
app strategy of building and I quote socially powered services
that were focused on and I quote again the possibilities
(27:22):
of what people could do with Facebook, largely because Facebook
is no easy way to measure how it's satisfied news
and needs. And by the way, that last part is
a quote. This document's important because it reveals some very
very specific details about the company, such as that in
twenty fourteen, Mark Zuckerberg unilaterally decided that time spent on
(27:42):
Facebook was a top level engagement goal and forced it
upon the News team despite their protest that it was
too easy to game. Now you may think too easy
to game, right, that's referring to exterior actors. Surely, right,
there's just some people who would manipulate the feed. No no,
no, no no. When someone at Facebook says something is too
(28:03):
easy to game, it means that they're warning that people
at Facebook will build products specifically to game the system
to hit those metrics, even if it means making them worse,
such as in the case of time spent on the app,
making the product more convoluted to use, meaning that people
use the app more because it takes more effort to
do the thing you want to. Hey, this kind of
(28:26):
reminds me of something when propagard Ragavan led a coup
to take over Google search from Ben Gomes mentioned this
few episodes ago. Is specific obsession was with queries or
how many searches people made on Google. Mark Zuckerberg is
effectively doing the same thing here demanding In twenty fourteen,
as I mentioned that Facebook increased two specific metrics daily
(28:47):
active people and time spent on the app, and he
was looking for a perpetual increase ten percent year of
a year. It didn't matter if these things made the
product worse or didn't actually to show that their users
were happy here, enjoying the website, or found it useful.
All that matters was the number went up. But I continue.
(29:10):
The document notes that in twenty seventeen, engagement metrics started
to take a dive, but the company's focus on time
spent meant that nobody actually noticed because the number that
Mark Zuckerberg cared about was going up until the alarm
was sounded by some internal engineers, and then Facebook moved
to the concept of sessions. To be clear, time spent
always stuck around, but it was no longer the prize
(29:33):
pig that must be fattened up. In this document, they
discuss a term called meaningful interactions, which is the underlying
metric which allegedly guides Facebook today. In January twenty eighteen,
Adam Masseeri, then head of Newsfeed, would post an update
about the feed, claiming it would now prioritize posts that
spark conversations and meaningful interactions between people, which may explain
(29:56):
both the chaos and the rot in the news feed thereafter.
And to be clear, you've really got to think about that,
got to think about what he meant with that. Prioritize
posts to spark conversations and meaningful interactions sounds like engagement
bait to me. And if you've used Facebook at any
time in the last few years, I don't think you're
going to disagree. As I've said, metrics around time spent
(30:17):
hung around this company like a stinky fart, especially with
regard to video, and Facebook has repeatedly and intentionally made
changes to manipulate its users to satisfy these metrics. In
his book Broken Code, Jeff Horwitz notes that Facebook changed
its news feed design to encourage people to click on
the reshare button or follow a page when they view
the post, with engineers altering the Facebook algorithm to increase
(30:40):
how often users saw content reshared from people they didn't know, which,
by the way, think about that for a second. They
changed the algorithm to make your feed more full of
stuff that was not people you knew, just strangers just
so annoying. Horwitz also notes that Facebook began hunting for
and I quote friction, anything that was slowing users down
(31:03):
or limiting their activity, with the goal of eliminating it,
something that manifested in Facebook allowing users to create an
unlimited amount of pages and cross posts the same material
to multiple groups at once, and even a change to
people you may know that prioritized recommending accounts that were
likely to accept a friend request. Naturally, this led to
negative incentives, and in an internal document from twenty eighteen
(31:26):
called the friend in one percent inequality in Friending, an
unnamed internal Facebook worker noted that Facebook's aggressive optimization to
make people send friend requests had created a weird situation
where point five percent of Facebook accounts were responsible for
fifty percent of friend requests. Worse still, accounts that were
(31:46):
less than fifteen days old now made up twenty percent
of all outgoing friend requests, and more than half of
friend requests were sent by somebody who was making more
than fifty of them a day. Heavily suggesting that Facebook
was basically growing its platform by allowing spammers to spam.
It's astonishing how stupid this company is. Or maybe they're
(32:07):
quite brilliant if you just think of them as a
deeply evil and manipulative software company built to grow metrics
and revenue, in which case they're amazing. In a later
document published in mid twenty nineteen, another engineer proposes limiting
the amount of invites that Facebook users can cent to groups,
specifically noting that and I quote a larger proportion of
invites into bad groups come from a small subset of
(32:30):
whale inviters that sent out massive amounts of invites in
a small amount of time, specifically noting that this effect
was really helping vaccine misinformation groups, which posed and I
quote an integrity and product relevant risk to the company.
A few years ago, Mark Zuckerberg grudgingly established some limits
(32:50):
on invites. It's not really clear what they are, though.
For Facebook and Meta at large, anything that might get
in the way of growth is kind of an oh.
Sometime in twenty twenty, another document I've reviewed through the
same source proposed limiting the distribution as in the recommendation
of groups before they've proven to be trustworthy. So give
(33:11):
you an example here. The idea would be that if
a group came up that was outwardly racist or anti
vags or I don't know, a January sixth style group,
just throwing that out there, you would want to stop
them from being recommended to others so they don't grow
and do something bad like I don't know, visit Washington DC.
The problem, as the unnamed engineer describes, is that and
(33:33):
I quote, unproven movements were able to get traction on
Facebook way before community standards could review them, and some
of them were gaining millions of followers in a single day,
and that the groups had immediately began to violate community standards.
And yeah, by the way, if you're thinking that this
ends with me saying and then Facebook realized that this
(33:54):
was a problem and they changed something, You're wrong. They
did nothing. Facebook never put any limitation. They never did.
I'm sure that they're going to claim they have, but
its bollocks in broken code. Hollwitz notes that quote, even
when the safety benefits were high and the business risks
were low, Facebook would choose not to use its emergency
(34:14):
break the glass playbooks to take action, which by the way,
were their emergency see things that allowed them to quickly
make groups have to review every post, and various other measures.
And even when Facebook used these measures, it was very
quick to roll them back. Holwitz gives an example of
the run up to Myanmar's twenty twenty election, when the
company rolled out the break the glass measures, limiting the
(34:35):
spread of reshared content and replacing it with content from
user's friends. These countermeasures were quite successful, and they produced
an impressive twenty five percent reduction in viral inflammatory posts
and a forty nine percent reduction in viral hoax photos,
and it only cost them about two percent of meaningful
social interactions, which was facebooks and is Facebook's most favorite metric.
(34:59):
Holwitz then notes that despite the small change in engagement,
an internal team chose to roll these changes back a
few months later. Can't stop that number going up. But
I got a worse story. This one, This one really
turns my stomach. One particularly horrible story, and it really
pisses me off, is one of something called Project Daisy,
(35:20):
which was a pilot program that would remove likes from
Instagram in an attempt to reduce the anxiety and negative
feelings that teenagers felt using the app, and this idea,
by the way, it was retired after Adam Masseerri, who
is now the head of Instagram, claimed it had a
and I quote very little impact and the result was neutral,
relegating it to a feature I have to opt into. Yet.
(35:41):
A complaint filed by several US states, including California and
New York, North Carolina, and Pennsylvania in May twenty twenty
three quoted a Meta researcher saying that project Daisy, so
turning off visible likes on Instagram, was one of the
clearest things supported by research that Meta could do to
positively impact social comparison and well being on Instagram, and
they advocated for shipping it, with another researcher saying that
(36:04):
Daisy is such a rare case where a product intervention
can improve well being for almost everyone that uses Meta's products. Chillingly,
one Meta employee said that if Meta refused to implement
Daisy despite the research, they were doubtful that Meta would
implement any broad product changes with the purpose of improving
user well being. I think we can all agree they
(36:26):
were correct that story. Just it's so disgusting because usually
with these big tech companies, they do something horrible and
you kind of have to work backwards. You're like, okay,
so they probably did it for this reason. This was
something where Adam Messeri utter scumbag lied. He lied so
that Facebook could make this, so that Facebook could handwave
(36:49):
away not doing something that would help many people, including
millions of children, and then lied. And it's just disgraceful.
These people have names. These people's names need to be
attached with the word scumbag and lyar it's so disgusting. Anyway,
the suit in question is labyrinthine, and it specifically makes
(37:10):
one allegation aligns with the current state of both of
Facebook and Instagram. And I quote that Meta's algorithm alters
users experience on the platform and draws on witting users
into rabbit holes of algorithmically curated material. And as I've
mentioned before, the people involved from the very beginning are
those perpetuating this abuse of users. An email thread from
(37:34):
late twenty seventeen and early twenty eighteen, cited on page
fifty four of the aforementioned complaint is one between Adamisari
and other executives discussing significant declients in US engagement metrics
naming how reducing notifications was associated with reduced engagement, with
an unnamed employee stating that there would be a trade
off between making a better notification experience for users and
(37:55):
recovering flailing numbers in Facebook's daily active people metric. In
the same thread, chief product officer Chris Cox said that
if the team believed that a filtered notification experience was better,
they shouldn't make changes because the metric was down, adding
that Meta needed to get better at making the harder
decisions when the main decision criteria was choosing the customer
(38:17):
experience over a particular metric. The then VP of Analytics
now chief marketing officer Alex Schultz responded that he fundamentally
believed that Meta abused the notifications channel as a company.
Though the suit doesn't quote the rest of the thread,
it notes that the director of Growth, Andrew Bocking, ended
the discussion by saying that Meta would prioritize engagement over
(38:40):
reducing notifications and that he just got clear input from
Naomi gLite that US daily active people is a bigger
concern for Mark Zuckerberg right now than user experience. Fuck. Look,
when I wrote about Probacar Ragavan's destruction of Google Search.
It was a lot harder to find real, tangible proof
(39:01):
of his actions that one could easily chart a path
of intent in how he wanted to increase queries and
revenue in Google Search at any cost, even if it
men making changes to make it worse. But in Matter's case,
it's just so much easier. Mark Zuckerberg is personally responsible
for the state of Facebook and Instagram today, and he's
(39:21):
assembled this marvel super team of growth fiends that will
at times happily and times begrudgingly make the user experience
worse to increase engagement. And it isn't even a new
phenomenon from the early days of Facebook. Mark Zuckerberg is
acted without remorse. He's tricked, he's schemed, he's screwed over others,
and it's all in pursuit of dominance and financial gain
(39:44):
and growth. And it's absolutely goddamn disgusting. Yet Zuckerberg could
not do these things, He could not perpetuate these disgusting
acts without the help of people like Chief marketing officer
Alex Schultz, who saw to it that Meta shut down
crowd Tangle, a public inside tool from Meta that allowed
(40:06):
researchers to easily analyze what was happening on the platform.
Jeff Horwitz's notes in Broken Code that Facebook, led by Schultz,
killed crowd Tangle because reporter Kevin Ruth kept posting a
list of Facebook's most engaged with content and that Facebook
was dominated with right wing assholes and lunacy in misinformation
like the movie Plandemic, an obvious COVID conspiracy film that
(40:27):
Joel Kaplan, head of Meta's public policy team and a
former Googler, by the way, initially blocked the health team
from removing until rus reported that it was Facebook's number
one post. Just to be abundantly goddamn clear, the head
of public policy at Facebook deliberately allowed the spread of
COVID conspiracies and would have continued to do so if
(40:49):
a reporter had not used a Facebook tool to show
them how popular the post was and the result. By
the way, they killed the tool that allowed Kevin in
Rous to find out that this happened. Thanks Alex. Every
single terrible thing you see on Facebook, be it some
(41:09):
sort of insane right wing free or a confusing, annoying
product decision, every single thing has been done and pushed
in pursuit of growth. Every bit of damage that matter
as course to the world has been either an active
ignorance or an act of deliberate harm, and many times
tweaking the product to make it harder or more annoying
to use so that you'll log onto Facebook or Instagram
(41:31):
multiple times a day and spend as much time on
the app as possible. This should also explain why both
Instagram and Facebook are just so terrible to use. Meta's
company culture is one of sycovancy and user abuse, and
both of these apps are deliberately engineered to get in
the way of once you want to do as a
means of increasing your engagement, even if that engagement is
(41:54):
one because the thing you're engaging with is actively fighting you.
And the rock begins at the top. And all of
these horrifying, infuriating, disgusting, stomach changing choices have been perpetuated
by Mark Zuckerberg and he's enforcer, Naomi Glait, making demands
of engineers desperate to appease the almighty uck having to
and I quote Horwitz, be careful not to frame decisions
(42:16):
in terms of right and wrong, because god damn, we
won't want Mark Zuckerberg having a fucking conscience. All right,
let me calm down for a second story. But this
really is the petty king of the tech industry, a
multi billionaire that can never be fired. That's billions of
people on websites that he's deliberately made worse to boost
engagement rather than provide any kind of service for a
(42:38):
company that makes billions and billions and billions of dollars.
You can probably hear how frustrated I am with this company,
and I apologize if I've gone a little bit off
the depend but Mark Zuckerberg is a goddamn scumbag, and
he hurts his users for profit. The entire tech industry
should know who he is, the culture he's created, and
(42:58):
the ways in which he's brutalized this company into finding
ways to increase numbers, all at the cost of user happiness.
And my anger is because there was a time that
Facebook was useful. There was a time that Instagram is useful.
There are friends and family that I've kept in touch
with thanks to Facebook products, thanks to Instagram. There are
things that these products have actually done for people that
(43:20):
have made their lives better, and all of that is
going away. All of it's being crushed by the rot economy,
and it's King Mark Zuckerberg. There is no one who
is more of a rot economist than Mark Zuckerberg. It's disgusting,
it's unconscionable. It's also utterly pathetic. Mark Zuckerberg created probably
(43:41):
one of no, I would say, within Facebook's history, at
least three or four different products, the idea of news feeds,
profile pages, these are all things that Facebook really popularized.
Mark did something kind of incredible, as did the people
under him, and then he just wanted more. He always
wanted more. Mark zucker must have more. He must have
(44:02):
more of Hawaii, he must have more of the Internet.
But I will give you a funny side to this.
Everything's kind of falling apart. In the next episode, I'm
gonna show you how Mark Zuckerberg's disgusting attitude to running
a company is genuinely destroying these products. I've shown you
(44:22):
the harm he's perpetuated and the deliberate actions behind it.
And in this next episode, I'm gonna show you how
bad things have got and honestly that Facebook's hand a dang.
Thanks for listening, Thank you for listening to Better Offline.
(44:44):
The editor and composer of the Better Offline theme song
is Matasowski. You can check out more of his music
and audio projects at Matasowski dot com, M A T
T O. S O w Ski dot com. You can
email me at easy at Better offline dot com or
visit Better Offline dot com to find more podcast links
and of course, my newsletter. I also really recommend you
(45:06):
go to chat dot where's youreaed dot at to visit
the discord, and go to our slash Better Offline to
check out I'll Reddit. Thank you so much for listening.
Better Offline is a production of cool Zone Media. For
more from cool Zone Media, visit our website cool Zonemedia
dot com, or check us out on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.