Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
Bloomberg Audio Studios, podcasts, radio news.
Speaker 2 (00:08):
In late June of last year, the CEO of Open Ai,
Sam Altman, sat down for an interview with Bloomberg's Emily Chang.
She asked him a question I think a lot of
us may have had over the past few years about
Altman's role in the development of artificial intelligence.
Speaker 3 (00:23):
You have an incredible amount of power at this moment
in time.
Speaker 4 (00:27):
Why should we trust you?
Speaker 5 (00:30):
You shouldn't if this really works. It's like quite a
powerful technology, and you should not trust one company and
certainly not one person with it.
Speaker 2 (00:39):
It was an astonishing moment. You had the head of
the world's most prominent AI company saying we shouldn't take
him at his word. But I caught the attention of
Ellen Hewitt, a Bloomberg Features reporter.
Speaker 6 (00:50):
I think that, you know, it's a really smart answer
because it strikes people as like candid and self effacing,
and it's also a way for Sam to push away
the actual question, which is should Sam Altman be trusted?
You know, is Sam Altman a trustworthy person? Like should
we trust what he says? We thought that that was
a worthy question to make kind of a focus of
(01:12):
the show. The show.
Speaker 2 (01:13):
Ellen has spent months tracking down everything she could about
Sam Altman for the latest season of Bloomberg Technology's Foundering podcast,
which drops today.
Speaker 6 (01:23):
We call Sam the most Silicon valley man alive, and
I don't think that's an overstatement. He has, since a
young age, become a really integral player in this startup
ecosystem that has powered the tech boom of the last
twenty years.
Speaker 2 (01:39):
Ellen has spoken to a lot of people who know Aldman,
including some who knew him well before he became the
most Silicon valley Man alive.
Speaker 6 (01:47):
I think some of the most surprising things that I
learned while reporting this season came from talking to Annie Altman,
who is Sam's sister. I think she you know, it's
a really sensitive and complicated and messy story. I think
Annie's story provides this very interesting counterbalance to some of
the promises that Sam makes publicly.
Speaker 2 (02:09):
It's part of a complicated and messy story that's unspooled
over the season's five episodes. Ellen says there's a lot
even for people who think they know the open AI drama.
Speaker 6 (02:19):
My hope is that by the end of the series,
the listener has a lot more material by which to
make you know. I want people to come to their
own conclusions about this, but I do think that there
is a pattern of times when Sam and by extension,
open Ai, have made promises that sound really good in
theory and then in practice are more complicated or are
(02:42):
not upheld in the same way that you might think
that they would be. We come to the conclusion that,
like when Sam says something, you can't be sure that
he really means it.
Speaker 2 (02:54):
That wariness is shared by a lot of people, including
a group of current and former open ai employees who
released a letter on Tuesday warning that frontier AI companies
are not doing enough to inform the public about just
what it is they're building. Open Ai denies this, and
the company says it is taking the necessary precautions. But
(03:14):
all of these headlines, which we've covered here on The
Big Take, are adding to the need for all of
us to better understand just who is running these companies,
starting with Altman.
Speaker 6 (03:25):
And so to me, this question is more urgent than ever.
Speaker 2 (03:31):
I'm David Gurra today on The Big Take, Episode one
of Foundering the open Ai story, the most Silicon Valley
Man Alive.
Speaker 6 (03:45):
It's a Wednesday in November twenty twenty three. I'm standing
in a dark restaurant in San Francisco. There are a
few dozen people. It's the after party for an AI conference.
People are standing and chatting in small circles. Waiters are
bringing trays of ardubs around. I'm biting into a tiny
(04:06):
Porcini mushroom doughnut when I hear a whisper. Sam Altman
has walked into the restaurant. I try to spot him
in the dim lighting. I've talked to Sam here and
there over the years. He's been around the startup scene
for more than a decade, and he's friendly to journalists
and over time, Sam Altman has become basically the most
(04:28):
Silicon Valley man alive. He's the CEO of open Ai,
which shot to tech startum a year ago with the
release of chat GPT. His company is worth eighty six
billion dollars. It's one of the most valuable startups in
the world. For the past six months, Sam had been everywhere.
It seemed like every major news outlet wanted to profile him,
(04:50):
and he was saying yes to each one. Headlines referred
to him as the Chat GPT King, the Oppenheimer of
our age, and an AI overlay lord. He was like
an ambassador for the AI future, zipping around the globe
meeting with world leaders. He testified in front of Congress
and amid all that, he decided to pop by this
(05:11):
AI conference party a little surprise visit. So I see
him standing in a corner of the restaurant. A few
people have already clustered around him. I decide to say hi.
He's wearing a suit and a tie, more dressed up
than usual, and he's shorter than I remember. We shake hands.
He eyes my conference badge and says, it's good to
(05:33):
see you. I'm surprised by how personable he is, how
friendly he comes across. He acts like he remembers me.
He already knows I'm working on this podcast. So I
say to him, I'm about to put in a request
to your comms team to see if we can find
a time to interview you. He's like, sure, sounds good.
Several people are hovering, hoping to get his attention for
(05:54):
a minute. So I step away, and after about ten
minutes at the party, he leaves too. The next day,
Sam continues his tour as the statesman of AI. He
speaks at Apec, a big conference that President Biden and
China's leader Shijinping both attended. Here's Sam at that conference
talking about AI.
Speaker 5 (06:14):
I think this will be the most transformative and beneficial
technology humanity has yet invented. I think this is like
the greatest leap forward of any of the of the
big technological revolutions we've had so far. So I'm super excited.
I can't imagine anything more exciting to work on, and
getting to do that is like the professional honor of
a lifetime. It's so fun to get to work on.
Speaker 6 (06:35):
That, the most transformative technology humanity has yet invented, the
professional honor of a lifetime. Sam clearly believes the world
is about to change dramatically and fast, and he's aware
that a lot of that change is connected to him.
(06:55):
And the day after that, less than forty eight hours
after I saw Sam at that party, less than twenty
four hours after he spoke at a major international gathering
of world leaders, he was fired.
Speaker 4 (07:08):
Breaking news.
Speaker 7 (07:09):
Sam Altman is out as CEO of Open AI.
Speaker 6 (07:13):
This is a stunner. The tech world has been thrown
into chaos over the.
Speaker 8 (07:16):
Weekend when the company that gave us chat GPT fired.
Speaker 6 (07:19):
It's CEO. The move came as a complete surprise to everyone,
including open AI's biggest investor.
Speaker 9 (07:26):
Microsoft board says it pushed Altman out after a review
found he was quote not consistently candid in his communications
with the board.
Speaker 6 (07:37):
It's hard to emphasize enough how shocking this was. Sam
Altman got fired. It was the juiciest tech news of
the year and completely unexpected. The news came on a
Friday afternoon, and my colleagues and I all immediately knew
our weekends were out the window. The next few days
felt like a hurricane of news. The board implied Sam
(07:59):
had lied to them. They announced a new CEO. Then
came a regretful public apology by someone who had fired Sam.
There were pledges of loyalty, an employee revolt, and at
the same time something else was happening. It started to
seem like Sam was actually going to come back. He
(08:19):
was rallying the support of his employees and Microsoft. It
was looking like Sam could win. On Tuesday night, Sam
was reinstated as CEO at the company that just five
days earlier had basically tried to destroy him. Sam managed
to get the upper hand once again. It was high
(08:41):
drama a jaw dropping turn of events, but for those
who know Sam, it actually made perfect sense. One of
Sam's mentors is an investor named Paul Graham. He once
told a reporter Sam is extremely good at becoming powerful.
To me, this seems like Sam's defining characteristic. Often in
(09:03):
Silicon Valley we talk about tech visionaries who are programming
geniuses or obsessed with the details and design of the product.
That's not Sam. His strongest, most unique skill is wielding power,
and that might have consequences for all of us. You're
(09:25):
listening to Foundering, I'm Ellen Hewitt. In this season of Foundering,
we're going to chronicle the rise of Sam Allman, the
all out arms race to build the leading AI company,
the claim that this new technology could threaten to wipe
out all of humanity if we're not careful, and then
the coup that almost took down the guy at the
top of it all before he managed to claw his
(09:48):
way back. For this season, you'll hear reporting done by
me and my colleagues at Bloomberg who have been covering
AI during the boom of the last few years. He
interviewed some of the leading minds in AI to try
to cut through the hype and understand the debate about
whether AI will be a tool to improve human existence
(10:09):
or to extinguish it. But this is also the story
of Sam, the man at the center of it all,
and we spoke with Sam's friends, family, and collaborators to
demystify him and how he rose to power. This first
episode is all about how Sam got here. He's a
man who has always understood the importance of being in
(10:31):
the right room at the right time, with exactly the
right few people. The full story of Sam's rise is
important because understanding who he is and what he believes
will shed light on an urgent question, should we trust
this mam to oversee this technology. In the summer of
(10:53):
twenty twenty three, about five months before he was fired,
my colleague Emily Chang asked him this exactly question at
a Bloomberg conference.
Speaker 3 (11:02):
You have an incredible amount of power at this moment
in time.
Speaker 6 (11:06):
Why should we trust you?
Speaker 5 (11:08):
You shouldn't have If this really works, it's like quite
a powerful technology, and you should not trust one company,
and certainly not one personalized.
Speaker 6 (11:18):
You shouldn't trust me. If he believes that, then why
did Sam fight his way right back to the top
of open AI, back into that position of control, as
if to insist he should be the person to lead
this company as they develop AI as fast as possible.
To a lot of people, the stakes are terrifying. They're
(11:39):
higher than anything else really in the valley. Some people
refer to artificial general Intelligence or AGI as the last invention,
something so potent that everything that comes after will look
unrecognizable compared to everything leading up to it. Inside open AI,
some employees talk about how they are building God, We'll
(12:01):
be right back to get a better sense of the
kind of person Sam is and how he got where
he is now. I want to take you back to
his adolescence. Sam had a privileged upbringing in Saint Louis.
He's the oldest of four siblings. His mom was a
(12:22):
dermatologist and his dad was a real estate developer. He
attended a private high school called John Burrows. There's an
anecdote about him from that period that sticks out. When
some students wanted to boycott an assembly about sexuality, Sam
stood up in front of the whole school and announced
he was gay. It's a pretty gutsy move for a
teenager in the early two thousands and Unsurprisingly, Sam was smart.
Speaker 9 (12:46):
And generally Sam was he was an exceptional student, He
was an exceptional writer, he was an exceptional a big personality.
Speaker 6 (12:57):
That's Andy Abbott. He was one OF's English teachers and
he's now the head of schools. And this is a
pretty nerdy school where it's cool to get good grades
and be a high achiever. And even in that environment,
Sam stood out.
Speaker 9 (13:11):
Sam's just a really natural leader, incredibly charismatic, curious guy.
He's a typical you know, he was the editor of
the yearbook and he represented the school in the Model
United Nations. He designed our website, you know, before we
(13:33):
hired people to do our website.
Speaker 4 (13:35):
He could just do that stuff.
Speaker 6 (13:37):
Sam even played water polo.
Speaker 9 (13:39):
He was pretty good. I'm not a connoisseur, but I'm like,
he was pretty good.
Speaker 6 (13:45):
He remembers Sam as being really confident, and apparently for
good reason. Sounds like Sam was just this exceptional kid.
Speaker 9 (13:52):
Well, he's the smartest guy in the room, and he's charismatic.
I remember thinking, and I'm just this is just an
embarrassing confession. I hope he doesn't go into technology. He's
so creative and he's so he's such a good writer,
and I hoped he would be an author or something
(14:15):
like that. And I mean, nobody could have anticipated the
magnitude of open Ai, but everybody knew that this guy's
better at most things than most of us are.
Speaker 6 (14:31):
This speaks to a pattern that'll become a crucial factor
in Sam's career. He's very good at impressing people, especially
the right people, older people, people with influence, people who
are in a position to help him. Someone who knows
Sam says his superpower is figuring out who's in charge
and charming them. So we have young Sam, even though
(14:55):
he was a teenager, he acted like someone older with
more agency and confidence. Found this quality of his admirable,
and he acted like this towards his three younger siblings too.
In a big New Yorker profile on Sam, his younger
brother said that as kids they used to play a
board game called Samurai, and Sam always won because he
declared himself the leader and said, I have to win
(15:18):
and I'm in charge of everything. When Sam's brother told
this story, it was a jocular exchange. But Annie, their
youngest sibling and only sister, sees it differently. These days,
she's estranged from Sam and the rest of her immediate family.
But when she was a kid, she remembered that same
quality of Sam's wanting to be in charge, and to her,
(15:39):
it wasn't funny, it was domineering.
Speaker 8 (15:43):
From my perspective, with the nine year age difference, he
very much wanted to be and acted like the third
parent and liked being the older sibling in charge, in control.
Speaker 6 (15:58):
For instance, even though the family was Jewish, they used
to get a Christmas tree until Sam put his foot down.
Speaker 10 (16:05):
I don't have memories of a Christmas tree because when
Sam got farmits foot at thirteen, he decided that we
as a family unit were Jews and needed to no
longer celebrate Christmas.
Speaker 6 (16:18):
There were no more Christmas trees. When their dad passed
away in twenty eighteen, Annie remembers that Sam dictated to
each of his younger siblings how many minutes they could
talk at the funeral.
Speaker 1 (16:29):
To be at your dad's funeral, to be like, oh,
I'm the oldest sibling, so I get to choose how
long all the sibling which it is bizarre, and there's
a level of it that's so hilarious and so benign.
Surface level classic older sibling bullshit where it's like, all right,
older sibling wanting.
Speaker 6 (16:47):
To like make up the rules to the game.
Speaker 1 (16:49):
Like it's there's there's a level of it that's very
light and funny, and there's also a level of it
that's very dark and deeply unsettling of how does that
behavior come up in other place if you believe that
you get to be the authority on something that you
are not the authority on.
Speaker 6 (17:07):
A spokeswoman for open Ai told us that Sam recalls
these incidents differently, but she declined to elaborate. And so
when Sam finished high school, he started down this path
that's pretty textbook for the tech industry, studying computer science
at Stanford, founding a startup, and dropping out of Stanford.
And he made one incredibly important decision he applied to
(17:31):
why Combinator. Why Combinator is a startup accelerator. It's basically
a boot camp for startups. You and your co founders
apply and if you're accepted, you spend three months hacking
away trying to build a company. At the end of
that period, you give a demo to investors and try
to raise venture capital. Sam was actually in the first
ever group of founders at y Combinator. Everyone calls it YC.
(17:54):
By the way, it was two thousand and five, so
YC was totally unknown, just a bunch of young guys
hanging out in Cambridge, Massachusetts for a summer writing code.
But YC would eventually become this enormously powerful network. Now
it's basically the number one elite program for startups. It's
really hard to get in, and the alumni network is
(18:15):
incredibly strong. Sam was nineteen years old when he joined YC,
and once again he impressed just the right person, Paul Graham,
the head of YC. The first batch included some other
really impressive people, like the founders of Reddit and Twitch,
but someone who knows Sam says that he was immediately
Paul's favorite. Paul later wrote, within about three minutes of
(18:38):
meeting him, I remember thinking, ah, so this is what
Bill Gates must have been like when he was nineteen.
The startup Sam was building was called Looped. Looped is
one of the forgotten apps of its time the early
two thousands, when people were really excited about having GPS
on their phones for the first time. It used location
data to connect people to their friends and local businesses,
(19:01):
kind of like a mix of Yelp, and four Square.
Here's Sam making the pitch at a developers conference.
Speaker 11 (19:07):
Lup is about connecting with people on the go, which is,
after all, the main reason you have a phone. We
show you where people are, what they're doing, and what
cool places are around you.
Speaker 6 (19:16):
Sam started building the startup in two thousand and five.
The iPhone didn't exist yet, so luped was trying to
do this for flip phones and it was kind of
hard to get traction. At one point early on, Sam's
company was in a desperate situation. They really needed to
get a deal with a mobile carrier. They learned that
(19:38):
Boost Mobile, which was part of Sprint, was looking to
add a location feature and needed a partner, but they
were about to sign with someone else, so Sam flew
down to Boost's headquarters in Irvine and southern California. When
he tells the story, he says that he just showed up,
waited outside the right executive's office, and asked for just
ten minutes. Here's how that executive remembers it.
Speaker 3 (20:01):
As I recall, I got a phone call from Sam
when he was in Irvine, and he said that he
explained who he was and what Looped was somebody at
Sprint had told him to get in touch with us.
Speaker 6 (20:16):
That's Lowell Winer. He was at the time the head
of business development for Boost. And he's going to tell
a story that has a few asides, but that I
think captures a lot about what Sam was like.
Speaker 3 (20:27):
Early on, we were a day or two away from
signing a contract with another startup that was further along
than Looped. He asked to come by that day, you know,
which is incredibly unusual, but given the timing that, you know,
we were at the eleventh hour, we were about to
sign this contract. He had come, you know, referred to
(20:48):
us by our parent company, it was worth at least
a meeting. So Sam shows up the office with one
or two other guys from Looped. We go sit in
the conference room, you know, we share what we were
looking to do. Sam started to share about Looped. He
was I think nineteen at the time, you know, I
(21:08):
think maybe in cargo short sitting cross legged in a
in a chair in a conference room and just kind
of holding court.
Speaker 6 (21:17):
I want to pause here for a second on this
cargo shorts detail. For a lot of Sam's young life,
he was a cargo shorts devotee wore them all the time.
People kind of poked fun at him for it, to
the point where he felt the need to address it
on a podcast called Masters of Scale.
Speaker 5 (21:34):
Honestly, I don't think they're that ugly, and I find
them incredibly convenient. Like I I, you can like put
a lot of stuff like I like to. I'd still
read paperback books. I like paperback books. I like to
carry on around with me. I have like an iPhone
seven plus, which is kind of like works really well
on cargo pockets. I carry like computer chargers, cables. They're
(21:54):
just like, you know, efficient. Why people care about that so.
Speaker 6 (21:59):
Much that I can't tell you that last comment. That's
very Sam to remark that the things normal people might
talk about don't make sense or aren't rational. It's like
he has no patience for the things most of us
might think are funny. He has more important things to
think about anyway. Here he is a nineteen year old
(22:20):
in a meeting with mobile network executives, wearing cargo shorts,
sitting cross legged in a conference room chair. Even though
this encounter was almost twenty years ago, Lowell remembers vividly
what Sam looked like in that moment because it was
such an odd picture.
Speaker 3 (22:34):
He's a he was small in stature. I don't think
he's a big guy now, but he was. You know,
he was quite slim at the time. You know, it's
no easy feat to sit cross legged in a conference chair.
I mean, he looked like he could have still been
in high school.
Speaker 6 (22:53):
I've heard other people describe Sam's weird way of sitting.
He's older now, so he doesn't do it as much,
but one person who knows him told me he used
to squat on the seat of a chair like a
perched bird, with his knees up toward his chin. Despite
this unconventional way of presenting himself or because of it,
(23:13):
he almost immediately convinced Lowell that boost Mobile should switch
plans at the eleventh hour and go with this other partner.
Speaker 3 (23:20):
It was pretty clear within a half hour of this
meeting starting that Lukeed and Sam were the right partner
to do this. There was both excitement and like shit,
we have to go sell it internally. But I recall
stepping outside of the room with the colleague that I
was in the meeting with and saying to him, we
(23:40):
need to switch gears.
Speaker 6 (23:42):
And Lowell still remembers this way that Sam looks unassuming,
but he's not.
Speaker 3 (23:48):
On the one hand, visually he looked incredibly young, but
if you'd shut your eyes and were just listening to him,
his command of material, in his ability to communicate and
engage was on par with anyone you know I had
met with over the course of my tech career. It Uh,
it was freakish. Yeah, it was freakish, and not his
(24:11):
appearance but his his poise and command for that.
Speaker 6 (24:15):
Age, Sam's relentlessness paid off. He knew he had to
get that deal, and he did what he needed to
do to get it, including flying across the state to
surprise someone at their office. Sam later said he learned
an important lesson the way to get things done is
to just be really persistent. So he inks this deal
(24:39):
for Looped, it's going to power location sharing for boost Mobile.
It's the partnership that led to this two thousand and
six ad campaign.
Speaker 1 (24:47):
Y Oh, you see where I meant here, But I
know where you at boos Looped the GPS.
Speaker 2 (24:52):
Now you know what your friends.
Speaker 6 (24:54):
At Sam's peers at YC were surprised that he pulled
this off. Loop's business model pretty wobbly, and the product
wasn't all that impressive, but Sam's particular strength was starting
to become clearer.
Speaker 7 (25:06):
It was obvious Sam was an incredible deals guy.
Speaker 6 (25:09):
That's a clip from a podcast interview with Emmett Sheer,
one of the other guys in that very first YC batch.
You can hear how impressed he was by Sam's particular talent.
Speaker 7 (25:19):
He was somehow convincing the phone companies to give his
startup that like didn't really have a product like deals.
I still don't know how he did that, And like
that was the only obvious thing about Sam at the time,
was like that he was ambitious, but most of us
were pretty ambitious, and he was great at great great
deals guy.
Speaker 6 (25:39):
For the next few years, Luke kept growing. Sam presented
at Apple's Developer Conference in two thousand and eight in
a very unique outfit. He wore two Polo shirts layered
on top of each other, so he's in a hot
pink polo with a second lime green collar poking out underneath.
Speaker 7 (25:55):
I'd like to invite up, Sam Altman, Sam I.
Speaker 11 (26:01):
Stept We are incredibly psyched about Luket on the iPhone.
Lup is about connecting with people and what cool places
are around you. The orange pin up there is where
I am right now, and the blue pins represent my friends.
We make serendipity happen.
Speaker 6 (26:18):
And when you listen to Sam pitching LUT, you can
hear this other part of him, this earnestness and optimism.
I've listened to and read a lot of interviews with
Sam and he's always using the words super and excited,
sometimes super excited.
Speaker 5 (26:34):
It's super cool, super easy to make, super important.
Speaker 6 (26:37):
It's been super great.
Speaker 5 (26:38):
I'm super excited to announce. I'm super excited, super excited
for that.
Speaker 6 (26:43):
But he was not super excited about what happened next
at LUT. After several years, Looped fizzled out. Sam made
a deal to sell the company for a modest sum
in twenty twelve. He walked away with a reported five
million dollars. Most people would be pretty happy with this outcome,
but in Silicon Valley terms, Luptwa was kind of a failure.
(27:05):
But that's okay because by then Sam had won over
other people who could help him. One was Peter Teel,
who was a billionaire and investor and the co founder
of PayPal and Pallanteer. He's also one of the most
powerful gay men in Silicon Valley, which lent him and
Sam a sense of camaraderie. When Sam left Loop, Peter
gave him a bunch of money to invest. They were close,
(27:29):
and Sam's peers noticed because Peter Teel is notoriously pessimistic
and even nihilistic, and Sam's public image, by contrast, he's
pretty earnest and optimistic. At the same time, Sam was
deepening his relationship with Paul Graham, the head of YC,
and that closeness was giving Sam tangible benefits. When Paul
(27:50):
had the chance to invest very early on in Stripe
the Payments startup, he invited Sam to invest too. Sam
later said that that was by some measures, his most
profitable Angel investment ever, and he got it purely because
of this personal relationship he had built. Paul is known
for writing essays about how to build startups, essays filled
(28:11):
with blunt, quotable entrepreneurial wisdom. Often in those essays he
praised Sam. He advised young eager founders to emulate Sam,
and Paul is also responsible for one of the most
infamous quotes about Sam. You could parachute him into an
island full of cannibals and come back in five years
and he'd be the king. At first, this quote sounded
(28:37):
to me like a compliment, but lately I've wondered if
maybe it's not anyway. Around twenty twelve, YC had become
very influential. It had relocated its headquarters from Cambridge to
Silicon Valley and was now a breeding ground for some
of the most successful Internet businesses. Airbnb, Dropbox, and Stripe
all got their start at YC, and from the outlie
(29:00):
it looked as though Sam was beginning to mimic Paul
as a startup guru. He was also advising young founders
at YC, and just like Paul, he started writing essays
filled with mysterious, often perplexing advice for startup founders, such
as the most successful founders do not set out to
(29:21):
create companies. They are on a mission to create something
closer to a religion or here's another A big secret
is that you can bend the world to your will
a surprising percentage of the time, most people don't even try.
In a blog post called how to Be Successful, Sam
told founders that they should quote have almost too much
(29:43):
self belief. The most successful people I know believe in
themselves almost to the point of delusion. That last point
will sound familiar to people who know Sam. One of
them told me that Sam has complete self belief. That
Sam gives off the impression that he believes in himself
one hundred percent without that nagging feeling most of us have,
(30:06):
like a little voice of fear or uncertainty. And here's
Lowell again, the former boost Mobile executive.
Speaker 3 (30:14):
He was extraordinarily self assured, and not in an egotistical way,
but just very comfortable with himself, with his capacity both
intellectually and relationally.
Speaker 6 (30:29):
For years, Paul had built up Sam's image, made him
this kind of startup demigod. Then he decided to anoint
him fully. Paul stepped down in twenty fourteen and named
Sam President of YC. This was a big deal. YSE
is the center of Silicon Valley, and now Sam was
its leader. He was twenty eight years old. With both
(30:52):
Peter Teel and Paul Graham, Sam cultivated these close relationships
with people and positions of power. Then they gave him
like money, connections, influence titles. Essentially, they transferred some of
their power directly to him, and that in turn gave
Sam the ability to think big, even when he was
(31:12):
working on something pretty silly sounding like looped, Sam had
sky high ambitions. Paul noticed this. Here he is at
a conference explaining why he picked Sam to be his
successor at YC.
Speaker 2 (31:25):
It's turned into this giant thing and I'm no good
at running giant things. Sam, however, is going to be
good at running a giant thing.
Speaker 6 (31:33):
At this point, there was no evidence Sam could run
a giant thing or even a medium sized thing. But
once he took over, Sam did make YC much bigger
than before sprawling. He gave more money to more startups,
and he expanded overseas.
Speaker 12 (31:49):
The Sam Altman era at YC was about expansion.
Speaker 6 (31:54):
That's John Coogan. He's a startup founder who went through
YC twice, including once for his company Soilin, you know,
the one that made food powder for tech people.
Speaker 12 (32:05):
Why CE went from you know, a summer program for
startups with just a couple companies to doing all sorts
of things venture investing with the Continuity Fund, nonprofit work
with YC research. There's definitely a there's a feeling of like, Okay,
how much should YC really be doing? Should this organization
(32:28):
be doing everything.
Speaker 6 (32:29):
During his time at y C, John noticed a couple
things about Sam.
Speaker 12 (32:33):
He's very good at switching gears and listening very intently.
I think that's what's really that's really like a superpower.
I heard someone describe him as the Michael Jordan of listening.
Speaker 6 (32:46):
A lot of people have described this intensity of Sam's
to me. When he listens, he also stares. It can
almost feel unsettling. And now that Sam was running YC,
he could become kind of a covert operator. He had
all of his deal making skills from his Loop days,
and now his network and influence was way bigger. If
there was ever a problem, he could make a call
(33:07):
and fix it fast.
Speaker 12 (33:09):
Early, very early in my business career, I had a
very tough negotiation going on, and I wrote Sam this
email saying like, Hey, I'm in this negotiation, I just
want to reality check some stuff with you, and he
called me immediately. We talked for like five minutes and
he completely solved my problem, and it was like one
(33:29):
of the best business deals in my life. It was like,
actually left a really really like big impact on me.
I've personally seen Sam resolve a one hundred million dollar
issue in a thirty minute phone call, fifteen minute phone call.
It's really remarkable, and I think Sam just thinks about
it in human terms. It's like, this person wants X,
this person wants why. How can we bring these two
(33:51):
people together.
Speaker 6 (33:53):
Y SEE was growing, but Sam's ambitions were even larger.
When he became president, he starts taking on pet projects.
One of his interests was nuclear fusion. To encourage more
people to build startups in this area, he expanded yc's
scope to include hard tech startups, that is, companies building
tech where there's some doubt that it'll work at all.
(34:16):
Until that point, YC had been almost exclusively about Internet software.
He also spun up a research arm at YC and
assigned researchers to projects he wanted to explore. One example
was universal basic income. The idea is to give every
person a regular income, regardless of whether they work. So
under Sam's command, YC created a study to give money
(34:40):
no strings attached to families in Oakland. This move is
one that Sam turns to often. He'll have an idea
for something he thinks should exist, and then he pulls
in people and money to encourage someone to do it,
bending the world to his will As he wrote in
his blog, he's kind of like the maestro of an orchestra,
not playing the instruments himself, but conducting his symphony. And
(35:05):
in twenty fifteen he'll have the chance to do this again,
this pattern of finding an important topic, then arranging to
get the right people and the right money on board
to build it. This time the focus is AI. It'll
all take place at a dinner, a dinner that'll change everything.
We'll be right back. Okay, here's a bit of Silicon
(35:32):
Valley lore. It's often told that in twenty fifteen Elon
Musk was worried, or at least Elon was going around
telling everyone he was deeply distraught over the state of
artificial intelligence. Back then, the biggest powerhouse in AI was Google.
It had so much money, and it had hired all
the best researchers. They had Google Brain, and then they
(35:54):
acquired deep Mind. It's a British lab that was working
on some of the most exciting things in AI at
the time, AI that could be more fluid and self taught.
Google's early lead particularly disturbed Elon because he was worried
about the possibility that AI could start to grow too
powerful especially if an AI entity began to improve itself,
(36:16):
and he was really worried about the guy in charge
of it all.
Speaker 13 (36:18):
So I used to be close friends with our page,
and I would say at his house and we'd have
these conversations long into long into the evening about AI,
and I would I would be constantly urging him to
be careful about the danger of AI, and and he
just he was really not concerned about the danger of
AI and was quite cavalier about it.
Speaker 6 (36:40):
That's Elon on CNBC. I'm always surprised by how casual
he sounds when he's telling the story of basically ending
a friendship. But he loves his anecdote because it makes
him sound smart and forward thinking.
Speaker 13 (36:53):
And at the time, Google, especially after their acquisition of
deep Mind, had three quarters of the world's AI talent.
Speaker 4 (36:59):
They had a lot of computers and a lot of money.
Speaker 13 (37:02):
But the person who controls that does not, or at
least that did not seem to be concerned about AI safety.
Speaker 4 (37:09):
That sounded like a real problem.
Speaker 13 (37:11):
So and then the final straw was Larry calling me
a specist for being pro human consciousness instead of machine consciousness.
And I'm like, well, yes, I guess I am.
Speaker 6 (37:26):
At the time, a lot of experts thought the idea
of developing AI so powerful that it threatened the human
species was laughable because AI was still having trouble distinguishing
between a picture of a chihuahua and a picture of
a blueberry muffin. But to Elon, it was a real threat.
Here he is speaking with Walter Isaacson at a conference.
(37:47):
He sounds serious, even alarmed.
Speaker 13 (37:49):
I don't think most people understand just how quickly machine
intelligence is advancing. It's much faster than almost anyone realizes.
Even within Silicon Valley, people really.
Speaker 4 (38:01):
Have no idea why is that dangerous?
Speaker 13 (38:04):
If there's some digital superintelligence, particularly if it's engaged in
recursive self improvement, and its optimization or utility function is
something that's detrimental to humanity, then it will have a.
Speaker 4 (38:19):
Very bad effect. You know.
Speaker 13 (38:22):
It could be just something like getting rid of spam
email or something, and it's like concludes, well, the best
way to get over spam is to get rid of humans,
you know, But why would we lose source of spam.
Speaker 3 (38:34):
I know we've all watched cal in two thousand and one.
Speaker 6 (38:38):
The audience actually laughed at him because what he was
saying sounded so outlandish at the time. So Elon was
increasingly feeling like he had to do something to dilute
Google's power, But he might have also been driven by
something else. Here's Ashley Vance, my colleague at Bloomberg, who
wrote a biography of Elon twenty thirteen.
Speaker 14 (38:58):
Elon was he was not the guy we know today.
I mean, he was doing okay, but Tesla was kind
of just barely starting a hitted stride. SpaceX was doing
pretty well. Elon was worth probably a few billion dollars.
Definitely not the richest person in the world.
Speaker 6 (39:18):
Elon's friends at the time were among the richest people
in the world, Larry Page and Saragey Brinn, the co
founders of Google.
Speaker 14 (39:26):
I was interviewing Elon a lot that I was working
on a book about him, and it was clear to
me that he was looking over at his friends who
were doing really well and everything they did, and they
had these software empires and this growing AI empire. And
my sense of where Elon's thoughts about AI started to
(39:49):
originate was part jealousy. I thought he looked at Google
and all the success it was having, and the success
as friends were having, and he had nothing like that.
He would never admit this out loud that.
Speaker 4 (40:02):
He was jealous.
Speaker 6 (40:03):
Okay, So AI was top of mind for Elon, and
in twenty fifteen he attended this dinner at the Rosewood Hotel.
It's this swanky place on sand Hill Road in Menlo Park,
right near some of the biggest venture capital firms in
the valley. There were about ten people at this dinner,
but for our purposes, here are the four most important ones.
(40:25):
There's Elon, There's Sam. Then there are two other people,
Ilia Sitskiver and Greg Brockman. Ilia worked at Google and
was a really well respected researcher in AI, and Greg
had been one of the most important people at Stripe,
taking it from a team of five people to a
company worth billions of dollars. At dinner, they talked very
(40:46):
seriously about AI and the threat that it could be
misused or that it could lead to catastrophe. They talked
about what it might take to build something that could
compete with Google. They had the pieces there. Ilia's AI skills,
Greg's operational experience, Elon's money, and Sam was there to
orchestrate it all. At that dinner, Elon pledged to put
(41:08):
a billion dollars toward this project. He came up with
the name too. You can hear the pride in his
voice when he describes the idea on CNBC.
Speaker 13 (41:18):
Open ai refers to open source, So the intent was,
what's the okay?
Speaker 4 (41:22):
So what was the opposite? What's the opposite of Google?
Would be an.
Speaker 13 (41:27):
Open source nonprofit because Google is closed sourced for profit,
and that profit motivation can be potentially dangerous.
Speaker 6 (41:36):
So that's the birth of open Ai. They had an
initial vision open ai would be a research lab and
it would share its work openly with the public instead
of trying to keep it private for its own gain.
And it would be a nonprofit not focused on enriching
the company, but instead focused on building a safe AI
that would benefit humanity. It sounds good in theory, but
(41:58):
those nonprofit open source ideals we're about to get complicated fast.
Opening Eyes co founders soon found themselves locked in a
power struggle and then urgently racing to raise billions of dollars,
and Sam would come out on top again in a
way that cemented his power even further.
Speaker 2 (42:22):
That was episode one of Foundering, hosted by Ellenhewitt Bloomberg
dot Com, subscribers can listen to the entire series of
Foundering right now. Just connect your subscription to Apple Podcasts
for early access. The second episode in the series is
available for free wherever you get your podcasts. Sean Wen
is Foundering's executive producer. Molly Nugent is the associate producer.
(42:42):
Blake Maples is the audio engineer. Mark Million and Vander
May Seth Fiegerman, Tom Giles and Molly Shoots are Foundering
story editors. Foundering had production help from Jessica Nix and
Antonio Mufferetch. This is the Big Take from Bloomberg News.
I'm David Gerre. Be sure to subscribe to Bloomberg for
all our great pots podcasts, and if you like our show,
leave a review. Most importantly, tell your friends.
Speaker 10 (43:11):
M