Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Zone Media. Try spinning. That's a good trick. I'm at Zetron.
This is Better Offline and this is the second episode
of my two part series where I explain how open
(00:25):
ai has become a systemic risk to the tech industry,
even with its massive forty billion dollar funding round and
bird brain benefactor in the form of soft Bank, the
world's foremost authority in losing money. Now, before I continue,
shameless request, Better Offline has been nominated for a Webbie
and I want to win this thing. I've linked to
it in my Twitter and my Blue Sky and if
you could vote for me, well, it will be in
the episode notes two. Moving on back at it. Okay,
(00:49):
all right, open ai now has forty billion dollars somehow,
right great? Right, Well, Dolgier horses. As part of its
deal with soft Bank, open Ai must also convert its
bizarre nonprofit structure into a for profit entity by December
twenty twenty five, or it will lose ten billion dollars
from that forty billion dollars and of funding. And just
to be clear, by the way, they've only really got
(01:10):
ten billion dollars of that so far. The rest is
at the end of the year. Furthermore, and the event
that open ai fails to convert into a for profit
company by October twenty twenty six, investors in its previous
six point six billion dollar funding route can claw back
their investment with a converting into a loan with an
attached interest rate. Naturally, this represents a night mess scenario
for the company, as it will increase both its costs
(01:32):
and its outgoings. This is a complex situation that almost
warrants its own podcast, But the long and short of
it is that open ai would have to effectively dissolve itself,
start the process of reforming an entirely new entity, and
distribute its assets to other nonprofits or seller license them
to a for profit company at fair market rates which
they would not set. It would also require valuing open
(01:54):
ai as assets, which in and of itself would be
a difficult task, as well as getting past the necessary
state regulators, the IRA state revenue agencies, and the upcoming
trial with Elon Musk Well, that only adds further problems.
I've simplified things here, and that's because, as I've said,
this stuff is a little complex and pretty boring. Suffice
to say, this isn't as simple as liquidating a company
(02:14):
and starting a fresh or submitting a couple of legal filings.
It's a long, fraught process and one that will be
as as has been subject to legal challenges, both from
open AI's business rivals as well as from civil society
organizations in California. You may have heard the lost monologue.
Based on discussions with experts in the field of my
own research, I simply do not know how open ai
pulls off this by October twenty twenty six, and honestly,
(02:37):
I'm not sure how they do it by the end
of this year. It's insane. It's a it's a really
I just every time I read this stuff and I
write out, I'm like, how is nobody else reading this
and going, what the fuck is going on? You See,
this is a big problem, this nonprofit thing, because open
eye really has become a systemic risk to the tech industry,
(02:58):
and anything that increases that is bad news for everybody.
Open Ai, they've become a kind of load bearing company
for this industry, both as a narrative, as I've discussed
multiple times as chat GPT is the only large language
model company with any meaningful use base, and also as
a financial entity. Its ability to meet its obligations and
its future expansion plans are critical to the future health
(03:19):
or in some cases survival of multiple large companies. And
that's before the after effects that will affect its customers
as a result of any kind of financial collapse. The
parallels to the two thousand and seven the two thousand
and eight financial crisis are starting to become a little worrying.
Layman Brothers wasn't the largest investment bank in the world,
although it was pretty big, just like open Ai isn't
the largest tech company, though again it's certainly large in
(03:42):
terms of alleged valuation and expenditures. Layman Brothers collapse sparked
a contagient that would later spread throughout the entire global
financial services industry and consequently the global economy. Now I
can see open AI's failure not having as big an effect,
but I can imagine a systemic effect. Still, you have
to realize that the whole AI trade, the narrative, the bubble,
(04:06):
it's holding up the economy. I think like thirty thirty
five percent of the US stock market is in the
Magnificent seven, and all of their bullshit numbers right now
are held up by this nonsense. And like the financial crisis,
the impact in this case won't be limited to just
bankers and insurers. It will bleed into everything else. This
episode is going to be a bit grim. I'm not
(04:26):
going to lie. I want to lay out the direct
result of any kind of financial crisis at open Ai,
because I don't think anybody is taking this seriously. Let's
start with Oracle, who will lose at least a billion
dollars if open ai doesn't fulfill its obligations per the Information. Oracle,
which has taken responsibility for organizing the construction of the
stargate data centers with unproven data center build a Cruso
(04:48):
and I quote the Information here, may need to raise
more capital to fund its data center ambitions. Oracle has
signed a fifteen year lease with Crusoe, and to quote
the Information is on the hook for one billion dollars
in payments to that firm. To further quote information. While
that's the standard deal length, the unprecedented size of the
facility Oracle is building for just one customerment makes its
riskier than the standard cloud data center used by lots
(05:08):
of interchangeable customers with much more predictable needs. According to
half a dozen people familiar with these types of deals.
In simpler terms, Oracle is building a giant data center
for one customer, open Ai, and has taken on the
financial burden associated with them. If open Ai fails to
expand or lacks the capital to actually pay for its
share of the Stargate data center project, Oracle is on
the hook for at least a billion dollars, and based
(05:30):
on the Information's reporting, it's also on the hooked by
the GPUs for the site. This is me quoting them again.
Even before the Stargate announcement, Oracle and open Ai had
agreed to expand their Abelene deal from two to eight
data center buildings, which can hold four hundred thousand and
VIDIA Blackwell GPUs, adding tens of billions of dollars to
the cost of the facility. In reality, this development will
likely cost tens of billions of dollars, nineteen billion dollars
(05:53):
of which is due from open Ai, which does not
have the money until it receives its second tranche of
funding in December twenty twenty five from SoftBank, and this
is contingent partially on their ability to convert into a
for profit entity, which has mentioned is extremely difficult and
extremely unlikely. It's unclear how many of the Blackwell GPUs
the Oracle has had to purchase in advance, but in
the event of any kind of financial collapse, open Ai
(06:15):
Oracle will likely have to toss at least a billion dollars,
if not several billion dollars, and then we get the
core with a companies. So a company whose expansion is
likely driven entirely by open Ai now and cannot survive
without open Ai flilling its obligations if it doesn't die.
O Anyway, Now, I've written and spoken a lot about
publicly traded AI compute firm Core Weave, and it would
(06:36):
give me the greatest pleasure in my life. Never think
or talk about them ever again. Nevertheless, I have to.
This is my curse. This is my curse. Core Weave
has become my curse every time I think about this.
Fuck Okay. The Financial Times revealed a few weeks ago
that core Weave's debt payments could balloon to over two
point four billion dollars a year by the end of
twenty twenty five, far outstripping its cash reserves, and the
(06:59):
Information reported that its cash burn would increase to fifteen
billion dollars in twenty twenty five, as Bird's IBO filing.
Sixty two percent of core weaves twenty twenty four revenue
a little under two billion, with losses amounting to eight
hundred and sixty three million was Microsoft Compute, and based
on the conversations I've had with sources, a good amount
of this was Microsoft running compute for open Ai. Starting
(07:19):
October twenty twenty five, open Ai will start paying core
Weave as part of its five year long, twelve billion
dollar contract, picking up the option that Microsoft declined. This
is not great timing, or maybe it's perfect timing, because
this is also when core Weave will have to start
making payments on their massive, stupid, multi billion dollar d
DTL two point zero loan mentioned in previous episodes. But really,
(07:40):
there's a newsletter if you want to you hear me,
go mad, You want to read me go mad? You
read my read my core with piece because it really
drove me insane. Nevertheless, these core Weave payments, the ones
from open Ai to core Weave that October, they're pretty
much critical to Corewave's future. This deal also suggests that
open ai will become core Weave's life just customer. Microsoft
(08:01):
had previously committed to spending ten billion dollars on core
Weave services by the end of the decade, but CEO
satchly Adella added a few months later on a podcast
that its relationship with core Weave was a one time thing. Man,
man a really like really fucking around there, such a
don't love core Weave. Assuming Microsoft keeps spending it its
previous rate, So about one point like sixty six percent
(08:23):
of two billion dollars whatever those I have something that
isn't guaranteed. By the way, it would still only be
half of open AI's potential revenue to core Weave. Core
Weave's expansion at this point is entirely driven by open Ai.
Seventy seven percent of its twenty twenty four revenue came
from two customers, Microsoft being the largest and yes, I
just fucked up a number at sixty two percent and
using core weaves auxiliary compute for open Ai. As a result,
(08:46):
the future expansion efforts the theoretical one point three gigawatts
have contracted, and by the way, that means it doesn't exist.
Compute at core Weaver are largely, if not entirely, for
the benefit of open Ai. In the event that open
Ai cannot fulfill its obligations, core weave will collapse. It's
that fucking simple, and then the shock waves will ripple further.
In Video relies on corewey for more than six percent
(09:06):
of its revenue and corwy's future credit worthiness to continue
receiving said revenue. Well, much of that is dependent on
open ai continuing to buy services from core Weave now
and basically this in a comment I received from the
legendary Gil Luria, Managing director and head of Technology research
at Analyst DA Davison and Co, I quote him, since
cor We've bought two hundred thousand GPUs last year, and
(09:27):
those systems are around forty thousand dollars. We believe cor
We've spent eight billion dollars on in Vidia last year.
That represents more than six percent of Nvidia's revenue in
twenty twenty four, he said last year, but I've just
wanted to make it sound better. Core We've receives preferential
access to Nvidia's GPUs, though Nvidia kind of denies that
and makes up billions of dollars of Nvidia's revenue. Corweave
(09:48):
then takes those GPUs and then they raise debt using
the GPU's as collateral as well as customer contracts. Then
they use the money they've raised to buy more GPUs
from Nvidia. You may think that doesn't sound right. I
am being complete, like this is factual information. At this
point in video was the anchor for Corwave's IPO, and
CEO Michael and Trader said that the IPO would not
(10:10):
have closed without Invidia buying two hundred and fifty million
dollars worth of their shares. Nvidia also invested one hundred
million dollars in the early days of core weaves, and
for reasons I cannot understand, also agreed to spend one
point three billion dollars over four years two and I
quote the information rent its own chips from core weave
fum fact. I can't find a single fucking mention of
(10:30):
core weave in any of Nvidia's filings. Now buried in
that core weaves s one the document every company publishes
before going public was a warning about counterparty credit risk,
which is when one party provides services or goods to
another with specific repayment terms and the other party doesn't
meet their side of the deal. While this was written
as a theoretical as it could, in theoretically speaking, come
(10:52):
from any company to which core Weave acts as a creditor.
It only named one open Ai now has discussed previously.
Core Weaver is saying that should a customer, any customer,
but really they mean open Ai failed to pay its
bills for infrastructure built on their behalf or services rendered,
it can have a material risk of the company. Now.
As an aside, the information reported that Google and someone's
(11:14):
going to email me there, so I just want to
get ahead of it. The core Weave is apparently in
advanced talks with Google to nd GPUs it Also, it
also added another thing in this story, just so that
I don't have to hear from any of you. The
Google's potential deal with core Weaves significantly smaller than their
commitments with Microsoft as according to one of the people
briefed on it, but could potentially expand in future years.
(11:34):
Do not come to me and claim that Google's going
to save core Weeve. I'll be so mad anyway. Even
with Google and open AI's money, Carew's continued the ability
to do business hinges heavily on its ability to raise
further debt, which I have previously called into question a
newsletter that gave me madness, and its ability to raise
future debt is to quote the financial times secured against
it's more than two hundred and fifty thousand n vida
(11:56):
GPUs and its contracts with customers such as Microsoft. Now,
any future debt that core Weave raises will be based
off of its contract with open Ai. You know, the
counterparty credit risk threat that represents a disproportionate sharef it's
revenue I just mentioned, and also whatever GPUs they still
have left that they can get debt on. As a result,
a chunk of a video's future revenue is dependent on
(12:18):
open AI's ability to fulfill its obligations the core Weave
both in its ability to pay them, and they're timing
less in doing so. If open ai fails, then core
Weave fails, then that hurts and video Jensen's going to
have to go. He's going to have to go to
a cheaper leather jacketarium and it gets worse. Open AI's
expansion is dependent on two unproven startups, one of them
I just mentioned, who are also dependent on open ai
(12:41):
to live with Microsoft's data center pullback and open AI's
intend to become independent from redmen. Future data center expansion
is based on two partners supporting coll Weave I know
will get there and Oracle. Now I'm referring, of course
to Core Scientific, which is the data center developer for
core Weave, and of course Crusos, the data center develer
four Oracle. Now, if you were wondering, I can't hint
(13:03):
it about this earlier how many data center how many
DIIDENTA centers do you think Cruso's ever built, and the
answer is none, And of course Scientific, how many do
you think they've built and the answer is also none.
These are the fucking companies underpending the AI boom. I
also really must explain how difficult it is to build
a data center, and how said difficult he increases when
(13:24):
you're building an AI focused one. For example, in Video
had to delay the launch of its Blackwell GPUs because
of how finnicky the associated infrastructure, so the servers and
the cooling and such is for customers. This was for
customers that had already been using GPUs and therefore likely
knew how to manage the temperatures created by them. Also
is another reminder open eyes on the hook for nineteen
(13:44):
billion dollars of funding behind Stargate, and neither of them
have that money. I just want to remind you of that,
because it costs so much money to build a fucking
data center. And imagine if you didn't have any experience
and effectively had to learn from scratch, how do you
think it would be building these data centers? Let's find out.
So let's start in Abilene, Texas with Crusoe and the
(14:07):
Stargate Data Center project. Now. Cruso is a former cryptocurrency
mining company that has now raised hundreds of millions of
dollars to build data centers for AI companies, starting with
a three point four billion dollar data center financing deal
with asset manager Blue Owl Capital. This yet to be
completed data center has now been leased by Oracle, which
will allegedly fill it full of GPUs for open AI.
Despite calling itself and I quote the industry's first vertically
(14:29):
integrated AI infrastructure provider, with the company using flared gas
as a waste by product of oil production to power
I infrastructure, Cruso does not appear to have built a
single AI data center and is now being tasked with
building one point two Gigawatt's had a data center capacity
for open Ai. It's just so fucking Cruso is the
sole developer and operator of the Abilene site, meaning, according
(14:52):
to the information that it is in charge of contracting
with construction contractors and data center customers as well as
running the data center after it is built. Oracle, it seems,
will be responsible for filling said data center with GPUs
as mentioned. Nevertheless, the project also appears to be behind schedule.
The Information reported in October twenty twenty four that Abilene
was meant to have fifty thousand of Nvidia's Blackwell AI
(15:13):
chips in the first quarter of twenty twenty five, and
also suggested that the site was projected to have a
whopping one hundred thousand of them by the end of
twenty twenty five. Now you can join me back here
in reality, because a report from Bloomberg in March twenty
twenty five said the Open AI and Oracle we're expected
to have sixteen thousand available by the summer of twenty
twenty five, with and a quote open Ai and Oracle
(15:35):
expecting to deploy sixty four thousand then video GB two
hundreds at the Stargate Data center by the end of
twenty twenty six. That's that's very delayed. That's really delayed. Again.
How I run a PR firm in that I record
a podcast, I've write a newsletter, I have a book.
I'm right in. I got all this shit on and
I'm the asshole who notices this anyway, has discussed previously.
(15:58):
Open ai needs this capacity very bad. According to the information,
open ai expect stargate to handle three quarters of its
compute by twenty thirty and these delays call into question,
at the very least whether this schedule is reasonable, or
logical or even possible. And I actually really question whether
Stargate itself is possible at this point. But it can
get dumber because we're about to talk about Core Scientific,
(16:19):
and they are core Weave's friends. They're the people building
data centers for core Wave in Denton, Texas. Now, as
you can probably tell, I've written a great deal about
core Wave in the past. That got a monologue, got
(16:40):
a newsletter, and I got a therapy bill for it.
And specifically I've written about their build out partner, Core Scientific,
a cryptocurrency mining company, yes, another one that has exactly
one customer for its AI data centers, and you'll never
guess who it is. It's Core Wave. Now here's a
few fun facts about Core Scientific. Core Scientific was bankrupt
lotat year. Course Scientific has never built an AI data center,
(17:02):
and its cryptocurrency mining operations were built around a six
specialist computers for mining bitcoin, which led to an analyst
to tell CNBC that said data centers would and I
quote need to be buildozed and built from the ground
up to accommodate AI compute. That's the stuff. Course Scientific
also does not appear to have any meaningful AI compute
of any kind. It's AI slash HPC, which is high
(17:24):
performance computing. Revenue represents a teeny tiny, teeny little percentage
of overall revenue, which mostly comes from mining crypto, both
for itself and other parties. Now, hearing all of this,
would you give this company your your compute? Would you
think these are the people that I am going to
(17:44):
call to build my data centers. If you said no,
you are smarter than Corewave, who has given their entire
one point three gigabat build out to Core Scientific. Course Scientific. Also,
it seems they seem to be taking on like one
point one four billion dollars of capital expenditures to build
these data centers, which, by the way, is not enough money.
But nevertheless, Corwave has promised to reimburse them at eight
(18:05):
hundred and ninety nine point three million of these costs.
This is all from public filings. By the way, it's
as one clear house course Scientific actually intends to do
any of this shit. While they've taken on a good
amount of debt in the past five hundred and fifty
million dollars in the convertible note towards the end of
last year, this would be more debt than they've ever
taken on. It Also, as with Crusoe, does not appear
(18:26):
to have any experienced building AI data centers, a point
I keep repeating because it's very important. These are other
companies behind the growth for open AI, except unlike Crusoe,
Core Scientific is a barely functioning, recently bankrupted bitcoin minor
pretending to be a data center company. Crusoe, on the
other hand, is possibly also doing the same thing, but
less egregious about it. Now, how important do you think
(18:47):
core Weave is to open ai? Exactly? Well, that's our semaphore.
Core Weaver has been one of our earliest and largest
compute partners, open ai chief Sam Allman said in Corwave's
roadshow video, adding that Corwave's compute power led to the
creation of some of the models that were best known
for Core. We figured out how to innovate on hardware,
to innovate on data center construction, and to delve for results,
very very quickly did it. But even if it did,
(19:11):
will it survive long term? Going back to the point
of the contagion. If open ai fails and core we fails,
so too does course Scientific And I don't really fancy
Crusoe's chances either. But let's take a step back for
a moment. We've been going so hard, haven't we. I've
got a genuine question, just for the fact finders out there.
There's a Microsoft book. Open AI's computer's revenue now. Up
(19:34):
until fairly recently, Microsoft has been the entire infrastructure backing
open ai, but recently to free open ai up to
work with Oracle and see other people, released it from
its exclusive cloud compute deal. Nevertheless, put the information, open
ai still intends to spend thirteen billion dollars on compute
on Microsoft as there this year. What's confusing, however, is
whether any of this is booked as revenue for Microsoft.
(19:55):
Microsoft claimed earlier in the year that it surpassed thirteen
billion dollars in annual recurring revenue, by which it means
it's last month multiplied by twelve by the way, and
they said it was from ai. Open AI's compute costs
in twenty twenty four or five billion dollars, and that's
at are discounted as your rate, which on an anualized
basis will be about four hundred and sixteen million dollars
in revenue a month for Microsoft. It isn't, however, clear
(20:18):
whether Microsoft counts open ai is computer's money, which is
really fucking weird. You'd think with all this money they're
making from this company, they'd be saying there was money
coming in. It's peculiar. I've yet to find a real answer. Now.
Microsoft sernings do not include an artificial intelligence section. No,
They're made up of three separate segments, Productivity and Business Processes,
(20:39):
which includes things like LinkedIn, Microsoft three sixty five and
so on. More Personal Computing, which includes Windows and gaming products,
and then Intelligent Cloud including server products and cloud services
like A zero, which is likely where open AI's computers included,
and where Microsoft book the revenue from selling access to
open AI's models but not open AI's compute question. As
(21:00):
a result, it's hard to say specifically where open AI's
revenue might sit. Even guessing intelligent Cloud might not be right,
but based on an analysis of Microsoft's Intelligent Cloud segment
from financial year twenty twenty three Q one through its
most recent earnings, and there was a spike in revenue
from twenty three Q one to twenty four Q one.
In financial year Q one, which ended on September thirtieth,
(21:23):
twenty twenty two, a month before Chat GPT's launch, the
segment made twenty point three billion dollars. The following year,
in FY twenty four Q one, it made twenty four
point three billion dollars, a nineteen point seven percent year
of a year growth, or roughly four billion dollars. This
could represent the massive increase in training and inference costs
associate with hosting Chat GPT, and they peaked at twenty
(21:45):
eight point five billion dollars in revenue in the financial
year twenty four Q four, before dropping dramatically to twenty
four point one billion dollars in financial year twenty five
Q one, and raising a little twenty five point five
billion dollars in financial year twenty five Q two. I'm
so sorry none of this is easy to read. This
is a plausible explanation. Open ai spent twenty twenty three
(22:06):
training its GPT four to Roho model before transitioning to
its massive, expensive iryme model, which would eventually become GPT
four point five, as well as its video generating model sourer.
According to the Wall Street Journal, training GPT four point
five involved at least one training run, costing around half
a billion dollars in compute costs alone. These are huge sums,
but it's worth noting a couple of things. First, Microsoft
(22:28):
licenses open AI's models the third parties, so some of
this revenue could be from other companies using GPT on Azure.
We've seen lots of companies launch Ai products, and not
all of them are based on l lambs mrieing things. Further,
Microsoft provides open ai access to a zerr cloud services
at a discounted rate, as I've mentioned in the past,
and so there's a giant question mark over open AI's
(22:50):
actual contribution to the various spikes in revenue for Microsoft's
Intelligent cloud segment, or whether other third parties played a
significant role. Furthermore, Microsoft's investment in open Ai isn't entirely
cold hard cash. Rather, it's provided the company with credits
to be redeemed on in see your services, kind of
like Chucky cheese tokens. I'm not entirely sure how this
would be represented in accounting terms, and if anyone can
(23:11):
shed any light on this, please get in touch. Would
it be noted as revenue or something else? Open Ai
isn't paying Microsoft or are they? Are they doing the
tech equivalent of redeeming air miles or have they spent
a gift card of us? You're it really isn't obvious,
and Microsoft is doing some accounting bullshit here and not
suggesting impropriety, not suggesting anything illegal. I'm just saying it's
(23:32):
insane that they have this company spending billions of dollars
theoretically on their services and it's just nowhere. Additionally, while
equity is often treated as income for tax purposes, as
is the case when an employee receives RSUs as part
of their compensation package, under the existing open Ai structure,
Microsoft isn't actually a shareholder, but rather the owner of
(23:53):
profit sharing units. This is a distinction worth noting. These
profit sharing units are treated as analogous to equity, or
at least in terms of open AI's ability to raise capital,
but in practice they aren't the same thing. They don't
represent ownership in a company as directly as, for example,
a normal share would. They lack the liquidity of a
share in the upside they provide, namely dividends is purely theoretical.
(24:14):
Another key difference. When a company goes bankrupt and enters liquidation,
shareholders can potentially receive a share of the proceeds after creditors, employees,
and so on are paid. Well, that often doesn't happen
as is, as in the liabilities generally, they can exceed
the assets of the company. In many cases it's at
least theoretically possible, given that profit sharing units aren't actual
(24:35):
salaries or shares. Where does that leave Microsoft? This stuff
is confusing, and I'm not ashamed to say that I
just fucked up a word and that complicated accounting questions
like these are far beyond my understanding. If anyone can
shed some light, drop me an email, buzz me on
Twitter or blue sky, hit me up on clerk or
gorp or post on the better offline subware. Someone might
take your wallet though. Anyway, back on track, I think
(24:57):
it's worth understanding the scale of the open air vortex
and how it's distorting the tech investment market and why,
even without having failed, it represents the systemic risk. Without
open ai, and American startup investment is flat, and even
with it, less startups are receiving investment. Crunch based News
reported in early April the North American startup investments spiked
in Q one due to open Ai, hitting eighty two
(25:18):
billion dollars. Great, right, sounds great? This statement sadly has
a darker undertone. American star up investment was actually like
forty two billion in Q one twenty twenty five when
you remove the deal, which is appropriate because none of
the money is actually received by open Ai yet and
at best only ten billion dollars if it will be
received before Sember twenty twenty five. This quarter also included
a three point five billion dollar investment in Wormlight competitors
(25:42):
Aroundthropic run by Warrio Amma Day, making the appropriate number
of paltry thirty nine point five billion dollars. Now, this
is still an improvement, though a marginal one over the
thirty seven point five billion dollars raised in Q one
twenty twenty four. Nevertheless, crunch based news also has a far,
far darker story. Your volume women in American startups has
begun to collapse, trending downward almost every quarter or deal
(26:05):
volume isn't the direct result of open AI's financial condition.
The so called revolution created by open ai and other
generative AI companies technology appears to be petering out, and
the contagion is starting to impact the wider tech sector.
It's important to understand how bleak things are the future
of generative AI, wrest and open ai, and open AI's
future rests on near and possible financial requirements. I've done
(26:25):
my best to make this argument is in as objective
a tone as possible, regardless of my feelings about the
bubble and its associated boosters open Ai. As I've said
before and argued countless times in interviews and podcasts and newsletters,
it's effectively the entire generative AI industry, with its nearest
competitor being less than five percent of its five hundred
(26:46):
million weekly active users Anthropic, Google, Microsoft XAI. They're all
rounding errors in the grand scheme of things. But open
AI's future is dependent and this is not an opinion.
This is an objective fact on effectively infinite resources in
many forms. Let's start with the financial resources. If open
ai required forty billion dollars to continue operations this year,
(27:09):
it's reasonable to believe it will need at least another
forty billion dollars next year, and based on its internal projections,
will need at least forty billion dollars every single year
until twenty thirty, when it claims somehow it will be profitable.
And I quote the information with the completion of the
Stargate Data Center project, you may be wondering, how's that possible? Ed?
(27:29):
How you think the information wrote that down? Fuck no,
just go Lesson's too busy humiliating people. She let go
by name on Twitter, Ess Golesson. I like the information.
I think you're a fucking asshole for how you treated
your people. Say it on my podcast, I say it
on Twitter. Anyway. Let's keep talking about some of these
resources that open ai is dealing with, specifically the compute
(27:50):
resources and expansion. Open ai requires more compute resources than
anyone has ever needed, and will continue to do so
in perpetuity. Building these resources is now dependent on two partners,
Course Scientific and Crusoe. Though I've never built a data center,
as Microsoft has materially pulled back on data center development
and has as aforementioned, pulled back on two gigawads of
(28:10):
data centers, slowed or paused. Of course, some of its
early stage center products too, with TD Cohen's recent analyst
reports saying that data center pullbacks were and I quote
them March twenty six, twenty twenty five, data center channel
checks letter because it's so good, driven by the decision
to not support incremental open Ai training workloads, that's the stuff.
(28:31):
In simpler terms, open ai needs more computer at a
time when it's lead backer, which has the most GPUs
in the world, has specifically walked away from building it.
Even in my most optimistic frame of mind, it isn't
realistic to believe that Cruso or Core Scientific can build
the data centers necessary for open AI's expansion, even if
soft Bank and open Ai had the money to invest
in Stargate today, which they do not. Dollars do not
(28:53):
change the fabric of reality. Data Centers take time to build,
requiring concrete would steal in other materials to be manufactured
and placed, and that's after the permitting required to get
these deals done. Even if that succeeds, getting the power
necessary is a challenge unto itself, to the point that
even Oracle, an established and storied cloud compute company run
by a very evil man at one point to quote
the Information, has less experience than its larger arrivals in
(29:16):
dealing with the utilities, to secure power and working with
powerful and demanding cloud customers whose plans change frequently. A
partner like Cruso will Core Scientific simply doesn't have the muscle, memory,
or domainer expertise that Microsoft has when it comes to
building and operating data centers. As a result, it's hard
to imagine, even in the best case scenario, that they
are able to match the hunger for compute the open
Ai has now. I want to be clear, I believe
(29:38):
open ai will still continue to use Microsoft's compute and
even expand further into whatever remaining compute Microsoft may have. However,
there is now a hard limit on how much of
that there's going to be, both literally and what's physically available,
and in what Microsoft itself will actually allow open ai
to use, especially given how unprofitable GPU compute seems to
be based on how every single company that isn't in
(29:59):
vidio lose his money running them. But really, and we're
coming to the end of this, which leads to a question,
(30:21):
how does all of this end? Last week, a truly
offensive piece of fan fiction framed as a report called
AI twenty twenty seven went viral, garnering press with the
Duiskesh podcast and gormles childlike wonder from Dope New York
Times reporter Kevin Rus and reporter I think is a
fucking stretch. Its predictions vaguely suggest a theoretical company called
(30:44):
open Brain will invent a self teaching agent of some sort.
It's total bullshit, but it captured the handsome minds of
AI boosters and other people without object permanence because it
vaguely suggests that somehow, large language models and their associated
technology will become something entirely different. Like making predictions like
these because the future, especially in our current political climate,
(31:04):
is utter chaos. But I will say that I do
not see, and I say this with complete objectivity, how
any of this bullshit continues. I want to be extremely
blunt with the following points, as I feel like both
members of the media and tech analysts have categorically failed
to express how ridiculous things have become. I will be
repeating myself, but it's fucking necessary, as I need you
(31:26):
to understand how untenable things are. Soft Bank is putting
itself in dire straits simply to fund open Ai once
this deal threatens its credit rating, with soft Bank having
to take on what will be multiple loans to fund
this forty billion dollar round, and open Ai will need
at least another forty billion dollars a year later. This
is before you consider the other nineteen billion dollars that
(31:46):
soft Bank has agreed to contribute to the data center
project with Stargate money it does not currently have available. Now.
Open Ai has promised nineteen billion dollars to the Stargate
Data Center project two and again they do not have it,
and they need soft Bank to give it to them.
And again I've said it, and I'll say it again.
Neither of these companies have the money. The money is
(32:09):
not there, and open Ai needs Stargate to get built
to grow much further. I see no way in which
open ai can continue to raise money at this rate,
even if open Ai somehow actually receives the forty billion
dollars it's been promised, which will require it to become
a for profit entity, which I don't think it can
fucking do. While it could theoretically stretch that forty billion
(32:32):
dollars to the last multiple years, projections say it will
burn three hundred and twenty billion dollars in the next
five years, or more likely, I can't see a realistic
way in which open ai gets the resources it needs
to survive. It will need this insane streak of good fortune,
the kind of which you only really hear about in
Greek poems or JoJo's Bizarre Adventure. You know, the more
cultured choice, but let's go through them. Somehow soft Bank
(32:54):
gets the resources and loses the constraints required to bankroll
this company forever. The world's wealthiest entities, those sovereign wealth
funds mentioned in the last episode, sounds and so on,
they pick up the slack until open Ai receive they
reach profitability, which is a huge assumption. It's also assuming
that open ai will have enough of these megawealthy benefactors
(33:15):
to provide it with the three hundred and twenty billion
dollars they need to reach profitability, which it won't. There'll
also need Cruso and Core Scientific to turn out to
be really good at building AI infrastructure, which they've never
done before, which is that's very possible, I'm sure. And
then Microsoft will then walk back its walk back on
building UAI infrastructure and recommit to tens of billions of
(33:36):
dollars of CAPEX, specifically on AI data centers, and also
will give it to open Ai. And then, of course
Stargate's construction happens faster than expected and there are no
supply chain issues in terms of labor, building materials, GPUs
and so on. Now I don't know, I haven't checked
the news in the last three weeks, but is there
anything going on that might increase the costs of materials?
(33:58):
Probably not. Anyway, if those things happen, I'll eat quo.
I'm not particularly worried. In the present conditions. Open ai
is on course to run out of money or run
out of compute capacity, and it's unclear which will happen first.
But what is clear is it's time to wake up.
Even in a hysterical bubble where everybody is agreeing that
(34:19):
this is the future, open ai is currently requiring more
money in more compute than is reasonable to acquire. Nobody nowhere, ever, anywhere,
has ever raised as much money as open ai needs to.
And based on the sheer amount of difficulty that soft
Bank is having raising the funds to meet the lower tranche,
(34:39):
the ten billion dollar one of its commitment, it may
not actually be possible for this company to continue, even
with the extremely preferential payment terms months long deferred payments
for example that open ai probably has. At some point
someone will need a dollar. I'll give Sam Ortman some
fucking credit. He's found many partners the shoulder the burden
of the rock economics of open Ai. With Microsoft, Oracle,
(35:02):
Crusoe and Core We've handling the upfront costs of building
the infrastructure, and SoftBank finding the investors for its monstrous
stupid round, and the tech media mostly handling marketing for him,
which is really nice, great job everybody. He is. However, overleveraged.
Open Ai has never been forced to stand on its
own two feet or focus on efficiency, and I believe
the constant enabling of this ugly nonsensical burn rate has
(35:26):
doomed this company. Open Ai has acted like it'll always
have more money. In compute, And that's kind of because
everyone's acted as that would be the case. No one's
really called sam Altman out on his bullshit. There are
some people, but really no one in the mainstream media
has bothered. Really, Sam Altman has been enabled. Open Ai,
by the way, cannot just make things cheaper at this point,
(35:48):
because the money has always been there to make things
more expensive, as has the compute to make larger and
larger language models that burn billions of dollars a year.
This company is not built to reduce its footprint in
any way, nor is it built for a future in
which it wouldn't have access to infinite resources. Worse still,
investors in the media have run cover for the fact
that these models don't really do much more than they
(36:09):
did a year ago, and for the overall diminishing returns
of large language models writ large. Now, I've had many
people attack my work about open ai, but none of them,
not one of them. Nobody has provided me any real
counterpoint to the underlying economic argument I've made since July
of last year, the open ai is unsustainable. Now this
is likely because there really isn't one other than open
(36:31):
ai will continue to raise more money than anybody has
ever raised in history imperpetuity and will somehow turn the
least profitable company of all time into a profitable company.
This is not a rational argument. It's a religious one.
It's a call for faith. And it's disgusting to see
well paid reporters with one hundred and fifty thousand subscribers
(36:53):
to the newsletters and a really shitty podcast with a
major news outlet constantly just ignore them share and I
see no greater payal horse of the apocalypse than Microsoft's
material pullback on data centers. Well, the argument might be
that Microsoft wants open ai to have an independent future.
That's fucking laughable when you consider Microsoft's deeply monopolistic tendencies,
and for that matter, it owns a massive proportion of
(37:16):
open AI's pseudoequity. At one point, Microsoft's portion was valued
at forty nine percent, and while additional fundraising has likely
diluted Microsoft's steak, it still owns a massive portion of
what is, at the very least, if you believe any
of this nonsense, the most valuable private startup of all time.
And we're supposed to believe that Microsoft's pullback, which limits
(37:36):
open AI's access to infrastructure it needs to train its
and run its models, and thus is mentioned represents an
existential threat to the company. You meant to believe that
this is because of some sort of paternal desire to
see open ai leave childhood behind to spread its wings
and enter the real world. Are you fucking stupid? Sorry?
I shouldn't be calling people stupid. I shouldn't. I really shouldn't.
(37:59):
But I am more likely Microsoft got would have needed
out of open Ai, which has reached the limit of
the models that can develop, and which Microsoft, by the way,
already owns the ip of due to their twenty nineteen
funning round. There's probably no reason for Microsoft to make
any further significant investments other than just kind of throwing
a little cash in there, and then I imagine some
sort of tax dodge. I'm just guessing. It's also important
(38:19):
to note that absolutely nobody other than Nvidia is making
any money from generative AI. Core Weave loses billions of dollars,
open Ai loses billions of dollars, Anthropic loses billions of dollars.
And I can't find a single fucking company providing generative
AI powered software that's actually making a profit. The only
company is even close to doing so Are Consultancy is
providing services to drain and create data for models like
(38:40):
Churing and Scale AI, and Scale isn't even fucking profitable now.
The knock on effects of open AI's collapse will be
wide ranging. Neither core Weave nor Crusoe will have tenants
for their massive, unsustainable operations. An oracle will have nobody
to sell compute to because they've leased that thing for
fifteen fucking years, or one customer who else is going
to take that anyway. Cor we will likely collapse under
(39:02):
the weight of its abominable debt anyway, which will lead
to a six seven percent or more revenue drop for
in video or a time when revenue growth has already
begun to slow. On a philosophical level, too, open AI's
health is what keeps this industry alive. Open ai has
truly the only meaningful user base in generative AI, and
this entire hype cycle has been driven by its success.
(39:22):
Meaning any deterioration or collapse of open ai will tell
the market what I've been saying for over a year.
The generative AI is not the next hype of growth
market and it's underlying economics do not make sense. But look,
I'm not saying this to be a hater. I'm not
saying this to be right. This stuff has driven me insane,
(39:43):
but I'm not doing it to be a pundit, to
be a skeptic, to be a cynic, to be someone
that hates because I want to hate. And I hate
them not because I think people like me because I
hate them. I hate them because I have brainworms. I
have something wrong with me inside my brain that tells
I have to be like this, and I have to
look at these things and I have to try and
(40:04):
find what's going on, otherwise I will be driven mad,
which is why I'll say if something changes, if I'm
wrong somehow, I promise you I will tell you exactly how,
exactly why, and what mistakes I made to come to
the conclusions I have in this episode and the episodes before.
But I don't believe that my peers in the media
(40:25):
will do the same when this collapses. But I promise
you that they will be held accountable because all of
this abominable waste could have been avoided. Large language models
are not on their own the problem. The tools capable
of some outcomes, doing some things, But the problem, ultimately
are the extrapolations made about their abilities and the unnecessary
drive to make them larger, even if said largeness never
(40:48):
really amounted to much. Everything that I'm describing is the
result of a tech industry, including media and analysts, that
refuses to do business with reality, trafficking in the ideas
and ideolo, celebrating victories that have yet to take place,
applauding those who have yet to create the things that
they're talking about, cheering on men lying about what's possible
(41:09):
so that they can continue to burn billions of dollars
and increase their wealth and influence for barely any fucking reason.
I understand why others might not have said what I've said.
What I am describing is a systemic failure, one at
a scale here too unseen, one that has involved so
many rich and powerful and influential people agreeing to ignore reality,
(41:30):
and that'll have crushing impacts for the wider tech ecosystem
when it happens. Don't say I didn't warn you. Thank
you for listening to Better Offline. The editor and composer
of the Better Offline theme song is Matasowski. You can
(41:51):
check out more of his music and audio projects at
Mattasowski dot com m A T. T O S O
w Ski dot com. You can email me at easy
at better offline dot com or visit better offline dot
com to find more podcast links and of course, my newsletter.
I also really recommend you go to chat dot where'soead
dot at to visit the discord, and go to our
(42:12):
slash Better Offline to check out I'll Reddit. Thank you
so much for listening. Better Offline is a production of
cool Zone Media. For more from cool Zone Media, visit
our website cool Zonemedia dot com, or check us out
on the iHeartRadio app, Apple Podcasts, or wherever you get
your podcasts.