All Episodes

October 23, 2024 38 mins

In this episode, Ed is joined by Paris Marx, author of the Disconnect Blog and the host of Tech Won't Save Us, to talk about the historic - and horrifying - push of hyper-scalers to build out and control vital parts of the world's infrastructure.

Tech Won't Save Us: https://techwontsave.us/episodes

Data Vampires Ep 1: https://techwontsave.us/episode/241_data_vampires_going_hyperscale_episode_1

Data Vampires Ep 2: https://techwontsave.us/episode/243_data_vampires_opposing_data_centers_episode_2

Paris on Twitter: https://x.com/parismarx 

---

LINKS: https://www.tinyurl.com/betterofflinelinks

Newsletter: https://www.wheresyoured.at/

Reddit: https://www.reddit.com/r/BetterOffline/ 

Discord: chat.wheresyoured.at

Ed's Socials:

https://twitter.com/edzitron

https://www.instagram.com/edzitron

https://bsky.app/profile/zitron.bsky.social

https://www.threads.net/@edzitron

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
All Zone Media.

Speaker 2 (00:06):
Hello and welcome to Better Offline. I'm your host and
the single most punished man alive ed Zetron. Everyone hates me,
but the people don't hate the man who's joining me today.

(00:27):
I've got Paris Marks, the host of Tech Won't Save Us,
the writer of the Disconnect blog, and the actual host
of the brand new series is about to tell us
about data vampires. Paris, thank you for joining me.

Speaker 3 (00:38):
Absolutely, it's so fun to be on the show.

Speaker 2 (00:40):
So tell me about the new show. I'm excited.

Speaker 3 (00:43):
Yeah, it's uh, you know, I think everyone has kind
of been paying attention a bit more to like the energy,
use of data center as a cryptocurrency and all this
kind of stuff, and you know, especially what powers generative AI,
and you know what's been kind of fueling the hype
for the past two years. And so in this series,
I wanted to go into why we're actually building so
many data centers, What is behind these things, how much

(01:05):
energy are they actually using, And on the one hand,
like the commercial impulses to do this, but also like
the kind of weird ideological reasons that these people want
to build these massive AI systems in the first place.
Relating back to all these like things about AGI and
you know, trying to make computers with human level intelligence
and all this kind of stuff where they eventually want
to merge their brains with machines. So it goes into

(01:28):
like the wide range of all these sorts of things
over over for episodes.

Speaker 2 (01:32):
So talk to me about the scale of this. How
many of these data centers are they building? Like, give
the audience some idea because it from what I know,
it's not great.

Speaker 3 (01:43):
Yeah, absolutely, and it's quite a lot, right, especially when
you see it scale up over the past number of years.
So we've seen this like really significant escalation over the
past five years, in particular in the construction of these
data centers. So there was something like five hundred or so,
you know, about five years ago, and now we're looking
at more like a thousand that were completed by the

(02:04):
end of or in the early part of twenty twenty four.
We have many more than that now, and we know
that Microsoft, Amazon, and Google in particular are making these
huge investments to increase the number of data centers that
they're building in to build them more quickly. And so
you know, the thing is some people might listen to
this and say Okay, Yeah, but data centers have been
around for a long time, and I think that this

(02:25):
is an important distinction to make, right. Data centers as
we knew them were like a floor in an office
building that a company was using for their own purposes,
or you know, maybe even a specific room in the
basement or something. But what we're talking about in particular
with Amazon, Bikersoft, and Google and this move to the
cloud and this kind of centralized computation is the hyper

(02:45):
scale data centers that exist on the scale that is
so much greater than what we used to have before
the cloud, and how we've really seen that escalate over
the past few years in particular. So that is the
specific problem, right. It's not that we have computation, and
it's not that there is some degree of centralized computation.
It's the scale that these infrastructures exist. That and then
the question of why we're actually building all these in

(03:06):
the first place.

Speaker 2 (03:08):
So why are we building them? What? Like, what are
they actually putting in there?

Speaker 3 (03:13):
Yeah, And it's an essential point, right, And I think
there are a few different ways to look at it, right,
Some computation is always going to be necessary for the
types of things that we're doing online. You know, there's
there's no question about that, right. I think it's fair
to say that we want some degree of Netflix, we
want to have easy access to our email, you know,
we want to be able to socialize with one another

(03:33):
on these various platforms that we use. And I think
that that is totally legitimate and that we should be
able to do those sorts of things, right. But then
you ask, okay, but what is really driving this significant
growth that we have been seeing over the past while.
And we see on the one hand, certainly that is
generative AI. The cryptocurrencies are a little bit separate from
that because generally that stuff is not being run on

(03:54):
say like the Amazon, Microsoft and Google servers. Those are
like specific things I don't even think if they let you, right,
So these are specific infrastructures being set up by like
crypto miners and things like that. So it's a little
bit different. But we can still kind of, you know,
look at the amount of energy use that's being put
into those sorts of use cases. But the other piece
that I think that we tend to you know, not

(04:15):
think about so much, especially when we talk so much
about generative AI, is this sort of you know foundation
that we have built the Internet on over the past
number of decades. And for a lot of these companies,
the business model is collect as much data as possible
so that then we can target ads, and we can
target product recommendations and all this sort of stuff to you,

(04:35):
And that like surveillance piece and storage of massive amounts
of data on all of us and everything that we
do online is also a big component of it as well.
That I think it's under considered. And so my provocation
is like, do we really need to be collecting all
that data in the first place? That requires so much
you know, computation but also storage, which is driving the

(04:56):
creation and the building of so many of these hyperscale
data centers.

Speaker 2 (05:00):
Building them just for generative AI or as generateve AI
just kind of an excuse.

Speaker 3 (05:05):
Yeah, I certainly see it as an excuse it, you know,
if we're really digging into it, it is a bit
of both, right. On the one hand, these companies are
trying to build a ton of data centers in order
to power generative AI, because we both know how computationally
intensive not just the training of those models is, but
also then using those products. As these companies are trying
to roll them out into so much of what we do,

(05:26):
so many of the services that we use.

Speaker 2 (05:28):
Right.

Speaker 3 (05:28):
But then the other piece of it is that even
before the generative AI moment like I was talking about,
over the past five years, we can already see that
there was this scale up in the number of hyper
scale data centers being built in particular, and so in
that way, I see generative AI as an excuse to
continue building these things that they were already building in
the first place. And this is like the foundational point here, right,

(05:51):
is that if you think about a company like Amazon,
Microsoft and Google that has this massive cloud business, and
especially Microsoft and Amazon where they're getting so many you know,
you know where the profits from those businesses are so
important to their to their wider business model and being
able to expand into so many different areas of business.
You know, there is an inherent incentive then, and you know,

(06:13):
we know how capitalism works. These businesses always need to grow.
They need to be making more profits in order to
keep shareholders happy. So if your business is, you know,
providing centralized computation at scale, you need the amount of
computation that we all collectively use to continue growing year
on year, and you know they want it to be
growing very quickly, and that means you're not just going
to need more hyper scale data centers, but you need

(06:35):
to sell people on more computation and to make the
things that we use more computationally intensive to justify you know,
this this kind of business incentive that that's driving you.

Speaker 2 (06:46):
So it almost kind of sounds like they have just
been trying to find computationally expensive or intensive even things
so that they could build more ways to compute and
justify these expensive It's kind of sickening, pisses yea off.
It's like when I think about that, people say to me, Oh,
you're angry. Why are I don't know why other people

(07:06):
aren't angry. I feel like you and I are about
as angry. We're very about this.

Speaker 3 (07:11):
Maybe we express it in different ways.

Speaker 2 (07:12):
Sometimes perhaps agree, yeah, you're a little nicer than I am.
Perhaps it's just frustrating because I was just on a
podcast actually just before this, and I was talking to
them and they were talking about the promise of AI.
But when you get past the promise, it's really just
kind of shit, like it's mediaoka, it's and they're building
all of this infrastructure for nothing, And one theory I

(07:35):
have is that it's not being used. I don't know
if you've seen anything that suggests that or what I mean,
you've likely working off of public documents, I'm guessing. But
what if this is for nothing? What if they've just
built a bunch of data centers for no reason.

Speaker 3 (07:51):
Yeah, And that is something that I've talked to a
number of people about, right, because I'm trying to understand
that question as well. And I think that there's two
potential past that this takes. So one of them is
that maybe they don't end up using some of these
data centers that they're currently building, or they scale back
some of the projects that they're currently planning to build.

Speaker 2 (08:09):
Right.

Speaker 3 (08:09):
And so, you know, my kind of example to point
to for that being a potential pathway is you know,
remember in the early kind of probably year or eighteen
months of the pandemic, there was a lot of people
who were using like Amazon to get more things delivered.
And as a result, Amazon kind of created this plan
to build a ton more warehouses than they were actually

(08:32):
going to need. And as like you know, people went
back to kind of shopping a bit more normally. As
you know, the threat or as the lockdowns ended and
things like that, what you saw is the demand for
Amazon or the demand that Amazon had projected decreased, and
so they canceled a bunch of warehouse projects and expansions
across the United States and their wider warehouse network because

(08:54):
they felt that they weren't actually going to need all
that capacity that they thought they were going to. So
that's one potent path. But I think, you know, because
of what we were saying, and because of what was
what I was explaining about, you know, this need to
grow the amount of computation that we are like collectively using,
I think that it is less the chance with data
centers that that is the pathway that we see, and instead,

(09:17):
I think that the amount the number of data centers
that these companies are creating, if generative a AI falls off,
which I think we you know, we both think is
going to happen ultimately, that there's going to be this
this collapse, this crash, I think they will ultimately try
to find some other use case to justify building out
that computation because I think that that is like the

(09:38):
inherent project that they want to realize. On the one hand,
on the commercial side, but also because the people who
are running these companies really believe that everything that we
do in our society needs to have computation like inserted
in there somehow, and that this is the way that
we build like the future right.

Speaker 2 (09:54):
Kind of makes me think of Peloton. No sure, if
you remember the twenty twenty one stories of Peloton was
like pos the next trillion dollar company. Everyone was furiously
excited about Peloton, and then the moment people got allowed
outside again, it started to kind of fall apart. Well, okay,
I'm getting my dates wrong because twenty twenty one was
when it started to fall off. But nevertheless, I have

(10:14):
to wonder if that could happen here because while you
say they'll come up with a reason, what is it
and how does that lead to money? I just add
Zephyr teach out on and we were talking about this.
It's like, what if there is no plan? What if
they don't other than bigger? Because it's based on what
I've what the series you're doing is about. It seems

(10:36):
like they're committed to this at least, like this is
they need to do this to prove that they are cool,
rather than researching or developing things, it almost feels as
if they think this is inevitable.

Speaker 3 (10:47):
That's what I feel, right, And part of that comes
not just from the commercial imperative, but also like these
broader ideological things that I'm sure that you've discussed on
your show and that I've been discussing with people as well.
But when you hear people like say Sam Altman or
Eric Schmidt or you know, Elon Musk talk about how
ais are going to become so intelligent, and you know
Sam Altman specifically saying that he wants to make sure

(11:09):
that we like I'll merge our brains with machines and
all these sorts of things, Like I think that suggests
also like the direction that these people want to go,
not because you know, not just because of commercial reasons,
not just because they think they're going to make you know,
more profits or try to realize some kind of world
that they want to realize this way, but because they
think that if this is the direction that we don't

(11:30):
take as a society, then you know, the future that
they think is the one that we need to realize
in order to have this you know, wonderful, glorious future
where humanity exists forever and we colonize all these planets
and all this kind of stuff. If we don't take
this direction, if we don't develop these incredibly powerful ais,
then in their mind, this isn't going to happen. So
it feels like, yes, there's the commercial element of this, right,

(11:53):
They're they're trying to drive investment with this AI boom
by promising so much about it. But like ideologically, in
their in their kind of minds and world views, they
think that this is the direction that we have to
go or we're basically all doomed, even though in the
process they're kind of dooming us by destroying.

Speaker 2 (12:08):
At the time, destroying the environment to make sure we
don't do ourselves. That's the Eric Schmidt thing. Yeah, so
let's talk costs. Yeah, what is being put into this?

(12:28):
Because I have bullpop numbers, but maybe you have some
better ideas. How much is actually going into this?

Speaker 3 (12:33):
Yeah, so it's hundreds of hundreds of billions, let me
get that, right, if not trillions of dollars? Right? And
I can't remember the exact figures off the top of
my head, but over the past year, say from June
of twenty twenty three to July of twenty twenty four.
Microsoft has been plowing hundreds of billions of dollars into
data center expansions around the world. Amazon committed I believe

(12:57):
it was one point five trillion dollars early this year
or last year to data center expansion, with about five
hundred billion of that going out in the short term.
So these are like massive numbers. We know that Microsoft
and open Ai have talked about building a one hundred
billion dollar data center complex that would need nuclear energy
in order to power it because they be so energy intensive. Yeah, right, exactly.

(13:20):
So like these are the kinds of numbers that we're
talking about, and I feel like these are the types
of numbers that we often hear, like, you know, governments
talk about when we're when they're making like massive investment
projects or something, right, not like private companies. But because
these private companies are so huge, it's like they can
build out these infrastructures that have, you know, such a
huge impact that they have impacts not just in communities,

(13:43):
but like nationwide globally because they are so significant.

Speaker 2 (13:47):
Almost feels like a kind of authoritarianism. It feels like
they're building out a new governmental system that they control
that they meeter out which includes I guess it's a
vampiric element to it as well, where they control the
power of America, even though power is theoretically a civic good,
like a civic cutility.

Speaker 3 (14:06):
Well, it definitely feels that way, especially when you look
at like the politics that they're embracing too, right where
you have so many of these influential people in Silicon
Valley not just embracing like Donald Trump, but embracing this
form of far right politics that is, you know, gaining
steam in a number of countries around the world that
is explicitly anti democratic. You know, Elon Musk is talking

(14:27):
a lot a lot about that these days and saying
that he thinks the Democrats are going to end democracy. Meanwhile,
like Donald Trump is saying quite explicitly that you know,
he wants to impinge on democratic rights. And these people
have no problem like working with anti democratic leaders around
the world as long as they see it as the
way to like advance not only their kind of personal

(14:50):
interests and to make sure their companies do well, but
also again to try to realize this like type of
future that they think is desirable. I was just recently
talking to Julia Black, who's a reporter at the Information
about this, and she did a profile on Curtis Jarvin,
who is like, you know this guy who yeah, this
guy who like calls himself a dark Elf, but you know,
as one of the co founders of the Dark Enlightenment.

(15:10):
This like you know, this this like political movement that
wants to see a monarchy established in the United States,
that structured more like you know, the CEO of a
corporation to have you know, the United States ron as
an authoritarian state. And we know that Peter Teal and
Mark Andriesen and a ton of these guys like support
these ideas, so you know, like you can you can

(15:32):
see that with the data centers, you can see this
like concerning power grab as these infrastructures expand and they're
in the hands of these major tech companies. But even
when you look at like the leaders of these companies themselves,
you can see that they very much want to make
sure that, you know, the power of the people through
democracy is curtailed so that they can do whatever they want.

Speaker 2 (15:51):
And if you want to know more about Curtis Yarvin,
somehow behind the bus has had ed helms On to
talk about Curtis Yarvin. Want to hear comedian Ed Helms
talk about one of the worst people alive, like, definitely
a bottle popper. When he pops his clogs, there will

(16:11):
all be celebrating. So has there been any pushback against
these data centers you've seen, either from governments or people. Yeah.

Speaker 3 (16:19):
Absolutely, And that's part of the reason that I wanted
to make the series and also like why it came
much more on my radar because I started hearing these
stories of pushback happening really around the world, right. And
it started with just hearing like the story of the
dolls in Oregon, where these people were trying to find
out the amount of water that Google's data centers were using.

(16:41):
And then Google or sorry, then you know, there was
actually a lawsuit because Google wouldn't share that information and
it took a year for them to share it with
the Oregonian, which is the local newspaper, and they found
that over the past few years, Google's water use, I
think it was over the past five years, Google's water
use had tripled, you know, just in that city alone. Yeah,
so these are really significant developments, right, But then I

(17:03):
started to hear about how, say in Phoenix or northern
Virginia or many other parts of the United States, there
were also growing concerns about data centers. And then I
started to learn about in Ireland how now twenty one
percent of all electricity goes to data centers, and that's
causing concerns about the ability of the power grid to
keep up, you know, but also about whether you know
it makes sense to be allowing this many data centers

(17:24):
to be built. You know, concerns in Spain about the
amount of water usage in areas that are increasingly facing droughts.
You know, concerns in France and the Netherlands, but also
in parts of Asia, in South America, where communities in
Chile were saying like, if you build this massive Google
data center, will we still have running water into our homes,
and Google not being able to say like, yeah, definitely,

(17:46):
we can make sure that will happen, and so then
they campaign to try to stop it. Like around the world,
you're seeing the opposition to these things grow, and I
would say part of the reason for that, going back
to what I was saying earlier, is that, you know,
it's not just the scale of these structures, but it's
because these companies are trying to build out so many
more of them around the world, and when they get
located in these communities, once one is established, they usually

(18:09):
try to build more around it to cluster them, right,
because they're usually beneficial reasons why they've located one there
in the first place. And so then you have these
these increasingly you know, greater demands being placed on the
power grids and the water infrastructure of these communities, and
eventually it reaches the point where where many of them
start to break and start to say, like, wait, this

(18:30):
doesn't make sense anymore. You're threatening our own access to
electricity or to water or whatever. And so, you know,
that is why I, you know, I've been paying so
much more attention to this issue because I think it's
something that we're all going to be hearing much more about.
And I think that the earlier that we do start
to take notice of these problems and start to push
back on them, you know, the hopefully the earlier we

(18:53):
can start to curtail these effects and make sure that
you know, these companies can't just get away with doing
whatever they want.

Speaker 2 (18:59):
Have we seen any situations where they have actually stopped
people accessing power or water?

Speaker 3 (19:06):
Yeah, So in a few places we have seen like
the grids actually be threatened. So for example, in Ireland
now the grid operator in the winter often has to
issue these amber alerts to basically say to people like,
reduce your energy consumption or we're going to start doing
rolling blackouts. There were concerns. There's a story that Karen

(19:28):
Howe wrote in The Atlantic about what was going on
in Phoenix, and one of the biggest concerns there is,
of course, is you know, Phoenix's has a lot of desert. Yeah,
Phoenix experiences drought a lot. And so as more and
more data centers are being built there because the energy
it tends to be cheaper and tends to be have
a higher kind of percentage of renewables in there is

(19:50):
my understanding that they're going there regardless of the water impacts,
and so they're growing concerns about people's ability to you know,
access water when these data centers need much despite the
drought conditions that so many people are facing. And how
we know that you know, governments tend to turn off
water to the populations before the businesses that that just

(20:10):
on these sorts of things.

Speaker 2 (20:11):
Yeah, it's truly awful like that It's why every time
I think of stuff like that, I start getting a
bit crazy, because it's just like you'd think that they
would say, well, actually, what we want to do is
turn it off for the business. No, no, no, we
must make sure the businesses can keep generating AI garfields.
If we don't have garfields with AK forty seven, well

(20:34):
what use is being able to drink water?

Speaker 3 (20:37):
Exactly. One of the wild things too, that I was
learning about in Ireland in particular, is that so you know,
like I was saying, there's twenty one percent of all
the electricity now in that country is going to data centers.
It's projected to be about a third by twenty thirty.
And so for a while, the government there, or at
least in certain jurisdictions of the country, kind of stopped

(20:59):
connecting data centers to the grid, stop making new grid connections,
because you know, they were like, there's not enough power
to supply these infrastructures. And so what these data centers
started doing instead was to build local methane gas generation
facilities to bring in the gas themselves to the empower
the data center because they couldn't connect directly to the grid.

(21:20):
And then of course you you know, you can imagine
all the additional emissions that that creates with this kind
of like local gas generation infrastructure, and that's one of
the things that's contributing to Ireland, you know, not you know,
being able to meet its climate targets, or that not
really being in reach. One of the local tds or
members of Parliament who I spoke to there basically told

(21:41):
me that, like, there is renewable energy being added to
the grid in Ireland, but the problem that we face
is that because the energy demand is increasing so rapidly
that you know, the renewables just go to powering the
data centers and we can't turn the fossil fuels off.
And we're seeing a ton of stories about that in
the United States as well, where yes, renewables are being
added to the grid, but the fossil fuels are not

(22:02):
being turned off next to it, and in some cases
fossil fuel infrastructure is actually being reactivated. Or there was
a story and I can't remember what was Bloomberg or
the Financial Times recently that basically said the United States
is investing in new fossil fuel infrastructure at the fastest
rate it has in years, which is.

Speaker 2 (22:18):
Like Northern Virginia the data center rally. That's a very
depressing thing. I want to die. My question is why
don't we I know this is perhaps a little bit
blunt force, but why are we not making them pay
to upgrade the infrastructure? Why why is the government not
just being like, you want this shit, go and build
it for us, give us the money, we'll do it.

(22:40):
Is that happy? I know we've this nuclear power plants, but.

Speaker 3 (22:45):
In some places that is happening actually, so in the Dolls,
for example, in Oregon, you know where I think maybe
most people probably heard the story of that because Google,
you know, was trying to hold back its water use
you know, data and whatnot and eventually had to share it.
But in that case, you know, they have made an
agreement with the local council in order to upgrade their

(23:07):
water infrastructure so that they should have more water available. Again,
that doesn't mean that the community isn't still concerned about
water access and what's going to.

Speaker 2 (23:17):
Happen the finite. It's not an unlimited amount of it.
More infrastructure isn't going to help it if we run
out of it.

Speaker 3 (23:24):
Yeah, that's a pretty fundamental issue, right.

Speaker 2 (23:28):
How do you feel about the nuclear power side, because
I'm kind of fifty to fifty. I like nuclear power,
I think, say it back, but it seems like it's
it seems like they're privatizing it, which is not solving
the problem.

Speaker 3 (23:40):
Yeah, I would say I'm probably more on the skeptical
side of nuclear power, and that's for a few reasons.
Where nuclear power exists, I think it doesn't make any
sense to turn it off, right because that's going to
be far better than any kind of fossil fuel generation
that we're doing, and that should be kind of one
of the last things that we actually target for replacement
with renewed or whatnot. Like, I don't totally agree with say,

(24:02):
Germany turning off it's nuclear energy and going back to coal.
I think that that is a mistake. But I'm more
skeptical of investing in nuclear at this point and treating
it as you know, like a climate response, because we
know that nuclear energy is not only very expensive, but
takes so much time to like set up a new

(24:23):
nuclear plant, And it feels like at this point, when
we've seen the cost of installing solar and wind energy
declined so much in recent years, that it feels much
more kind of not just cost effective, but much more
rapid to just invest it in, you know, building out
large gale renewables instead.

Speaker 2 (24:41):
Renewables don't generate power quite as fast.

Speaker 3 (24:44):
Well possibly, you know you can you can set up
the battery storage facilities and things like that to store
things for the times when it's not generating. Yeah, that's
kind of the way that that I see it.

Speaker 2 (25:08):
It's so funny. I think something's just come to me
with this as well. That really pisses me off, which
is I know, unusual for me, the sense that you
just mentioned battery storage. What if they put all the
money into investing into stuff like that, What if all
of this money could go into inventing them that would
be my make Casey brought this up recently. It's like
one of the very clear places money could go that
would be very good. Like if we had massive battery

(25:32):
storage for power, this would solve many problems and actually
probably create new things we could do, especially in double conscious,
we could dug genuinely do amazing things in the world.
Even describing it now, I feel more excited about this
than Generative AI. But it almost feels like they're kind
of lazy that they don't want to solve the actual
problems to get to the point that they just want

(25:52):
to build more and keep doing the bullshit they've been
doing for years.

Speaker 3 (25:56):
I feel like part of it is profitability as well. Right, Like,
when you think about investing in like so called tech
or generative AI or what have you. Whenever we have
investors thinking about tech businesses, we're thinking about these kind
of really rapid takeoffs in the amount of money that
they're going to make. That the chance for these like
really significant payoffs is the money if the company really works.

(26:16):
And so you often have these companies trading at multiples
that are far above say, what a traditional company would
trade at when it goes public. Right Whereas if you
think of like a more traditional type of company, the
possibilities with the chances that you're going to get this
massive payoff are far lower, and so there's less of
an incentive to put your money into that type of place,
when say, some sort of tech business is going to

(26:38):
have you know, this much greater chance of having this
huge payoff. And so I think that that is like
one place where you know, obviously we live in a
capitalist system where our incentives are kind of misaligned, And
it's one of the things that I find quite silly,
where like, you know, the United States is having this
big feud with China now and very concerned about the
you know, the ability for Chinese tech companies and whatnot

(27:01):
to compete with US companies on the global stage automakers
and things as well. But one of the reasons that
solar energy is so cheap and one of the reasons
that we've had this like significant expansion in evs. You know,
we often point to Elon Musk for example, is like,
you know, the one who deserves all the credit for that.
But the Chinese have really been successful in bringing down

(27:21):
the costs of those types of technologies solar panels and
batteries and things like that in particular. And yeah, you know,
I think that we should be trying to like build
on that rather than just trying to like exclude the
cheaper stuff from our markets.

Speaker 2 (27:35):
Well, fundamentally disagree Paris, because I don't want any Chinese
companies in like stealing my data, tracking Americans, using that
to monetize them somehow and manipulating them using that that's
for American businesses. We keep all surveillance capitalism in the
US of A. It's just pisses me off as well,
because I look, I'm not getting into geopolitics, but it

(27:58):
feels like some of the AI boom is even driven
by that cinephobia, the sense that if we don't build it,
the Chinese will build their AI and their garfields will
be even bustier than ours. And it's just frustrating because
I don't know about working with the Chinese. I'm not
going to get into that. But also it feels like
a dumber, stupid world. What we have another country that's

(28:18):
building things fast, probably in ways that we might not
want to do labor wise, I don't know. But nevertheless,
it feels like we're actually not all of this rapid expansion.
All of this shit we're supposedly building doesn't actually seem
to be innovation. It just is capitalist sprawl.

Speaker 3 (28:39):
Yeah, I definitely agree. Right, it feels like Silicon Valley
left innovation behind quite a while ago. Like I think
that you can very genuinely say that in the early
days of the Internet there was innovation going on, right,
whether it was in you know, software development, but also
in hardware development too, right, you know, the emergence of
the mobile phone, and you know, we can even say
the iPad and those sorts of things, but it feels

(29:01):
like now, you know, those types of developments, those innovations
have matured, and it does feel like the industry is
you know, kind of sort of trying to grope for
whatever might come next, but really failing in doing that
because you know, their profits and their whole businesses are
tied up in what is currently successful at the moment,

(29:22):
and I think a lot of them don't want to
disrupt what is working for them and want to protect
these what are now basically legacy businesses that they have
built up and that they're now kind of soaking for cash, right,
And so I don't think that we should be looking
to the Googles or Apples or Amazons of the world
for innovation and for the path forward for what is

(29:42):
going to come next. And I think that even like
if you think back to like the Internet era and
things that came before, you know, often what a lot
of people have observed, you know, Mariana Mazocato has this
great book called I Think It's the Entrepreneurial State that
goes into this as well. How a lot of the
things that these companies were able to launch and able
to make so much money from were ultimately things that

(30:03):
were being developed in the public sector and that they
were able to privatize and make a lot of money
off of.

Speaker 1 (30:09):
Right.

Speaker 3 (30:10):
Obviously, the Internet was a public innovation before it was privatized,
and all these companies could use it and commercialize it
and whatnot. Right, And I feel like something has broken
down there as well, where so much public research seems
to be focused on a very early stage and like
what is the commercialized potential of this, rather than just saying,
forget about commercialization for a while and let's just like

(30:30):
work on these things and see if it goes anywhere.
And it just feels like in general, like you know,
our deep commitment to capitalism in our society has broken down.
Whereas before, like it used to be measured a bit
because there needed to be like certain deliverables for people
and you know, a different kind of an expectation. But
at this point we've just gone so full tilt on,

(30:51):
like whatever makes profit is what we need to do,
that we've lost the things that ultimately contribute to that
in the long term rather than just focusing on the
short term.

Speaker 2 (30:59):
And it makes me think about like giving things time
to build, because generative AI as an idea, you have
to wonder if they left it alone for another five
to ten years without doing this, whether it might have
actually been good if there was potential for this, because
I think with long gone with large language models on
device stuff I think is cool. But nevertheless we're not

(31:19):
talking about that. But it's kind of we instead of
investing in public things or nonprofits that actually build the
building blocks that make innovation happen, we've allowed companies like
Qualcom to vacuum up various codecs and standards to the
point that most of them are owned by one company,
and then we pile our We don't really have anyone

(31:40):
in power anymore, so who understands what the fuck's going on?
So we pile all that cash into something that kind
of looks like the future because I think about the iPhone.
I was talking to someone about Jim Cavello and the
Generative AI paper from Goldman earlier, and he was saying
how one of the big things that made the smartphone
revolution happen, one of the things that they knew when
this happened, and we'd bring in was small GPS was

(32:03):
the ability to have device level GPS and of course
the chips that support and then someone would need to
build the software layer, which is where Apple came in,
and then Android to some extent, and we don't have
that building block. We have one thing. We have generative AI.
We have transformer based models, and we're going to put
all the money in that in the hopes that no

(32:25):
one's even thinking of what the next devices might look.
I can't stop thinking about batteries now, because that really
is it. It's if we had a battery that could
power I don't know, I'm being a bit wanky here
a city. Actually I'm not being wanky. This is less
wanky than what Sam Mormon says every day. That feels
like or incredibly small batteries which are powerful that would
enable all sorts of things. EDGAI is exciting, but they've

(32:47):
been working on it's been around ten years. None of
the people in power seem to actually be aware of
how good things are built, which I guess explains the
data center expansion, because if you think of it like
a dumb fuck, if you're like, huh, how do we
make money? We got those data centers right, Well, if
we build more of them. Yeah, what if they're really big,

(33:09):
then the money will come out. I'm just are you worried.
I'm a bit worried about the entire tech economy at
this point. That's put the environment, but like.

Speaker 3 (33:18):
I'm worried about it all, you know, I'm less worried
about the tech economy in the sense of like will
they be profitable enough, will they make their money? Like
I don't really care about that. That's not like it's
something that's important to me, and I know that's the
same for you. But it's like I do worry about
where it's going because they're broad broader, like societal impacts
to everything that they do. Right on the one hand,
because of like the expectation that we need to adopt

(33:39):
so many of these things, but also because our governments
so much are supporting and you know, willing to push
out and not regulate effectively whatever it is that they
do until it's too late. And I feel like, you
know what you're talking about with generative AI even you know,
and I'm not going to claim to be, like, you know,
the most knowledgeable person in the world about the technical
angle of that, but when you look at what, say,

(34:00):
open AI was trying to do, and what all these
other companies have kind of chased after is they were
trying to build like the general foundation model that could
do virtually everything right, and you speak to AI researchers
or you know, I've spoken to some AI researchers who say, like, yes,
that is very energy intensive, that is going to take
a lot of power, energy data in order to.

Speaker 2 (34:21):
Make it work.

Speaker 3 (34:21):
But you can like train very tailored, much smaller models
that are not nearly as computationally or energy intensive as
the direction that these companies have chosen to go in.
But we are not doing that, you know. And I
think for a few reasons, mainly because on the one hand,
there is like the expectation of scale or the desire
of scale that comes with these general foundation models. I

(34:43):
think that there is the other side of it where
it's very beneficial to the tech companies that exist, you know,
the cloud giants, because if you're competing on this scale
of general foundation models that need so much computation in
order to train that, it's very difficult for smaller companies
to like compete on level. But then I think it
also plays into that ideological angle of it that I

(35:04):
was talking about earlier, where you have these people who
are leading the tech industry who fundamentally believe that we
need to build like you know, AI with human intelligence
and eventually merge our brains with machines. And so if
we're not building these massive general models, then we're not
getting closer in their minds to achieving this, which I
think is ultimately something that's never going to get built

(35:26):
because I don't think it makes sense. Actually, I think
it's just science fiction, but I think that is like
part of what's driving these people too.

Speaker 2 (35:33):
So to finish us off, So you've got two episodes
out right, now, what have we got to look forward
to with the rest of the series?

Speaker 3 (35:39):
Even yeah, absolutely, you know. The first one looked into
like what these data centers are, where they're coming from.
The second one looked into what this community opposition that
we're seeing looks like. The third one that will you
know come out soon, looks at the climate impacts of
generative AI and the way that generative AI is helping
to fuel this further construction of hyper scale data centers.
And the fourth one looks into the broader impacts of

(36:02):
this whether we can think about a different way of
approaching this problem and these ideological you know issues and
predispositions and and you know things that these tach billionaires
are wound up with that they are trying to push
on the rest of the world, and how that relates
to this anti democratic politics that so many of them
are adopting.

Speaker 2 (36:21):
Well, Paris, it's been such a pleasure avenue on Where
can people find you?

Speaker 3 (36:26):
Absolutely, you know, they can find the podcast tech won't
save us on any podcast app where they like to listen.
I'm on you know, most of the social media, the
tech based ones at least where you can find me
at Paris marks Well.

Speaker 2 (36:36):
Thank you so much for listening, Paris. Thank you for
joining You've been listening to Better Offline. I'm the most
punishment alive at Zitron. You'll now get exactly the same
message as this, but slightly different, and you'll get mad
at me, and you'll email me because it's too say
me anyway, you're gonna hear it now. Thank you for

(37:00):
listening to Better Offline. The editor and composer of the
Better Offline theme song is Matasowski. You can check out
more of his music and audio projects at Matasowski dot com.
M A T T O S O W s KI
dot com. You can email me at easy at better
offline dot com or visit better offline dot com to
find more podcast links and of course my newsletter. I

(37:21):
also really recommend you go to chat dot where's youreaed
dot at to visit the discord, and go to our
slash Better Offline to check out our reddit. Thank you
so much for listening.

Speaker 1 (37:32):
Better Offline is a production of cool Zone Media. For
more from cool Zone Media, visit our website cool Zonemedia
dot com, or check us out on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.

Speaker 2 (38:00):
M
Advertise With Us

Host

Ed Zitron

Ed Zitron

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.