Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
You're listening to Comedy Central.
Speaker 2 (00:07):
From the most trusted journalists at Comedy Central.
Speaker 1 (00:10):
It's America's only source for news. This is the Daily
Show with your host show Start Hi Hi Money Right Show.
(00:55):
We got one point tonight, unbelievable. I'll be speaking of
the head of the FTC Lena Khan co on. It's
going to be unbelievable.
Speaker 3 (01:02):
But first, as many if you were aware, the news
has been pretty bleak recently the past two three hundred
years listen, but this weekend there was one story that
was so disturbing, so dark, even the news couldn't handle it.
Speaker 2 (01:24):
In our editorial discussions this morning, we were asked not
to show the image from this video because of its
violent and disturbing nature.
Speaker 3 (01:31):
Video which we are intentionally choosing not to show you.
Speaker 1 (01:34):
We're not going to show because of how disturbing it is.
Speaker 4 (01:37):
I was extremely disturbed to see.
Speaker 1 (01:39):
This horrible, horrible, violent imagery, violent and dehumanizing imagery. We're
only going to show you a clip of this briefly.
All right, that's enough, let's take it down. I didn't
get to see it at all. It's got to be devastating.
Speaker 3 (01:59):
News Channel show images from Ukraine from Gaza from natural disasters.
Speaker 1 (02:03):
They get through them dispassionately.
Speaker 3 (02:06):
I can't imagine how devastating this footage must be. Former
President Donald Trump shared a video this one on his
truth social account, featuring an image of President Joe Biden
hogtide on the back of a pickup truck. That's what
(02:39):
was so disturbing and dehumanizing. You wouldn't show it on television.
And an airbrushed Biden dcattle on the back of a truck.
Are you the same networks that show reruns of nine
to eleven every year?
Speaker 1 (02:55):
I mean, I don't think it's great.
Speaker 3 (02:56):
That Trump is boasting things like this, but it's not
like people really say Joe Biden was tied up in
the back of the truck.
Speaker 2 (03:04):
It's a doctored image, but it's plastered on the tailgate
of the pickup truck, so if you're driving behind it,
it would appear as if Joe Biden were actually restrained
on the vehicle's flatbed.
Speaker 3 (03:25):
If you think that's really Joe Biden tied up on
the back of the pickup truck, I don't know that
you have the mental acuity to be operating a motor vehicle,
But if you do think that I should also probably
explain to you that trucks also don't actually have testicles.
Speaker 1 (03:45):
It's just.
Speaker 3 (03:48):
A novelty item, and.
Speaker 1 (03:52):
It's not as low.
Speaker 3 (03:56):
It's not as low as an F one fifty and
a Silverado love each other very much, they get one
of these.
Speaker 1 (04:05):
It's not what is going on now.
Speaker 3 (04:13):
There is technology out there in the world that really
does blur the line between reality and tailgate art, but
those are mostly AI generated. Your fake Joe Biden robocall
that tells New Hampshire voters not to vote your Chicago
mayoral candidate, glorifying police brutality, your Donald Trump dropping by
the neighborhood for a stupek look how comfortable he seems.
(04:45):
And as AI gets better and better, it's only going
to make it more difficult to separate fact from fiction,
which could be terrifying. Luckily, the people in charge of
AI have told us that just like with the Internet
and social media actually going to make everything much.
Speaker 1 (05:01):
Much better, this has the potential to make life much better.
I think it's honestly a lay up.
Speaker 5 (05:06):
I hate to sound like utopic techbro here, but the
increase in quality of life that AI can deliver is extraordinary.
AI is the most profound technology humanity is working on,
more profound than fire or electricity.
Speaker 1 (05:22):
Yeah, sucking fire, that's right, you are, and me.
Speaker 3 (05:33):
Your art me fire, I'm sorry, Ready to turn that up,
suck a mother.
Speaker 1 (05:40):
Fire and hope? Whoa? What are you giggling at?
Speaker 3 (05:46):
Electricity? I mean, listen, I'm sure AI is good, but
like fire good? How so they can help us solved
very hard scientific problems that humans are not capable of
solving themselves.
Speaker 6 (06:05):
Addressing climate change will not be particularly difficult for a
system like that. The potential for AI to help scientists cure, prevent,
and manage all diseases in this century.
Speaker 1 (06:16):
I completely trust you.
Speaker 3 (06:22):
And your enormously wide eyes and very human cadence. But
benefit of a doubt this can cure diseases and solve
climate change.
Speaker 1 (06:36):
What are we using it for now? Jarvis knows when
to make me breakfast? Your toast is ready? All right?
Are you out of your mind? See? Here's the thing
toast I can make. I can make toast. It might
(06:59):
be the only technology we have that works pretty much
every time.
Speaker 3 (07:04):
I'll tell you what, why don't you get to work
on curing the diseases and the climate change and we'll.
Speaker 1 (07:09):
Hold down the fort on toast.
Speaker 3 (07:12):
Of course, now we have as a society. We have
been through technological advances before and they all have promised
the utopian life without drudgery.
Speaker 1 (07:24):
And the reality is they come for our jobs.
Speaker 3 (07:27):
So I want your assurance that AI isn't removing the
human from the loop.
Speaker 1 (07:35):
This is not about replacing the human in the loop.
In fact, it's about empowering the human. It's an assistant.
It's an assistant. What we're all getting assistance. It's an assistant.
Speaker 3 (07:50):
AI works for unite and day tirelessly and all you
had to do was remember their birthday.
Speaker 1 (07:57):
That's all you had to do. But I get it.
It's an assistant.
Speaker 3 (08:02):
It's about productivity and that's good for all of us. Yes,
although they do let the real truth slip out now.
Speaker 1 (08:11):
And again, they will be overall displacement in the labor market.
You can get the same work done with fewer people.
They're just the nature of productivity. That doesn't sound good.
Same work done with fewer people. Not a math guy,
but I think fewer means less.
Speaker 3 (08:29):
Yes, so AI can cure diseases and soft climate change,
but that's not exactly what companies are going to be
using it for?
Speaker 1 (08:37):
Are they?
Speaker 3 (08:38):
So?
Speaker 1 (08:39):
This is like productivity without the tax of more people,
without the tax of more people?
Speaker 3 (08:49):
Are the people tax formerly referred to as employees? But
you know, the promise of AI versus the reality of AI,
it's not quite crystal clear mind yet how that's going
to work out for workers. Do you have anyone who
wants to lay this out more bluntly, perhaps while auditioning
to be a bond villain from his mountaintop layer.
Speaker 6 (09:10):
Left completely to the market and to their own devices,
These are fundamentally labor replacing tools.
Speaker 1 (09:23):
Did that guy just call us tools but he's actually
warning us?
Speaker 3 (09:29):
Is there anyone who might say the same thing as
this fella but looks at losing employees as a feature
of AI and not a bug.
Speaker 2 (09:40):
The CEO of the company laid off ninety of its
customer support staff AF. They're arguing that AI is kind
of the reason. Why did you do this? It seemed
a little brutal.
Speaker 1 (09:54):
It's smart. I think like Hitsbruden.
Speaker 2 (09:57):
If you think like a as a human.
Speaker 3 (10:09):
AI it's brutal if you think like as a human.
It's not the catchiest ad slogan i've ever heard. So,
while we wait for this thing to cure our diseases
and self climate change, it's replacing us in the workforce.
Not in the future, but now. So what exactly are
(10:30):
we supposed to be doing for work?
Speaker 7 (10:32):
I think we'll need new types of jobs to help
us embed AI and maintain AI in the workplace.
Speaker 1 (10:40):
Prompt engineers. They're basically people who learn how to use
AI systems and in effect, how to program them. Who
would have thought that there'll be a prompt engineer, right right?
Prompt engineer?
Speaker 3 (10:54):
I think you mean types question guy. And by the way,
if there's any job that can be usually replaced by AI,
it's types question guy. This is some shit you got
going here. AI models have hoovered up the entire sum
of the human experience that we've accomplished over thousands of years,
(11:16):
and now we just hand it off to be their
prompt engineers. And by the way, you're not fooling anybody
by adding a word engineer. You're not the types of
question guy, You're the vice president of question input.
Speaker 1 (11:32):
This It's true. It's like a janitor is a doctor
of mopping.
Speaker 3 (11:39):
Like this whole AI thing is a bait and switch
you're acting like you're helping us. Oh Ai, it's supposed
to be my assistant, but now I'm making AI and
toast I'm Jarvis. Guess what, No, you'll listen to me.
(12:01):
I got news for you, AI. I'm not Siri, you're Siri.
Speaker 1 (12:11):
Siri. While I have your attention, let me ask you
a question.
Speaker 3 (12:15):
Sure, John, but first, could you run and fetch me
some lynsium cadmium.
Speaker 1 (12:19):
Yeah, sure, that's not a problem.
Speaker 2 (12:21):
Mother.
Speaker 1 (12:28):
I didn't want to have to do this Ai, but
it's pretty clear.
Speaker 3 (12:32):
With the technology this powerful, like nuclear power and atomic weapons,
I'm gonna have to place a little call to my
good pals in the United States government, perhaps even the
House of Representatives are the Senate, and they're about to
open up a can of what's AI?
Speaker 7 (12:45):
Now, do you understand what AI does?
Speaker 1 (12:51):
Understandings what's going on?
Speaker 2 (12:54):
Very frankly, it's new terrain and charted character. Do we
have the knowledge set here to do it?
Speaker 3 (13:00):
No?
Speaker 1 (13:01):
The short answer is no. The long answer is hell no.
Speaker 3 (13:10):
And the longest answer is H to the E, to
the L to the L or to the no.
Speaker 1 (13:19):
Hell. I don't even know how to use an answer.
Speaker 3 (13:21):
And we'll say to do to do.
Speaker 1 (13:32):
But I'm not against progress.
Speaker 3 (13:34):
But let's look to our history to see how we've
dealt with previous economic disruptions.
Speaker 2 (13:39):
We can retrain workers from one generation and create jobs
for the next.
Speaker 1 (13:43):
Retrained workers who do lose their jobs for even better
jobs in the future, retrain in order to be productive workers.
Upskill America to help workers of all ages of train
and retrain workers for new jobs.
Speaker 8 (13:57):
Give me a break, body, you can throw coal into
a furnace, can learn how to program, for God's.
Speaker 1 (14:04):
Sake, and I'll fight every one of you.
Speaker 3 (14:13):
Jack Coles who says different, But that's the game. Whether
it's globalization or industrialization or now artificial intelligence, the way
of life that you are accustomed to is no match
for the promise of more profits and new markets. Which
sounds brutal if you're a human, But at least those
(14:38):
other disruptions took place over a century or decades. AI
is going to be ready to take over by Thursday.
And once that happens, what the is there left for
the rest of us to do?
Speaker 1 (14:52):
Time is not a terrible thing.
Speaker 3 (14:53):
AI freeing us up to think about things at a
higher level, it's gonna help.
Speaker 1 (14:58):
It's going to give us a I'm back. We'll be
able to express ourselves in new creative ways. You know,
he's right. I've been thinking about this all wrong.
Speaker 3 (15:06):
It's not joblessness, it's self actualizing me time.
Speaker 1 (15:13):
I'll live the artist's life. It'll give me.
Speaker 3 (15:15):
More time to explore my passions. You know, I'm an
aging suburban dad. I'll learn to play the drums. You know, music,
Ta ta tinky ta, music is what makes us human.
(15:47):
When we come back, Lena conn will be joining your
thums said, don't go away.
Speaker 1 (16:10):
My guest tonight learns one of the main government.
Speaker 3 (16:14):
Agencies responsible for enforcing anti trust and protecting consumers in America.
Please welcome to the program, Federal Trade Commission Chair Lena Khan.
Speaker 1 (16:37):
It's lovely to see you. You run the Federal Trade Commission.
Speaker 3 (16:44):
That's right, the whole shebang, and you are in charge
of it's it's protecting Americans from monopolistic uh government company practices,
but also dealing with pricing and things like that, protecting consumers,
that's right.
Speaker 7 (16:59):
I mean, the short of it is, we want to
make sure that the American public is not getting bullied
or coerced in the marketplaced or tricked and so we
enforce the nation's anti trust and consumer protection laws.
Speaker 1 (17:10):
And how is well?
Speaker 3 (17:11):
Please, now, we just want to make killy that right.
You were not bullied or tricked into applauding. Now, I
don't want to be accused of monopolistic How much pressure
do these companies exert on the Federal Trick com Mission?
In other words, how much do they fight whatever regulation
(17:35):
you're trying to put into place to keep them from
becoming monopolies or from these types of business practices.
Speaker 7 (17:41):
Well, look, monopolies are not fans of enforcing the anti
monopoly laws.
Speaker 5 (17:46):
And so that type of pushback is baked in. But
we have a fantastic team.
Speaker 7 (17:52):
We're a small agency, but we're mighty, and we play
to our strengths being entrepreneurial, being strategic, and getting real
wins for the American people.
Speaker 3 (18:01):
What are the companies? So these are separate things monopolies.
The way I always view it was, oh, that's only
one company. But don't we have oligopolies in this country?
Speaker 1 (18:11):
Aren't there?
Speaker 3 (18:11):
Industries? Consolidation has made it. For instance, the entertainment industry
is controlled by like six companies. Is that considered not
a monopoly?
Speaker 1 (18:20):
But a problem.
Speaker 7 (18:22):
Yeah, Look, we've really focused on how are companies behaving?
Are there behaving in ways that suggest they can harm
their customers, harm their suppliers, harm their workers and get
away with it. And that type of too big to
care type approach is really what ends up signaling that
a company has monopoly power because they can start mistreating
(18:44):
you but they know you're stuck.
Speaker 3 (18:46):
And what would be the metrics of that? How would
you judge that? Because I know you've sued Amazon, that's right,
and that's for those practices.
Speaker 7 (19:00):
The last seat does allege that Amazon is a monopoly,
that they've maintained that monopoly through illegal practices.
Speaker 5 (19:07):
And look, there are a variety of ways.
Speaker 7 (19:08):
That you can show a company is a monopoly and
has monopoly power. One is you can try to figure
out what's the exact boundary of the market, what's the
market share? But again, the most direct way is to
look at how is the company behaving. And as we
lay out in our complaint, Amazon is now able to
get away with harming its customers. So, just to give
(19:28):
you a few examples, over the last few years, they've
littered their search results page with junk ads, ads that
internally executives realize are irrelevant and unhelpful to consumers, but
they can just do it and it milks them, you know,
billions of dollars in money. They've also been steadily hiking
the fees that small businesses have to pay to sell
(19:50):
through Amazon. And so now some small businesses have to
pay one out of every two dollars to Amazon. It's
basically a fifty percent monopoly tax.
Speaker 1 (20:00):
Wow.
Speaker 7 (20:00):
And so those are just some of the behaviors that
we point to to note that this company has monopoly power.
Speaker 3 (20:06):
Is there anything in the company's leader that also suggests that, Like,
for instance, if you were to go from being like
sort of a nerdy dude who sold books out of
a garage into let's say, a jacked lex luthor type,
does that also suggest either monopolistic practices or some type
(20:29):
of injections?
Speaker 7 (20:33):
You know, we haven't tried to make those arguments in court,
but it would be interesting to see how a judge
would respond, I.
Speaker 3 (20:40):
Think quite favorable. How many lawyers do you like? For instance,
So what are you up against? So you've got government lawyers.
I'm assuming you've got a pretty good cadre, But like
let's say you're going after Amazon, how many lawyers.
Speaker 1 (20:53):
Are they in.
Speaker 7 (20:54):
I mean, you know, if they have monopoly money, they
can buy as many lawyers as they want. I mean,
the FTC is around on twelve hundred employees. But when
we're going up against some of these monopolistic companies, they
can outmatch us, outgun us sometimes one to ten, just
if you're looking at lawyers, if you're adding paralegals and.
Speaker 3 (21:13):
Support, if you're just looking at lawyers, they outnumber you
ten to one.
Speaker 5 (21:16):
Sometimes they can.
Speaker 7 (21:17):
Yeah, I mean, we have lawsuits against a whole bunch
of big companies and just in terms of sheer resources
that they can pour into the litigation, we're pretty outgunned,
but not outmatched, right, And this is where it comes
to playing to your strengths being entrepreneurial.
Speaker 1 (21:34):
So this isn't about just getting a fine.
Speaker 3 (21:35):
This isn't about going after Amazon and saying so because
this is what the SEC does. The SEC, I think
is overmatched as a government ager saying you don't.
Speaker 1 (21:43):
Have to comment out that, but just nod your head.
Speaker 3 (21:46):
Utterly overmatched, So they go after groups and then they
can't really prove it in court. So then they're like,
how about this. You give us a cut of your
profit and we'll all be done here. How do you
handle that with Amazon?
Speaker 1 (21:58):
It's not just about a fine, that's right.
Speaker 7 (22:00):
I think we've seen look over the last couple of decades,
we've seen how businesses can treat fines just as a
cost of doing business right, And we need to make
sure that we're actually deterring illegal behavior, and so that
can mean naming individual executives.
Speaker 1 (22:16):
We in our old snap you just did not go there.
I like that. So have you had success with this?
Speaker 5 (22:27):
We have had success with this.
Speaker 7 (22:29):
I mean we had a lawsuit against Martin Screlly a
couple of years ago.
Speaker 3 (22:37):
All suddenly it turned into a pro wrestling match. Here,
let's go on, and he went to jail. Do you
do you have to refer things to the DOJ or
do you have an enforcement horn?
Speaker 7 (22:47):
So you're right, we don't have criminal authority. But the
remedy we were able to get against Martin Screlly was
to effectively ban him from doing business in the pharmaceutical industry.
Speaker 3 (23:02):
I imagine that the practice that he did in the
pharmaceutical industry, which was taking a life saving drug and
like Jack and the price.
Speaker 1 (23:08):
Up, I don't know how many thousands of percent. I mean,
he did something crazy, right.
Speaker 3 (23:12):
How do you keep that as a normal practice in
the pharmaceutical industry? I mean, are they colluding as a
group to keep prices high? Why are we having so
much trouble with them and prescription drug prices?
Speaker 7 (23:27):
I mean, look, there are a whole set of reasons why. Yep,
for too many Americans, drugs are unaffordable, right. I mean
I hear weekly monthly about American families who are having
to ration life saving.
Speaker 1 (23:38):
Drugs absolutely and shortages of those drugs.
Speaker 7 (23:41):
Shortages of those drugs, and there can be all sort
of tricks in monopolistic behavior that is leading to that.
Just to give you one example, in Haler's. They've been
around for decades, but they still cost hundreds of dollars.
So our staff took a close look, and we've realized
that some of the patents that had been listen did
for these inhalers were improper, There were bogus, and so
(24:04):
we sent hundreds of warning letters around these patents, and
in the last few weeks we've seen companies delist these patents,
and three out of the four major manufacturers have now
said within a couple of months, they're going to cap
how much Americans pay to just thirty five dollars.
Speaker 3 (24:22):
So this is their game life, So you being entrepreneurial
is their game. We're going to see how far we
can push this and get away with it and do
these different things in the hopes that we don't run
up against an entrepreneurial or crafty FTC.
Speaker 1 (24:41):
Are they waiting you out?
Speaker 5 (24:43):
Look, it's possible.
Speaker 7 (24:44):
But that's why you need to think about tactics that
are going to be around deterrence.
Speaker 5 (24:49):
And so one big area of focus.
Speaker 7 (24:50):
For us is understanding what is the root cause of
these problems? Right, Let's understand who is the mafia boss
here rather than just going after.
Speaker 3 (25:00):
The foot soldiers, right, And I think you'd probably there's
probably a biblical sin in there that's probably the root
cause of the whole thing. But I want to talk
about the tech companies because they are the new oligarchs.
It would seem they are the companies that and you
see this especially in Europe where they are fined considerable
(25:21):
amounts of money for monopolistic practices or Apple just had
to pay an enormous fine. Microsoft has always been found
guilty of certain monopolistic practices when it comes along.
Speaker 1 (25:31):
How do you handle enforcement for.
Speaker 3 (25:35):
These new, incredibly consolidated and enormous oligarchies.
Speaker 7 (25:42):
So we have a lawsuit against Amazon, we have another
ones against Facebook.
Speaker 1 (25:47):
What is the one against Facebook?
Speaker 7 (25:49):
So that one was filed before I arrived at the agency.
But basically it alleges that Facebook, when it was watching
the transition from desktop to mobile, it realized it really
couldn't survive and mobile, and so it ended up and
buying out Instagram and WhatsApp. And the lawsuit a ledges
that those acquisitions were anti competitive, that they violated the
anti trust laws, that instead of competing organically, Facebook instead
(26:13):
bought its way to maintaining its monopoly.
Speaker 3 (26:15):
Now why why is that considered all of this? Wouldn't
they say, well, that's a sign of our success. We're
so successful, we have extra money, and with that extra money,
we make bets on certain companies and we turn those
into successes.
Speaker 7 (26:27):
So look, one key tenant of the anti monopoly laws
is that you can't go out and buy one of
your biggest competitors.
Speaker 1 (26:35):
Or you're not allowed to do that.
Speaker 5 (26:37):
You're not allowed to do that. In fact, can I.
Speaker 3 (26:39):
Tell you something crazy. So I had put in an
offer for last week tonight I had come out.
Speaker 1 (26:45):
Now it wasn't I. I want to tell you so.
Speaker 3 (26:50):
Because it's Oliver I offered to him in the balloons.
Is that what British people use? Obviously didn't take it.
But you have to make the decision then of whether
or not they are cornering the market. They used to
call it cornering the market, but couldn't you say, like Apple, Microsoft,
(27:12):
they are kind of working together to corner markets now.
Speaker 7 (27:17):
So look, we are investigating to understand whether some of
the investments and partnerships that they're entering into right now
in the AI space may in fact be giving them
undue influence or giving them special privileges. If we get
any hint that there is actual collusion happening in the marketplace,
(27:37):
we take that extraordinarily seriously and won't hesitate to take action.
One trend that we're especially concerned about is the way
that algorithms may be facilitating price fixing. And so if
you have a whole bunch of competitors in a market,
be at hotels, be at casinos, and they all decide
they're going to outsource their pricing, decision to the same algorithm,
(28:01):
they may in effect be fixing their prices, even if
they're not getting in the back room and making secret deal.
Speaker 3 (28:07):
If that would be like if a hotel says, oh,
you can get us on Expedia, or you can get
us on Kayak, or you can get us on but
all those companies are using the same algorithm. Would that
mean that it flattens those prices and you are not
getting the competitive advantage that you might get from those
ten to fifteen apps that are searching for the cheapest
hotel rooms.
Speaker 1 (28:28):
Is that the idea that's right.
Speaker 7 (28:29):
You may collectively see inflated prices because all of these
companies are using the same algorithm, they're inputting the same data,
and that algorithm is an effect allowing them to collectively
raise their prices.
Speaker 5 (28:42):
So Americans are having to pay more.
Speaker 1 (28:44):
And it's not just paying more.
Speaker 3 (28:45):
I mean you could look at a company like Walmart,
where you would say, okay, they came in to areas
and they dominated all the competition. They didn't buy up
the mom and pop shops, but because they had access
to cheap labor and overseas goods and those types of things,
they could undersell them and put them all out of business.
And even at that moment, they might not raise their prices,
but boy could they and boy could they exert their
(29:08):
influence on supply chains, And boy could they depress wages
and make sure that people, even if they're working long hours.
Speaker 1 (29:15):
Still have to have social assistance. Is that something that
you could go after?
Speaker 7 (29:20):
Look, monopolies harm Americans in a whole bunch of ways.
You're absolutely right that it's not just higher prices. It
can be lower wages. It can be suppliers getting muscled
out of the market or seeing their own payments drop.
It can also be shortages. I mean, we've seen over
the last two years baby formula, baby formula.
Speaker 5 (29:41):
IV, bags adderall adderall.
Speaker 1 (29:46):
Basic forums.
Speaker 3 (29:50):
I see the audience says no use for baby formula,
but has an interesting predilection.
Speaker 1 (30:00):
What do you do in that instance?
Speaker 7 (30:02):
So, look, we want to understand are there dominant players
here that are using their muscle to coerce in ways
that's contributing to shortages. We've also seen historically when you
concentrate production, that concentrates risk, and so a single disaster,
a single contamination, a single shock, can lead the entire
(30:23):
supply to be wiped out.
Speaker 5 (30:25):
I mean, the short of it is, don't put all
your eggs.
Speaker 3 (30:27):
In one basket, and then you guys are the ones
that have to separate the eggs. It's curious to me
that the government wouldn't have other methods of working with
these corporations to ask them to curb their excesses in
exchange for what they get, which is the stability of
the American system.
Speaker 7 (30:48):
So look, we have a whole bunch of policies in
laws in place that are actually designed to ensure our
markets are more competitive and not as subject.
Speaker 1 (30:57):
To these withou actually innovation. That's the balance.
Speaker 7 (31:01):
But forty years ago, under President Reagan, we've radically veered
off course and undertook a much more hands off approach,
and now we're living with the consequences of those Disisis is.
Speaker 3 (31:12):
Industry more consolidated today? I mean, my gut would tell
me it is more consolidated. You have larger companies that
swallow up in the pursuit of growth, swallow up and consolidate.
Speaker 1 (31:24):
It feels that way to me. Do you have the
metrics that suggest that that's actually the case on the whole?
Speaker 7 (31:30):
Yes, I mean, you always want to do a market
by market analysis. But if you look at airlines, if
you look at telecom, if you look at meat packers,
if you look at huge parts of our economy across.
Speaker 5 (31:42):
The board, you've see in huge waves of.
Speaker 1 (31:44):
Mergers less competitive.
Speaker 7 (31:45):
Do you go from dozens of companies just to a
very small number, And again that hurts Americans and American
communities in all sorts of ways and even leads to,
for example, planes falling apart in the sky.
Speaker 1 (31:58):
Wait, what.
Speaker 3 (32:02):
Irish thought that was just I always thought that was
all just DEI are you telling me this gets us
to our final point? So now they're saying this new algorithm,
this new kind of machine learning model called AI, that's
(32:24):
going to transform every aspect of American life, in the
American economies. It's already being consolidated. Apple has bought thirty
AI models. Microsoft is pride Bug, Google has bought They
all buy AI startups and put them behind their paywell.
And they're already having an arms race to see who
(32:46):
will be either the monopoly or this will be in
Ohla gobly.
Speaker 1 (32:51):
I got to tell you.
Speaker 3 (32:52):
I wanted to have you on a podcast, and Apple
asked us not to do it. To have you, they
you literally said, please don't talk to her, having nothing
to do with what you do for a living.
Speaker 1 (33:05):
I think they just I didn't think they cared for you.
It's what happened. They wouldn't.
Speaker 3 (33:15):
They didn't, They wouldn't let us do even that dumb
thing we just did in the first Act on AI, Like,
what is that sensitivity? Why are they so afraid to
even have these conversations out in the public sphere.
Speaker 7 (33:28):
I think it just shows one of the dangers of
what happens when you concentrate so much power and so
much decision making in a small number of companies. I mean,
going back all the way to the founding, there was
a recognition that in the same way that you need
the Constitution to create checks and balances in our political sphere,
you also needed the anti trust and anti monopoly laws
(33:50):
to safeguard against concentration and economic power, because you don't
want an autocrat of trade in the same way that
you don't want a monarch.
Speaker 3 (33:59):
But then it took the I mean it wasn't until
the Sherman Act in what eighteen ninety some day. I mean,
why when did they first decide was it the beginning
of industrialization? When they finally decided like, oh, we should probably.
Speaker 1 (34:11):
Put a halt to this. That's right.
Speaker 7 (34:12):
You'd initially had some state level laws, but the first
federal antitrust law was the eighteen ninety Sherman Act, and
it was absolutely a response to the industrial evolution and
a lot of the power that that had concentrated.
Speaker 3 (34:25):
Can we just hold on for one second? Why can
you take the camera real quick? I want to take
a single real quick if I can, I don't know
which one. Let me take this one now. That Sherman
thing didn't I oh, came out of nowhere. I think
(34:45):
I might have learned that in like ninth grade?
Speaker 1 (34:48):
Stuck. Has that been updated since eighteen ninety So we.
Speaker 7 (34:55):
Had some follow on laws in nineteen fourteen, another follow
on in theeen fifties, and then since then it's been.
Speaker 5 (35:02):
A bit more sparse.
Speaker 7 (35:04):
So for the most part, our lawsuits are still based
on those laws going back a century.
Speaker 1 (35:09):
What would you posit?
Speaker 3 (35:10):
What would you put forth to control this new AI
technology that is looming?
Speaker 1 (35:17):
And I'm not talking about censorship.
Speaker 3 (35:18):
I'm not talking about government deciding you can't say that
or you can't print that. I'm talking about in terms
of business practices these few companies controlling the entire mechanism.
Speaker 7 (35:27):
Look, the first thing we need to do is be
clear eyed that there's no AI exemption from the laws
on the books.
Speaker 5 (35:33):
We see sometimes.
Speaker 7 (35:34):
Business says try to dazzle enforcers by saying, oh, these
technologies are so new, they're so different, let's just take.
Speaker 5 (35:42):
A hands off approach.
Speaker 7 (35:43):
And that's basically what ended up happening with Web two
point oh. And now we're reeling from the consequences, and.
Speaker 1 (35:50):
So we need to make sure all of the Web
two point oh is.
Speaker 7 (35:54):
You know, the rise of social media, and you know,
in the early two thousands, the initial set of of
companies that ended up innovating but ultimately becoming monopolistic, ultimately
adopting business models that are premised on endlessly surveilling.
Speaker 3 (36:08):
People and so and hoovering up data and creating algorithms
that are clearly harmful not just to children, but to
political discourse.
Speaker 1 (36:19):
And there's it's it's pretty wild how they're able to
do that.
Speaker 3 (36:22):
And they every now and again they get called in
front of Congress and Mark Zuckerberg, you know, do I
goes like.
Speaker 1 (36:31):
Like and subscribe? You know, I don't know, but it
are you?
Speaker 3 (36:38):
Are you optimistic that we will be able to catch
up to this in time before something truly catastrophic happens
through AI.
Speaker 7 (36:46):
Well, look, there's no inevitable outcome here. We are the
decision makers and so we need to use the policy
tools and levers that we have to make sure that
these technologies are proceeding on a trajectory that benefit Americains
and we're not subjected to all of the risks and harms.
Speaker 1 (37:03):
Right, boy, would you stay forever because it's incredible pros
the way you got do me so much FDC chair
lea PM. Where are you to talk about what they do?
Speaker 3 (37:38):
That?
Speaker 1 (37:39):
To go chelping tonight?
Speaker 3 (37:40):
But before we go, we're gonna check in with your
host for the rest of this week, Daisy Lighting, it's
gonna be doing heavy.
Speaker 1 (37:50):
What are you excited about this week?
Speaker 4 (37:52):
Well, I don't mean to brag, John, but I've got
a perfect NCAA bracket Women's and men's.
Speaker 1 (37:59):
Perfect, perfect, perfect, perfect. It's incredible. How's that even possible?
Speaker 8 (38:03):
Yeah?
Speaker 4 (38:03):
Well, here's the key strategy that I've honed over many seasons. See,
I fill out the bracket after the games are played.
Speaker 5 (38:11):
That way, I know who wins.
Speaker 1 (38:15):
And what will you be doing on the show this week?
Speaker 4 (38:18):
Well, John, I mean that's impossible to say until after
the shows have happened. See it's a strategy that works
across the board.
Speaker 1 (38:24):
I can't wait to watch Daisy Light. Everybody, God bless
you all.
Speaker 6 (38:33):
Enjoy the day.
Speaker 8 (38:34):
And I'm coming down to do that Easter egg role
just a minute. Thank you all, Oh very very much.
Thanks everybody. And by the way, say hello to oyster bunnies.
Come on up bunnies, get up here as they can
see you. Come on, get in here, pretty big bunny.
Speaker 3 (38:51):
Huh.
Speaker 8 (38:54):
Explore more shows from the Daily Show podcast universe by
searching The Daily Show wherever you get your podcast.
Speaker 1 (39:00):
Watch The Daily Show weeknights at eleven.
Speaker 6 (39:02):
Tenth Central on Comedy Central and streamful episodes anytime on
Paramount Plus.
Speaker 1 (39:12):
This has been a Comedy Central podcast