Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
You're listening to Comedy Central.
Speaker 2 (00:07):
From the most trusted journalists at Comedy Central is America's
only source for news.
Speaker 3 (00:14):
This is The Daily Choke with your.
Speaker 4 (00:16):
Host Song Stewart. Hey, everybody listening to the Daily Show.
Speaker 5 (00:51):
My name is John Stewart and I am Risen from
COVID Hell.
Speaker 6 (01:00):
First timer. First timer did not care for it.
Speaker 3 (01:03):
I do also want to welcome in all of our
viewers who are probably joining us from X after watching
an amazing and surprisingly life affirming conversation.
Speaker 6 (01:20):
Doing Donald Trump and Elon Musk.
Speaker 3 (01:22):
You know, when they started quoting their favorite Maya Angelou
passages to each other. My interpretation to Caged vote is
singing for bitcoin. We never got a job for you. Tonight,
Mark Cuban is going to be joining us.
Speaker 6 (01:42):
You know we mentioned Arlis.
Speaker 3 (01:47):
On this program occasionally. We do make fun of Donald
Trump occasionally, and with the ribbing and the joshing and
the pulling the pants down and the pointing. But he's
(02:09):
in pain right now, multiple sources tell The Washington Post
Trump has grown increasingly upset about Harris's surging poll numbers.
Trump is quote complaining, relentlessly, posting multiple times on social media,
clearly frustrated with Biden's decision to step aside, saying quote,
now we have to start all over again. Not fair Jesus.
(02:36):
A month ago, he was basically already the president. He
had cheated death started a new ear accessory trend. Back then,
people thought his VP.
Speaker 1 (02:49):
Selection was a smart choice.
Speaker 3 (02:53):
He had it all in the bag and it was
taken away. It was perfect on the beamy. Now at
the dismount who was walking the podium to get his medal,
Romania files an inquiry at the.
Speaker 6 (03:03):
Last right, at the last minute, and they're just stealing
it from it.
Speaker 3 (03:10):
And by the way, Romania file all you want, you're
not getting a metal back. Oh I'm sorry, we have
an inquiry. Yeah, good luck. But now, instead of enjoying
the fruit of six years of Biden attacks, Trump's gotta
(03:33):
start all over again, and the audience has to literally
sit through him getting up to speed.
Speaker 6 (03:40):
There are numerous ways of saying her name. You can
say Kamala.
Speaker 7 (03:44):
You can say Kamala, Kamala, Kamala, Hey Kamala.
Speaker 8 (03:48):
Trump misspelled Harris's first name as Mabla.
Speaker 3 (04:00):
I get Kamala, I get Kamala, Kamabla. Judges, are we
taking kamabla. I hope the Romanians don't have a problem
with that. But you know what, I guess what Trump
(04:23):
calls her isn't as important as figuring out what she is.
I don't know is she Indian or is she black?
Speaker 7 (04:30):
Yes, she was Indian all the way, and then all
of a sudden, she made a turn and she went
she became a black.
Speaker 3 (04:42):
What am I gonna do with all my Indian ethnic
slurs that was gonna use mostly involved termerica human. She
made a turn into black. He talks about it like
she wandered into the wrong neighborhood. She went driving on
(05:06):
the Upper West Side, and then boom.
Speaker 6 (05:07):
She's in Harlem. Boom. Let it turn.
Speaker 3 (05:14):
You know what, Donald, you're clearly struggling. Let's get some
issue oriented ideas flowing here. You know we're gonna do.
Come on, my brother, I'm gonna help you out. Here's
what we're gonna do. We're gonna do, We're gonna do
some apparently I'm in a musical about gambling. All of
a sudden, Yeah, all right, here we go. I got
(05:38):
my pan, I got my pad, I got my advisor.
Forget the biographical suffer. Now let's focus on the issues.
Speaker 9 (05:43):
I saw it yesterday on ABC which they said, oh,
the crowd was so big, and I've spoken to the
biggest crowds. Nobody's spoken to crowds bigger than me.
Speaker 3 (05:59):
Okay, okay, that's one of those mom and pop issues.
For the single issue crowd sized voter. I'd move on.
But oh, you've got more.
Speaker 9 (06:14):
I had one hundred and seven thousand people in New Jersey.
You didn't report it. I'm so glad you as what
does she have yesterday? Two thousand people we had in
Harrisburg twenty twenty five thousand people and twenty thousand people
couldn't get in. We had so many nobody ever mentions
that when she gets fifteen hundred people, they said, oh
(06:35):
the crowd was so big, I have ten times, twenty times,
thirty times the crowd size.
Speaker 6 (06:45):
I had an infinity crowd.
Speaker 3 (06:50):
One guys.
Speaker 6 (06:50):
You had one guy named Jeff.
Speaker 3 (06:58):
All right, very clear, everybody she has nobody? Can we
move on?
Speaker 1 (07:06):
He wrote?
Speaker 10 (07:06):
Has anyone noticed that Kamala cheated at the airport? There
was nobody at the plane, and she aied it and
showed a massive crowd of so called followers, but they
didn't exist he goes on to say she's a cheater.
She had nobody waiting, and the crowd looked like ten
thousand people.
Speaker 3 (07:21):
Oh my god. Now all right, for those of you
at Almo are saying like, oh, it sounds like he's
losing his mind. Just because there's video and photographic evidence
that Kamala Harris's crowd was real doesn't mean that it
was real. And then you might say, oh, well, John,
(07:45):
I was actually there. I was in the crowd, and
have you considered you're not real? Have you considered that
what is this? Nald Trump doesn't need to fake news
media and their AI crowd shots to win this thing
because he's got inside information on Kamala Harris from someone
she used to date.
Speaker 1 (08:05):
Well.
Speaker 6 (08:05):
I know Willie Brown very well. In fact, I went
down in a helicopter with him. We thought maybe this
is the end.
Speaker 3 (08:11):
We were in a helicopter going.
Speaker 7 (08:13):
To a certain location together and there was an emergency landing,
but he told me terrible things about her.
Speaker 6 (08:34):
You were in a helicopter.
Speaker 3 (08:37):
With former San Francisco mayor Willie Brown, who famously dated
Kamala Harris, and while the helicopter was going down as
you were plunging.
Speaker 6 (08:56):
To your imminent death.
Speaker 5 (09:00):
Former San Francisco Mayor Willie Brown turns to you and says,
this might not mean anything to you now, but to you,
(09:22):
do you.
Speaker 11 (09:22):
Remember that, lady, I was going out with the prosecutor
well before we die.
Speaker 6 (09:34):
I just want you to know she worst.
Speaker 3 (09:40):
I do not want to meet my maker without giving
you that piece of information. If you survive, you may
need it.
Speaker 12 (09:50):
Oh my god, I.
Speaker 3 (09:54):
Gotta tell you. I'm sure a moment like that was
seared not only in the memory of Donald Trump, but
also into the memory of former Mayor Willie Brown.
Speaker 6 (10:04):
To be clear, you have never been on a helicopter
with Donald Trump.
Speaker 3 (10:14):
And he made a mistake.
Speaker 9 (10:15):
Thought it was.
Speaker 3 (10:17):
What what.
Speaker 6 (10:24):
That is so dum that I'm sure that is not
what happened.
Speaker 3 (10:30):
What are the chances Trump is just mixing up his
black people.
Speaker 13 (10:35):
It seems that the African American politician in question was
not Kamala Harris's ex former San Francisco Mayor Willie Brown,
but rather this man Nate Holden, a former Los Angeles
City Council member who says he had a bumpy ride
with Trump in nineteen ninety.
Speaker 6 (10:57):
Oh my god, do you know what This means.
Speaker 3 (11:03):
Nate Holden, former Los Angeles City Council member, told Donald
Trump as their.
Speaker 6 (11:09):
Helicopter was going down.
Speaker 3 (11:15):
Bad things about Kamala Harris that I guess Willie Brown
had told him if they knew each other. That is
the only explanation, right.
Speaker 13 (11:30):
Hold Him saying, quote, Willie is the short black guy
living in San Francisco.
Speaker 12 (11:34):
I'm a tall black guy living in Los Angeles.
Speaker 3 (11:36):
I guess we all look alike. Hey, Donald Trump is
not racious. He just meets a lot of people on
death helicopters and he needs some mnemonic device help. If
the chopper goes down, that's not Willie Brown.
Speaker 6 (11:59):
Head a little little device heard one.
Speaker 3 (12:06):
If the flight's not going great, you're probably riding with me.
Speaker 6 (12:14):
Look, people.
Speaker 3 (12:17):
They pulled the candidate Trump was crushing. It's hard. You
think you could write a new Hour in a month.
It's not easy. He's trying. He's trying out some good
catastrophizing on Harris.
Speaker 9 (12:30):
If Harris wins this election, you will quickly have a crash.
Like in nineteen twenty nine, we could end up in
World War three.
Speaker 6 (12:36):
The suburbs will be overrun.
Speaker 3 (12:39):
Boom, That's what I'm talking about. Stock market crash, world
War three, suburbs destroyed, it's fresh, it's new. We haven't
heard what was that? And I'm sorry.
Speaker 9 (12:49):
If Biden got it, you'll have a stock market crash
the likes of nineteen twenty nine are worse, a very
real risk of World War three. They're going to, in
my opinion, destroy Somebia.
Speaker 3 (13:04):
This is just a remix, Dude. You can't just find
and replace Biden with Kamala. That's lazy apocalypsing. Look, man,
if you want us to genuinely fear your opponent as
the existential threat you'd like to make them out to be,
you're gonna have to do better than boilerplate cut and paste. Shit.
You're better than this. Donald.
Speaker 9 (13:23):
Joe Biden is a failed president. She was a failed
vice president, the worst president in the history, the worst
vice president in history.
Speaker 7 (13:31):
He is incompetent, she's incompetent.
Speaker 9 (13:33):
Everything he's touched has been bad, Everything she's touched has
turned to bad things. He can't talk, she can't talk,
and in many ways he's worse than Burnie.
Speaker 6 (13:41):
She's worse than Burnie low IQ. He's a low IQ individual.
Speaker 7 (13:45):
She happens to be really a low IQ individual.
Speaker 1 (13:48):
Serially goes she has a very low IQ.
Speaker 3 (13:53):
This is bullshit, man. This is like when Elton John
changed like three words and then pretended Candle in the Wind
was always about Diana. It wasn't.
Speaker 6 (14:04):
Very disrespectful to Marylynd.
Speaker 3 (14:14):
Too soon.
Speaker 6 (14:22):
Here's the problem.
Speaker 3 (14:24):
Even when Trump does figure out how to come a Kamala,
it's not really landing because most of the time the
bad stuff he's saying about her applies even more to him.
Speaker 9 (14:32):
If Kamala will lie to you so brazenly about Joe
Biden's mental incapacity, then she will lie to you about anything.
Speaker 6 (14:40):
She can never ever be trusted.
Speaker 3 (14:42):
Yes, Donald Trump is telling America not to elect a liar.
Donald Trump is saying that I don't about a liar.
I mean, for God's sake, He's like the Michael Jordan
of lie, or, as Trump would say it, the Willie
(15:03):
Brown of lyon and confusion. Look, I had to say it.
Speaker 14 (15:10):
I don't think Trump has gone in him to go
after Kamala Harris. He's been fighting Joe Biden for six years,
it's all he knows. He misses the fight so much
he was still workshopping nicknames for Joe Biden.
Speaker 3 (15:23):
This weekend. What do you like better?
Speaker 9 (15:25):
It doesn't matter anymore, But what do you like better?
Speaker 1 (15:27):
Crooked Joe or sleepy Joe?
Speaker 3 (15:28):
Sleepy Joe? Crooked Joe. This is sad. It's like seeing
an old man talking to an empty spot on the
bench and then you realize that's where his wife used
to sit. He would give up.
Speaker 6 (15:51):
Everything for just one more moment. We cook at Joe.
Speaker 9 (15:57):
I hear he's gonna make it come back at the
Demokrat convention. He's gonna walk into the room and he's
gonna say, I want my presidency back, I want another chance.
Speaker 6 (16:07):
To debate Trump.
Speaker 1 (16:08):
I want another Chand he's not coming back.
Speaker 3 (16:15):
He's not coming back.
Speaker 5 (16:16):
Donald.
Speaker 6 (16:17):
Hey, you know how I know he's.
Speaker 3 (16:20):
Not coming back. We have a camera on him. That's him.
He's just sitting there at the beach, having an Arnold Palmer.
You can hear him sighing over the waves. Does this
look like a man marshaling his forces to take back
(16:41):
the nomination or filming a Corona commercial? He's finding his beach.
It's over. There's only one way, Donald, meet me a
(17:04):
camera one Hello, friend, May I call you Donald? I
get it. You wanted to run against Joe Biden. Just
(17:26):
two old dudes go in toe to toe fungus last Hurrah,
Rocky twelve.
Speaker 6 (17:32):
It's not there.
Speaker 3 (17:34):
Now You've got to run against someone who appears healthy
and youthful and happy, her vigor standing as a stark
counterpoint to whatever front butt thing you have going on.
And it's pretty clear that Biden isn't going to do
what needs to be done to stop this steal. I know,
(18:01):
love stopping steals right feeling me. Kamala Harris accepts the
nomination next Thursday night, which means it may be time
to get the gang together storm the convention pull in
August twenty second, this time on behalf of Joe Biden.
(18:23):
All you need is thousands of supporters who have not
yet been sent to jail yet for being part of
the Last.
Speaker 6 (18:29):
Mood or God sent to jail so.
Speaker 3 (18:32):
Early in the process, they're already out. If only there
was a sign of the righteousness of this cause.
Speaker 8 (18:42):
A federal judge ruling the Department of Justice must return
the spear and for helmet belonging to Quenon Shaman, Jacob Chancelly.
Speaker 6 (18:50):
Shaman donn Fair helmet we ride on for Bidon when
we go back Markibbing, it's here, don't go away.
Speaker 3 (19:02):
Let me write back. What about to Dallas show, I
(19:23):
guess tonight an entrepreneur, an already owner of the NBA's
Dallas Merrick's, co founder of Costs Plus drug company.
Speaker 15 (19:30):
Please welcome Mark Kibbn, sir, welcome, Thank you, you are
you're fair.
Speaker 6 (19:52):
Security.
Speaker 1 (19:55):
I didn't hear you what you said.
Speaker 3 (19:56):
This is a no, this is a Nicks. They love that.
Speaker 6 (20:03):
Now are people in New York?
Speaker 3 (20:05):
Are they because of the history between the Mavericks and
the Knicks generally with the trades where you fleeced us
to a certain extent? Uh, do you find there's a
kindness that is uh?
Speaker 1 (20:15):
Yeah, yeah, I'm extended to you, Nick. Literally if I
like to walk in New York, right and just today
walking down the.
Speaker 3 (20:22):
Street, Yeah, Ku man, we love you and it's.
Speaker 1 (20:24):
Crazy literally great basketball fans here. I get all kinds
of love.
Speaker 3 (20:28):
And that's what you get in New York. That's what
they shout at you.
Speaker 1 (20:31):
Yeah, but that's what I get. And now it's more
thanks for JB. Right, but yeah, that's what I write.
Speaker 3 (20:36):
Well, Jalen, But now, did you have any idea when
Jalen Brunson was there, and I'm sorry to go down
this road, but I'm a Nick fan and this is
just you're gonna have to sit through. Ith Jalen Brunson
was not He started in the playoffs when uh yeah,
when Luca got us, did you have any idea that
he would become this all nb a phenomenon. He's undersized,
he doesn't. His footwork is so phenomenal.
Speaker 1 (20:58):
No, no idea. I mean, I mean he was talented,
but he was picked in the second round. If everybody knew,
he would have been a top five pick. I mean,
if you redraft that draft other than Luca, he is
a top three or five pick.
Speaker 3 (21:10):
That's amazing.
Speaker 1 (21:10):
It's crazy. Yeah, but more credit to him. He worked
on it.
Speaker 3 (21:13):
Yeah, and he's and it just seems like a phenomenal guy.
And then decided to take a contract for less money
than he could have made.
Speaker 1 (21:19):
So let's talk politics.
Speaker 3 (21:26):
By the way. Now you are in this interesting position
in your career where you've sort of above you are now,
even though I think your leanings are probably you consider
more independent, more libertarian, you are the left's favorite billionaire
because and I can't I don't know if it's because
(21:50):
there's a certain mellowing that occurs as you get older,
or if this new sort of tech bro phenomenon is
so dystopian in its formulation.
Speaker 1 (22:03):
Yeah, I mean this is all who I've always been.
I haven't been like the rich guy trying to act
like a rich guy. My friends just still my high
school buddies, my college buddies, my rugby buddies. But watching
what's happened in Silicon Valley is insane, right right. It's
not so much a support thing. It's more like a
takeover thing, trying to put themselves in a position to
have as much control as possible. They want Trump to
(22:24):
be the CEO of the United States of America, and
they want to be the board of directors that makes
him listen to them.
Speaker 3 (22:30):
It's not a good What is the ethos? Because it
seems like in the old days of innovation there was
a certain amount of we're innervating the Internet, we're taking things.
Now it seems much more about sort of this social
engineering and transhumanism, and we are going to join with
computers and together eight of us are going to run
(22:53):
everything dominating, right, Is that the ethos? You see?
Speaker 1 (22:58):
Yeah? I think I go yeah, you just said yeah,
they've gotten to the point now where they feel like
they should control the world, right, and that there should
be a CEO in charge of everything. But because they
have a good photo app because of riches, right, you know,
it's just like you get to that point sometimes where
(23:18):
I think they've lost the connection to real world.
Speaker 3 (23:22):
Is it boredom? Like is there a certain extent like
if you're like a Bezos or one of those guys,
you just you've sold so many books that you're just like,
I'm going to live on Mars.
Speaker 1 (23:31):
Like It's just I think it's more what's their next act? Right,
We've like, we invented this, we did this, we created that.
What can we do next? Somebody wants to go to Mars? Well,
what can we do here back on Earth? Well, let's
I mean look at Elon right, Elon and being one
of those powerful people. He's trying to be the most
influential man in the world. It sounds like a commercial,
(23:53):
but literally that's what Twitter has given.
Speaker 3 (23:55):
I've got to say I think he might be that
because I don't even think he's trying to. When you
when you talk about somebody who is setting up satellite
links for war zones and also controlling discourse in the
most important platform, the most.
Speaker 1 (24:10):
Powerful, because Twitter is in every almost every country, right,
and so Twitter gives him the ability to connect to
the prime minister, the head of every country in the
world that's right, and that that person whoever is in
charge of that country has an interest in what happens
on Twitter, and what happens on Twitter because of the
control of the algorithms. Being the biggest user is all
dependent on Elon Musk. He literally wherever his thumb wants
(24:33):
to go, he gets to push his hard certainly.
Speaker 3 (24:35):
I mean, he's transparent about where he wants things to go.
I think he's very clear that civil war is inevitable
and that white people are under concerning right, it's it's
you know, it'll be like civil wars inevitable. And then
he'll write underneath there, you know, kind of an undersavment
on there. But uh, I can't I can't decide whether
(24:59):
or not it's better to know exactly where he stands
and know where he's going to be put the thumb on,
because he's clearly a very bright guy. Yes, and he
has a media empire that has the largest reach and
most influence of anything on the face of the earth,
and there's no question he's going to leverage it in
this election, no question.
Speaker 1 (25:17):
But the crazy part is he has more impact globally
than he does domestically in my opinion, right, because when
you go on X you see a preponderance of right
leaning people. You don't see a lot.
Speaker 3 (25:28):
They're all over my for you. I've never clicked on
any of these.
Speaker 1 (25:31):
Well, that's the whole thing. That's the way algorithms work, right, what, Yes.
Speaker 6 (25:36):
They do the opposite of what I want. Yes, but
somebody tells them when you write an algorithm.
Speaker 1 (25:41):
I haven't written a lot, it's been a while. But
when you write one, you get to set the parameters
of what you want to see happen. And he certainly
has done that to the things he likes. But it's
different in other platforms. And the good news is what
twenty percent of adults in the United States are on Twitter,
So I mean there's eighty percent who aren't there.
Speaker 3 (25:58):
But isn't this a certain amount of of uh tech
bro malpractice that there is this incredible uh need in
the marketplace of something that is slightly less biased or
you know, toxic when it comes through there and like
they came out with threads, and you're on it for
two seconds and you're like, I think I need an app.
Speaker 1 (26:17):
No, I like threads. Threads is getting better. Try it.
Speaker 3 (26:20):
No, h here's something that doesn't sell online. No, it's
getting better.
Speaker 6 (26:26):
That may be the worst wor saleshit ever.
Speaker 3 (26:29):
Okay for any of these, but see, you you do
disrupt industries like there is. See that's why I would
have thought, and I think you've said this that Trump
appealed to you at first because there is a certain outsider.
And look, we both know our government, there is a
status quo, and there is a capture by lobbies and
(26:49):
by big businesses that write this legislation and end up
gaining advantage that needs to be disrupted.
Speaker 1 (26:56):
Correct.
Speaker 3 (26:57):
When did it occur to you that he didn't necessarily
want to free it. He wanted to have the deed
to the swamp signed over to.
Speaker 1 (27:04):
Him about the third time I talked to him, Right,
it was he wasn't about changing. I mean, the conversations
I would have with him. I'm like, there was a
time when are these phone conversational conversations? Yes?
Speaker 3 (27:17):
Is it zoom? No?
Speaker 6 (27:18):
It wasn't zoom.
Speaker 1 (27:19):
Right, that was pre zoom. Actually does he FaceTime? No,
that didn't FaceTime, right, But like we were talking about
this one debate for CNBC that he wasn't going to
be at, and I'm.
Speaker 6 (27:28):
Like, don' going much?
Speaker 3 (27:29):
Not going?
Speaker 1 (27:30):
And I'm like, Donald, why don't you go to a
local small business and sit there at the table and
just show off your business, chops right, and show people
your business? He goes Mark, Donald Trump and Mark Cuban
don't go to people's houses and have dinner. Are you
kidding me? That's who he is. Right when we talked
about what's he going to do with the ground game out,
(27:51):
I got all these religious people who are going to
do their work.
Speaker 3 (27:53):
For me Jesus. So he in his mind. So I
think this is very interesting because and maybe you know
this too, he runs a family business, so he is
an essence and monarch. It's a dictatorship. And maybe there's
not as much malevolence to his actions as oh, this
America can be a subsidiary of the Trump organization because
(28:18):
this is how I run it. And they might say, well,
we have checks and balances and division of government, and
he just thinks himself, yeah, no, we're gonna get out
of that.
Speaker 1 (28:25):
Yeah, that's the sense I get. That's what it is. Yeah,
this is my country, right, everybody else is bad? Donald good?
Speaker 3 (28:31):
Okay, and so Donald good. So whoever thinks Donald goold also.
Speaker 1 (28:35):
Come along with the for the ride? Right? I mean,
he just brought hate and anger to politics, and that
is a sales pitch when.
Speaker 3 (28:43):
You talk to him. Is that a part of his
general conversation or do you think that is a strategic
demagoguing of he wants to get that emotion.
Speaker 1 (28:53):
That wasn't what we talked about. But I think that's
Donald is a sales rep. He's a salesperson. He's going
to follow what works and whatever. He's going to try
all kinds of different things. He's going to talk to
all kinds of different people and he'll try things out
and if it works, it's going to.
Speaker 3 (29:08):
He's going to do more of it. Do you see
him on his heels?
Speaker 9 (29:10):
Now?
Speaker 3 (29:11):
When was the last time that you sort of had
these counseling sessions.
Speaker 1 (29:15):
No, there weren't. I talked to him probably twenty nineteen.
Speaker 3 (29:19):
No.
Speaker 1 (29:19):
I talked to him during the pandemic because I was
trying to help him with different things. Look, he's still
the president of the United States. It's still our country, right,
So I tried to help him with PPE and a
lot of different things, a lot of medical cares type stuff.
Speaker 3 (29:31):
Sure we got who suggested the bleach is that you
is that everything. Everything's going great, everything's working. Cuban Prince says,
have you tried drinking liquid plumber? I did not say
drink I said in Jack, all.
Speaker 6 (29:47):
Right, I heard so.
Speaker 3 (29:52):
All this is going on. You've soured. So what is
your relationship now with this tech world? And how does
AI fit into that? And how do you remain bullish
on those innovations when they so clearly are working to
avoid any kind of regulation of these new innovations.
Speaker 1 (30:12):
Okay, two things. One, they're there because they're rich, not
because they're tech bros. Or because they just happen to
make their money in tech. I don't think that's really
applicable the AI side. You know, I've been in technology
for a long time, and you can always look at
a new tech PCs, networks, the Internet, streaming whatever, and say, okay,
in five years, this is what's going to happen. Right,
(30:32):
have a good sense with AI. You can't do that
with large language models. We have no idea whether it's
going to zig or zag or what the impact is
going to be. And that's the good news and the
bad news. The good news is we're dominating right now
globally the United States. Is the bad news is in
terms of in terms of are the quality and the
impact of the AI and the advancements that we're introducing
(30:54):
in AI, the research that we're doing, we are, without
questioning the leader, and that's really important from a defense perspective, military,
et cetera. And also you know, from a business perspective,
it's going to have a big impact on this country.
I personally think it's generally positive, but there's a lot
of uncertainty to come.
Speaker 3 (31:11):
And so when you know, what gives you the hope
that it's generally positive because I as a counterpoint, we
heard the same thing about social media, and we heard
the same thing about all these different innovations of the connectivity.
And yet every time I turn on Congress, Zuckerberg is
up there like, look, I'm really sad. I didn't know
(31:32):
it was going to kill all your daughters.
Speaker 1 (31:35):
Like no, remember, it's still just a short window. Social
media you know, has really only been prominent last six
years and I think we'll learn and we'll evolve, and
the same thing will happen with AI. There's going to
be points in time where it's up right and people
are using it. But I think over time, particularly with
gen Z, right, gen Z is a different beast. You know,
boomers are idiots. I mean we went from sex, We
(31:58):
went from sex, drugs and rock and all to Fox News.
I mean, it doesn't get any worse than that, right,
And they're trying to we haven't done what And they're
trying to define regulations, right, And that's hard, right, that's
really really And so I think gen Z has a
better understanding and a better feel for AI and where
it's going and would maybe be able to come up
(32:20):
with better uses, better implementations, and better regulation.
Speaker 3 (32:23):
Does it concern you that the implementation time frame? So
when you think about the industrial revolution, right, and you
think about the disruption or globalization, the disruption to the workforce,
the way that labor can travel and labor cannot travel
but capital can, right, and all these different things that
were kind of a race to the bottom for American
workers to a large extent. But all those changes took
(32:44):
place over sometimes the century, sometimes decades, the changes in
AI the disrupt Right. So when you've got something that
disrupts to maybe even a larger extent than globalization did,
to maybe a larger extent than the Industrial Revolution did,
and it's going to happen by Thursday, in what world
are humans in any way capable and set to withstand
(33:09):
that disruption.
Speaker 1 (33:10):
I think we'll be able to withstand it. But I
think it's going to be very disruptive. And the problem
is it's going to happen anyways. And you know, somebody here,
your son at Duke right, can say I've got this
great idea, I'm going to implement it with an open source,
large language model and I'm going to take it in.
Speaker 3 (33:26):
That's so weird he did say that to me, right.
Speaker 1 (33:31):
But gen Z is different, right, gen Z I think
looks at humanity. Humanity differently, is kinder, Like I've got
three kids, fifteen, eighteen and twenty one, right, right, and
they're just nicer, right, They're not like we were.
Speaker 3 (33:45):
So are you trying to say, like, are we weathering?
What is the last gasp of this kind of more
misanthropic moment in history? So in your mind, whatever happens,
this is going to be a more misanthropic decade that
will be ameliorated by this younger generation.
Speaker 1 (34:06):
Right, I hope so, because the regulatory cap the way
we've always done politics right now is everybody's chasing power
and nothing will give you more power than military and AI.
And I think the algorithm, I mean, we talked going
back to algorithms again, right, driven by AI. That's the
most powerful element in the world right now because everybody
(34:28):
just gets whatever they're seeing reinforced. And if you want
to influence somebody, just manipulate the algorithm and you'll get
their attention.
Speaker 3 (34:35):
And so but I think, so, what's the remedy on
that if there's no one working a pushback? If pushing
back on that is considered.
Speaker 1 (34:42):
You just got to go censorship. It's just one of
those things where you've got to go through it.
Speaker 3 (34:46):
It's an evolution of a new media model.
Speaker 1 (34:49):
Just an evolution of technology, right media, right, because if
we don't do it, the Chinese and the Russians will
because the only thing that holds AI back is processing power,
electricity and ingenuity, right, and I think our ingenuity wins.
I'm still a big believer in an American exceptionalism. I
still believe that we've got the best technologists in the world,
and I think that's why we have to open that
(35:10):
door for AI.
Speaker 3 (35:11):
So ultimately it becomes a question of the world is
going to be carved up in the way that it's
always been somewhat carved up in terms of its resources.
The question is is it carved up by the Western
world or is it carved up by somebody else, a
different world? And do they set up a different system.
And I'm assuming that Russia and China see a unique
vulnerability in the West's ascension in this moment that's been
(35:34):
the world order since nineteenth five.
Speaker 1 (35:36):
Everybody looks at it, right, and looks at it and
says Ai, if I can he who controls Ai? Right?
And so, But we've done a good job of limiting processors.
The New Semiconductor Act will help us quite a bit,
and we'll bring things. You know, we were already doing
most of those things here.
Speaker 3 (35:54):
Right, So how do you resist the ring? Right? So,
like Lord of the Rings, the ring of like it's
the one thing. Boy, when you get the ring, you
just don't want to let it go. How do you
resist that? Because you've got the money, You've got the influence.
You could be that guy. You could be setting those
things up and doing all that. But you're just trying
to get us like better generic aspirin.
Speaker 6 (36:15):
Like what is happening?
Speaker 3 (36:20):
No, No, I tell you that.
Speaker 1 (36:22):
I know what I know, and I know what I
can do. I know what I'm good at.
Speaker 3 (36:25):
Okay, And you're not tempted by the ring that's in front.
Speaker 1 (36:31):
Of it, because I think there's a different ring, right,
Because yeah, AI could be the end all be all technologically,
but that doesn't play to my strengths, and the ups
and downs and ins and outs are just not me.
But you want to talk about pharmacy, what could be
better than the healthcare system in the United States of
America and make it feel it's affordable?
Speaker 3 (36:50):
But that does your things?
Speaker 1 (36:53):
There's the path there, there is.
Speaker 3 (36:55):
I imagine when you get in that position, at that height,
you can't help but hear the siren call of you
could run this whole thank you, But maybe a little bit,
maybe a little bit, but you know, just I hate
to use the cliches, but the way I was raised,
I've got three kids, right, and I don't want to
miss that, you know, I don't want to be ninety
(37:18):
five and look back and say I was president, but
I didn't get to know my kids at all. Right,
you know, I'd rather say healthcare and everybody's healthier, and
everybody's got a better world to live in. And my
kids and I have friends, were close. You know, they
bring over the grandkids and the kids' kids, and that's
just more important to me. Right. And do you have
your eye on other industries right now where you can
(37:39):
do sort of the same thing.
Speaker 1 (37:41):
If this pharmacy and where you know, costplus Drugs dot constant,
I'm gonna get that sales pitch in there. Costplus Drugs
dot Com is literally in process of having a significant
impact on the drug market. Right. We are pushing generic
drugs down down now, we're right around the corner front.
Speaker 3 (37:56):
Well, you're negotiating prices in a way that hasn't been
done private.
Speaker 1 (37:59):
Right, So when you go prior to us, there was
no transparency whatsoever, right, and so nobody knew what the
price of any medication was, whether you're an employer playing.
Speaker 3 (38:07):
For you, and it's just run by these boards.
Speaker 1 (38:09):
Yeah, the these pharmacy benefit managers are dictating prices left
and right. They're basically stealing money from employers and employees.
And so we walked in there and said, what's the
one missing piece transparency? So when you go to costplus
Drugs dot com, you put in the name of the
medication you might take. Let's just say to Dila Phil, right,
I know you don't know what's drug.
Speaker 3 (38:28):
I'm so hopped up on I have no idea. Do
you know what it is?
Speaker 1 (38:33):
I don't generic sialis.
Speaker 3 (38:47):
As I said before, I am so hot.
Speaker 1 (38:52):
When you go to cost plus Drugs dot com and
you put into Dila phil or what I mean. First
thing we do is we show you our cost. Then
we show you our markup is which is always fifteen percent,
and everybody gets the same price because we're mill order
to start, we're starting to partner with pharmacies. Now there's
a shipping fee and then there's a fee for the
pharmacists to review everything. And when you do it that way,
this is legal. Of course it's legal.
Speaker 6 (39:13):
Yeah, it's good old American capitalism.
Speaker 1 (39:15):
But let me just tell you the impact. There are
drugs that there's a drug called a matinet for chemotherapy
that when we started, the price of a matinet if
you just walked into a big pharmacy, A big chain
pharmacy was going to be two thousand dollars. You go
to costplus Drugs dot com, it's under thirty. There's a
drug drox a dopa. Right, actually said Sandy, I had
a friend, I had a front landed who was in
(39:36):
a terrific car crash and he needed this drug droxidopa
and lost his insurance. It was going to be thirty
thousand dollars every three months. I'm like, let me just
check the seafood. We can get it sixty four dollars
a month, and the price has gone down since all
because we were transparent.
Speaker 3 (39:52):
But like, weren't there dudes like Martin Screlly in jail
for shit like that, Like when you jack prices up
like that? And why can't the United States government negotiate
in terms of if you're the largest customer to any industry,
it's criminal that you wouldn't use any leverage to make
those things more available.
Speaker 1 (40:11):
The problem was there's this thing called pharmacy benefit managers, right,
and they're basically responsible for doing the negotiating with to
a certain extent, Medicare, but with all the large employers,
if you're one of those big companies that cover one
hundred and fifty million employees across the country. That's who
you negotiate with. And the first rule when they negotiate,
(40:32):
they say, is you can't talk about this. It's like
fight club. You cannot say what your price is. You
can't say what we're doing in our negotiation. And they
got so big doing that that nobody ever questioned them.
We come along, and actually Martin scurelly plays a little
part in this whole thing because when he got thrown
in jail, I was talking to Alex Oshmyanski, my partner,
(40:52):
and it's like, if this dude can just jack up
the price, it is not an efficient market. That means
nobody knows what the real cost is. If we publish
our price, boom, the whole world's going to change. And
as it turns out, the FTC just came out with
this report criticizing the PBMs. They used our pricing data.
The smartest thing we did was so now.
Speaker 3 (41:11):
So this brings up so FTC is a federal Trade Comission,
and boy, there's nothing the tech world hates.
Speaker 6 (41:17):
More than the FC than the FTC. So how does
that square?
Speaker 1 (41:21):
Well, you know, like any agency, they do, something's right
and something's wrong. So but in this case with the PBMs,
they're crushing them and it's justified.
Speaker 3 (41:30):
Now is it something that can't be done throughout the
healthcare because one of the difficulties with healthcare is the
contingencies of you can't really comparison shop. When you have
a heart attack, you're basically saying, drive me to the
closest hospital and take care of it. But those prices
you're talking about, you could get heart attack treatment at
this hospital it's one hundred and fifty thousand dollars, but
you go up the street and it's twelve thousand.
Speaker 1 (41:50):
And it's all Aboutandy knows what's the and what happens
is who's pain when you, you know, God forbid, have a
heart attack and you go there and let's just say
it's going through your employer, right, your employer has no
idea what they're paying. And so what we're saying is
on drugs first, and now we're just getting it approved today,
we're going to publish all contracts. Never before has it
been done where for my companies. We're saying, if you
(42:12):
want to do business with us, if this hospital system
wants to work with my companies whatever it may be,
We're going to publish them and put them online for
anybody to see all of our pricing.
Speaker 3 (42:22):
But so the right, I think that's fantastic.
Speaker 6 (42:25):
But I'm curious setting.
Speaker 3 (42:27):
Why is there such pushback on this idea of applying
those same kinds of competitions and things to our healthcare sism.
You know, we talk about what we have about a privatized
healthcarecism and it's the best in the world, but very
clearly it doesn't function like a free market.
Speaker 1 (42:44):
No, it's not in any way at all.
Speaker 3 (42:45):
So what is so terrible about getting everybody healthcare? Like?
Why is that such?
Speaker 1 (42:51):
But these companies, these PBMs and the big insurance companies
they call them the buka's the largest insurance companies, right,
they are so big, Like, like I keep on saying,
big employers cover one hundred and fifty million people, right,
And the CEO of this big company doesn't know much
about healthcare and their health care costs, and so they
just say to them, Okay, we're going to write you
a check for a rebate, even though it's your sickest
(43:13):
employees that are paying for that rebate, right, they just
don't know.
Speaker 3 (43:16):
And it's so interesting because it's such a non villainous
you know, nobody ever talks about like big prescription benefit manager, right,
Like that's a good tell me. It's always like big
oil is going to come down, or big tobacco or
big farm, and it's really like the PbII middle manager.
Speaker 1 (43:33):
Yeah, that's what it is, right, and you cut them out, right,
there's no reason for the big ones that controlled ninety
percent of the prescriptions that are filled, there's no reason
for them to exist. There are others that are called
pass through PBMs, right, that show you all your claims,
show you all your data, show you all your pricing,
that do it for a fraction of the price. Right,
So there's an opportunity tosuption disruption.
Speaker 6 (43:54):
Be like to see what's that? Now, what's that? What
else you have your eye on?
Speaker 1 (44:00):
Healthcare? Healthcare? It's gonna be healthcare healthcare.
Speaker 3 (44:02):
I'm with that. I'm with that too, and and it
might be, you know, with that money, if you could
help the Knicks get okay, forget it, it's all fine.
Thank you very much for coming by. Always a fascinating conversation.
Check out costplaus Drugs dot com, Mark Cuban, where.
Speaker 1 (44:19):
Do you take that?
Speaker 6 (44:20):
B No, that's.
Speaker 3 (44:38):
Now, don't cover tonight before we go, We're gonna check
in with yours for the rest of the week. Dailick
is gonna be John MOI what are you covering this week?
Speaker 12 (44:52):
Oh John, I'll be recapping all the inspiring athletes of
the Olympics, Simon Biles, Katie Ledecki and of course, of
course that Australian breakdancing lady.
Speaker 3 (45:04):
Why why you know the thing about her? It didn't
her dancing didn't seem and I say this was good?
It didn't. It didn't seem so good, Oh John.
Speaker 12 (45:20):
She was inspiringly terrible, inspiringly terrible, because I can never
do what Simone Biles does.
Speaker 5 (45:27):
But this all.
Speaker 3 (45:32):
This, Yes, I can do this this and quite well
I might have.
Speaker 4 (45:36):
Yeah, all this stuff, John, I could do this.
Speaker 15 (45:40):
All this?
Speaker 3 (45:41):
What what how you do this all day? I can
all day.
Speaker 6 (45:48):
I could have done it all day before COVID.
Speaker 1 (45:50):
But now I'm tired.
Speaker 6 (45:53):
Twenty twenty eight, Here I came.
Speaker 2 (45:58):
Explore more shows from The Daily showdcast universe by searching
The Daily Show wherever you get your podcasts. Watch The
Daily Show weeknights at eleven ten Central on Comedy Central,
and stream full episodes anytime on Paramount
Speaker 6 (46:11):
Plus Paramount Podcasts