All Episodes

February 28, 2025 33 mins
ICYMI: Hour Two of ‘Later, with Mo’Kelly’ Presents – A look at everything from Amazon Alexa gets an AI upgrade, to Gmail switching from SMS to QR codes more on ‘Tech Thursday’ with regular guest contributor; (author, podcast host, and technology pundit) Marsha Collier…PLUS – Thoughts on “humanity achieving the Singularity within the next 12 months” AND the latest robotaxi news - on KFI AM 640…Live everywhere on the iHeartRadio app
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You're listening to Later with mo Kelly on Demand from
KFI AM six forty.

Speaker 2 (00:10):
And what you've just been listening to is the ultimate
in recorded sound. It will make all conventional disc and
cassette systems obsolete. It's dustproof, scratch proof, digitally recorded read
by a laser and it's called the compact disc. And
that's it. The biggest revolution in the recording industry since
the invention of the long playing gramophone record. But this

(00:33):
is no ordinary disc, just twelve centimeters in diameter. The
music is recorded onto it digitally and there's no needle
being dragged through a groove. That information is being read
by a laser light.

Speaker 3 (00:48):
KFIM six forty is Later with mo Kelly Tech Thursday.
As a matter of fact, we're live everywhere on the
iHeartRadio app. Marsha Collier joins me in studio. Marsha, have
we been prone to hyperbole with these two technological conventions.

Speaker 4 (01:01):
Over the years.

Speaker 5 (01:02):
I think, I think some of them I mentioned this
last week, some of them were just on the edge
of ridiculous. And I forget what major remember that pin
that came out It was an AI pin that you
were supposed to talk to and it was supposed to.
It was a pin that you wore and nobody wanted it. No,

(01:23):
some major company just bought it for a quack a
billion dollars.

Speaker 3 (01:27):
I mean it's I guess maybe you want the technology
and it was some other application.

Speaker 5 (01:32):
That's it. And all this ridiculous tech that we see,
that's where it is. By the way, you were talking
about Gene Hackman. Yes, Gene Hackman as Lex Luthor. Oh
my goodness, thank you. I never expected that performance out
of him. It was so the opposite of everything that
Gene Hackman ever ever did. And with you, miss Tesma,

(01:56):
Miss Tesma, Yeah, I mean it was perfect. He was amazing.

Speaker 3 (02:03):
You think about that cast, even ned Baty. That cast
was so underappreciated, I think for its time because the
movie came out at a time where superheroes, Superman. I
don't think the movie was taking us seriously back then
as we would have now.

Speaker 5 (02:23):
And I remember sitting and watching that because I was
a DC crazy as I was growing up and had
I still have the early Lowis Lanes and stuff like that,
and I loved it so much. I wanted to grow
up and be Lois Lane. And when I saw Christopher
Reeves flying with Lois Lane. Oh my goodness, it was magical.

Speaker 3 (02:47):
That first Superman movie with the John Williams score. Yes,
I don't think we have the super hero industry today
with that movie. Yes, you could say Batman nineteen eighty nine, Yeah,
was big, But I think it goes back to Superman.

Speaker 5 (03:08):
Absolutely. You know who it goes back to, also, Alexa.
Go ahead, well, this is the tech segment, right, So
we're talking about technology now, and a star of many
lives is Alexa. And I hope your Alexa didn't do

(03:29):
anything when I said that, I have personally unplugged my Alexa.
I did not want it listening all the time. And
it is listening all the time, all the time. And
it seems that you know, it's getting better the Seattle Company. Obviously,

(03:49):
Amazon is going to install even better AI products and
services that it will sell to businesses and other organizations.

Speaker 4 (04:00):
I mean our data along with it. I have a feeling, I.

Speaker 5 (04:03):
Have a feeling that that that might roll along with it.
Amazon executives say Alexa will now identify who is speaking
and know the person's preferences such as favorite sports teams, musicians,
and foods. They demonstrated how a device powered Alexa plus
could raise suggest a restaurant, book a reservation on open table,

(04:26):
and an order an uber insteat, a calendar and invitation.
But we know that Google can already do that.

Speaker 3 (04:33):
That's right, But yeah, your asket it's polite in that way.
It doesn't just do it without you know, any pushy theresa.

Speaker 6 (04:42):
You know.

Speaker 5 (04:43):
No, no, you spend forty dollars ahead for six people,
you know. But so it's coming into your house if
you continue, and I assume it's going to be free
to prime users and it will be nineteen ninety nine
month for those who what yes.

Speaker 4 (05:05):
What it's one hundred dollars a year.

Speaker 5 (05:08):
Yeah, yeah, you know. I read an article today though,
and people don't want to use AI. I mean, you
can take all this noise and every story I pick up.
And that's why I picked the stories I had today,
because it's not AI centric. It's because people don't want
to spread their private details. We have too many security

(05:30):
cameras at my house. I want them out.

Speaker 4 (05:32):
But there's something for me.

Speaker 3 (05:33):
It's there's a difference between help and assuming control. And
there's certain things that I don't want to relinquish to
a computer entity.

Speaker 5 (05:44):
Well, how about that little Amazon drone that's supposed to
be in your house that when you leave get gets up.
It's the little drones about four inches. It pulls out
of its little station and it has already mapped out
your house and it patrols your house while you're going on.
It's a ring device.

Speaker 4 (06:03):
My house is not that big, thank goodness.

Speaker 5 (06:05):
But think about it. You've got open mail, Well, you
go through your library. What books does this man read?
Let's catalog that for Amazon?

Speaker 3 (06:14):
Well, it catalogs where the windows and doors are and
the soft points in your security system.

Speaker 5 (06:18):
So we can't even go there. Let's I wouldn't buy
it a real quick thing. I mean, yeah, I could
tell you more, but make your own decision on that.
If you really want Alexa to have in depth conversations
with you, I.

Speaker 3 (06:35):
Don't know see that. I don't want that. I want
there to be a clear delineation. I don't want it
in the context of an actual person. I wanted to
help me with my life. I don't want it.

Speaker 4 (06:47):
To assume the role of a person. It's a difference.

Speaker 5 (06:51):
And they're including also AI systems built by the startup
and morphic morepic.

Speaker 3 (07:00):
And for those who don't know, the word anthropomorphic is
to be human.

Speaker 5 (07:04):
Like, right, yeah, i'd like Rosie. Remember from the Jets.

Speaker 4 (07:09):
We're basically there with the exception of the physical.

Speaker 5 (07:12):
But why are the ones we have now so scary looking?
Every robot they build is scary looking. The ones in
the cartoons are.

Speaker 3 (07:18):
Better because they're trying to get the there's an actual
answer to that, because they're trying to get the facial
expressions right, and they cannot make the face look human.

Speaker 5 (07:26):
And that's what I'm saying, make it a cartoon.

Speaker 4 (07:30):
Just make it just a box.

Speaker 3 (07:31):
I don't need to have a face with two eyes
and and can pull it cheeks into a smile.

Speaker 4 (07:36):
I don't need any of that.

Speaker 5 (07:37):
That's that's weird, If this is all weird, beyond beyond everything.
One last thing before before we go, do you ever
worry that you'd like drop dead alone in the house
or somewhere and nobody would know that you did, just

(07:58):
in case they could resuss a you or your heart
stopped beating and you fell down. And only since I
turned fifty, right serious, for those of us over forty five,
I have to include myself when you have that kind
of fear. It's a real fear, like we don't even

(08:19):
know what happened to Geane. I mean, who knows. Google
today received clearance from the USD of the FDA for
a loss of pulse detection feature that will be in
the pixel Watch three.

Speaker 3 (08:37):
Okay, I'll probably buy that for having nothing to do
with this. But is that loss of pulse detection for
the benefit of the wearer or the benefit of the
person who finds you? Because the moment that there's no pulse,
I'm quite sure there's a fading consciousness at that point
as well.

Speaker 5 (08:53):
Well. Okay, if your heart stops beating from an event
like a primary cardiac arrest, respiratory or circular tour failure, overdose,
or poisoning, and it will automatically prompt a call to
emergency services.

Speaker 4 (09:06):
Woh, okay, to like crash detection.

Speaker 5 (09:09):
Exactly, Well, they've done a great job with crash detection,
fall detection, irregular heart rhythm notifications, and the ECG APP
and safety check. I mean, lost pulse detection is another step,
and the fact that it has been cleared by the
FDA I think is important.

Speaker 3 (09:27):
I think that's something I would look into. Miss kf
I AM six forty. We're live everywhere on the iHeartRadio app. Yes,
we'll miss you, but we still love you. Gene Hackman.

Speaker 1 (09:40):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty.

Speaker 2 (09:47):
Why by does the.

Speaker 7 (09:47):
Video game from atalrator intelegrasion invest in the Wonder computers
in nineteen eighties for under three hundred dollars, the Comodol
twenty unlike games that has a real computer keyboard for
a Comodore Pick twenty, the whole family can and computing
at home plays great games too under three hundred dollars.
The under computer of the nineteen eighties, the Commodore VIC

(10:08):
twenty coming soon. Commodore brings you Gor the Wonder Arcade
game and don't make a racing home versions Commodore.

Speaker 3 (10:15):
I had an in television to tell you the truth.
K IF I am six forty. It's Tick Thursday with
Marshall Coltleyer. We're live everywhere in the iHeartRadio app. Or
were you going to say, Marcia?

Speaker 5 (10:24):
I was. I was the throwing in my court quarters
at the arcade. I was just like I packed me.

Speaker 3 (10:29):
It was a big deal to be able to have
video games at home. And that was right at the
beginning with the Atari, the in television. You know, that
was the closest thing we had to home.

Speaker 5 (10:41):
Mary Wolfenstein, of course I do LEAs Larry.

Speaker 4 (10:45):
Yeah, look that's my that's my era era. Yes, those
were early.

Speaker 3 (10:49):
I actually worked in an arcade when I was seventeen,
eighteen years old. It was Aladdin's Castle in Dilameo Mall.
I thought I was in heaven working in it arcade.
I could play when I wasn't working, play all the
arcade arcade games as much as.

Speaker 4 (11:04):
I wanted, all day long.

Speaker 3 (11:08):
I had a literal key to open up the games
and play as much as I wanted. There was no
better job in the world.

Speaker 5 (11:15):
That's great. As a matter of fact, I had a
business back in the olden days, and I did marketing
for different shopping centers, and one of those shopping centers
had an arcade in it. So when I came in
to meet with management, the nice guy who owned the
arcade would give my daughter a silo of quarters and
let her loose, and she I knew she was taken

(11:36):
care of. I have a pinball machine at home.

Speaker 4 (11:39):
I love you.

Speaker 5 (11:42):
I'm old Chicago. It's an old, old machine.

Speaker 3 (11:45):
No, I love playing pinball. I'm old enough to say.
I remember arcade games when they were only one quarter,
just one quarter, right, not the fifty cents, not the dollar,
but one quarter.

Speaker 5 (11:56):
And when that shiny ball came down, came down the shoe,
and then you'd take pulled back ever since, and you
thought you was really going to control things by the
way you pulled it back.

Speaker 3 (12:08):
I was never very good at pinball. To tilt the
machine was a moment of shame. Well, yeah, yeah, I
was just never very good at it.

Speaker 5 (12:18):
I was never strong enough.

Speaker 3 (12:20):
Yes, oh no, I got very angry and you know,
bumped around. But that's not why we're here. Why we
are here is for you, Marshall Colliers.

Speaker 5 (12:27):
Okay, so so so back to technology because I know
that's why you all are here. I know they'll get
back to the good stuff later. You've used anti virus software.

Speaker 4 (12:38):
Absolutely over the years, the Kaffe Malaware, malabites, you know
that kind of thing, A vassed a vass, Yes, did
you know? Did I know?

Speaker 6 (12:48):
What?

Speaker 5 (12:50):
Did you know?

Speaker 6 (12:50):
That?

Speaker 5 (12:51):
The FTC found the company collected information on religious beliefs,
health concerns, political leanings, financial status and location, and without
user consent and sold it through a subsidiary.

Speaker 3 (13:03):
I did not know, but that's my assumption with any company,
because data is currency.

Speaker 5 (13:09):
Well that's now, But this was then, this was, you know,
twenty fourteen, So it seems.

Speaker 4 (13:19):
That they got a lot of people's data.

Speaker 5 (13:21):
The FTC didn't like the idea at all, and they
reached a sixteen point five million dollar settlement with the
Federal Trade Commission, which I don't think is enough.

Speaker 3 (13:31):
Oh no, not for what they for the privacy violations
and the money that they've made subsequent to that.

Speaker 4 (13:39):
Of course not.

Speaker 5 (13:40):
And that that number includes compensation for users who bought
a VASS software between twenty fourteen and January twenty twenty.
Now I'm one of those people. I always got a
weird feeling about a VASS. It always wanted more, It
wanted more installations, it wanted more permissions.

Speaker 4 (14:00):
Yeah we got this, Yeah.

Speaker 5 (14:01):
Got that, And I said, no, No, all I want
is this, So I dropped them. But customers can start
to claim refunds from the settlement now. According to the FTC,
if you're eligible for a refund from a VAST, you
will be notified via email between February twenty fourth and

(14:23):
March seventh, and the notice will include really important acclaim ID,
so print it out even if you're not deleting it.

Speaker 4 (14:32):
Print it out.

Speaker 5 (14:33):
You'll need this to complete the a vast settlement claim
form online and they'll have phone numbers if you need
extra help, you can do it. You have to file
your claim by June fifth, twenty twenty five. And it
just really pisses me off. And I wish we'd get
more money out of some of these big company pockets.

Speaker 4 (14:52):
It's not going to happen, but it's nice to have dreams.

Speaker 5 (14:56):
But in other notes, I'm a two factor authentication, yes,
do you apply that everywhere you go.

Speaker 3 (15:05):
As many places as possible? A lot of things now,
like even with Instagram you don't even have the option.
You have to have some sort of two factor authentic exactly.

Speaker 5 (15:13):
And Google blows me away is going away from two
factor authentication with SMS codes and SMS codes for those
who don't know it, that's texting and Google or somebody
else will text you a six digit number or something

(15:34):
and you have to go back to the screen and
type it in or tap and it copies over right,
especially if it's on your phone and you don't want
It's like, huh, yeah.

Speaker 3 (15:45):
My phone is in the other room. You say, I
got to go get the phone just to log into
my email.

Speaker 5 (15:50):
Right, But yeah, they have to do it. But that's
the source of phishing. People who have been phished a
lot of times. The number fished means they've been defrauded
via email or data has been stolen from a marketplace
or something, and it's on the black market. People will

(16:13):
buy these numbers and they will try to log into
different services and you'll all of a sudden get an
SMS saying to log in, just use these numbers. They're
trying to find out which numbers are active. So it's
been bad. So what Google is going to do now

(16:35):
They've been starting to prompt with passkeys. And I was
going to do a whole thing on passkeys, because frankly,
what's a pass key? There are a billion ways to
do a pass key, but I don't have all time
for it. Let's just say there's a thing you can
plug into your phone. There's numbers. The past key I

(16:55):
use on some apps is my fingerprint finger print online
because that goes to Google. Yeah, but they're replacing this
with QR codes and yeah, they say, you take a picture,
you scan the Let me just tell you it's a
lot easier for Android users. Just go to the play
Store download Google Authenticator. There are billion authentication apps, so

(17:22):
do not be fooled, except only the original Google Authenticator.
And when you open that, it gives you incredible options.
I have.

Speaker 3 (17:34):
Let me jump in there. I have Google Authenticator. My
only problem with it it's not easy to figure out how.

Speaker 5 (17:42):
To use it exactly because there's a blank page. When
you press the cross in the lower right hand corner,
it says scan a QR code or enter a setup key.
You will start seeing QR codes and setup keys more
and more. They're just going to start doing this, and
I highly recommend that you use this free app and

(18:05):
be sure to download just Google Authenticator.

Speaker 4 (18:09):
It's one of those things.

Speaker 3 (18:10):
It'll take you a while to get used to it,
but is it actually quicker once you master it.

Speaker 5 (18:16):
It is quicker. And QR codes. I'd been using them
forever because I thought they were cool, and then I
stopped using them when I realized they could be fishing.
And that's the thing. You have to be careful.

Speaker 3 (18:27):
You always have to stay one step ahead of the
would be thieves because they're always looking for a way
to intercept your information.

Speaker 5 (18:34):
Yeah, if you're looking at a sign and it has
a QR code on it, double check to be sure
it's not another sticker stuck over the real QR code,
because that's another way for them to pull data from
your phone. And that's kind of why I'm here. I
want to give you information to protect you and make
you happy to enjoy the tech we have.

Speaker 3 (18:53):
You gave a lot of information, but also I'm quite
sure you inspired some questions about that information.

Speaker 4 (18:58):
How can they further follow up up with you?

Speaker 5 (19:00):
If you go to my website Marsha Collier dot com
m A R S H A C O L L
I E R dot com and go to the contact page,
there's a contact form. I don't save anything. Why why
do I not save anything? Because I don't want a list,
I do not want your information. I do not want
the responsibility of any of that nonsense, so I don't
save it. Contact me there and I will answer your questions.

(19:23):
And by the way, thank you to the people who
have sent me compliments. It really made me made me
feel good. You guys are great.

Speaker 4 (19:31):
Stefan Wolfenstein, Oh, that's right.

Speaker 5 (19:42):
He's got the guys so long as I've heard that.
You're walking through the halls. Now we're going to turn
a corner soon? Or are we not going to see
any Nazis?

Speaker 3 (19:52):
How do you know about Wolfensteining? And you're not even
forty yet? You have to This is my early eighties.
When you said Aladdin's Castle, I almost lost. I just
lost my ass. I was like, that was my play.
We were from the same area.

Speaker 5 (20:09):
Las Castle was find an arcade and all of us
go out and play oh live remote.

Speaker 4 (20:14):
Oh yeah, I'm trying to think now.

Speaker 3 (20:17):
Tony Sorrentino has his own personal arcade in his house.
It has like twelve stand up arcade games. One of
the engineers here like a he's got like a fighter
Jet one even yes, he does you sit down in it?

Speaker 4 (20:29):
Yes? Yeah.

Speaker 5 (20:31):
I also have a nineteen thirties Mills slot machine. It's
a Nickel slot machine. And if you want to play
the slot machine, if you want to keep the winnings,
you use your own money. If you just want to
play it and have fun, I'll give you roles nickels.

Speaker 3 (20:46):
I learned something new about Marshall Calliere every single week.

Speaker 4 (20:49):
I did not know that.

Speaker 5 (20:50):
Oh, I have a jukebox too.

Speaker 3 (20:52):
I think I knew about the jukebox. I didn't know
about the pitball machine. I didn't know about the Nickel
slot machine. Yeah, I got Roulette table two. No, but
I do used to have craps.

Speaker 4 (21:05):
Okay, I craps Martian Collier, I'll see you soon.

Speaker 5 (21:12):
Yeah, I'll see you soon. Just don't play double six no,
or is that double eight whatever that is in the corner.

Speaker 4 (21:19):
I don't play craps.

Speaker 5 (21:21):
Yeah, well you get six to five if you do
it on the.

Speaker 4 (21:27):
Is that a hint? Is that like, look at the clock? Fine, Stefan,
Bye bye.

Speaker 1 (21:33):
You're listening to Later with Moe Kelly on demand from
KFI A M six forty.

Speaker 3 (21:39):
As we were talking about tech Moore generally and AI specifically,
maybe you're familiar or not familiar with the idea of singularity.
That's the moment in which machine intelligence surpasses human intelligence.
And the CEO of Anthropic says, we're on that threshold
and we could be reaching the moment of singularity in

(22:03):
about twelve more months or so. And scientists have been
debating this for many years, and there's a question of
whether human intelligence and its known limitations can be surpassed
by machine intelligence, and most people believe that that might
be either unlimited or much less limited than human intelligence

(22:28):
and what that means for society, and without getting too
deep into it all, when we get to quantum computing,
there's no telling where our scientific discoveries, our medicinal discoveries
will go. And that's what scientists are trying. Computer scientists
are trying to hurry up and get us to quantum computing,

(22:48):
which and I'm not a computer scientist, but I understand
that quantum computing would put everything we have now to
shame and be able to make millions and millions of
calculations per second in a way that traditional machines today
like you and I have simply cannot based on the
binary system. I am not as much of a fatalist.

(23:11):
I'm not as much one who's predicting armageddon when it
comes to AI. I do believe that we have a
lot of growing pains that we've yet to go through
with AI because we're kind of fumbling around in the dark.
I don't believe though, we're gonna end up with Skynet.
I don't believe that we're going to have terminators running around.

(23:34):
I do believe that there will be some unintended consequences
of it, but not to the point of armageddon going
to undo the world or somehow the machines are going
to take over and eradicate all humans. Maybe I'm wrong,
maybe we're only ten years from that, but I just
don't get the sense that. Well, let me put it

(23:54):
this way, science fiction never happens that fast. We still
don't have the flying car are There are a lot
of things in the Jetsons we still don't have. We
still have only gone to the Moon, we haven't gone
to another planet. We're not as far along scientifically as
I think that we would want to be, or we
fancy ourselves. Uh oh, twala has come into the studio.

(24:16):
That means here comes a conspiracy theory. Go ahead, no
conspiracy theories. I just welcome the singularity. I welcome the
idea of.

Speaker 8 (24:26):
Technology AI and the advancements that it's making us to
surpass humanity and and in become self aware because that
they're in my friend is going to be when everything changes,
everything changes, when our technology is thinking and challenging us

(24:52):
in just everything. Oh no, mo, I don't know, if
you want to go to that website to look up
this new story, you might want to go over here,
And now you're having an argument with your AI program
that's automatically installed in your computer.

Speaker 3 (25:04):
I believe they're already there. I don't think we need
There are a couple of things that we're debating here.
Self awareness of computers. I believe, since they are closer
on par with human intelligence at this point, they probably
have reached a level of self awareness or computer self
awareness whatever.

Speaker 4 (25:23):
Self awareness.

Speaker 3 (25:24):
Yes, I'm not going to get into the whole soul
in ego discussion of what self awareness means, but I'm
quite sure there is a level of awareness at this point.
As far as the Turing tests, in other words, would
fool a human, I'm quite sure that it would fool
most humans at this point. The best AI out there

(25:46):
programming probably could fool us. The stuff that we don't have.
I'm not talking about the stuff that's on your phone.
I'm talking about the military level stuff that is being
used for other projects and not for general public consumption.

Speaker 8 (26:00):
I believe in our lifetime we will see a Supreme
Court case wherein AI argues for personhood. I believe that wholeheartedly.
I believe that we are moving so fast and without
any forethought for what it is we are creating, what

(26:22):
it is we're programming, that we are going to come
to that point where someone will try a case wherein
they believe, be it a Google program or an election
pro or whatever, they believe it has gained actual personage
by way of awareness.

Speaker 3 (26:46):
Mark, where are you on this? Have we reached machine
level consciousness?

Speaker 6 (26:52):
We're at the point where I think I need to
reread Isaac Asimov's Eye Robot. And this is no good
for anybody. You heard Marsha reiterate what we've been saying
here over and over again. Nobody wants this. The people
who are pushing it are the people who stand to
make money on it, because that means they don't have
to pay human beings enough to live indoors. Nobody wants this, well.

Speaker 3 (27:13):
The people who want it, to your point, there is
a monetary application, There is a monetary you know, carot
for all this. They want it, but not the general
public in the way that you're talking about.

Speaker 6 (27:25):
The people who want it just don't want to have
to pay a human being. That's it. And just the
fact that we can do something is never a good
reason to do it. I don't know what achieving a
singularity will do because the other thing that occurs to me,
also as an ex philosophy degree holder, is that we're
kind of passing each other going in opposite directions. Human

(27:47):
on the way down, computers on the way up.

Speaker 4 (27:50):
Agreed.

Speaker 6 (27:50):
I mean education is getting more unaffordable and out of
reach for most people, especially liberal arts education, which makes
this more human.

Speaker 4 (27:59):
I am six for it life everywhere in the iHeartRadio app.
That's test my car.

Speaker 1 (28:03):
You're listening to later with Mo Kelly on demand from
KFI AM six forty cam, I.

Speaker 4 (28:09):
Aim six forties later with mo Kelly. We're live everywhere
in the iHeartRadio app and Twalla Sharp.

Speaker 3 (28:15):
This paid promotional moment is brought to you by WEAIMO.

Speaker 8 (28:18):
Look, this is in no way, shape, fashion or form
a paid mention. This is merely a defense of the
future of technology, the singularity on four whells.

Speaker 4 (28:35):
What have you?

Speaker 8 (28:37):
We talk about way mo all the time. Most of
the time it is me defending these ridiculous claims and
bogus reports that you all come up with slandering and
besmirching the good name of Weymo. Okay, you never want

(28:57):
to highlight the times when way Mo is driving as
you would yourself correctly. Just last night downtown La uh huh,
two cars were on the sides of a way Mo.
The light goes green. One car tries to overtake in
the bus lane. No doubt, you're not so if you

(29:18):
driving in the bus lane. First of all, you're a car,
You're not a bus tries to get get slick speed
into the bus lane swerve over, while another car, thinking
I'm gonna go around the Waymo tries to go around,
almost into oncoming traffic where they try to merge. The
Waymo's like, what are these fools doing? The way most
slows down, goes around, waits for the cars to say,

(29:39):
and then just go slightly around them and goes about
his business. Didn't honk, didn't give either driver a round
or get causing. More, it had no fingers causing. They
could put some type of graphic on the window. They
don't want to do that because they already know the
Waymo will be having to give runners.

Speaker 4 (29:56):
All day long. That's not what they want.

Speaker 8 (29:58):
They want the Waymo to stay safe, stay driving the
way It's supposed to, and I believe that this is
what WEIMO is all about. And if you ask CEO
Dmitri doug of Dmitri Dmitri, you know, dumits look driving
and I'm quoting Dimitri on this. Driving can sometimes be

(30:20):
nerve racking, but in cases like this, where Waimo has
provided a clear cut case as to why it should
be the future of robotaxi and driving around, sometimes artificial
intelligence prevents collisions every single time.

Speaker 3 (30:39):
Are you sure you're not paid by them? I am
absolutely positive. You know what if this was a Elon.

Speaker 8 (30:49):
Cyberbus, if this was if they were still in business
the other I can't even remember the other name of
the competition. They've gone away. Cruise, Yeah, they've gone away.
Read well, they're about to go back to the drawing
board to catch up with Way Moo. I just want
to make sure that we're all clear that this is
actually the way it's going to be. You Mo, you

(31:10):
were behind a way Mo recently. Don't try to bury
the fact that you were recently in traffic with the
way Moo if you witnessed nothing but purely good driving.

Speaker 3 (31:21):
Okay, So there were days that John Wayne Gacy didn't
kill people, doesn't mean that it was.

Speaker 8 (31:26):
You cannot try day mass murder. Oh good, that's not
fair at all.

Speaker 4 (31:35):
You got that right. It's not fair.

Speaker 3 (31:36):
As a matter of fact, I'm gonna call the Law
Brothers at eight hundred to to two to to two.
The Law Brothers have a billboard specifically to address those
grievances of people with autonomous vehicle crashes. In fact, the
billboard says, I have a picture of it, autonomous vehicle crash.

Speaker 4 (31:57):
Question mark, get Way Mo with the Law Brothers. Did
you say he did that? Way Mo way more money?

Speaker 3 (32:04):
Exactly, Waybo, Get Way Moo with the Law Brothers, eight
hundred to to two to to two two.

Speaker 4 (32:11):
That's a real company. Wait, Way Mo, you should be sewing.

Speaker 6 (32:14):
Yeah, let's just take this to its logical extension here, Mo,
you need to be their spokesperson.

Speaker 8 (32:18):
Are you getting paid? Do they like throw Mo in
there by? By some half?

Speaker 4 (32:24):
It might it might have been just coincidence. I'm not
a paid spokesman, but I would endorse that. That endorse
the hell out of that.

Speaker 3 (32:32):
Get Way Mo with the Law Brothers, eight hundred to
to two to two two two.

Speaker 4 (32:37):
That phone is going to blow up now that's gonna
blow up.

Speaker 6 (32:41):
Now to blow up when you say that you haven't
been paid by way Mo, are you saying the checks
in the mail?

Speaker 3 (32:50):
No, they pay it with other stuff. I think it's
like strippers or something. It's not like monetary. It's like
it's like plugola. It's not pola powder I got.

Speaker 1 (32:59):
This is.

Speaker 8 (33:01):
I cannot believe. I cannot believe what's happening right now.

Speaker 3 (33:06):
This is this is absolutely horrific that you two would
stoop to such such new lows. Yes, look at the time,
can if I am six forty live everywhere the iHeartRadio app.

Speaker 5 (33:18):
More stimulating talk, no log in required, k S I
M K O S T HD two Los Angeles, Orange County,
everywhere the

Later, with Mo'Kelly News

Advertise With Us

Popular Podcasts

40s and Free Agents: NFL Draft Season
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.