All Episodes

February 11, 2025 50 mins

Meet Kari Byron, amazing television host, actor, and artist. You may know her from Mythbusters or White Rabbit Project, but I know her as my good friend. It was an absolutely grand time catching up with her and I hope you enJOY!

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
The Craig Ferguson Pants on Fire Tour is on sale now.
It's a new show, it's new material, but I'm afraid
it's still only me, Craig Ferguson on my own, standing
on a stage telling comedy words. Come and see me,
buy tickets, bring your loved ones, or don't come and
see me. Don't buy tickets and don't bring your loved ones.

(00:21):
I'm not your dad. You come or don't come, but
you should at least know what's happening, and it is.
The tour kicks off late September and goes through the
end of the year and beyond. Tickets are available at
the Craig Ferguson Show dot com slash tour. They're available
at the Craig Ferguson show dot com slash tour or
at your local outlet in your region. My name is

(00:45):
Craig Ferguson. The name of this podcast is Joy. I
talk to interest in people about what brings them happiness.
Welcome to Joy the Podcast. My guest today is a
friend of mine full disclosure, A good friend of mine
and a good person and very clever and very interesting,

(01:06):
and she is a MythBuster. So that works. Her name
is Carried Byron, and you're welcome and enjoy carry.

Speaker 2 (01:23):
I want to ask you a question. Yes, this is
a question.

Speaker 1 (01:26):
Although I've known you for many years, I have never
answer you this question, unless maybe I have asked you
this question.

Speaker 2 (01:31):
But we'll find out.

Speaker 1 (01:33):
Are you in any way related to Lord Byron?

Speaker 3 (01:37):
Well, according to my father, yes, but my father was
one of those big fish kind of guys that tells
really great stories. So sometimes I'm just going to go
with yes. I have never fully fully investigated it. But
my dad used to have this big leather bound Lord
Byron book, okay, read in front of the fireplace, and

(01:59):
like five am when I get up early, I'd curl
up on his lap and he would read me poetry.
And I didn't understand any of it until I was older,
but he felt this. I think it just gave him
a sense of importance. So like on his deathbed, I
read him Lord Byron poetry.

Speaker 1 (02:15):
Oh wow, that's father lovely. And I'm sorry to hear
that your father's passed. But what a lovely thing to do?
Was he able to hear you? Did he enjoy it?

Speaker 2 (02:25):
And stuff?

Speaker 3 (02:27):
I feel like he did. He wasn't really in a
state of it.

Speaker 4 (02:31):
So yeah, you know.

Speaker 1 (02:33):
I gave my mother on her deathbed, I gave her
a photograph of Tony Curtis. Tony Curtis had been on
the Late Night Show and that was my mom's favorite.
And then she was dying and I told Tony Curtis
my mom was in the hospital.

Speaker 2 (02:47):
She's very second.

Speaker 1 (02:47):
I was going to leave after the show, and he went, oh,
and he signed a photograph for and I gave it
to my mom. But I think I don't think she
saw it, but I like to think that she knew
that it was a picture of Tony Curtis signed by
Tony Kurtiz On anyway, listen, it's a rather poetic name
for someone who I think of as being a science

(03:08):
You're very sciencey. You come you've come to my haunted
houses in Scotland, and you're like, ah, there's no such
thing as ghosts, and you're very sciencey about it.

Speaker 2 (03:17):
But of course there are such a thing as ghosts. Obviously.

Speaker 3 (03:20):
I wanted to be haunted so badly at your house,
Like I stayed awake, I walked around in the dark.

Speaker 4 (03:28):
I was like, bring it, because I would love to
be haunted. That would be so cool.

Speaker 1 (03:32):
Yeah, that's what they don't do it if you want it. Also,
I think there were a couple of scotsmen at the
parties be very happy to be haunting you in the
middle of the night if they you you were walking
around the castle. I watched a couple of those gentlemen.
What all you around the dance floor.

Speaker 3 (03:45):
But you do some real fun dancing at your parties.
You have never done that kind of dancing. It's it's
like Scottish line dancing or coon it is.

Speaker 1 (03:54):
It's reels. It's it's reels before Instagram. Now let me
let me ask you this because you are one of
a select gang that throughout the course of this podcast
I will talk to all of them, even probably including
Jamie is myth Butler's Royalty. You were the only for
most of the time anyway, you were the only women

(04:16):
on the team, right.

Speaker 4 (04:18):
Well, we started out with Skotcharny.

Speaker 3 (04:20):
Yes, she is just an amazing welder and very very cool.
I was just the longest running and consistently the first one.
I was helping Jamie out behind the scenes when I
was just interning, so like the very first episode, I
was there.

Speaker 2 (04:35):
But you were interning, right, That's what it was.

Speaker 1 (04:37):
It was just like because because you I feel like
maybe I'm making this up, but I feel like I
think of you as you were a sculptress or a sculptor, right,
that was your thing, that's where you were going to do.

Speaker 2 (04:50):
You still do that?

Speaker 4 (04:51):
Yeah, I mean some of them are behind me.

Speaker 2 (04:53):
I was going to say those ones behind you yours?

Speaker 3 (04:55):
Yeah, I mean I've always done it. I thought when
I was in San Francisco that I was going to
become this like really cool sculptor and my life was
going to be artie and my creepy little sculptures didn't
really sell that well. I couldn't make a living out
of it, so I thought, oh my god, how could
I still be a sculptor and do this kind of thing?
But like it paid, So then I started trying to

(05:18):
put together a portfolio so that I could maybe go work.
It's you know, work on Star Wars because Lucas Ranch
was so close to us, and I wanted to be
like Jamie and Adam and Tory when I grew up.
I wanted to do what they did. They were model makers. Yeah,
but I ended up using those sculptures that I had
made artistically to impress Jamie, and he's like, Okay, you know,

(05:39):
maybe I'll give you a job.

Speaker 4 (05:40):
I won't pay you, but you can come work here.

Speaker 1 (05:43):
Yeah.

Speaker 2 (05:43):
That sounds very much like Jamie. He's he's for.

Speaker 1 (05:47):
He's a very singular human being. I've always liked very
much what he did on camera. I always got very
a sense of like, there were people like you and
Adam and Torry and Grant God dressed them. We're very
comfortable on camera and kind of loved it and we're
very natural. And Jamie I always get a sense that
he was kind of rather grudgingly doing it because he

(06:09):
felt like someone. I felt like someone had told him
he had to do it and say you.

Speaker 2 (06:13):
Went yeah, right, I don't like it? Or am I
making out? Was that character?

Speaker 4 (06:19):
No? No, there is no character like Jamie is Jamie.

Speaker 3 (06:23):
When I occasionally go visit his shop and he oorates
the gearing of everything he's working on, like it's the
same stuff, like he's working for some DARPA think tank
or something at this point on some stuff I probably
can't talk about, but I all spent an hour just
listening to him explain projects. But he did MythBusters because
they gave him an opportunity to do really cool, weird,

(06:45):
wild stuff.

Speaker 1 (06:46):
Right, there was a lot of explosions. You guys blew
up a lot of stuff, a lot.

Speaker 3 (06:50):
No, honestly, that wasn't his favorite. That was like one
of my favorites.

Speaker 2 (06:53):
You seem to be very.

Speaker 1 (06:54):
Drawn to explosions. Do you still get to get your
explosions fixed?

Speaker 2 (06:57):
Now? How do you get your semtex fixed these days?

Speaker 4 (07:02):
Well?

Speaker 3 (07:03):
I do black powder artwork, so I'm still yeah, you
send you some, give me your address, I'll send you
some powder.

Speaker 2 (07:12):
It's not going to work well that you can't said.

Speaker 3 (07:14):
That it is all. It's already it's already been set off.
So I take canvases, cover them in a watercolor paper,
and then I I do these controlled explosions. They're not
very big because I'm doing in my backyard.

Speaker 2 (07:33):
Oh my god.

Speaker 3 (07:37):
Yeah, don't tell anybody anybody, but I have I have
a storage of really old dirty black powder, like not
the clean kind because they change the recipe, but like
the old stuff that they would put in canons. So
I have this old black powder. And when I was
on MythBusters, I learned how much I loved the detritus
that was left behind when we do explosions, and so

(07:58):
I'm like.

Speaker 4 (07:58):
I wonder if I could make that into our work.

Speaker 3 (08:00):
So I started exploding things on paper and using clay
to mask things off, and I made these kind of
really cool explorations in What's Left Behind, which is basically
at that point just a charcoal painting. And then I
discovered there's other artists that do this. There's like a
really famous Chinese artist that does this much bigger and
better than I do. But it's it's just another way

(08:22):
I can have an expression of art well, still blown, but.

Speaker 2 (08:26):
It's got to be it's got to be abstracts.

Speaker 1 (08:29):
Like, it's not like you said something off and suddenly
there's a horse or something like that.

Speaker 4 (08:33):
Right Actruly, I do a lot of abstract stuff.

Speaker 3 (08:37):
All of my abstract stuff has a theme of connectivity,
humanity and quantum physics. But then every now and then,
like I will do pieces where I take this palmer
clay and then I will extrude it into these long,
skinny pieces and I will put it on a paper.
I did one recently for my friend who had lost

(08:58):
her her dog, and so did a portrait of her
dog using this clay to kind of mask off an
area that would create negative space, and then I exploded
black powder behind it, and it makes this really cool
charcoal painting. But it's almost like it's like when you
look at the negative space is white and then.

Speaker 2 (09:16):
The background the explosions.

Speaker 3 (09:20):
Yeah, but it creates this sort of chaos because like
the little the grain that I use, I use a
really big grain because I like how it shoots around
the page and makes these little like squiggly marks, and
then parts of it kind of explosive.

Speaker 2 (09:34):
Do you have to use special paper you blow up
your paper?

Speaker 3 (09:37):
Well, I've I've done a lot of experimenting, as you
know I like to do, and I've figured out, you know,
it's if you contain the black powder, it creates an
explosion and that's when things get destroyed and there's just chaos.
But if you if it's open air, it just goes
and it just leaves charcoal behind. I mean, mind you,

(10:01):
My daughter hates it because the smell of sulfur lofting
up into her room.

Speaker 4 (10:08):
She's like, oh, it smells like smoky farts. I hate it.

Speaker 1 (10:13):
Well, I do quite like the smell of smoky for
or I was going to say sulfur, and then I
said smoky farts. But I say, I think I meant
smoky farts. What I'm saying is that that's that's kind
of my smell is smoky.

Speaker 2 (10:25):
I like to go smoky. I remember actually.

Speaker 1 (10:28):
There was an episode of myth Busters were you guys
did collect your farts in a jar?

Speaker 4 (10:33):
There was we did a lot of farts are funny.

Speaker 2 (10:37):
They are I totally agree.

Speaker 1 (10:39):
I always killed but they but they I do remember
because I remember seeing you doing sit ups and Adam
doing sit ups and catcows and stuff like that to
try and collect to see how much was so much
gas a human creates in a day or something.

Speaker 2 (10:54):
I don't know. It was some it was fantastic. I
remember that.

Speaker 3 (10:57):
There was a lot of things that I think I
worked on them. And do girls fart?

Speaker 2 (11:02):
I mean wow, that made it as far as the show.

Speaker 4 (11:06):
Girls, I mean yeah, it was a bunch of mini myths.

Speaker 3 (11:10):
And that one they raked up a pair of hydrogen
sulfide detecting panties.

Speaker 4 (11:19):
Hey, when Jamie says panties, it freaks me out.

Speaker 2 (11:23):
Yes, I'm in London right now, so you can say pants, mean.

Speaker 4 (11:27):
I can say pants.

Speaker 1 (11:28):
Yeah, yeah, well, pants, and then then it will be underpants.
But if you said it's a whole thing, you know
how it is. Tell me that though, when before you
were in because I know I met you when you
were doing MythBusters because I have such a fan of
the show. And then you guys came down and we
started kind of hanging out and stuff like that. But

(11:49):
before your life in myth Busters, I mean you were
straight out of college, right.

Speaker 4 (11:56):
No ish?

Speaker 3 (11:57):
I So after college, I packed a backpack and I
traveled around.

Speaker 4 (12:02):
The world for about a year. No way did she
get country to country.

Speaker 3 (12:06):
I started going west from California, so I left San
Francisco and yeah, I almost said that year, Okay, this
is going to date me up.

Speaker 4 (12:16):
In nineteen ninety eight, I left and.

Speaker 3 (12:19):
I went to Rare Tonga and Fiji and New Zealand, Australia,
worked my way up through Bali, through Southeast Asia, Japan, India, nepoal.

Speaker 4 (12:30):
Worked my wayward to Europe and down to Egypt.

Speaker 3 (12:33):
A couple of times, I just I just kind of
kept moving. And I started out the trip with another girl,
and then a couple of countries in we split off
and did our own things.

Speaker 4 (12:42):
So I was traveling alone a lot and making friends
along the way and lazies.

Speaker 2 (12:46):
You have a door, would you would you let your
door do that?

Speaker 1 (12:49):
Then?

Speaker 3 (12:50):
I feel like at that age she would be an adult.
And it was the most transformative educational experience I ever had.
I feel like my entire life has been shaped around
the mission to build bridges and connect with people and
just feel that humanity. And I learned so much about

(13:11):
myself in the world. And you know, when I left college,
I was still in that sort of like crazy phase,
and I feel like it grounded me in a really
interesting way.

Speaker 2 (13:19):
And how did you go to college? What did you study?

Speaker 3 (13:22):
San Francisco State. I wanted to go for film sculpture.
I just something laraty at the time, which is how
I kind of ended up that come.

Speaker 2 (13:30):
From your parents? Then? Were you an arty family?

Speaker 4 (13:33):
No, I mean the Byrons.

Speaker 1 (13:36):
Well, I mean I mean, but Byron was first of all,
He's Scottish and I think he was a bit of
a bit of a drunk actually.

Speaker 2 (13:45):
Byron, Oh, yeah, he was.

Speaker 4 (13:47):
He was wild.

Speaker 3 (13:49):
I mean, I am not entirely proud of some of
the things that that side. His father brought, uh, venereal diseases,
to like the cook eye.

Speaker 4 (14:00):
I think that I think there's been a lot of
a lot of mayhem.

Speaker 2 (14:03):
But he would he.

Speaker 1 (14:05):
Would definitely be canceled now Byron. There's no way he
had survived in social media, no way. There's so many
people know that would not have survived that.

Speaker 2 (14:15):
Uh.

Speaker 1 (14:16):
Although I don't know, you canceled a drunk poet, but
probably you don't drunk put with the title. Well that's it,
You're canceled. I don't care.

Speaker 2 (14:27):
I don't think it would matter. But the fact I have.

Speaker 3 (14:31):
A lot of bastard children. He had one child that
he claimed. Now this is this is sort of the
the ancestor that I tell my kid about because I
was so proud. So he had a daughter, Ada of Lovelace,
who she was really sick when she was young, and
her mother was not into Lord Byron. So she she

(14:51):
stole her way and tried to break the poetry out
of her with math.

Speaker 4 (14:56):
She made her study.

Speaker 1 (14:57):
Math, so kind of like praise the way, but with
math that's crazy.

Speaker 4 (15:01):
Yeah, she's just like, I'm going to get all the
RT out of you. I'm going to I'm going to
train you in math.

Speaker 3 (15:06):
So when this woman grew up, she hooked up with
this guy Charles Babbage, who was and they created He
had created a counting machine like a giant calculator, and
she used that machine as an application to create rugs
with pictures on them. She was sort of the grandmother

(15:26):
of computer science because she's the one that figured out
with all those ones and cereals you can create. So
we wouldn't have I don't think computers today if it
wasn't for that connection between our science.

Speaker 4 (15:40):
So she's the revolutionary.

Speaker 2 (15:42):
Yeah, that is fascinating.

Speaker 1 (15:44):
It was also, I mean, the idea that math would
push the r of someone as such a ridiculous notion.
You think of Leonardo da Vinci, who is pretty much
a mathematician who did are or an artist mathematician and
pretty good one, not as well and probably gay.

Speaker 4 (16:04):
I've heard that, I've.

Speaker 2 (16:04):
Heard, Yeah, I think so. I think.

Speaker 1 (16:06):
I think back then the sexuality in medieval Italy, Middle
Ages and Netlie it was like, you know whatever, you know, male,
it's a fluid, Hey, what's the coming ago? It doesn't matter,
everything is fine. So you leave there and you go
with me. How did you end up? You just went
to metaphors? This as an intern job that was because

(16:26):
because of the connection with industrial light and magic, right,
was that the thing you wanted at the time?

Speaker 3 (16:33):
Uh, MythBusters wasn't there. So I just got a job
for Jamie.

Speaker 1 (16:37):
Oh it was just Jamie. You was just wanting for Jamie.
How did you meet Jamie for Good's Day? I mean, Jay,
I can't imagine. Jamie's a person that you would meet socially.
How did you how they held you bump into Jamie?

Speaker 4 (16:47):
Yeah, it wasn't socially.

Speaker 2 (16:49):
No, I can't imagine.

Speaker 3 (16:51):
So he had sort of a general manager of his
shop that was teaching a sculpting for special effects class
to college kids.

Speaker 4 (16:59):
And one of my friends at that class.

Speaker 3 (17:01):
And he was like, oh, Carrie would love this, like
it's this incredible workshop. Like you should go hit him
up and see if you can get an internship because
this is what you've been trying to do. And I
heard he gets everybody their first job. So I put
together a portfolio and he, you know, introduced me to
Jamie and the next day I was back. I mean,

(17:22):
I had a job as a receptionist and I called
in sick. I just started showing up at Jamie's shop,
like just in case Jamie was going to throw me
out because you know, at first he wasn't really that
impressed with me. I have to say, he was just.

Speaker 2 (17:36):
Like some guy to impress, you know, I mean, it.

Speaker 4 (17:38):
Takes a minute.

Speaker 2 (17:39):
It takes a minute.

Speaker 1 (17:47):
What happened though with myth Busters, it's such an odd
thing because it was such a strange. I always felt
very connected to myth Busters. I always thought that the
late night show that I'm myth Busters were kind of
out of the same thing. They were anomalies, They shouldn't
really have happened, they shouldn't really have been successful.

Speaker 2 (18:05):
But they both were by people who kind of like.

Speaker 1 (18:08):
Drifted into it for kind of sideways and the idea
of you know, when you became you know, visible and
famous and stuff like that. Because this sounds like to
me that that was something that you were cultivating or
even wanted to be part of.

Speaker 3 (18:23):
So I know, you said I looked like I was
comfortable on camera, but I was not. I was a
shy person, right that pretends to be extroverted. So, I
mean it was hard at first.

Speaker 1 (18:36):
Yeah, I think it first, but it kind of it
gets easier right then, you get it didn't get easier
over time.

Speaker 2 (18:41):
I mean how long you guys did that show for
for ten years? Yeah?

Speaker 4 (18:44):
Over ten years?

Speaker 1 (18:45):
Yeah, I mean it's yeah, it becomes a thing. And
then Grant coming in. He came in in season two
or something else, season three, is that right?

Speaker 4 (18:56):
Don't remember what season?

Speaker 3 (18:57):
Seasons are difficult for me because I was wild West
of Networks and you know, we made more than three
hundred and fifty episodes, but like the first season of
MythBusters was like three episodes, and then the second was
like ten, and then they realized our contracts only needed
to be renegotiated per season, so all of a sudden,
the season would be thirty six episodes.

Speaker 2 (19:17):
Oh man, that's awful. That's awful.

Speaker 1 (19:22):
Do you still what I mean, what does it look
like now post MythBusters? Well, I mean we lost Grint,
which was such a terrible, terrible shock, I mean, just awful,
and everyone seems to be doing rather well likely, you know,
I mean you now have that you're like you're the
science thing that you do?

Speaker 2 (19:41):
What is that?

Speaker 3 (19:43):
So? You know, after MythBusters, Tory and I tried and
tried to come up with other shows that we might
do the same thing or give us this great experience,
and you know, nothing took off past the season, but
you know, we were friends and we want to just keep.

Speaker 4 (19:59):
Working, you know, if you want to scoop right now.

Speaker 3 (20:03):
The reason that I'm setting up the studio and it
looks so blank right now is that Tory and I
might be pre production putting together a podcast.

Speaker 1 (20:11):
Oh that's a great idea. I mean, of course that's
the way to do it now as well. I mean,
I mean, obviously you can have this podcast. I'm I'm
going to do this until my contracts up, and now
I'm not doing it anymore.

Speaker 4 (20:21):
Then, Yeah, I would I like.

Speaker 2 (20:24):
To quit things. I just like to Ah, it's enough
of that to try your career.

Speaker 1 (20:28):
Yeah, my god, it's enough of that.

Speaker 2 (20:32):
We can try something else now.

Speaker 1 (20:33):
But so what would you do in the pot if,
hypothetically you and the Great Salvatory Belichi we're going to
do a podcast together and I'll have to get one
that's or else you'd be mad at me, And also
I would like to talk to them. But what would
you do in your podcast if you were going to
do a podcast with Tori?

Speaker 3 (20:52):
So we had always said with you know, the real
show was never what you guys saw the real show
was the hilarity behind the scenes because we were all friends.
I mean, we were pranking each other. We were talking
about weird stuff that we saw.

Speaker 4 (21:06):
We're like, oh my god, did you see that thing
about the zombie ants?

Speaker 3 (21:09):
You know, we would spend all this time with the
weirdness of trying to produce the show. You know, hey,
can you give us a cow stomach? With all the
ventricles attached? Like, there's so many behind the scenes stories,
plus all of the weird intricacies of the conversations that
we would have. So it's basically going to be like,
what's going on now? What things interest us? Relate it

(21:30):
possibly to something back on MythBusters, because that's our shared
history and we have so much of it. And then
you know, maybe talk to some experts about things like cryogenics,
Like let's go talk to a professor that can tell us,
if it's possible.

Speaker 2 (21:45):
Do you know anything about cryogenics.

Speaker 1 (21:47):
Look, I'm sixty two years old, though, carry so I
know I'm kind of much more interested in cryogenics than
I used to be when I was I'm.

Speaker 3 (21:54):
Going to be honest with you, it's like trying to
make a cow out of Hamburger's.

Speaker 2 (22:00):
Yeah, it's not going to work.

Speaker 1 (22:02):
It's not going to work, all right, Well back to
philosophy then, so what about the singularity? What about putting
the human personality into the machine that the ds X machina.
Is that going to happen? Can you get inside the machine?
Can humans live as.

Speaker 3 (22:24):
I am not an expert. I am somebody who likes
to talk to experts. I think that we're not going
to be able to do that because the mysteries of
all the electricity that happens to us and why it happens.
I mean, we're just these electrified meat puppets that you know,
it's a mystery why we exist.

Speaker 4 (22:42):
I don't think that that's going to happen.

Speaker 3 (22:44):
But I am incredibly impressed by what AI is doing.

Speaker 4 (22:50):
And I have a lot of friends.

Speaker 3 (22:50):
I live in the Silicon Valley and I have a
lot of friends in DC and all of the policy
I have know. I know people in AI that are
you know, the luminaries, and they are hopeful, they are fearful,
and they are astounded by how quickly it's developing. And
now AI through the Internet. It's not just AI talking
to us, it's AI talking to each other. So there's

(23:13):
just this whole undercurrent of robots talking to each other,
and it's the thinking is getting.

Speaker 1 (23:19):
Human can we can we can we conduct surveillance. So
in these robots as they talk to each other, presumably
they're not keeping secrets.

Speaker 3 (23:28):
No, I don't think they're hiding it. I don't think
we would fully understand it, but they. I mean Deep
Seek is the latest AI to come out, and really
really yeah, I was really interested in this because it
shows you it's thinking and how it comes to its conclusions,
which is it's a little different and it's so human
in the way that it relates. I think that that's

(23:50):
what's so interesting to me is that people are trying
to make it have humanity so that it can connect
with you better, so that you can feed the beasts,
so you can give it more material.

Speaker 4 (24:00):
It's it's it's.

Speaker 3 (24:01):
Wildly interesting, but it also only can feed off of
what we give it, so right, I do think that
humans in their creativity, if you don't put it online,
it doesn't have it.

Speaker 1 (24:14):
Yeah, or just take the battery out. I mean I
always thought that, but terminators.

Speaker 2 (24:18):
Like just don't plug it for God's sake, kill the battery.

Speaker 1 (24:22):
Just get the battery over there, or pour some water
on it every time you like, drop it in the toilet.

Speaker 2 (24:28):
That's said, that will take care of it.

Speaker 4 (24:30):
But the just flush the terminator.

Speaker 2 (24:32):
Flushinators, take it in the.

Speaker 4 (24:34):
Toilet, blush it like an old high school movie.

Speaker 2 (24:36):
Look look, terminator. What's that that looks like?

Speaker 1 (24:38):
I think John Connor's and the toilet terminator and then
was swirly and you're done.

Speaker 2 (24:45):
I think that.

Speaker 1 (24:47):
I think that there's so much talk about it though
the AI have. I mean every conversation I have with
everybody right now, at some point, if you talk to
anyone for more than ten minutes, you're talking to them
about AI.

Speaker 2 (24:58):
Once you get past you know, the like or whatever.

Speaker 1 (25:02):
And I'm fascinated because most people I talk to are
quite afraid of it. But these are the people who
don't use it. People who use it are not afraid
of it at all. I talked to it, I mean
a surgeon who was like, Oh, this is the greatest
leap for surgery. This is amazing, this start, I mean,

(25:22):
this is just it's changing the game. We're going to
be able to save so many more lives with this
kind of technology.

Speaker 2 (25:28):
I mean, he's really gung o about it. He loved it.

Speaker 3 (25:31):
I mean, it's like any technology that little, you know,
iPhone in my hand revolutionize my life. I have so
much more connectivity. But it's also something that sucks me
into the light. You know.

Speaker 2 (25:41):
Yeah, it's time.

Speaker 3 (25:43):
But I mean that's if you're talking to people are afraid.
I'm thinking that you're mostly talking to adults who came
up without it.

Speaker 1 (25:51):
I really have two children, and one of them already
is an adult. So yeah, I talked to one child
pretty much that's it. He's fourteen, so I mean he's
not afraid of that at all. Yeah, of course, right.

Speaker 3 (26:05):
But I'm so one of you were asking post MythBusters.
I am currently the director of the National Stem Festival,
which is basically the nation's science fair, right, and a
lot of these kids are using AI machine learning in
such interesting ways to try to come up with solutions
to big challenges around the world. I mean everything from uh,

(26:27):
there's this kid Tyler in Connecticut who's come up with
a cost effective test that can find iodine deficiencies, and
all of the data that comes from that is going
to actually create a base for us to understand why

(26:49):
millions of people have an iodine deficiency. And he's come
up with a two dollars saliva based color metric test
that can you know, you can send it in and
get results quickly and find out if you have an
iodine deficiency.

Speaker 1 (27:05):
Well would why would that be important about having an
ida in deficiency?

Speaker 2 (27:09):
What would that do to you if you're an iida
in deficiency.

Speaker 3 (27:11):
There's also some sorts of like neurological and physical issues
that come from any sort of deficiency. I mean, some
of these kids are using it for like last year,
there was this one girl who used pictures of eyes
and the machine learned to be able to detect if
you had anemia. Because anemia is a real problem. It

(27:33):
can create all kinds of problems, especially if you're sick
and old. And her grandmother had anemeia and it was undiagnosed.
So she came up with a way for machine learning
to predict anemia through pictures of your eyes.

Speaker 2 (27:44):
I mean, that's pretty cool.

Speaker 1 (27:46):
So non invasive kind of testing like that, I mean,
and presumably then if you can do it from an
eye scan, you can do it from a computer. Because
like I'm in Britain right now and they have socialized
medicine here, and they're, oh is struggling about trying to
get people into offices and having the right amount of people.
So I guess if you can test people using machinery

(28:09):
so that you know, you can scan the idea of
having a medical done by a robot.

Speaker 2 (28:14):
I quite like that idea, especially the prostate example.

Speaker 1 (28:17):
I feel like that would probably be if there was
a way to do that. Well, it is done digitally,
I suppose, but it's a different type of but if
there was a way to do it that it was,
you know, a little less I don't know, if it
was a little more camera ish, I'd be up for that.

Speaker 3 (28:36):
Okay, I'll put it out all these genius kids and
see if somebody can come up with something.

Speaker 1 (28:39):
Get your your kiddie think tank going and say, look,
I'm feeling like testing is the key.

Speaker 2 (28:46):
Early detection is the key on all of these things, right,
So I'm.

Speaker 1 (28:50):
Not just talking about that though, I'm sort of like
any illness early detection, right, you know, if you know early,
you can do something about it. I feel like the
development of vaccines seems to kind of go hand in
hand with that as well. These kind of weird new
vaccines that they're developing, That people are talking about vaccines
for cancerous conditions now and stuff like that. You know,

(29:12):
I think that's incredible to me that that kind of
thinking is going on. Is that Do you see a
lot of that in the young people that you talk to,
because obviously if you're doing that, that they are embracing
the AI in a way that older people just don't
or can't.

Speaker 2 (29:29):
Yeah.

Speaker 3 (29:30):
I mean I feel like they're just using it as
a tool like any other tool. I mean, they're coming
up with apps that find early Parkinson's markers or I
mean even in twenty sixteen when I did the White
House Science for there was a girl who had figured
out a way to diagnose cancer early using computer modeling.

Speaker 4 (29:51):
I mean, they've been doing this for a while.

Speaker 3 (29:52):
And if you think of it as a tool rather
than the evil robot that's coming for your information, it's
it's less scary. Yeah.

Speaker 1 (30:02):
I wonder about all that kind of the rule boat
want your information now? I mean that do you worry
about that? Do you worry about data harvest thing in
your own life?

Speaker 2 (30:10):
You do?

Speaker 1 (30:11):
You have firewalls put up? I mean your computer savy,
you know how to do all that.

Speaker 3 (30:15):
I do my best, but honestly, I think it has
it all anyway, because I'm also somebody who's shops online.

Speaker 4 (30:21):
Yeah, I love that.

Speaker 2 (30:23):
Yeah, I totally agree.

Speaker 1 (30:25):
I mean, if one, if one piece of the machinery
has it, it all has it? You go in a
store to buy a pack of gum or something, They'll say,
can we have your email address? Like you have it?
You have it, Just let me scan my card and
it will come up. You already have all that information.
It's interesting to me, though, because I think of you.
Last time I saw you in San Francisco when you came.

(30:45):
I was doing a show and you came and you
had turned up in one of those self driving cabs,
which I still haven't done yet.

Speaker 2 (30:55):
Is that freaky? Do you kind of okay with it?

Speaker 4 (30:59):
Okay?

Speaker 3 (31:01):
I granted I have been here in a beta testing
city for Waimo for Cruise for Zookes for years now,
so I was in some of the early prototypes of
Waimo way back in the day, and I was part
of the test program. I signed up immediately to try
these because I'm a super curious person and I just

(31:23):
I wanted to know more. So I interviewed and talked
to all kinds of people that made the cars, and
we're creating them. And you know, they hire hackers and
they hire all sorts of people to try to to
muck it up, to see if they can actually throw
the cars off track. But the last time I got
into a rideshare vehicle, I had a driver that had

(31:45):
a movie going on over here. I was talking on
a phone here, had me in the back seat. I
was driving a little bit crazy. Right.

Speaker 4 (31:54):
Yeah, I'm not going to say that that's the rule,
but you.

Speaker 2 (31:57):
It happens when you look at me.

Speaker 3 (31:59):
Yeah, when you look at the robot car, it's got
over sixty cameras light our radar. It has a three
sixty view going on that you can actually see, which,
by the way, they only put that up for you
to see.

Speaker 4 (32:13):
The car doesn't need it.

Speaker 3 (32:13):
It just wants to show you, Like I can see
that cat running out into the street. I can see
that bicyclist over there. It's not going to hit anything
if it can help it, Like it's it's paying so
much more attention. It doesn't have to worry about fog
or rain or night like a lot of people can't
see it at night as well. The robo can like,
it's got the instrumentation, it has to follow the laws.

Speaker 2 (32:35):
It's for example, like say like.

Speaker 1 (32:41):
A person wanders out enter the road, and the option
the car has is to either hit the person or
drive off the road and hurt you.

Speaker 2 (32:51):
What does it choose?

Speaker 3 (32:53):
Okay, it doesn't have the morals to make those decisions.
But I don't know that I can answer this with
any expertise, because I think you have to ask them
on this one, right, Okay. I do know that it's
going to try to avoid conflict altogether and stop. I
don't think that there is a well I can relate
to exactly.

Speaker 1 (33:15):
I was like, oh no, it's fine, I'm fine, I'm
fine if it's okay.

Speaker 3 (33:19):
I mean, what would you do if your choice was
to run into this person or that person? Like you're
going through the calculations of how to get out of it.
I'm pretty sure that the robot has it because it's
AI machine learning constantly and it's been training in places
like San Francisco is a place to drive roads all
over There's there's there's things everywhere like yeah, people on

(33:42):
the streets. I mean, you couldn't throw it in a
more chaotic situation, so it is learned, like how to
maybe go this way and avoid everything.

Speaker 1 (33:52):
I rented a Tesla truck in Phoenix a couple of
weeks ago. I was doing a show there and and
Tamas and I rented a test split truck and we
put it in self drive to take us around.

Speaker 2 (34:04):
And it was freaky, and.

Speaker 1 (34:06):
At one point I I intervened because I thought it
was going to hit a car, and it was. There
was some debate between Tomas and I to master thought
it was going to hit the car.

Speaker 2 (34:15):
I thought it was going to hit the car.

Speaker 1 (34:17):
But then I thought I should have given it the
chance to not hit the car, but I just didn't.
But I want, I mean, you're right about you know,
the distractedness of drivers, like you know, human drivers, people
texting people on their phones, people you know, doing stuff.

Speaker 2 (34:36):
It probably is a better driver.

Speaker 1 (34:38):
Right, And but where am I going to get if
I'm in a taxi but it doesn't have an actual driver?
Where do you do I get my racist opinions from
I would be able to get my racist opinions from
a cab driver?

Speaker 2 (34:50):
Where will I get them?

Speaker 3 (34:52):
If only there's still the Internet in your hand, you
can TikTok that if.

Speaker 2 (34:57):
You like, I'll be able to get my ras from
other places.

Speaker 4 (35:02):
Yeah, there's plenty. There's plenty, there's.

Speaker 1 (35:03):
Been, there is plenty. But I feel like it is
an interesting thing. How do you use it in your life?

Speaker 2 (35:10):
Now? Would you use it for AR? Do you use
it for R? Is that?

Speaker 1 (35:15):
Well? Like you, if you're creating a sculpture, would you
take the would you would you bring in an it?
Would you get AI to help you conceptualize it or
or or maybe even make models for you?

Speaker 4 (35:29):
No?

Speaker 3 (35:29):
I mean for me, all our work is for me
to go through some sort of journey of curiosity. So
I don't need any I don't need any assistant. But
what I do use AI for is when I don't
understand something. And okay, so National STEM Festival right right?
These kids, we find kids from every state and territory.
We try to find a representation from all over the country.

(35:50):
So it's going to be the best of the best kids. Okay,
some of their projects, when I'm when I'm reading the champions,
I don't understand them. They are so complex. I have
literally taken the text from one of their projects and
put it through AI and said, can you explain this
as if I am a twelve year old. Oh wow,

(36:14):
it will explain the science of their project to me
so that I can understand it.

Speaker 4 (36:19):
And they are, you know, seven through twelfth graders.

Speaker 3 (36:23):
So I'm currently using AI to help me be a
better person, be a smarter person.

Speaker 2 (36:30):
I very much relate to that. I get to this thing.

Speaker 1 (36:40):
Recently, I listened to audio books all the time on
my phone. I find it very like i'm driving or
if I'm can't sleep. It's like it because you can
like and be off and you can listen, and it's
like because I have who I am. I listened to
this book, a biography of Socrates, because I thought I
should know about Socrates more than I do, and it
was a biography about so Cracis, and I thought, that

(37:02):
is the best biography.

Speaker 2 (37:04):
I'm so helpful.

Speaker 1 (37:05):
And then I saw it was, you know, a biography
for young people. It was for it was for teens,
and I was like, ah, jeez, But it helped, you know,
I understood it a bit more. They simplified it a little.
And maybe I'm not as smart as I thought I was.

Speaker 3 (37:22):
Well, I mean, on MythBusters when we were making the show,
I think The reason was so successful is because, like
I am not a scientist, so I'd have to hear
the complicated story and then be able to relate it
in a simple way. So I would I would hear
it until I could understand, you know, a scientific principle

(37:42):
and so that I could explain it. So we used
to say, if you want to explain something to a
thirty five year old, explain it like they're twelve. And
if you want to explain it to a twelve year old,
explain it like they're thirty five, because we always underestimate
young people and overestimate us.

Speaker 2 (37:58):
It's very poignant and very true.

Speaker 1 (38:00):
And I wonder, given that, the interaction that you're having
currently with young people and you know, the does it
make you because you hear so much derision heaped on
the young right now about their terrible generation and with
their being having pronouns and what this and all that,
and people from my generation and younger just heap so

(38:24):
much hate on the young. Does it make you optimistic
or are you? Are you more optimistic dealing with kids
or are we right? Are they all a bunch of
fundamentalist douchebags?

Speaker 3 (38:39):
I think any group of people that you brush with
these grand broad strokes, It's going to be easy and
reductive to come up with your ideas.

Speaker 2 (38:50):
Madam. That's true. Actually, you're right, it is.

Speaker 1 (38:53):
You know, sweeping generalizations are the basis of old fascism,
I suppose.

Speaker 4 (38:58):
Are Let me tell you this.

Speaker 3 (39:00):
If you need a moment of hope, if you need
a moment to feel like we're gonna be okay, really
listen and and and hear what these kids are doing
and saying. They are solution seekers and innovators, and they
are looking to problems instead of like, oh my god,
this is crushing, this is awful. How am I going

(39:20):
to wake up tomorrow? They're like, huh, what can I
do about this? What can I do about this this
particular problem myself? And it's it's beautiful to see the
hope that they have, and I think hope it gets
hope and all of a sudden, everybody's feeling a little
bit better about their neighbor. I mean, we were just
as a species, we were created to be tribal. You know.

Speaker 4 (39:43):
I I'm at my high school.

Speaker 3 (39:45):
Think the high school over there is terrible, and they
think the word terrible.

Speaker 4 (39:51):
Exactly.

Speaker 3 (39:52):
And the Internet has created just bigger tribes, and so
it's become a little it's become a little divisive and angry.
And if you fall into that and stop realizing that
we're all of the same tribe, that we all really
want the same things. We want to know where our
next meal is coming from, we want to know that
our family is safe and healthy, and we want a

(40:15):
roof over our heads. Anything beyond that doesn't matter, and
we all want that that we're all just the same.
And you start looking like mister Rogers for the helpers
and start looking towards these kids that are trying to
solve the biggest problems of the world, it just it
makes you feel good and it makes you feel like
you want to be one of the helpers. And for me,

(40:37):
since I'm not one of these brilliant scientist kids, I
am going to amplify their stories. I am helping get
them on their local news stations so that they can
create hope in these little pockets all over the country.
I want to elevate them on a national stage, which
is why this is the National STEM Festival, so that

(40:59):
politicians and industry leaders and everybody can support them and
hear their stories and maybe take that hope back to.

Speaker 4 (41:05):
Where they are and create more hope.

Speaker 3 (41:08):
I really feel like this small sea, this stem festival
that we're creating is going to have this incredible ripple
effect of just creating more and more positivity. And who
knows if this kid in New York who's learning how
to gamify some sort of issue over hear me so
a kid in Kansas, and the two of them come

(41:29):
together in the future and we've created this little community
and all of a sudden, they're curing cancer. I just
I feel like there's ways that you're going to look
at the youth and it's going to make you feel
good about the world. Turn off the news and start
looking to the kids.

Speaker 1 (41:45):
That's very encouraging. It's very encouraging. And I think I
agree with you. I think that the human mind has,
for the most part, a magnifying quality. What I think
it does is if you if you look at something, anything,
you look at, it magnifies it. It's part of the
process that I think we go through. So if we

(42:06):
look at the problem, we magnify the problem. If we
look at the potential solution, we magnify the potential solution.

Speaker 2 (42:14):
And I think it's an emotive thing.

Speaker 1 (42:16):
Like if I concentrate on what and this is where
I think social media is misleading because the algorithm will
give you what it thinks you want and actually maybe
that's what you wanted a thought process to go. But
now like if you if it's thinking, oh, I want
to look at fights, I want to look at street fights,

(42:38):
I want to look at more street fights. I'm going
to more and more street fights, and the algorithm will
do that, Whereas what you need is you need to
stop looking at that and you need to look at
how people don't do that. And I think what's very
encouraging about what you're doing is the fact that you're
concentrating on the solution, you know, I which is interesting

(43:00):
because if you take an engineering mind, an engineering mind
looks at the problem. It concentrates on the problem and
we'll find the solution from the problem.

Speaker 2 (43:10):
But I think an artist's mind.

Speaker 1 (43:12):
Looks at the solution and does the solution. It doesn't
look at the problem. Does is just going straight to
the solution. Let's have the solution. And I think they're
both essential. I think these different different ways of looking
at the world does duality. I think it's essential. It's
very helpful. I think that people like you are doing that.

(43:32):
It's great that you became famous in science y, even
although you were just an intern trying to get Jovid
to it was very I love.

Speaker 4 (43:41):
About the world. To me, it really did.

Speaker 3 (43:43):
It kind of showed me how science and art are
so similar. And I think I became just a super
stem advocate just because I saw how positive an impact
that MythBusters could have on kids, and I think kind
of led them into these careers. And I was just like,
Oh my gosh, how do we earnest that, how do
we make you know, mythbustery inspiration happened with lots of

(44:05):
kids and then just you know, one thing led to
the next.

Speaker 4 (44:07):
And the way that this.

Speaker 3 (44:11):
Festival came about is I was interviewing people itself by
Southwest and I was interviewing the Secretary of Education, and
I'm like, oh, I've got this opportunity.

Speaker 4 (44:19):
I'm going to corner this guy.

Speaker 3 (44:21):
So I was like, listen, I hosted the White House
Science Fair and it hasn't existed for a.

Speaker 4 (44:26):
Decade, and I want it back. How do we make
that come back?

Speaker 3 (44:29):
Tell me you're going to bring it back and it
kind of.

Speaker 4 (44:33):
I got him to offer on camera.

Speaker 3 (44:37):
That you know, if I would build it, he would
let me take the keys. And so this this entire
thing has just sort of sprouted into this is the
second year of it, and want to star science, science, Technology, engineering,
and math. I like to say science fair, but but

(44:58):
STEM Festivals seemed like it was more encompassing for a
lot of kids, so we went with that. But you know,
I did my research. I went and found all of
the people that worked on the first science fair that
I had hosted with Bill and I, and they said
that their one regret was that they tied it to
an administration so that it became partisan and that makes

(45:18):
it go away when another administration comes in. So we decided,
my business partner Jenny Bucos and I that we were
going to take that aspect away, take the politics out
of it. And so we work in conjunction with all
the government agencies, and they are there and they are
part of it, but nobody owns it. It's completely on
its own. We get all of the funding from the

(45:41):
STEM community, so we actually get businesses. Everyone from Autodesk
to Nokia to GM two, Indy Car are all helping
US sponsor the festival because they're, you know, creating the
workforce of the future with these kids. So we get
the whole STEM community to support these kids. We separate

(46:01):
it from government to make it bipartisan, and it's really
just all about supporting brilliance, supporting excellence, and supporting these kids.
So it's it's not just focusing on the kids, but
the entire STEM community is showing up.

Speaker 2 (46:17):
I think it's great. I mean that it's for you.

Speaker 1 (46:21):
I mean because I know you, and I know you
are not putting on You are relentlessly positive, which is
which is very interesting to me. Do you never do
you ever get discouraged? Do you ever feel like, oh,
fuck this, I can't I kind of listen it. I mean,
because look, you give me five minutes with a politician,
I can I lose the will eleft any of them

(46:42):
any no matter what stripe they have, they're just foul people.

Speaker 2 (46:48):
They're awful, all.

Speaker 4 (46:51):
The headline grabbers. So I gotta say so.

Speaker 3 (46:53):
One of the things that was always great about MythBusters
is the audience was very broad, and it was eight
to eighty. It was men and women, it was blue,
it was red. We did so many things that just
people communally enjoyed. So you know, it didn't matter where
I was or who I was talking to, Like I
could walk up to the bluest of the blue or
the reddest of the red and build some bridges because

(47:15):
I do like creating that community of we're all more
alike than different.

Speaker 4 (47:19):
And so when.

Speaker 3 (47:22):
We as the company that I am founding with Jenny Bucos,
which is Explore, which is an educational platform, we conceived
of this great idea we wanted to make STEM Week.

Speaker 2 (47:34):
So we probably with math Yes, yes, I love it.

Speaker 3 (47:40):
We want National STEM Week to happen that is just
celebrating STEM and all things that are STEM. So we
partnered up with Kosai, which is like the number one
science museum out in Ohio, and we authored this bill
that went onto the floor, that was introduced by Republicans
to the floor and is now we're getting both Republican

(48:03):
and Democratic support to create the STEM Week Act to
make enough.

Speaker 2 (48:07):
Congratulations. I think you're the only one that's crossing the
floor right now. That's amazing.

Speaker 3 (48:12):
We're trying, we're trying, We're doing our best, but it's
it's going places because you know, it's not necessarily the
headline grammars, it's it's the people that are doing the
work that are really really these they were, they were lovely,
And I know I was walking into offices all over
the Capitol Building of different parties, and one thing you
can get behind is smart kids. And I thinking it's

(48:36):
a unifying event here.

Speaker 2 (48:37):
Yeah, yeah, it's.

Speaker 1 (48:38):
It's also it's it doesn't seem to me to contain
a very an obvious polemic right away. I mean, who
can be against kids saving Earth? I think that's okay,
But I think that's also the plow of space jam.
Oh no, it's cartoon characters saving Earth. But it's very similar. Well, listen,

(49:00):
should had more time. Well we do have more time
because we're friends, and I'll talk to you again, but
we don't have any more time right now for this.
But I'll be out on the West coast sooner. You're
gonna be there for a while.

Speaker 3 (49:10):
Yeah, here, So I will come to your show and
if you could bring your wife, because I will.

Speaker 2 (49:16):
I will bring my wife.

Speaker 1 (49:18):
To be honest, when Megan goes somewhere is because she
wants to go there is I don't bring her anywhere she.

Speaker 2 (49:25):
Goes she wants to go.

Speaker 4 (49:26):
I realize this.

Speaker 2 (49:27):
Yeah, it's I think we know.

Speaker 1 (49:29):
The paradynamic in my house is fairly obvious, and I'm
fine with it.

Speaker 2 (49:34):
I'm very happy with it.

Speaker 1 (49:36):
All right, Well, tons of love to you, Thank you
so much for being on, and i'll speak to you
very soon.

Speaker 2 (49:42):
I think i'm out there in a couple of weeks.

Speaker 3 (49:44):
I'll give you a show so fantastic. The only reason
I'm doing this podcast so I can just you.

Speaker 1 (49:49):
Know, doing chat and all right, you swear a lot
less in podcast land, though.

Speaker 4 (49:55):
I'm doing that on purpose.

Speaker 2 (49:56):
Good for you. I'm very impressed.

Speaker 4 (49:58):
I'm a pirate offic I know, I know.

Speaker 2 (50:02):
It's okay. That's okay. You got some of it's going
to be for you. You've got to have your own thing,
all right, get the fuck out of here. All right,
we're done. What do you
Advertise With Us

Host

Craig Ferguson

Craig Ferguson

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.