All Episodes

December 30, 2024 • 37 mins

Why are our brains so wired for love? Could you fall head over heels for a bot? Might your romantic partner be more satisfied with a 5% better version of you? How does an AI bot plug right into your deep neural circuitry, and what are the pros and cons? And what will it mean when humans you love don’t have to die, but can live on in your phone forever? Join Eagleman for a deep dive into relationships, their AI future, and what it all means for our species.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hi, this is David Eagleman. I want to wish you
a very happy holidays. We're going to take a break
for a couple of weeks, and then we're back in
January with new episodes on emotion, intelligence, time perception, smell
and taste, brain, computer interfaces, and much more. In the meantime,
we're going to replay one of our favorite episodes from
the past year, and I'll look forward to seeing you

(00:21):
in January. What is the future of AI relationships? Could
it be the case that your relationship partner would be
more satisfied with a virtual version of you that behaves
five percent better than you do? Could you fall in

(00:45):
love with a bot? How does an AI bot plug
right into you our deep neural circuitry and what are
the pros and the cons of that? And what will
it mean when humans you love don't have to die
but can live on in your phone forever. Welcome to

(01:06):
Inner Cosmos with me David Eagleman. I'm a neuroscientist and
an author at Stanford and in these episodes I examined
the intersection of our brains and our lives. Today's episode

(01:29):
is about relationships. Why are our brains so wired for relationships.
Why do we want love so much? And will AI
be able to serve as a key to that lock,
and what does that mean for us as humans. So
one of the things that's becoming increasingly popular among young

(01:50):
men is having an AI girlfriend. You get to choose
or set up a beautiful avatar.

Speaker 2 (01:58):
And what do I mean by beautiful. That's up to you.
You can choose.

Speaker 1 (02:01):
Any model that you want, with any sort of features
that appeal maximally to you. But that's just what she
looks like. The important part is the conversation. You start
talking with her, and typically this is just text chat,
but the technology is evolving into the upgrade of video chat,
where you see the avatar's mouth moving while she speaks

(02:24):
to you. Now, typically the free or entry price gets
you an avatar friend who lives on your phone and
checks in on you and says nice things to you
and is available anytime that you want to chat.

Speaker 2 (02:38):
And for a premium.

Speaker 1 (02:39):
Subscription price you can upgrade to a steamier relationship, and
here she'll text suggestive photos and she'll say things that
you might only expect from pillow whispers. So the concern
that people have expressed is whether this is going to
impact the next generation of males. Now, as a side note,

(03:01):
let me say that I suspect this will have whatever
influence it has on both genders, on males and females,
also straight and gay.

Speaker 2 (03:10):
But I do suspect that.

Speaker 1 (03:11):
Males will be the majority demographic simply because males tend
to be more visually driven than females. So for the
conversation here, I'm going to talk about it the way
that it's mostly discussed in the media and in academic circles,
which is straight males getting girlfriends this way. But keep
in mind this is a more generalized issue. Now, the

(03:35):
question is what will this mean for all future generations?
Because within AI relationship, you don't have to go out
and confront all the difficulty of a real flesh and
blood relationship. Real relationships get snippy, people get angry.

Speaker 2 (03:54):
In real relationships.

Speaker 1 (03:56):
Your partner might develop a crush on someone else in
leave you or hook up with someone else and you
find out later, Or your partner might develop an illness,
or she might get a job somewhere else and have
to move and then you're stuck in a lonely long
distance relationship for years or whatever. Relationships are full of challenges,

(04:18):
the majority of which can get circumvented with a nice
algorithm that is just content to listen to you all
the time and remember everything you say and give you
one hundred percent attention and always be nice. So my
wife sometimes jokes with me about wanting to build the
five percent better David. She has, mostly as a joke,

(04:41):
talked about this issue of what if she could have
an AI avatar of me that is never distracted with work,
or never looks at my cell phone when it things
in the middle of a conversation we're having, or never
wakes up from a weird dream and has a funny morning,
or never argues over some misunderstanding that's later understood to

(05:03):
be stupid and meaningless. And she tells me that she
wants five percent better David to always tell her she's right,
even in those rare cases when she's wrong. The key
is that five percent better David never gets busy or
occasionally snarky or forgets some occasion. Instead, he represents all

(05:24):
the best of me. And I'll just note that it's
very kind of her to label this five percent better David,
because she could say like ninety percent better and she'd
be justified.

Speaker 3 (05:35):
Why.

Speaker 1 (05:36):
It's because we are all very imperfect in relationships. As
I've talked about on other episodes, we are each living
on our own planet in the sense that we carry
our own internal models of the world, and as much
as we work to understand one another's viewpoints and motivations
and intentions, we're.

Speaker 2 (05:57):
Not always that good at it.

Speaker 1 (05:58):
Because we assume that other people are seeing the world
in the same way that we do, they have the
same methods for sense making, they gather meaning in the
same way that we do, and we assume they generally
hold or should hold, the same opinions about everything that
we do. And this is because the brain is locked

(06:20):
in silence and darkness and has no meaningful direct access
to the outside world, and so it gathers up all
its information through its narrow windows of the senses, and
it builds its internal model from this very thin trajectory
of space and time that it walks along. And this
is why everyone is so different on the inside, and

(06:43):
therefore why relationships are always full of misunderstanding and often conflict.
So relationships are inherently tough. And the question is when
would it be a good thing if you could have
an artificial partner who represents all the best of what

(07:04):
a person can be. So a lot of people will
immediately say no to this idea, but it's worth noting
that we're all striving to be the five percent better
versions of ourselves. We don't want to be snarky or
angry or distracted when a loved one is talking to us.
It's not like we get some extra pleasure out of
doing that, and it's not like the relationship gets some

(07:26):
extra boost or closeness from that having happened. So presumably
this is all part of why AI relationships have become
a thing, a possibility.

Speaker 2 (07:37):
That we talk about nowadays.

Speaker 1 (07:39):
In Japan, many young men apparently already prefer to have
relationships with their digital assistants or avatars or holographic girlfriends
instead of dealing with the complexity of real life relationships,
and according to research, gen z is more readily redisposed

(08:00):
to seek out relationships with AI generated avatars first because
they're comfortable using the technology in this way compared to
previous generations, and also they're participating less often in traditional
social activities like regular family dinners or attending religious services
or playing sports and The question is, if AI relationships

(08:26):
were to catch on broadly, what will this mean for society?
Will kids actually stop going on dates because they can
find better relationships online?

Speaker 2 (08:38):
And this is a real question because there.

Speaker 1 (08:41):
Are many startups currently blossoming to create chatbot driven connections.

Speaker 2 (08:47):
I'll give you one example.

Speaker 1 (08:49):
There's a twenty three year old influencer with almost two
million Snapchat followers. Her name is Karen Marjorie, and earlier
this year in May, she released Karen Ai, which is
an immersive AI experience featuring videos of Margie that she
says provide a quote virtual girlfriend for those who are

(09:10):
willing to pony up one dollar per minute.

Speaker 2 (09:12):
Now.

Speaker 1 (09:12):
This is what's known as a companion chatbot, and she
tweeted that quote karen Ai is the first step in
the right direction to cure loneliness.

Speaker 2 (09:24):
He tweet continues. Quote. Men are told to suppress their.

Speaker 1 (09:27):
Emotions and not talk about issues they're having.

Speaker 2 (09:30):
I vow to fix this with karen Ai.

Speaker 1 (09:35):
She says she's worked with leading psychologists to seamlessly include
the right therapies to quote undue trauma, rebuild physical and
emotional confidence, and rebuild what has been taken away by
the pandemic end quote. And by the way, as a
side note, I think AI psychologists are going to be
a truly important part of the clinical landscape by next

(09:58):
year because you can have a therapist that you can
talk to twenty four to seven, and the therapist never gets.

Speaker 2 (10:04):
Distracted or flustered and.

Speaker 1 (10:05):
Only cares about you and has a perfect memory for
everything you've ever said, which is better than anybody else
in real life. So back to AI girlfriends or boyfriends.
The same idea applies here, which is that they are
completely devoted to you and always in a good mood

(10:26):
and only have you in mind. So what are AI
relationships going to mean? Well, I think this is going
to be a research question that sociologists and psychologists will
study for the coming decades and centuries. The initial studies
are suggesting that people, mostly gen zers, are moving closer

(10:47):
to the technology to avoid the unpleasant realities of human relationships.
All the tough stuff. Is that detrimental? Well it could
be if it makes your human reas relationships harder, because
maybe every time you guys have an argument in real life,
your partner thinks, well, forget it, I'm bagging this. I'm

(11:08):
going back to my comfort zone. So the concern, as
you can probably guess, is that the rise of AI
driven relationships could exacerbate loneliness because they seem to be
a meal, but they provide no calories.

Speaker 2 (11:23):
And I'll come back to that in a moment.

Speaker 1 (11:25):
In other words, AI generated avatars could interfere with the
relationships that young people are just learning to foster, because
the AI relationship might breed dissatisfaction with flawed humans. And
this applies not only to lovers, but even to friends.
It might be easier to have AI friends who aren't

(11:48):
busy when you need them and can give you one
hundred percent of their attention whenever you need it. And
let me throw in a different potential problem with AI relationships,
So give me one second to take this tangent here.
I was thinking the other day about the Fermi paradox.

Speaker 2 (12:04):
The Fermi paradox is.

Speaker 1 (12:06):
Given the size of the observable cosmos, with over one
hundred billion galaxies, and each of them with one hundred
billion stars, and each of those surrounded by some number
of planets, what is the reason that we have not
heard from any other alien species yet? And one of
the proposals that's always been there is that maybe as

(12:30):
civilizations become more technically advanced, they end up killing themselves,
and this is why we haven't heard from other smart civilizations,
because they.

Speaker 2 (12:40):
Are already gone.

Speaker 1 (12:42):
And every time I've seen this proposal, it's always in
the form of warfare, things like nuclear bombs.

Speaker 2 (12:48):
They end up wiping themselves out.

Speaker 1 (12:51):
So civilizations become smart and it's not long before they disappear.
So in thinking about AI relationships, it struck me as
a possibility that if we had really, really great relationships
with avatars, perhaps that would cause the birth rate of
the species to collapse. I don't know if this has

(13:12):
been proposed as a possible answer to the Fermi paradox,
but maybe this should be included, not civilization's disappearing because
of bad things, but instead from having too much of
a good thing, which could fool and eventually overwrite or
mandate for reproduction. Okay, so no one knows what the

(13:50):
long term effects will be of these AI relationships, but
I don't actually think the situation is as dire as
some of these arguments suggest that it is. And I'll
make two arguments to this end. The first revolves around
human touch. We are deeply wired to care about touch.

(14:11):
I'm going to do a whole episode on touch in
the near future, but the bottom line is that touch
helps us to connect with others, to feel safe and secure,
to regulate our emotions. When you get touched, your brain
releases oxytocin, which is a hormone that has calming effects
and bonding effects, and oxytocin helps to reduce stress and anxiety.

(14:32):
It can even boost your immune system. So we need
touch to feel connected and loved, and a lack of
touch leads to loneliness and depression and anxiety. So we're
deeply programmed for touch and also things like smell, and
so it would presumably be quite lonely if all you

(14:53):
had was the five percent better partner on a screen
and you're just exchanging text messages or just an avatar
you can look at on your phone, or maybe even
in the near future you'll have a three D avatar
projection in your living room, but you won't have the
hand squeeze and the hug and other forms of physical intimacy. Now,

(15:16):
I assume people are working on AI robots that can
provide touch, even something simple like touching your shoulder or
laying a hand on your hand, and I can't imagine
that it's going to be too hard to do, and
it'll probably be not that great at first, but after
a few tech cycles you can imagine it could get
pretty good. But in any case, at the moment, if

(15:39):
you have a girlfriend who just lives in several square
inches in your phone screen, you're going to be missing
out on this fundamentally needed aspect of human communication that
our brains seek. So the depth to which our brains
are wired for touch suggests to me that the reach
of AI partners into our lives is going to be limited, because,

(16:02):
at least it's currently devised, their algorithmic reach never actually
contacts our skin, and so that will be continued to
be sought.

Speaker 2 (16:14):
Now.

Speaker 1 (16:14):
The second point to raise about whether AI partners can
displace real human partners is that there's a sense in
which fake partners have always been around. Just look at
a book, look at a movie, look at any TV show.
You have beautiful Hollywood actors and actresses, and they have

(16:35):
flawless skin and perfectly quafft hair and no hair where
they shouldn't, and they have glittering white teeth. They are
the epitome of health, and they always say the right thing,
and you get to be the protagonist and enjoy experiencing
that relationship. You find the partner and lose the partner,

(16:55):
and then an act five you regain the relationship with
an epic kiss. This kind of fake relationship in books
and movies isn't exactly the same as an ai relationship,
but it has some similarities.

Speaker 2 (17:08):
They both represent a.

Speaker 1 (17:10):
Platonic ideal, a perfect relationship with someone who always says
the right thing. We never see a love interest in
the movie who is distracted or angry, or interested in
someone else, or just really busy with work, too busy
to spend time with you when you need them. You

(17:30):
never see a love interest in the movies who waste
a lot of time taking selfies and trying to build
a meaningless reputation on TikTok. People have no meaningful foibles
in a good love story in a book or on television.

Speaker 2 (17:45):
Now.

Speaker 1 (17:46):
I've often wondered if we, in a sense get cursed
by the fairy tales we're surrounded with when we're looking
for actual love.

Speaker 2 (17:54):
But I don't know.

Speaker 1 (17:55):
Perhaps those fairy tales help us past all the difficult
stuff in a relationship. They get us to ignore the
imperfect things because we believe so strongly in the possibility
of a perfect relationship.

Speaker 2 (18:09):
So think about it this way.

Speaker 1 (18:10):
Say you were a space alien who had never watched
or read a love story, and you had no concept
of that, and the question is, when you met someone,
would you think, Wow, They seem to have very different
opinions than I do.

Speaker 2 (18:24):
They think like this, and I think like that.

Speaker 1 (18:26):
And they also spend some fraction of their time getting
snippy at me or staring at their cell phone or whatever.
So there's no way this can work.

Speaker 2 (18:34):
I don't know.

Speaker 1 (18:35):
I'm just speculating here, but I do wonder if seeing
lots of models of love stories gives us the tools
to view things in a more optimistic light, and that
actually gives a chance.

Speaker 2 (18:48):
To make the relationship work.

Speaker 1 (18:50):
In other words, it provides some aspirational glue where otherwise
things would just fall apart. Now, the counter argument, of course,
is that all these fantasies set you up with false
expectations about love and relationships, which makes it harder to
keep the relationship together once you see some degree of

(19:11):
realism and disenchantment sets in. In any case, even if
we do get cursed by these fairy tales in some way,
it's still the case that there's nothing new about fantasy relationships. Now,
maybe you argue this is different because instead of the
Julia Roberts movie that everyone watches, it's now something that

(19:33):
is bespoke just for you. It's a one on one relationship,
and maybe that's an important difference. But just keep in
mind the way that we humans enjoy literature is by
living inside the story. You are essentially having a one
on one relationship with Julia Roberts. So perhaps it's not
the privacy of the relationship, but instead, the meaningful difference

(19:58):
with an AI relationship.

Speaker 2 (20:00):
Is the bi directional nature of it.

Speaker 1 (20:03):
Instead of watching a movie where you're simply hearing other
characters say lines and Julia Roberts responds, you are now
the one coming up with the lines. You are deciding
what to say. So maybe this makes a difference. I
suspect it enhances the degree of the fantasy.

Speaker 2 (20:22):
So we have yet to see whether.

Speaker 1 (20:25):
AI will meaningfully replace people's pursuits of other humans because
it is touchless and smellless, and it's not clear what
the impact is of holding fantasy relationships, because we already
do that with book characters and movie stars. So this
is going to require many years of real world data

(20:45):
to get a real bead on the impact here. Okay, Now,
whatever you think about AI companions, I have noticed in
conversations with my friends, especially those who are married, a
question that floats quickly to the surface. Is it cheating
to have a relationship on your phone with a non
real person? And there are different levels, of course of

(21:06):
what an AI relationship could be. What if it's just
an app like replica that checks in with you like
a friend who cares about you, and you can just
chat innocently with it. This is the free version of
the app, Okay, so that's one level. But what if
you go in for the paid version, where the conversation
with the avatar becomes more spicy? And what if the

(21:27):
cartoon like avatar is highly attractive and dressed provocatively and
is extremely suggestive in what she says. So I've informally
surveyed several married friends about this, and it seems clear
that opinions are all over the spectrum. Some wives and
husbands feel fine about having their partner have an AI relationship,

(21:50):
on the side, and others said no way. Now, for
those who said no way, this is presumably because the
issue plugs into very deep so chetry in their brain.
It's interpreted as a threat to the relationship, and we
are hardwired to fight against that. From an evolutionary perspective,
what you want is for your mate to stick around

(22:13):
and provide resources and child rearing, and anything that represents
a threat to that is to be fought against. Now,
the part that seems interesting here is that an AI
avatar would not represent a direct threat in this evolutionary sense.
You can't go and impregnate or be impregnated by AI.

(22:35):
But nonetheless your attention might be stolen away to some degree,
possibly to a large degree, and beyond an evolutionary threat.
A big part of what people get out of a
relationship is the love and the attention that we all crave.
So many people feel that they just don't want the
AI bought to steal away even a fraction of that.

(22:58):
A partner only has so much much love and attention
to give in a day, and you don't want half
of it getting siphoned off to someone or something else.
This shares some similarities to the situation of a person
having an X that they still talk with, and if
a person talks very intimately with their ex, a spouse

(23:20):
might feel like she or he doesn't really love that. Now,
when it comes to the X, if you were making
the evolutionary argument, you could argue that the fear is
that on a lonely night, in the middle of a conflict,
your partner might make a bad choice and slip back
into a physical relationship, and so that relationship with the

(23:40):
X feels like a threat. But obviously the kernal cheating
can't happen with the AI bought, and yet the fear
is still there.

Speaker 2 (23:49):
So that indicates one of two things.

Speaker 1 (23:51):
Either are evolutionarily programmed deep fears simply can't make that distinction,
or it does have to do with a future threat
of physical infidelity, but instead it's just this issue about
somebody else having that emotional intimacy with your partner, which
steals away attentional resources from you. Now this gets more

(24:15):
interesting when we start thinking about having physical robots that
can play a role in your life.

Speaker 2 (24:20):
Now, this is probably not going to happen in the.

Speaker 1 (24:22):
Next few years, but fast forward a century and certainly
everyone's going to face this scenario.

Speaker 2 (24:29):
Your partner can buy not.

Speaker 1 (24:31):
Just a mechanical device or blow up doll, but can
now have a convincing and attentive physical partner. So the
question is what if your spouse can get not only
the emotional intimacy but also the physical intimacy. So the
people I surveyed about this who found the AI bot
online a threat seem to find this idea of an

(24:53):
AI physical robot even a larger threat. Now, the interesting
thing is that I can point out that they're there's
a sense in which none of this is different from
what their spouse might do anyway, in terms of finding
adult content on the Internet and cheating in that way.
But I think people have a reaction to internet surfing

(25:14):
for the same reasons as they have the reaction to
the AI bought, which is simply that there is less
time and intimacy and attention toward them. This certainly won't

(25:43):
apply to everyone, but the very general impression I've had
from talking with people at different stages of marriage is
that at the beginning of a relationship, people have a
stronger reaction against AI relationships.

Speaker 2 (25:57):
They don't want their partner to be distracted.

Speaker 1 (26:00):
But people who have been in a relationship for a
long time and have kids will sometimes see this as
a way to get their spouse out of their hair,
and they can be happy for the spouse because it
addresses their spouse's needs and slakes their attention. In other words,
they love their spouse as a partner, and they see
this as a way for their partner to fill in

(26:23):
needs for intimacy and attention in a way that's innocent
and has no meaningful health risks like STDs. So what's
become clear to me is that there's no single answer
for how a spouse feels or should feel about AI relationships.
Some people are against it, some people think it's a
great idea, and many people are still somewhere in between

(26:46):
or still making up their minds. Now, I want to
switch gears from what this means to the partner back
to what it means to the brain of the person who.

Speaker 2 (26:57):
Is receiving the intimacy. So let's recall the movie Her.

Speaker 1 (27:02):
It's about this guy named Theodore who's played by Joaquin Phoenix,
and his marriage ends and he's left heartbroken and he
becomes intrigued by a new app. It's actually an operating
system in which he can launch this program, and he
meets Samantha, who's just a voice played by Scarlett Johansson,

(27:23):
and Samantha is sensitive and playful, and this ends up
becoming a good friendship, but soon it deepens into love
and he has this relationship with an AI bot, and
this relationship means everything to him. Now the film has
an incredible ending because in the final act he comes

(27:44):
to understand that she has been having this relationship with
hundreds of thousands of other men, all at the same time,
because she is computational and operates at a totally different
timescale and can process what appears to be intimate conversation
at a rate millions of times faster than our poorer brains,

(28:08):
and so she's maintaining this intimacy with hundreds of thousands
of others.

Speaker 2 (28:13):
And the movie opened up the question how should Theodore
feel about that?

Speaker 1 (28:20):
Is the intimacy real if it's shared with a city
full of other men? Is the relationship real if she
lives on a timescale many millions of times.

Speaker 2 (28:30):
Faster than yours? Does it matter?

Speaker 1 (28:32):
Should he still feel the titillation of her saying something
sweet and kind to him just when he needs it.
These were the questions launched by that particular movie. So
I'm going to suggest a direction here that I don't
believe anyone is thinking about, certainly not in Silicon Valley,
where everything is about leveraging the power of AI to

(28:52):
scale a product to millions or billions of people. What
I'm thinking about instead is from the point of view
of neuroscience, and the goal is not scaling, but instead
focusing on the life of an individual and the specific
details of what has shaped his or her brain. So
I'm going to tell you my idea, but first I'm

(29:14):
going to start far away and we'll come back around
to this. So I spun off a company from my
lab called Neosensury some years ago, and one of our
inventions is a wristband to replace hearing aids. Because the
risk band listens in real time for high frequency parts
of speech, and it vibrates to tell you, oh, there

(29:34):
was an s oh, I just heard a te Oh
that was a K And so it clarifies what's happening
at the high frequencies and that helps people with age
related hearing loss to understand what word was just said.

Speaker 2 (29:47):
Now, to make the risk band.

Speaker 1 (29:49):
Good at detecting these high frequency parts of speech. We
needed to train a massive neural network with six thousand
hours of audiobooks. But it turns out that people with
high frequency hearing loss have a difficult time understanding, for example,
children because their voices are higher frequency, and there are

(30:13):
no audio books read by children. So we had no
way to train the neural network with any kind of
massive data from children's voices. So here's what we did,
led by one of our engineers, Yong Yee, We had my.

Speaker 2 (30:28):
Eight year old daughter read forty.

Speaker 1 (30:30):
Five seconds of text into a microphone and then with
some keystrokes, Yong Yee turned that into her voice reading
six thousand hours worth of books, and then we trained
up the neural network on that corpus. And we did
the same thing for my eleven year old boy. So
now I can listen to any book read by my children,

(30:53):
as though they took the tens of hours to sit
down and read the book in the studio to me.
So this technology which exists now which allows you to
capture the cadence and prosity of a voice, this gave
us a really straightforward.

Speaker 2 (31:10):
Solution to a problem.

Speaker 1 (31:12):
But this technology has also led to many legal and
ethical questions, for example, about celebrity voices, like can you
use John Lennon's voice to sing you a personalized song
to get you to sleep? There are all kinds of
legal battles blossoming as people try to figure out the
rules around this, But I'm not going to talk about
that today, because my interest is in finding this single voice,

(31:36):
or maybe small handful of voices that have.

Speaker 2 (31:38):
Meaning to your brain uniquely. Here's what I mean. When
we did this project with my kids'.

Speaker 1 (31:46):
Voices, that got me thinking because my father passed away
three and a half years ago, and he was a
major influence in my life and I miss him. So
I went through my old videos and found some short
clips of him speaking, and I wondered if there was
enough there that I could actually make an ai bot

(32:06):
out of his voice, so I could hear him speak
whenever I wanted to. And it made me wonder about
the degree to which that's a healthy thing. But I
decided there was nothing bad about it. What a pleasure
to be able to hear my dad's voice for the
rest of my life and to have that trigger my
fond memories of him.

Speaker 2 (32:27):
Wouldn't it be cool?

Speaker 1 (32:28):
To have him read audio books to me in the
way that he read to me when I was a child,
And I thought about what it would be like for
my mother if I programmed a sentence to her in
his voice, like I love you and I'm thinking about you.

Speaker 3 (32:44):
I love you and I'm thinking about you.

Speaker 1 (32:47):
And when I turn ninety years old, wouldn't it be
amazing to hear him say Happy birthday, David, just like
he did when I was a kid.

Speaker 3 (32:55):
Happy ninetieth birthday, David. I hope this orbit is the
best one.

Speaker 1 (32:59):
Yet, or at New Year's eves into the future, for
the rest of my life, he can wish me the best,
even though he will have.

Speaker 2 (33:06):
Been gone from the planet for a long time.

Speaker 3 (33:09):
Happy New Year. I can't believe it's already twenty fifty three.

Speaker 1 (33:13):
And what I realized as I was reaching my arms
down into this is how powerful this technology is going
to be, because it will be so compelling. I'm not
talking here about the issue of using somebody's voice to
fake an ATM transaction, or fake a kidnapping, or any
of the AI concerns that.

Speaker 2 (33:30):
People have expressed.

Speaker 1 (33:32):
Instead, what I'm talking about is the unbelievably compelling way
that an AI voice could mean something to you emotionally.
After all, I grew up my entire life from the
moment I was born hearing my father's voice. It's so
embedded in my neural circuitry that a voice with exactly

(33:54):
that cadence and prosity would have enormous emotional sway on me.
And again, I'm not talking about all the bad things
that could be done with that. Instead, because this episode
is about relationships, I'm talking about what it would be
like and how I could leverage the intimate nature of
that relationship. For example, let's say that I wanted to

(34:17):
get myself to stop doing something. I don't drink, but
let's say I did, and that I wanted to stop drinking.
So imagine in the near future, I build an app
that tracks my GPS location, and when it sees I'm
about to walk into a bar, it launches my father's
voice in my ear, telling me, Hey, David, don't do this, Hey.

Speaker 3 (34:39):
David, don't do this. I believe in you. I believe
that you have the strength to resist this.

Speaker 1 (34:45):
I think that would be extraordinarily compelling. This would be
a technique to plug into a relationship that already exists
deep in my neural networks, and it could leverage that
for or good. So this is a way that we
can right now take a loved voice and extend your parent,

(35:08):
let's say, past what Homo sapiens can normally do. They
can live on beyond their passing away to keep playing
a role in your life. Now, there's a sense in
which you might say, well, there's nothing new here. If
your parents wrote you a letter, you might find that
years after they've passed, And the invention of writing is

(35:31):
a way of lasting well past your death and reaching
out to people at great distances and across great time chasms.
But what is new is that I can get my
father to talk about things that simply didn't exist when
he was alive. Maybe twenty years from now, I'll look
up the Wikipedia page about room temperature superconductivity, and I'll

(35:54):
get to listen to it in his voice, like he's
teaching me something the way he used to do when
I was a little kid. So the part that is
new is not the reach of a human but instead
the emotional component to all of this, that is the
overlay of a loved one's voice onto any possible scenario
in the future.

Speaker 2 (36:15):
And the reason this all matters is because that voice
has pathways deep into the forest of your neurons. So
let's wrap up.

Speaker 1 (36:25):
Many companies are launching AI relationship bots, and many researchers
are exploring what this all means. But I don't really
think we know the answers yet. It's likely to take
a whole generation before we know what the effect is.
Are people discovering a beautiful technique to address.

Speaker 2 (36:43):
The loneliness crisis here and they have.

Speaker 1 (36:46):
Someone to turn to in the middle of the night
who says something caring to them and always has their
best interest in mind. Or are we entering an era
that exacerbates the loneliness crisis and at worst fills our
belly with empty calories and counteracts? Are reproductive mandates like

(37:07):
a perfect drug that spells the.

Speaker 2 (37:09):
End of the species. Only time will tell.

Speaker 1 (37:17):
Go to Eagleman dot com slash podcast for more information
and to find further reading. Send me an email at
podcast at eagleman dot com with questions or discussion, and
I'm making monthly episodes in which I address those until
next time. I'm David Eagleman, and this is Inner Cosmos.
Advertise With Us

Host

David Eagleman

David Eagleman

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.