Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Sleepwalkers is a production of my heart radio and unusual productions.
Every day I go to work, I use my car.
Every tag I talked to somebody is more than a
few more feet away. I use my phone, so I'm
using certain technologies that extend me beyond my physical normal range. Yes,
so maybe I'm a cyborg. That Sebastian Throne speaking. He
(00:28):
found a Google X and he's appeared on Sleepwalkers a
few times. As far as we know, he's not a cyborg.
If summer took all technology away from me, like refrigeration
and everything, would I not survive? I have no clue.
I good, I'm in catching deer here my neighborhood. But yes,
I would be pretty miserable. You said something which I
found quite interesting. You said, I would love to directly
(00:51):
interface my brain to all the computers in the world,
so I could be truly superhuman. I would know everything,
every name, every phone number. If there was a button
you could press to do that right now, would you
have any hesitation pressing it? I would press it in
a microsecond. Why I make sure the product of data.
I went to college, I read probably four or five
(01:13):
books in the time, and I build a scientific career
on the shoulders of others. That data empowered me to
who I am. If you took that data away from
me and said, Sebastian, I'm going to put you into
the China in the seventies, you're gonna work in the field,
I would be less as Sebastian than I am today,
because I would probably know how to plow a field.
But that was pretty much it. To Sebastian's regret, there's
(01:33):
been a hard limit on how much data he can
actually consume his hardware. His physical body has held him back. Unfortunately,
the human I owe the input out with the ears
and eyes and smell and so on, voice are still
very inefficient. If I could accelerate the reading of all
the books into my brain, oh my god, that would
be so awesome. We talked in the last episode about
(01:56):
brain computer interfaces in the medical world, helping disabled pay
ss like Jan Sherman move prosthetic limbs with their minds. Today,
we look at trans humanism becoming cyborks. What happens when
we merge with computers not just to restore function, but
to upgrade and enhance ourselves. And as we head into
this new future, who might get left behind. I musoloshin
(02:20):
this is Sleepwalkers. So Carrot Sebastian was talking about cyborgs
and whether we're already cy walks because we're so reliant
on our technology, our cars, and especially our phones, which
(02:43):
become kind of like a second brain, albeit a very
distracting one. Yeah. I think what he was saying that
if technology is required to keep us alive, that merger
between humans and machines has already happened. We're already that's right.
But let's be real, that's not really what people think
about when they hear the word cyborgs. They think about
more extreme cases like Sebastian having his brain directly connected
(03:05):
to all the information in the world ever, you know,
and there are already some people working on that. There's
this guy named Elon Musk that you might have heard of,
and he actually one of his new companies is called Neuralink,
and he's trying to do is create a mesh that
can be inserted in the brain that connects the brain
to a computer. And I think part of his goal
(03:26):
is to make humans competitive enough to take on machines
if and when they out smart us. Well, this idea
of humans and machines merging to create something superior is
a great interest throughout Silicon Valley, and one person who
kind of wrote the book on this transhumanism is you
have al No Harari. He is a historian and a
(03:47):
futurist and the author of Homodaeus, a book that explores
the future of our humanity, and his writing is a
major source of inspiration, not just in Silicon Valley but
for this podcast. So I was rather excited when he
agreed to have a conversation with us. Every generation in
history thinks that they are on the verge of the apocalypse,
and usually they are wrong. But as a story and
(04:09):
I have the sense that we are on the verge
of the most important revolution since the very beginning of life.
You've all believes that right now we're actually looking at
the end of history as we know it, the dawning
of a new era. Something has changed. I wanted to
know what we're really deciphering, the underlying rules of the
(04:33):
game of life and now acquiring the ability to change
these rules. I mean, previously, the expectations of the apocalypse
were directed outwards towards the gods, towards some external entity
that will come and intervene and change everything. Now we
don't expect an external entity to do it. We expect
(04:56):
ourselves to do it. And when you look really calm
only and objectively at the advances that science has been making,
it doesn't sound so far fetched as you've all would
have it. We no longer look to the heavens or answers,
because we ourselves are becoming gods, and as technology improves,
we're getting better and better at finding answers by looking inwards. Literally.
(05:21):
One of the biggest misunderstandings about the whole AI revolution
is that many people see it as a revolution coming
out of computer science, but actually it comes equally from
the life sciences, from biology, from brain science. It's not
enough that computers are becoming smarter. It's also essential that
(05:43):
we view humans as algorithms. If our feelings are not
the product of some kind of extremely complicated algorithm in
the brain, then no matter how smart computers will become,
many things won't happen. If something like self driving cars
to navigate a street full of pedestrians, the car must
(06:06):
be able to understand human behavior and human feelings. And
if you think that human feelings are the result of
some spiritual soul or something which can never be deciphered.
Then you can't have self driving cars. In order to
create so many of the technologies we talk about on Sleepwalkers,
(06:27):
from self driving cars to targeted ads to parole algorithms,
we have to assume that human behavior can be modeled,
that our habits, routines, even our personalities could be expressed
as mathematical formulas. But even if you don't take things
that far. Uv OL's key point about the transformative potential
of combining computer science and biology is shared by people
(06:50):
in the business of building the future. You may remember
Arthur Provoca from earlier in the season Arthur round the
Defense Advanced Research Projects Agency Dark. But we're living in
a time in which the biological sciences are converging with
information technology. The DARPER We are always looking for these
(07:12):
areas where we see these seeds of technological surprise, and
today I think this intersection of biology with technology is
one of the most fertile seed beds for surprise. And
when these advances are coming from the ability to autobiology
with computer science, it may be hard for us to
predict what surprises might bloom. You've always argued that advances
(07:34):
in AI and gene editing will lead to new forms
of life. We are now gaining the ability to break
or bend or change the rules of life. We are
about to create the first inorganic life forms after four
billion years of evolution. We can't really imagine how the
new entities would look like because even the wildest dreams
(07:56):
are still subject to natural selection and organic biochemistry. So
according to you of our Cara, it's rather difficult to
say what might come next because it's somehow out of
our frame of reference. It's almost like saying what aliens
look like. And I know if you ever watched movies,
but in movies, aliens and cyborgs tend to look remarkably consistent.
(08:17):
I actually watched Men in Black last week, and it
is alien central casting. You know, a lot of representations
of cyborgs tend to have this element of cliche. You know,
there's a pitiable robot, the robot that's hell by love.
You can think of Wally c three p O, the
tin Man from the Wizard of Us. The Wizard was
my favorite movie. It's kind of a fantasy film. We've
actually talked about real life people who may meet the
(08:40):
qualification of cyborg. Already on this show, we spoke about
Jan Sherman in the last episode using a robotic arm
with her mind, but the research didn't actually end there.
In a subsequent experiment, Jan flew a simulated plane with
our brain waves. Remember this research was being funded by
Darper and it raised big questions for Arthur Prabba Car
(09:00):
who was running the agency at the time, about the
ethics of transhumanism. When Jan went from moving a prosthetic
limb to controlling a flight simulator, that was the moment
that it became visible that these technologies that allow you
to restore function by harnessing motor control signals also mean
(09:22):
that now we have a way for a human brain
to engage with the world in a completely different modality.
It was a very eye opening moment. I think for
all of us, what did you feel in that moment?
It's creepy, it's powerful. It makes you realize that as
hard as the technology is, it might be the easiest
(09:44):
part of figuring out how we harness these capabilities for
good in the future. As we've said, this technology remains
firmly in research labs at least for now. Andy Schwartz
is the neurobiologist at the Universe. You have Pittsburgh who
led the project with Jan and he's very clear that
his mission was to restore function, not enhance it. The
(10:08):
idea was, can we help paralyze people coming back from
war zones regain function? So I think that was very
good motivation for the kind of research we are doing.
And despite the huge potential of Andy's research, he's quick
to point out that the brain computer interface he built
is much less efficient than our existing connection to the world.
At present, our bodies are the best we've got right now.
(10:30):
If you wanted to control the computer with brain activity,
it would be ten times slower. That would be if
you were typing with your own fingers. Things move so
fast that we've gotten accustomed to the idea that if
you see it once, it's just going to become pervasive.
That's actually not true of a medically invasive procedure that
involves putting implants on the surface of the brain. We're
(10:54):
going to need all of that time to figure out
what we were we want to go with these technologies.
So it's a perfect moment to ask where do we
want to go? For many in Silicon Valley, the principle
of a more direct interface between humans and computers is enticing.
But here's Sebastian again. At this point, I don't think
(11:15):
customers would love having surgery to get an implant in
their brain just to be lived smarter. But there are
other ways to approach the problem. You may remember Google Glass.
Sebastian lad that project when he ran Google X, and
in fact, when I first met him in twenty twelve,
he was wearing a pair we bet Google glass because
I wanted a camera right next to your eye, and
(11:36):
I wanted to speak of right next to your ear,
so the computer could perceive the same sensation, the same
stuff you see every day. I think these technologies become
closer and closer to us. When I first met Sebastian,
he was wearing these weird Google glasses carra It kind
of looked like a cyborg, and he told me that
these glasses might solve the problem of people constantly having
(11:58):
to look down at their ownes because the information could
be at eye level. But that really hasn't solved the
problem of tech neck. Technic women and men are now
getting this like very you know, unsavory under neck, fatally
from looking down at their phones. Yeah, and then there's
turtle posture, which is people hunting forward because they're sitting
(12:19):
at their computers so much so we don't know if
these are going to have an impact on us evolutionarily speaking,
but I imagine they might. Or thumbs if you think
about it, well, our bodies are already being changed by
our interaction with our phones. Absolutely. There's actually really quickly,
there's this fake video that was going around called Lookout,
where this guy invented a product where your phone basically
(12:39):
had the same camera it does, except it would face out,
so you never had to look away from your phone.
You could just look at your life through your phone,
which is essentially what everybody is already doing well, And
that was kind of the insight behind Good Last was
what if you could overlay kind of augmented reality over
what you saw And there's something very promising cool about it,
but it wasn't mostly viable because you had to wear
(13:01):
something on your face that made it like a cyborg.
But for consumers and the benefits of a computer at
that proximity just weren't enough to make up for the
inconvenience of wearing something on their face. There is one
group of people who are willing to put up with
inconvenient technology. You know, people like Jan who had her
skull opens that she could move an arm again. Right
as Andy said, this wasn't about letting her fly planes
(13:24):
with her mind. It was about restoring function. When we
come back after the break, we look at a much
less invasive technology that hopes to be as miraculous as
Jan Sherman's robotic arm. Okay, let me do you want
to test to test it with the tambourine or whatever. Okay,
(13:56):
that's Noise Toka. He's an award winning Italian blues guitarist
who lives in New York. When Noi came to play
for Julian and I in the studio earlier this year,
he asked that I need him at the bus stop.
I went to Berkeley College of Music in Boston to
study jazz guitar performance and Noise the real deal. Well
he was at Berkeley. He was given the Jimmy Hendricks
(14:17):
Award for being the school's leading guitarist, as well as
the Billboard magazine and owed scholarship. And now he plays
a consistent roster of shows around New York. But he's
working to reach a digital fan base as well. Every
day Noi posts a video to help build his audience.
This is how I found him. What you wouldn't know
just listening to this podcast is that no Way also
(14:37):
happens to be blind. I was born three months early,
so they put me in the colbator, but there was
too much oxygen in his infancy. Noi suffered from retinopathy
of prematurity known as r o P. He needed surgery
to reattach his retinas and now he has very limited vision,
totally blind, like blind. Technically, I guess it's called light
(15:00):
and shades or something. I mean, I can see colors,
I can read really large print only out of one eye,
but my vision field is small, so I still use
the k and I still read brail. This is why
no I asked me to pick him up at the
bus stop. Getting to and from new destinations is a
huge pain in the ass. But here's the thing. There
are some companies that are trying to fill the gap
and make smart glasses for people with visual impairments. Their
(15:23):
hope is that with the right technology, blindness could become
a thing of the past. When we were researching for
this podcast, Julian came across a new technology company. Yeah,
after a few months, I had a pretty AI focused
search history, right, so my Instagram ad started to reflect that,
which is convenient, you know, AI helping us make a
podcast about AI. Yeah, super consider it, right, So I
(15:45):
got an ad for east Site presented itself and east
Side build themselves as creators of the world's most advanced
site enhancing glasses for the visually impaired and medical tech
can be pretty expensive, but the sites recently lowered their
prices from hundred dollars for appair of these glasses down
to fifty what do they call that? A price slash?
(16:06):
All Hans glasses must go. So it's still pretty steep,
but east Side as payment plans and health insurance can
help people out as well. Yeah, but even so, I
mean I'd rather spend money on guitars, you know, like
that's what I do in my life. Like, why would
I spend money on something that is still so experimental?
Still No, I was curious to test out east Sides technology,
(16:27):
so I reached out, explained noise condition and we got
an appointment. Yeah, don't do anything crazy when you put
them on to start jumping around like a crazy person. Yes,
if it doesn't work, just make it up. I'm I'm doing.
The east Side glasses basically look like Cyclops from X Men,
(16:51):
except without the blinding laser beam that shoots out from them.
The glasses capture high quality video with a camera above
your nose, and then project the video to a high
definition screen in front of each eye. The footage is
enhanced by software designed especially for people with vision impairment.
You can also zoom, change focus, and do things like
boost contrast or go to gray scale. And east Side
(17:14):
rep named Nigel helped No. I get fitted with a pair,
but it can be a little hard to get used to.
I mean, I am trying to see. Isn't it like
right here where I should be looking like? Yes, it's
it's it should be right in front of your people. Um, yeah,
it's a little it's a little high. Let me see
if I can just adjust that. Yeah, there you go.
That's better? Is better? Yeah? That seems like I get
(17:36):
more details out of things, but I don't really know
what I'm looking at. Nigel also asked, no way to
identify me. Let's let's see if you can recognize characters.
I see, I see it. Do you know what I
look like? No, I like, do you have a sense
of what I look like? I mean from what I
saw when I was close to you, like have like
light hair, like the light brown or something like that.
(17:57):
That's right, And uh yeah, I mean it's cool. I
just don't know, you know, we have to see how
useful it would actually be, Like, I mean, I don't
think I would use it to recognize people like there,
because I would have no way of I don't know
what anybody his face looks like anyway, you know, because
he has been blind for his entire life. Nobody doesn't
have a point of reference for the details in people's faces.
(18:19):
Imagine that you have no idea what any of the
people in your life really look like. You learn to
identify them in other ways, like their voices. So no
is ambivalent about the prospect of seeing people clearly, but
he would like it to be easier to get around.
Recognizing people's faces would not be one of my priorities,
but reading where the bus stop is or when it's
(18:40):
coming will be a priority. I mean, it's cool, it's interesting.
I would just have to see how it actually practically
will be because I mean here's the thing, Like, when
you're outside, it's not that you have time to stop
and figure stuff out, especially when you're walking on the
sidewalks of New York. Like, I'd rather just use blind Square.
Nobody first told me about blind Square when I picked
him up from the bus. He just the most popular
(19:01):
iOS navigation app for blind people. It describes the environment
you're in, the intersection you're crossing, and announces points of interest,
all using GPS noise. Point in mentioning blind Squared to
Nigel is that it is a tool that he can
work with within his set of limitations. Now, he spent
his whole life navigating the world one way, his way,
(19:21):
and east Site would actually change that, maybe more than
he would want. I've never really seen a subway map anyway,
so I would have to get a hang of how
it works first, and then because your brain needs to
recognize it and then know what it's looking, that's hard
for human beings to comprehend. I think, yeah, it's like
imagining a new color, right. Yeah, So Julian east site worked,
(19:45):
but it wasn't perfect for no Way, And I think
it's important to know that east is actually really a
cool technology, whether or not, it worked for him. Nigel
mentioned that many elderly people who suffer from macular degeneration
losing their sight due to old age, have been able
to see their grandchildren for the first time, which is
really cool. I mean it does restore site for a
lot of people who have seen before. Noways said, the
sacrifices he makes to get around to his gigs as
(20:07):
a blind person aren't so bad, but there is one
part of his career that's less easy to navigate. Like
many musicians, no Way uses his social media channels as
his primary way of connecting to his current fans and
hopefully reaching future ones. The problem is it's pretty hard
to reach your fans on Instagram if you're blind. Instagram
(20:27):
needs to listen to me right now. Uh No, Basically
the issue is this editing videos and putting the part
that you want is not accessible on Instagram. You know,
thankfully the Apple app is accessible. So I edited on
the iPhone app because there is like sliders and he
tells you like current positions zero seconds of like two
minutes and thirty seconds, and so basically you slide with
(20:49):
your finger up and down. If it wasn't for that,
I wouldn't be able to post on Instagram. I would
only be able to post the first minute, which is
not what I always want to post. On the songs,
sometimes I post some other part, you know. But Instagram
has implemented the app because you know, black people aren't
supposed to be on Instagram. I've never thought about this.
Digital technology is almost exclusively visual. I don't think about
(21:11):
how easy it is for me to use an iPhone
every day. I just use it. And when you have
these technologies like iPhones being designed by people in Silicon
Valley who fit pretty healthily within the norm, it's not
really doing no way any favors. And when we focus
on the next big thing in tech, we can leap
frog past uses for these existing technologies. So we say, okay, well,
when you're starting next big unicorn, we want to do
(21:33):
these miraculous, gigantic ideas. But you don't think about subway maps.
You don't think about things like blind Square unless you
actually need that app. And I think it's we're not
anti technology or anti unicorn. Those things are important, you know. No,
I actually said something really funny, which is that like
he would love a self driving car. Of course he
would so we all benefit from these moonshot innovations, but
(21:56):
it's really important I think that everyone thinks about who
is forgotten. When we innovate too quickly, we don't all
benefit in the same ways. We began this episode debating
what defines a cyborg and asking how close we are
(22:17):
to being transhuman. These are important philosophical questions, and they
may even become practical questions in our lifetime. But as
we wrestle with them, it's important to remember that a
huge number of people still don't have access to technologies
that most of us take for granted to expand our
horizons and what we can do. There are billions of
people in the world who don't have smartphones, and millions
(22:40):
of people in the US, like Noah, who can't take
full advantage of them. Briany Cole is tackling this problem
head on. After working at Microsoft, she founded The Future
of Sex, and as part of her work, she produces
sex tech hackathons around the world. I think it's so
important is because no one's talking about it. We all
(23:00):
got here by having sex sexual identities so core to
who we are, and the concentration of people innovating in
this field is quite small. So Carol, I was a
bit nervous to talk about sex in this episode because
I don't want people just think we're being sensational for
the sake of it. But it is an area where
we can see the tangible consequences of what happens when
(23:23):
one group designs for the rest. Like with No Way,
I don't think it's sensational at all. It's like with
Gillian and episode one, when people programming ads for Facebook
do not understand different outcomes for pregnancy. They don't know
that an outcome for pregnancy could be still birth, and
that's how Gillian's ads were making her life worse because
of people programming them a man by and large, By
(23:45):
and large, now it takes these edgu cases like Gillian
being haunted by targeted ads for us to notice that
more of from than not. We don't have a counter factual.
We can't make a comparison to a version of technology
that wasn't built by Silicon Value. And that is exactly
why Briany Cole organizes events to design new sex tech.
(24:06):
While hackathon sound really geeky and typically you're going to
attract just people involved in technology, we're really careful in
structuring the hackason, inviting other people in from other classes,
from other ethnicities, from other genders to allow us to
build different sorts of products. What was the surprise that
we didn't design for was people with disabilities that showed up.
(24:30):
And with these new groups of people came new kinds
of products. One team came up with a voice activated
vibrator for people in wheelchairs. Another team in Singapore, a
deaf man a blind man built a sex intimacy education
platform about bringing a woman back to their dorm room
and really not knowing how to read intimacy ques because
(24:53):
they've never been taught. These people just showed up on
my thank you um. We feel invisible technology very quickly
becomes part of our homeostasis. We take for granted the
fact that we have a supercomputer as our constant companion,
and that we can summon the entire corpus of human
knowledge with the flick of a finger. But as we
(25:14):
constantly focus on the next frontier, what's new, it's easy
to forget that many people don't get to share in
the fruits of what we've already built. Other populations that
also resonate with the hackathon's people in remote areas or
rural populations that have trouble finding a mate, and of
course women, which is predominantly who turns up to these hackathons.
(25:37):
It's people that typically don't have access to this providing
input into how we build it. Briani's hackathons allow people
who feel invisible to create products directly for themselves, people
who may even be invisible on the campuses of Silicon Valley.
For Briany, access begins at the design phase and including
(25:58):
new voices can create new products and technological surprises that
otherwise might not exist. So perhaps our hackathons could serve
as a model for tech innovation. More broadly, I think
where I'd most like to see it go is too
sort of the trickle down effect across the world in
terms of how that technology reaches other populations because it
is so concentrated in these like Silicon Valley types. It's
(26:22):
the access that I'm more excited about. We actually have
incredible technology available to us, and yet the US is
right now are so clunky, you know, and we're we
just need to figure out how we're going to use
it and who we're going to invite in Briani paints
the picture of a world whether benefits of god like
technology spread to all people. But as we've talked about
(26:44):
before on Sleepwalkers, there's a very real possibility that technology
will be co opted by the rich and powerful. So
we have to ask who gets to become a cyborg?
And what if the new gods don't want to share
their divinity? Here's you all again, m One disturbing thought
is that there is no us, There is no we.
(27:05):
Different groups of humans have a very different future, and
they should prepare themselves in a different way. It could
lead to the creation of a new cost system with
immense differences in wealth and power, much greater than we
ever saw before in history. It could even lead to
a speciation, Homo sapiens splitting into different species with different capabilities.
(27:30):
So the descendants of Elon Musk will be a different
specie then the descendants of people who are now living
in some Favella in some Paolo. So it's a big
question always what is the future of humanity? What should
we do? What is our future? What is the future
of our humanity? What we sleepwalking towards. It's tempting to
(27:53):
look for the dramatic ways technology is changing us. Might
we be able to physically merge with computers? Will computer
vision help the blind, sea or the marginalized experience intimacy?
They're big questions, but there's one that's even more urgent.
How is the AI revolution already changing the way we
think about and perceive the world. What are the algorithms
(28:15):
we interact with every day doing to us, and how
are they changing our society? More on those questions with
you all know a Harrari. When we come back, until
(28:35):
I was twenty one, I didn't know I was gay.
And I look back at the time when I was
I don't know, fifteen or seventeen, and I just can't
understand how I missed it. I mean, it should have
been so obvious, But the fact is I didn't know
an extremely important thing about myself, which an AI could
have discovered within you know, like two minutes. For you
(28:58):
all know a Harari, AI could have made a big
difference to his early life, and that got him thinking.
When we talk about AI, we tend to greatly exaggerate
the potential abilities, but at the same time we also
tend to exaggerate the abilities of humans. People say that
AI is not going to take over our lives because
(29:20):
it's very imperfect and it won't be able to know
us perfectly. But what people forget is that humans often
have a very poor understanding of themselves, of the desires,
of their emotions, of their mental states. For AI to
take over your life, it doesn't need to know you perfectly.
(29:42):
It just needs to know you better than you know yourself.
And that's not very difficult because we often don't know
the most important things about ourselves. So let's say you
could turn back the clock to being fifteen, would you
have wanted to live in a world where there was
sufficiently sensors to monitor your eyes, your eye movement, you're breathing,
(30:04):
you know, while you're going about your daily life, and
then to interpret that and say to you you've all,
more likely than not you're gay. That's a very good question,
which will become very practical questions in a few years.
And the way that I grew up and developed it
would have been a very bad idea. I wouldn't like
(30:25):
to receive this kind of insight from form a machine.
I'm not sure how I would have dealt with it
when I was fifteen, you know, in Israel, in the
nineteen eighties, and maybe proudly it was, you know, a
defense mechanism in the future to it. It depends where
you live. Brunei has instituted a death penalty for gay people,
at least for people engaged in homosexual sex. So if
(30:48):
I'm a teenager in brune I, I don't want to
be told by the computer that I'm gay, because the
computer will then be able to tell that to the
police and to the authorities as well. Looking to the future,
say ten twenty years, the danger is if I still
don't know that I'm gay, but the government and Coca
Cola and and Amazon and Google they already know it.
(31:10):
I'm at a huge disadvantage. So it could be something
as frightening as the secret police coming and taking me
to a concentration camp. But it could also be something
like Coca Cola knowing that I'm gay, they want to
sell me a new drink, and they choose the advertisement
with the shirtless guy and not the advertisement with the
(31:32):
girl in the beginning, and next day morning I go
and I buy this soft drink and I don't even
know why, and they have this huge advantage over me
and can manipulate me in all kinds of ways. What
you've all suggests is that once we become reducible to data,
we become predictable to algorithms, and once we're predictable, we
can be manipulated. We talked in the last episode about
(31:57):
how AI is helping us decode life's fundamental histories brain waves,
health outcomes based on genetics, time of death. But the
next frontier could be our very behavior is off the again.
We have tools to evaluate vast volumes of data that
we have previously collected or that we've always collected. We
(32:18):
have new sources of data. Think about everything from fitbits
and those kinds of measurements that you can make on individuals,
to the volume of data that people are spewing out
into the online environment every day. The implications of spewing
this data onto the internet is where we began the series.
And knowing that your data is being used to build
(32:40):
an accurate model of you make if you pause about
putting it out there. But it's a combination of that
data with advances in AI that's allowing us to start
to see into the future. Data plus that deep knowledge
now allows you to form hypotheses and to design experiments
that allow us now to start a journey of building
(33:02):
better models of these complex systems. This is at the
core of the revolution that we're seeing. But I think
the next wave, after starting to understand biology, is about
a transformation in the social sciences. Of course, in some ways,
that transformation in the social sciences is already here. It's
what you' I was talking about in terms of computers
(33:24):
that understand us better than we understand ourselves. And we've
already seen the real world consequences altering the course of
history with Cambridge Analytica, Brexit and the six presidential election.
But Uvou believes that was just the beginning, and this
revolution in biology and computer science will shake the very
foundations of how we live. The ideology of humanism basically
(33:50):
says that the human experience is the ultimate source of
authority and meaning in the universe. If you look at polity,
then if originally political authority came from the gods, now
political authority comes from humanity. The idea is that the
voter is always right. You look at economics, they're the motto.
(34:12):
The slogan is the customer is always right. If the
customers want something, then, however, irrational and illogical it is,
this is what the entire system is geared to provide.
And you have the same idea in ethics, Why is
it wrong to murder? Not because some God said so,
(34:32):
but because it hurts other people? So we already view
yourself in this sense as kind of divine entities that
provide meaning and authority to the world. The big question
is what happens once some algorithm can decipher and manipulate
human feelings and experiences, then they can no longer be
(34:54):
the source of authority if it's so easy to hackn
manipulate them. And this is part of the crisis which
we are already beginning to see today. This crisis is
the intersection, the culmination of many of the ideas that
we've spoken about on this first season of Sleepwalkers, and
it's why it's so urgently important not to ignore the
(35:15):
changes going on around us and to wake up if
we refuse to see it. If we just hold on
to this liberal belief that humans are free agents, we
have free will, nobody will ever be able to understand me,
nobody will ever be able to manipulate me. If you
(35:35):
really believe that, then you are not open to the
danger and you won't be able to reinvent the system
in a better way. Not everyone agrees that we've reached
the point you've all describes. According to Arthie, there's a
world of mystery that we've barely begun to penetrate. I
think we are so far from having a full understanding
(36:00):
of any of these systems. The fact that we are
making this rapid accelerated progress, I think sometimes leads to
a hyperbolic sense that we're going to know everything and
it'll all get reduced to a bunch of algorithms. And
I don't think we're anywhere near that. And I'm not
even sure that's the endpoint, not the end point in
a sense that that will never happen, or there maybe
(36:20):
an endpoint beyond that. My own view is that what
it means to be human is so much richer than
these mechanistic components that we're talking about. You know, if
you said to me, when do you think we will
fully map and be able to predict behavioral systems, my
answer might be never. The incredible depth of these systems,
(36:40):
how messy and organic they are, means we've got a
long way to go before we've understood everything. The uniqueness
of our humanity lives to fight another day. But as
you've out said, the AI to profoundly influence us, it
doesn't need a perfect understanding. It just needs to know
a little bit more than we do, and that's become reality.
(37:01):
So what should we do on the individual level. It's
more urgent than ever to get to know yourself better
because you have competition. Once there is somebody out there,
a system out or an algorithm out there that knows
you better than you know yourself, the game is up.
You can do something about it, not just by beholding data,
(37:23):
but above all by improving your own understanding of yourself.
The better you understand yourself, the more difficult it is
to manipulate. You know thyself. You have our references the
ancient Greek maxim and last episodes to Dartha called our
attention to the Hippocratic oath in the midst of the
(37:43):
incredible upheavals of modernity, with being urged to remember the
wisdom of the ancients. But how do we get to
know ourselves better? I meditate, some people go to therapy.
Whatever works for you, do it, and do it quickly.
Get to know yourself better, because this is your best
protection against being hacked. If you're an engineer, then one
(38:06):
of the best things you can do for humanity is
build AI sidekicks that serve individuals and not corporations or governments.
AI systems that yes, they monitor you and they analyze you,
and they hack you, but they serve you. They reveal
that what they find to you, and they work like
(38:27):
an antivirus, just like your computer has an antivirus, so
your brain also needs an antivirus. And uh and and
this is something that that a I can do for
us to protect us against other malevolent Aiyes. Of course,
one of the problems we've talked about on the series
is that the vast majority of talented engineers work for
the very corporations Val says we need to be protected from.
(38:50):
But there is something compelling about using technology as a
tool to protect ourselves from other technology. And brianie Cole
also sees avenues for technology you to help us get
to know ourselves better. We can explore in virtual worlds
without shame. It's pushing us to reveal ourselves, even things
we didn't even know about ourselves. If we think about, Wow,
(39:13):
this dark and wacky world of sexuality that we haven't
even explored ourselves because we've been too afraid of what
we might discover. You put technology in there where you're
suddenly able to create any world that you want, These
totally fantastical edges of our minds that we can go
to thanks to technology. And according to Briany, this is
(39:36):
not just about seeking thrills. Our sexuality is actually driven
by something much more profound, the core of our humanity.
We want to connect, we want to belong, we want
to feel like we're part of something that's sort of
like the core part of that right down to our sexuality.
What is the future of sex? The answer has nothing
to do with technology, and it always has to do
(39:58):
with being human. We can take great strides personally to
get to know ourselves better, but we also have to
recognize the limits of what we can achieve as individuals.
To create a technological future that is fair and positive,
we need governance and policy. Here's you again. We need
to regulate things like the ownership of data and the
(40:19):
immense power the divine powers of creation, of being able
to engineer and create life. This this should be a
major political issue of who owns these kinds of abilities.
This is not something you can do by yourself. So
here that my best recommendation is join an organization. Fifty
people in an organization can do far, far more than
(40:43):
five hundred individual activists. So whatever cause seems to you
the most important, join a relevant organization and do it
this week. The AI revolution isn't far off in the future, Kara,
as Uval says, is here with us, and we have
(41:04):
this personal and political responsibility to make sure that the
future is a future we want to live in. Yeah,
I think that's actually a good place to leave Season
one of Sleepwalkers, don't they're breaking my heart? Ye. It's
been incredible to hear you say, Karma, Kara. But also
it's been incredible to report on you know, to meet
people like no Way and Gillian brock Hell or Glenn Rodriguez,
(41:27):
and to understand how this new technology is affecting people's lives.
It's been a real privilege to hear those stories and
also to get access to some of the people who
are building the technologies we live by, people like Yasmine
Green at Google's Jigsaw, Nathaniel Gleitscher, Facebook's head of cybersecurity,
and Ben Singleton director of Analytics at the NYPD. We
got to hear firsthand what these leaders in the world
(41:50):
of technology is thinking about and how hard they wrestle
with the ethics of their creations. And there's so many
more areas that AI is transforming and that we're going
to look at in season two of things like money
and climate change and some of the problems that I've
noticed even since we started doing this podcast, like what
happens if your boss is an algorithm. Well, that's why
we're doing a season two of Sleepwalkers, So please join us.
(42:12):
And in the meantime, we'll be keeping our Instagram, Twitter
and website updated. That's at Sleepwalker's Podcast on Instagram and
at Sleepwalker's Pod on Twitter. And if you have any stories,
suggestions or criticisms, send us an email at Sleepwalkers Press
at I heart media dot com. That Sleepwalker's p R
E s s at iHeart media dot com. Well, that's
(42:33):
all from us. I'm Ozveloshin and we'll see you next time.
Sleepwalkers is a production of I Heart Radio and Unusual Productions.
For the latest AI news, live interviews, and behind the
(42:56):
scenes footage. Find us on Instagram at Sleepwalker's podcast at
Sleepwalkers podcast dot com. Special thanks this episode to the Forward,
the digital news and culture website and Jonathan and Cody
from the SEO Agency make it all work here in
New York City. And thanks to Noise So Hard That's
s O. C h A for his involvement in this episode.
(43:16):
If you'd like to hear more of Noise Music, you
can find him on Facebook, YouTube, and Instagram at Simple
blues Boy. Thanks also to Allen Ullman, author of Life
in Code, and Gary Marcus of Robust dot Ai, who
gave generous interviews which helped shape off thinking for this series.
Sleepwalkers is hosted by me Ozveloshin and co hosted by
me Kara Price, with produced by Julian Weller, with help
(43:38):
from Jacopo Penzo and Taylor Shakoyne. Mixing by Tristan McNeil
and Julian Weller. Our story editor is Matthew Riddle. Recording
assistance this episode from Joe and Luna and Sabrina Boden.
Sleepwalkers is executive produced by me Ozveloshin and mangesh Hat Tigetter.
For more podcasts from my Art Radio, visit the I
Heart Radio app Apple podcasts, or wherever you listen to
(43:59):
your favorite show, simly for the wing of they would
be for the DA