Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Sleepwalkers is a production of I Heart Radio and unusual productions.
We can choose to have a poker face, but the
point is that our bodies are still reacting and what's
changed is the ability to see those signals. That's Poppy Crumb,
chief scientists at Dolby Labs and a professor at Stanford University.
(00:26):
Her work is at the forefront of neuroscience and data science,
and it's bad news for the poker face pret Solo
just want to an Academy award. And the filmmakers were
at my company doing a screening and had we had
captured carbon dioxide of the audience with her approval, of course,
and I wasn't actually at the screening. I just saw
(00:47):
the Ceoto capture and I knew exactly where the climbs were,
where he abandoned his climbs. As the audience watched Annlex
Honald attempted to climb El Capitan, their bodies responded to
the suspense, their breathing change, and thanks to the carbon
dioxide sensors in her theater, Poppy had a map of
the audiences emotionally experience. It's this power of the audiences
(01:09):
on that journey and experiencing it with the filmmakers, and
it's pretty exciting to see that engagement in the theater
and have that history. But our breath isn't our only
tell There's an increasing number of ways machines are becoming
able to read us, even how hard we're thinking. In
thermal cameras, you can track, you can look at dynamics
(01:32):
of blood flow to know stress levels and engagement. Just
in the infrared signatures, you can understand cognitive load. You
can then look at micro expressions of facial recognition to
get past not just if I'm feigning emotion, but really
the authenticity of what I'm experiencing. That gives us a
lot of insight about how hard my brain is working
(01:52):
and how engaged I am. We haven't changed this humans.
What's changed this ubiquity of sensors and the capacity of sensors.
The the cost just Fitteen years ago, the cost of
a typical device would be about maybe. Now you're looking
at those devices not even having to be close up
for pennies dollars integrated into every pair of smart glasses.
(02:14):
Going forward, we're on the cusp of two explosions. The
power of machine learning to find patterns and make predictions,
and simultaneously the miniaturization and affordability of cameras and other senses.
Last episode we talked about facial recognition and surveillance by governments.
But when machines contract how we're feeling, our most private
(02:37):
selves become readable, and while that may sound frightening, it
also holds enormous promise for many parts of life. From
beginning to end this episode, we look at what's changing
and what's possible. I'm as Veloshen. This is Sleepwalkers, my
(03:06):
Ma Mama. So there are quite a lot of situations
where personally, I don't actually want to be read. I'm
not sure about you, like I want to hold them
like they do in Texas play, Like when I'm playing Texas,
hold them on people to know what I'm thinking, right,
and not just at the poker table. In fact, our
society kind of relies on the idea that we can
(03:27):
look one way but feel another. Obviously, in plays like Hamlet,
interiority is dramatized, but but more broadly, society is where
people have no privacy tend to be a bit scary.
It is scary because the last thing we have on
earth is our privacy. You know, it's like people have
this impulse to share everything on Instagram and give away
their name to a company that wants to sell them jewelry,
(03:49):
and it's just like, truly, our deep our secrets are
the last thing we have, well the last thing we had.
Right probably did a full ted talk on this, and
the thing that I took away from it is will
we live in a near future where a slasher film
will be edited with people's biometric data in mind? Poppy
suddenly sees that on the horizon, and she has a
term for technology that starts to understand us. Empathetic technology
(04:12):
is the idea that you know, it's it's not technology
that empathizes with me or technology that is trying to
emulate human empathy. It's technology that makes use of my
internal experience to be able to integrate that as part
of its interface. Today, it's impressive that Poppy can understand
an audience's emotional journey watching Free Solo by tracking the
levels of CEO two in their breath. But tomorrow it
(04:35):
could lead to new kinds of art and entertainment that
responds to us. Really great hip hop producer I was
talking to wants to create music that is personalized, almost
like a tailored suit for individuals. So you start to
think about a very dynamic integration of the human experience.
That experience becomes something that our technology can be aware
(04:56):
of and optimized for. I want my technology to make
the right di visions so that the experience I have
with it is seamless. Seamless such a seductive word they
named a food delivery service after it, but a dangerous
word too, because for technology to read and respond to
us in real time, it needs to make decisions about
(05:16):
us on its own. You may remember last episode we
spoke with Lisa talia Moretti about some of the risks
of facial recognition technology, but that's not her only area
of research, something that I was looking into really recently,
as our relationship with technology is completely shifting. So we're
moving from a relationship with technology where we are asking
(05:39):
it to do something, you know, it's a pure sort
of input output, And what now we're moving towards is
a relationship with technology where we are trusting technology to
make decisions on our behalf. Lisa teaches at Goldsmiths in
London and Cardiff University. She told us about how her
students have encountered chnology making its own decisions about their
(06:02):
future prospects. One of the things that the students are
starting to do is to game the algorithms that are
being used to mind through candidates cvs. And so what
they figured out is if they put right in white
text anywhere on their CV UM Cambridge, Harvard, Oxford, they're
(06:24):
more likely to get through to the interview process. Does
students know that recruiting algorithms prioritize applications from certain schools,
so they pepper their applications with words they know the
algorithm will like, but written in white. So the human
recruiters are not the wiser. They're marketing themselves straight to
the AHI. They're gaming the algorithm system, which I think
(06:46):
is pretty genius. There's also certain things where students or
candidates who are having to conduct their first interview in
some companies purely online and there's no person on the
other side. You're essentially talking into your webcam, and there's
algorithmic technology that is recording your voice, that's listening for intonation,
(07:08):
that's listening for the types of words that you're saying,
like if you use smart words or your language isn't
perhaps at a level that they would think is appropriate
for business. But do we want computers to deny opportunities
to job applicants that may be qualified but not fully
polished without human review. And the algorithms weren't only analyzing
(07:29):
the student's words. They're also looking at your facial features,
and so they can say if you're nervous or shy.
And some of my students have said that if they
very quickly use like hand gestures, they confuse the camera
and the camera councy if they were nervous or shy
for those particular moments. There's something quite hopeful about these
students averting the algorithms designed to read them. It's not
(07:52):
quite the Summer of six y eight, Cara, but the
youth have still got something. Well, it's true that you
and I both look very good in black leather and
are considered cyberpunks. That's how we met that cyberpunk rally.
I was reading about this Kickstarter campaign called reflectacles like specticles,
but reflective, that's right, And that's because they reflect invisible
(08:15):
and infrared light. When you look back into a camera
that's watching you, which is like an ultimate that's like
a techy middle finger, like I'm gonna look right back
in this camera. It's just gonna like buzz light back
at it completely. So you know, there are methods obviously
for resistance, But what I'm worried about is that algorithms
are very smart and they will wise up and be
(08:37):
harder to trick. Right, and showing up for your job
interview and reflecticles probably carried his own burden as well.
It is how I got this job, though, But we
should also ask ourselves why these companies are using AI
to filter candidates and conduct interviews, And of course it's
really about saving money and saving resources, which brings out
the big question of the series. Who benefits from this
(08:58):
new technology Amazon and Facebook and Google. When we come back,
we look at the economics of giving up our data
and what we get in return. Sensors and AI to
analyze our response to movies or decide if we're a
good fit for a job. May sound like the stuff
(09:20):
of dystopian science fiction, and that's because it is set
up a progner and Tilmon Rope. I'm placing you under
arrest for the future murder, Sarah Marks, you have a
man his head. Yeah, oh, gosh, Well, it's interesting you
mentioned Minority Report because, um to this day, so many
(09:42):
years later, decades later, I'll be in some meeting in
Silicon Valley and We'll be looking at some gadget and
so I say, Wow, this gadget is great. It's like
from Minority Report. It's so cool, And I'm like, that
was supposed to be cautionary. That was a description of
the bad world. That was what we want to avo wide. Oh,
for God's sake, that's Jaron Lania. He's a research scientist
(10:04):
at Microsoft and in the eighties he coined the term
virtual reality after helping invent the field. Jaren's thought a
lot about what our relationships with technology mean for us,
So when Steven Spielberg was making Minority Report, he called
on Jarren to act as a technology consultant. Mostly, what
I've taken from Minority Report is that just trying to
(10:24):
do cautionary portrayals of technology actually backfires, because there's some
way that it's a little bit like when you show
the Life of Billionaires, people don't get angry about like
why do those people monopol or where they own whole
islands or something. Instead they say, oh, I identify with
that person. Maybe I could own a whole island someday.
And despite our fascination with dystopian fiction, we also have
(10:48):
a tendency to fantasize about ourselves as the beneficiaries, not
the victims, of the systems we create, and we tend
to ascribe those systems their own will, even though we've
made them. Early in the history of capitalism, Adam Smith
suggested that capitalism or markets were an invisible hand, as
sort of a life form. And in the same way
(11:08):
that you can interpret a market as being this living
thing just because it's a little beyond our understanding, it's
a little too complicated to fully predict and fully understand,
and and that's actually its power. In the same way,
big computational systems can be a little out of control,
not entirely, but even if they're only a little bit,
you can interpret that as being the new invisible hand,
(11:30):
which we call artificial intelligence. Invoking an external force like
the invisible hand, or an algorithm that automatically reads resumes
or makes parole recommendations obsculls real human decisions. We have
to remember that our creations reflect us. If you use
that to abdicate your responsibility. If you use it just
(11:50):
to cower and fear, then you're not being a good
computer scientist. That is not the responsible way to do things.
Just as if an economist says, well, the invisible hand
says all these people should starve, that's not a responsible economist.
The responsible economist fixes the problem in a sense. I
think it's very hard to be effective if you believe
(12:11):
in some kind of magical agency in your own inventions.
I think you make yourself into an idiot. And and
so I'm really concerned that not only economists but computer
scientists make that error all the time. It's almost like
a new form of mythology. I've been calling it alchemy lately.
But yeah, sure, it's certainly easier to say, oh, we
should respect this amazing autonomous living thing that has arisen
(12:34):
in our own inventions. It's much easier to say that
when it's benefiting you and you're getting very rich. Jarn
puts his finger on a central irony in our relationship
with technology. When our creations benefit us, we're quick to
forget who pays the price. People who translate between natural languages,
such as between English and Spanish, have seen their career
(12:54):
prospects on the whole decrease tenfold since the arrival of
Autumn Addic Translation, which is offered for free but companies
like Google and Microsoft. Now, the thing is, you might say, well,
this is very sad, but it always happens. People are made,
people's jobs become obsolete when new technologies come along. The
buggy whip goes away and the motor car comes. All right,
(13:17):
but the problem is that every single day, those of
us who help run these free services have to scrape
or steal tens of millions of example phrases from all
over the world from people who don't know it's being
done to them. And the reason why every single day
there's new pop culture and slang in public events and
memes and on and on and so you need to
(13:38):
constantly get new phrase examples to feed into the translation engines.
So it's a weird thing. We're telling the people you
don't get a job anymore because you're not needed. Oh,
by the way, you're needed. We need to steal with you.
Oh but by the way, we won't even tell you.
And it's all based on this lie that we don't
need people. Um, and that lie is based on this
need to pretend the AI is this free standing thing,
whereas we could instead think of it it's just the
(14:00):
way that's African channel value between people in a new
and better way. The free availability of real time translation
opens up a world of possibilities for travelers, for language learners,
for long distance lovers. But these technologies have invisible costs,
like the translators losing their jobs to a tool that
was trained on their work. And this kind of unpaid
(14:22):
labor is actually something all of us participating every day
without even realizing. Here's Lisa again. The way that these
voice activated assistance are being trained is through huge amounts
of data. A bit of an unknown secret by many
people who have an Alexa is that every single time
you are talking to that device, it's being recorded and
(14:46):
being stored and going back to the cloud to train
all of the other echoes around the world. So the
users of echo devices are providing free labor on behalf
of these massive organizations in order to train the system.
It's not the first thing we think that after purchasing
an Alexa and using it to buy stuff online, every
(15:08):
time we interact with it, we're also helping Amazon improve
and make more money, but framing our data in terms
of labor helps us think about technology in new ways,
and our Alexa use reminds Jarren of another science fiction movie.
The reason. One that's really gotten to me is that
probably the most famous cautionary tale about computers was in
(15:29):
two thousand one Stanley Kubrick and Arthur C. Clark's movie
And There's this computer called how that's this round thing
that sits on the wall and just looks at you
and talks to you, I'm sorry, I'm afraid I can't
do that, and it ends up going berserk and killing
people and just and they have to the program it.
And the hot new gadge for the last few years
has been this round thing that sits there and looks
(15:50):
at you and you talk to I'm afraid I can't
do these smart speakers and so on, and it's like,
no matter how many cautions we put forward, people just
follow right into it. It's it's astonishing to me. Of course,
Alectra is powered by artificial intelligence. It takes a machine
learning to understand what you say to it. But maybe
the bigger breakthrough has been our decision to let listening
(16:12):
devices into our homes. Yeah, I think that's true. I
didn't grow up talking to something in my house. It's
interesting when you read those articles like deep inside North Korea.
You know, the thing that journalists always writes, there's a
speaker in every house which projects the chairman's voice into
the homes. As always the shocking detail. And of course
now we will have Alexas in our houses, you know.
(16:33):
I think it's interesting that Amazon was being delivered to
us the boxes on our door step. That was the
farthest they were going to get right, And now with
Echo and Dot, these are devices that are inside of
our homes, that are on our countertops. We're at this
place where Alexa is now a part of the family.
And now we have this first generation of children growing
(16:53):
up with Alexas and other smart devices at home, interacting
with them, seeing their parents talk to them all the time,
and they're already used to this responsive technology. The giving
away our data piece is disconcerting, but there's another piece
of our shifting relationship with technology, which is why I
dragged Julian into New Jersey to one of the smartest
homes I know. And no I'm not talking about i Q.
(17:17):
Let me ask you a question. What does your little
brother look like a little guy? So what would you say?
What does Alexe look like? But where does she live in?
They really so every time you talk to Alexei, you're
talking to space, despite having an Alexa who lives in space.
(17:43):
My friend and her husband live in the suburbs with
their sons, who are almost two and five. When they
recently moved to New Jersey to fit their expanding family,
they did not scrimp on smart home devices. Yeah, we've
got two little kids. We both work. It doesn't bother
me that they know our habits make it easier for us,
like to get stuff done. Like I'm cool with Amazon
just sending me diapers because it knows what I need
(18:03):
diapers for parents like my friends, devices like the Amazon Alexa,
Google Home and their marriad counterparts are genuinely helpful, which
is probably why over a hundred and eighteen million American
households have a smart speaker. That's half of the US.
And when you've got something so involved in your home
life that it helps with diapers and groceries, it's bound
to affect some other areas as well. Your dad said,
(18:25):
they're too There's there are three women in the house Google,
who I don't know, who's the third woman in the house?
And even bedtime is different. Hey, Google, tell me a story. Sure,
(18:48):
here's one from Nickelodeon. It was a sunny day and
Mr Porter was visiting farmer. He'll have I'll Google reading
a story when he's in bed and we don't want
to read any more books. You for your Google go
off when you as if closed the door. Yeah. To
be clear, Google Home has not replaced real life bedtime stories,
(19:10):
but it has enabled my friend's kid to get more
out of bedtime. He can keep asking for stories long
after his parents need to stop reading. But focusing on
the privacy component of these devices doesn't capture the full picture.
Not only can the rhythms of family life change in
response to a digital assistant, but so can kids expectations.
And that's true for all of us, even those of
(19:31):
us already passed early childhood development. We should think about
how we're affected long term by our expectation of seamless delivery.
You've got a good rapport with those kids, Kara. Neither
of us have our own kids, but our editor among
us does. I was curious for his take on next
to joining the family So I'd gone out to this
(19:51):
wedding a couple of years ago in Seattle, and it
was in this fancy hotel and there was an Alexa
there in the room, and Azzie and I went out
for dinner or something, and we left the kids with
a sitter, and the kids were mesmerized because they never
encountered one of these devices before, so they were watching
the babysitter interact with it, calling up music and whatever else.
(20:13):
But then she also ordered food, so they were just
floored by this. Then the next week, we were back
at home and I was watching my four year old
just like stomping around the house, and she started barking
Alexa pizza, and it was just so confusing. She immediately
knew that there was this thing you could just bark
(20:34):
at it and and get food, and she wanted results.
But I have a conflicted feeling about all of this
because I had grown up in the States, very middle class,
and every couple of years we go to India and
visit my relatives who were a little wealthier. And we
went to a party once with one of my cousins
and I saw this kid who was super wealthy, and
(20:55):
he was yelling at his chauffeur. He was yelling at
his mom, he was barking at the maid that they had,
and it was just so gross. And when you see
this sort of entitlement in front of you and expressed
in this way, you don't want your kids growing up
with that, right, and you want everyone to be treated
as humans. And and so for my daughter to be
(21:16):
stomping around barking what she wanted, you know, I don't
want that to be her way of speech. This kind
of reminds me of the movie Invasion of the Body Snatchers,
except the Monash version, which is that you know, he
went out to dinner with his wife. He comes back,
his daughter is like Alexa pizza, and he's like, what
has Alexa done with my chid? You know, she was
(21:39):
like the sweet little girl, and now she's like a
child that's aware of an Alexa based on one encounter. Right.
I feel like if an alien came down from space
and came into my apartment and saw me cooking dinner
and then saw me go, Alexa turn on Paul Simon radio,
the alien would be like, is she yelling at a
woman who's going to go turn on the music? Um?
I think right, now we are all, you know, Mangesh's
(22:02):
daughter included, participating in this life altering moment of self
delusion where we're sort of collectively accepting smart devices in
our homes. We're not just accepting them, we're treating them
as human right, And I think this is the slippery
ish slope of using voice assistance. More and more we're
speaking to voice activated devices as though there are family members,
(22:23):
and there have got to be some long term implications,
both emotional and psychological, of blurring the line. I agree,
and we're going to hear more from Jared about that
after the break, But we'll also speak to Poppy Crumb again,
who believes that we've barely scratched the surface of how
these devices can change how we live for the better children.
(22:48):
Interacting with Alexa and Siri and other digital assistance is
particularly striking because of the developmental implications, But interacting with
machines who understand and respond to us can his questions
about who and what gets to be treated as a person. Here, Jarren,
I think the problem isn't the math or the computer
(23:10):
science algorithms. I think the problem is our framework for
thinking about them tends to be machine centric. Instead of
human centric, and it tends to create dangerous for the
whole and and to create a lot of confusion, And
a lot of it is because of this ideology of
thinking of the machine as being alive. And when we
remember that our AI inventions aren'to live and simply reflect
(23:30):
the inputs we give them, we can do a better
job at harnessing their power for good. You may remember
kai Fu Lee from the last episode. Before running Google China,
Kaifu worked at Apple, where he helped develop Siri. AI
is programmed by people. It is up to us to
remove the factors that we don't think it's appropriate to
(23:50):
be considered in the decision from an AI. So if
we want to eliminate sexual orientation from a long decision engine,
we can do that, or if one to eliminate it
from a job application, we can do that. It's actually
better than people. You can't force people to completely ignore
these sort of things from their decision. They can try,
(24:12):
but our brains are not separable in that way. Engines
actually are being able to program a way out of
our messier human biases is a big deal, but it's
only made possible when we acknowledge that we control what
an algorithm learns from. So if we get it right,
just how good could our relationship with technology get? His
(24:32):
Poppy crume again. I had a relative who you know,
was at the end of life, and I was with
him for the last few weeks in the hospital. He
hadn't been speaking for a couple of days. I had
taken on Amazon Alexa. I was using it simply to
play music. I was playing classical music. And all of
(24:55):
a sudden, he says, Alexa, play al Green and I was,
And you know, and Poppy had always interacted with our
uncle through classical music. It was their thing. And here
he was, at the end of his life, requesting R
and B an interest she didn't even know he had.
He wanted to hear Algreen and sly in the family stone,
(25:15):
and I was like, nowhere near that. But the empowerment
the device allowed at a very vulnerable and sensitive and
important time, he smiled the end of life. And it's
the access to memories, the access to that internal richness,
the things that might bring someone the most comfort are
(25:37):
we all don't know. Amazon's Alexa really opened up great
opportunities for what our relationship with our technology can be suddenly,
Alexa isn't just something that dims your lights or tells
you if his reigning as you rush out of the door,
but actually a device that can change profoundly how you
live and die. And Poppy noticed other ways, and Alexa
(25:59):
could have up her uncle beyond playing al Green. I
sat in a hospital room where I saw errors be made.
I saw information be captured, incorrectly written on the board
one way, shared to a different nurse, another, shared to
a different doctor, another And I said, all of these
different things happened that with the right coordination of that
(26:24):
same device that just allowed my uncle to hear al
Green on Q it could have also been a huge
part of improving not just his mental wellness but his
physical wellness. Because we're humans, we make errors, we make mistakes,
We're not good at integrating information all the time, and
our fallacy comes in places that technology can solve. I
don't want to discount hospital stuff. They work very hard,
(26:45):
uh and and everything, but people are sometimes you haven't
had enough sleep, or they don't know someone else. You know.
People try to help at different points in time and
end up sometimes introducing error and mistakes. Technology that's actually
capturing or registering information for a user. There's obvious ways
that it can help improve the interaction to Poppy. The
true power of Alexa is not to respond to specific
(27:08):
requests like a super assistant. It's to monitor us constantly,
detecting patterns that we can't more like a parent. And
as of now, as far as we know, that's not
what it does. What Alexa does right now is not
what would actually benefit us most from a healthcare perspective.
Right Alexa is listening for a wake word. It's listening
(27:29):
for a particular que It's not holding that longitudinal data
to learn our behaviors and and such right now. Not
because of technological barrier, No, because of I think social
and privacy barriers, and those barriers tend to erode in
response to new technology. Twenty years ago, we would never
have believed we would summon strangers from the Internet and
(27:51):
climb in their cars either, you know. So we evolve
when the capacity, when the convenience is introduced, and the capability. Clearly,
technological innovation is only part of the equation. Becoming comfortable
with new uses for those technologies opens up new worlds
of possibilities and everything from how we listen to music,
(28:11):
to how we take care of each other. So let's
say we accept this new bargain and open ourselves up
to constant monitoring what might the future look like. Companies
are looking at these things as ways of knowing not
just someone is taking their medicine for an aging population
or someone who's healing, but to know if actually they're depressed,
are they under mental stress as well? And that becomes
(28:34):
a great opportunity for autonomous living for elders, where the
caretaker knows a lot more about how well the individual
is healing and is doing at a particular point in time.
There's this real irony in all of these situations that
where through more tracking of my information comes freedom and
(28:55):
you gain autonomy through the amalgamated data. There is an
angel belief that more privacy we have, the more freedom
we have. But Poppy says it's time to rethink that relationship.
If you look at an elder who might otherwise be
in a care home but instead gains ten years of
autonomous living because you now have more ubiquitous understanding of
(29:16):
our mental and physical wellness, you have people having a
lot more freedom with simply having a richer understanding of
their internal experience and their personal data. Crucially, if we
do feel comfortable trading our privacy for more agency, we
need to be very sure we can trust the people
who get to see our data, because they're not just
(29:37):
seeing us naked, they're seeing under the hood. The physiological
tells that betray our private emotions. For me, everything is
about transparency. No one should be tracked when they don't
know they're being tracked. How our technology interacts with us,
whether we share information or not, our technology can't know.
I think that's the real issue. We have to recognize
(29:57):
that this sort of cognitive sovereignty or agency that we
believe in is a thing of the past. It means
we have to redefine what that future looks like. It's
a future we can all participate in building, and it's
one way better off building as citizens with a collective
voice and long term objectives, rather than as lone consumers
in search of the best deal, whatever the cost to
(30:20):
us in society. As Poppy says, there's a difference between
an Alexa waking up to respond to commands like play
our green and an Alexa that is always on building
a model of our behavior that knows us better that
we know ourselves, but the crux of that difference is
more cultural and political than technological. Are we willing to
give up our poker faces and allow ourselves to be
(30:42):
read by sensors and algorithms in return for longer, safer,
happier lives like Poppy's uncle? Or knowing the history of
governments who have monitored and categorized citizens, Should we be
doing everything we can to hit pause? Will our technology
become a safety net or a spider's web? Next episode,
(31:05):
we travel to Facebook's headquarters and investigate some of the
more dangerous corners of the Internet, and, knowing that AI
can both learn about us and imitate us, we take
a hard look at deep fakes and examine a world
where it's increasingly difficult to tell truth from fiction. I'm
az veloshen see you next time. Sleepwalkers is a production
(31:37):
of I Heart Radio and Unusual Productions. For the latest
AI news, live interviews, and behind the scenes footage, find
us on Instagram, at Sleepwalker's podcast, or at Sleepwalker's podcast
dot com. Special thanks to Briany Cole. We had a
conversation with Briany that made this episode possible, and Brian
is the host of a fascinating podcast called Future of
(31:58):
Sex that's all about using technology to make our lives better.
Sleepwalkers is hosted by me Ozveloshin and co hosted by
me Kara Price, with produced by Julian Weller with help
from Jacopo Penzo and Taylor Chikoin mixing by Tristan McNeil
and Julian Weller. Our story editor is Matthew Riddle recording
assistance this episode from Tofarel and Phil Bodger. Sleepwalkers is
(32:21):
executive produced by me Ozveloshin and Mangesh Had to Get Up.
For more podcasts from my Heart Radio, visit the I
Heart Radio app, Apple Podcasts, or wherever you listen to
your favorite shows. Alexa Pizza