All Episodes

August 1, 2024 131 mins

On this episode, sexy lady robots Caitlin, Jamie, and special guest Olivia Gatwood discuss Ex Machina. 

Follow Olivia on Instagram at @oliviagatwood and buy her book 'Whoever You Are, Honey' at https://www.penguinrandomhouse.com/books/653524/whoever-you-are-honey-by-olivia-gatwood/ 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
On the Bechdel Cast, the questions asked if movies have
women and them, are all their discussions just boyfriends and husbands,
or do they have individualism? It's the patriarchy, zeph and
best start changing it with the Bechdel Cast.

Speaker 2 (00:16):
It's Caitlin and Jamie's Session one.

Speaker 3 (00:19):
Who's who Am I the sexy robot?

Speaker 2 (00:24):
I mean, clearly you've chosen.

Speaker 3 (00:25):
I mean, well, I'm the sexy robot.

Speaker 2 (00:28):
I don't know, are you? Yes?

Speaker 3 (00:30):
And I have awareness? I think I don't know. That's
your job to figure out.

Speaker 2 (00:36):
God. I mean truly. In the scene where dominal Gleason
needs to decide if he's a good person and is
like yeah, I was like, oh fuck, I would do that,
Like basically, yeah, I'm like kind of low ki and orphaned,
so I'm probably a good person like he. Uh I

(00:58):
love his damn bowlcott in this movie. Anyways, I'll be Caleb. Okay,
you be sexy robot. You look more like Alicia Wikander. Honestly,
Oh my gosh, thank you so much. I'm just stating
facts here. And I'm as tall as Dominald Gleason. We're
both so big.

Speaker 3 (01:17):
Yeah, So Caitlyn Jamie Session one, and uh can't wait
to see how this how this culminates, you know.

Speaker 2 (01:24):
Yeah, I'm just gonna hit on you and say you
pass the Turing test.

Speaker 3 (01:29):
Oh my gosh. And then I'm gonna trick you and
leave you to rot.

Speaker 2 (01:33):
And your right to do it and you're right to
do it.

Speaker 4 (01:36):
And here we are.

Speaker 2 (01:38):
This is the Bechdel Cast. My name's Jamie Loftus aka
the Tester.

Speaker 3 (01:43):
M HM, and my name is Kaitlin Derante aka the
Sexy Robot.

Speaker 2 (01:47):
Yeah, and this is the podcast where each and every
week we take one of your favorite movies and look
at it from an intersectional feminist lens. And this is
not just an episode that we have gotten requests for
for many, many years, but also one of our famous
cursed episodes. This has only happened in the eight years

(02:07):
we've been on air and can counted on one hand
where we actually did record an episode about this very
movie two years ago and the audio on my end
it was unusable and we didn't release it.

Speaker 3 (02:21):
Yeah, but we're back doing it again.

Speaker 2 (02:24):
I always do feel like it's meant to be. And
shout out to Alyssa Nutting, who was on the first
run of this episode. No one's fault, but I do
always feel like anytime there's a last episode the eventual guest,
it's just meant to be, and today it feels very
much the case. So it's our ex Machina episode of
the Bechdel Cast.

Speaker 3 (02:43):
That's right, and our guest is a writer based in California.
She's got books of poems. Her first novel, Whoever You Are, Honey,
just came out. She's the co host of Say More
podcast with friend of the cast, Melissa not Oliva. It's
Olivia Getwood.

Speaker 2 (03:03):
Hello, y, hi everyone, thanks for having me. Welcome.

Speaker 3 (03:08):
Thanks, it's session one for you.

Speaker 4 (03:11):
Yeah, I'm so nervous.

Speaker 2 (03:13):
Yeah, the tables have heard. Now Caitlin and I are
a two headed Caleb.

Speaker 4 (03:19):
Does that mean I'm the sexy robot?

Speaker 2 (03:21):
Yeah, yes, congratulation.

Speaker 4 (03:23):
To shut off the power.

Speaker 3 (03:24):
Hello, the recording is lost. That's what happened last time.

Speaker 4 (03:29):
Yeah, that's so what happened.

Speaker 2 (03:31):
I think we joked about that at the time, where
it was like Ava was not pleased with my.

Speaker 4 (03:36):
Takes and she, yeah, just shut it all down.

Speaker 2 (03:40):
She did. I hope she'll be pleased because it is interesting,
like this is a movie that I feel like every
time I watch it, I always really like it. But
I feel like it's hard to watch this movie the
same way twice, or at least I have found that
to be true.

Speaker 5 (03:54):
Yeah, I feel like, especially as time goes on, as
like technology gets more advanced, the movie just ages in
really weird ways.

Speaker 4 (04:02):
But anyway, Yeah, it.

Speaker 2 (04:04):
Is interesting because we originally recorded, you know, before Ava
sabotaged me. We recorded an episode on this movie in
summer twenty twenty two, when AI felt more like a
nebulous thing, and now it, like you know, in the
ensuing two years, has become a real and present threat
to almost any industry in an immediate way, like we've

(04:26):
all gone on strikes about it. Like it's just it's
interesting looking back in my original notes, like what a
vague threat it was, and now being like, oh no,
this is actually it's here and it did come for
my job. It's not good at it thankfully yet.

Speaker 4 (04:43):
Hmmm, yeah, it's really bad.

Speaker 2 (04:45):
It's so bad, like Ava cannot bullshit the way that
we can bullshit so far, but let her cook, we
don't know. Yeah, So on that note, I mean, Olivia,
what is your relationship to this movie?

Speaker 5 (04:57):
Okay, so I saw I went to see this movie
in theater when it came out, which was I think
was thus twenty fourteen, ten years ago or something like that. Yeah, Yeah,
And it was a really bizarre experience because I was
like in my early twenties in college. I had just
moved away from home. I'd moved from New Mexico to
New York, and I was like in this big moment

(05:19):
of looking at my past and looking at my identity
in these new critical ways, as one does when they
go off to college. But I was like, you know,
analyzing all of my relationships with men, my relationship with myself.
And it was really bizarre because I watched this movie
and I walked away from it feeling this incredible closeness

(05:44):
to this robot in a way I hadn't really anticipated.
Like I watched it and was like I am her,
she is me, and I couldn't identify it, and it
felt really profound when I was twenty two. But I
think what I learned from it was that I felt
at that point like my entire identity had been built
and created by men, and it sent me on this

(06:07):
very long journey that I'm still on of gradually undoing
the many pieces of fabric of myself that feel sewn
by someone else. So that was sort of my initial experience,
and then I've just watched it like every two years
since then to just check in, to just check in
on how it's doing.

Speaker 2 (06:27):
Yeah, it is truly like the Turing test of where
you're at with anything that movie is exploring. The older
this movie gets, the more I'm like, everyone should watch
this movie every couple years, if only as like a
yardstick for their own growth.

Speaker 4 (06:41):
Hmmmm.

Speaker 2 (06:43):
Absolutely, Caitlin, what's your history.

Speaker 3 (06:46):
I don't think I saw this movie in theaters, but
I saw it within a year or two of a
coming out. There was a lot of hype around it,
and I'm someone who if I hear about hype about
a movie, I'm like, I have to see it.

Speaker 2 (07:01):
I have to be a part of the conversation.

Speaker 4 (07:03):
I can't be left out.

Speaker 2 (07:06):
It's a hype man.

Speaker 3 (07:07):
I just have a lot of fomo, is what it is.
It's not about the hype so much. It's about I
want to be a part of the conversation.

Speaker 2 (07:15):
Your fingers on the pulse. You see every movie that
comes out, and that's just a fact pretty much.

Speaker 3 (07:21):
Except I still haven't seen bike.

Speaker 2 (07:22):
Riders, Oh my god.

Speaker 3 (07:25):
But otherwise I've seen every movie.

Speaker 4 (07:28):
What is Bike Riders?

Speaker 2 (07:30):
Exactly? What is Bike Riders? Oh gosh, it is what
it sounds like, I think, unless there's some sort of
twist in the Bike Riders where they're not riding bikes.

Speaker 5 (07:39):
I did just watch a documentary about Lance Armstrong today.

Speaker 3 (07:43):
It's the other kind of bike. It's the motorbike. Oh okay,
the motorcycle Boys. And Tom Hardy's there, Austin Butler's there,
and that's really all I know about it.

Speaker 2 (07:55):
Oh, Jody Komber's there. I love Jody Komber. Yeah. I
recently had conversation with someone for my other podcast where
it was like a really interesting conversation, but by far
the most shocking and random thing the person told me
was that they love and forgive Lance Armstrong. And it

(08:15):
was like it just I don't know, I haven't thought
about the Lance Armstrong scandal in a long time, but
it was just like so unrelated to what we were
talking about, but he felt it so strongly, and I
was like, I have to lead, like I'm not going
to cut this out of the tape, because clearly he
needed to let me know, no, and.

Speaker 5 (08:33):
It's going to become a very important part of the
discourse because of this docu series.

Speaker 4 (08:37):
But also I just finished it.

Speaker 5 (08:39):
And I was literally being like, Okay, whatever you do, Olivia,
just don't bring up Lance Armstrong on this podcast, do
not do it. And then I did it in the
first five minutes, so it's clearly impactful.

Speaker 2 (08:50):
He's on the tip of everyone's tongue. Wait, where is
this now? All of a sudden, this is a promotional
explore This is it like on HBO? Where is it?

Speaker 4 (08:57):
It's on Netflix?

Speaker 5 (08:59):
It's too It's a two part thing at there, each
an hour and a half. It goes into everything I
personally don't love and forgive Lance Armstrong.

Speaker 2 (09:07):
I don't think I do either.

Speaker 5 (09:09):
Yeah, I'm not like particularly angry at him, but I
think he's I don't know, he's just like a deeply
disturbed person, I think, and I think he battles with that,
but I don't know. I just don't feel love and
forgiveness towards him. I suppose I feel a bit scared
of him.

Speaker 2 (09:26):
I think it's weird, like the intimate connection you feel
with someone having bought their bracelet at eleven years old. Like, yeah,
that was ultimately the thing where it's like I trusted you.

Speaker 5 (09:38):
Yeah, yeah, I think that's what it was. Okay, and
then I'll stop, I promise, I'll stop. But I think what.

Speaker 4 (09:44):
It was was that it's the depth of the lie.

Speaker 5 (09:48):
Because everyone in the sport was doping, so it's not
that he doped. It was the depth with which he
got angry about people accusing him.

Speaker 4 (09:58):
It was like suing people, slandering people.

Speaker 5 (10:01):
It was that aggression that I think is pretty hard
to see and then be like I forgive you, you know, right, yeah,
because it's pretty intense.

Speaker 2 (10:12):
Man, I've got to watch it.

Speaker 3 (10:14):
I mean, connecting it back to the movie. He lied,
and then there's that part where Ava is like, he's
not your friend. Nathan lies. He lies about everything. Nathan, Yeah,
there you go.

Speaker 5 (10:30):
And Oscar Isaac is a bit of a Lance Armstrong.
They are certainly cut from the same cloth.

Speaker 3 (10:36):
There's a type.

Speaker 2 (10:37):
Anyway.

Speaker 3 (10:37):
So I saw the movie oh right, I'm sorry, probably
like twenty fifteen or so, and I thought it was
really well done. I thought it was cool production design.
I don't think I saw it again after that, though,
until we recorded the episode that has lost.

Speaker 2 (10:56):
Two years ago for sure, Yeah, for sure.

Speaker 4 (10:59):
Yeah, I don't know.

Speaker 3 (11:00):
It's it's a little too It's not a.

Speaker 2 (11:02):
Romp, so certainly not.

Speaker 3 (11:04):
I usually don't go back and revisit movies that are
not romps. But I think it's a good movie, and
I think it's gonna generate a fascinating discussion among us.

Speaker 2 (11:14):
It's an awesome movie for this specific podcast. There's no death,
there's no denying.

Speaker 3 (11:19):
Absolutely.

Speaker 2 (11:20):
I did not see this movie in theaters when it
came out, I think, so I watched it the year after.
But I actually I saw this movie in theaters for
the first time because as an AMC stubs member, they
re released a lot of a twenty four movies in
theaters for like one or two days only in the
last couple of months, and I was fortunately free the

(11:43):
day X Machino was screening, and so I got to
see it in a theater for the first time. I mean,
I think I've probably seen this movie five times now,
maybe more. But Olivia a similar deal like where it's
like every couple years, I'll rewatch this movie and I
really like it. Alex Garland on the whole, I actually
don't really know because I did not see men or

(12:06):
Civil War. I like Annihilation, but ex Machina, I mean,
it's like definitely my favorite of his that I've seen
so far. And I don't know. I had in my
notes from two years ago and I was like, wow,
good for her, that's funny. I had it as like
alternate title my Weekend with Elon, and that is true. Yeah,

(12:26):
but I think this movie is like really good. It's
really good. I feel a little differently about it every
time I watch it, and so this is the encapsulation
of how I felt about it on this watch. But
similar to you, Olivia, I felt very connected to Ava,
and the more I watch, I mean the connection to Ava,

(12:46):
and I know this is like so in conversation with
your novel, Like the first time I watched this movie.
I don't know if I said this in our original
version of the episode. The first time I watched this movie,
I fell for it, fell for everything it was possible
to fall for totally, which I think is why I've
rewatched it so much, because when you rewatch it, I

(13:07):
don't know if it's like I mean, it's partially that
I'm rewatching it and partially that I am ten years
older than when I first saw it and as time
goes on. You know, originally I was shocked at the
ending and really intrigued and like, oh now I for
real love Ava, you know when you see what she
does at the end of this movie. But I was

(13:29):
genuinely shocked and pulled in by all of the choices
this movie is making to convince you that Caleb is
a good guy. And now it's like, looking back on
my first viewing, it's almost like a little bit embarrassing
to be like whatever, twenty two year old me was
too drawn in by the good guy aesthetic of Caleb's
character to realize that he never considered taking Kyoko with him.

(13:52):
So he's a bad guy, like and it just like
grows with you in the way that good movies due.
And I'm really excited to talk about it.

Speaker 3 (14:03):
Same. Yeah, let's take a quick break and then we'll
come back for the recap. And we're back, and here's
the story of Ex Machina. We meet Caleb Smith played

(14:28):
by Donald Gleason. He has just received news that he
won something, which turns out to be a weekend getaway
with his boss, who is an uber rich tech mogul
who we will find out created a company called Blue Book,

(14:52):
which is basically Google.

Speaker 2 (14:54):
I'm almost certain this is the first movie I saw
Oscar Isaac in.

Speaker 4 (14:58):
I think that's true for me to maybe me too.

Speaker 2 (15:01):
Yeah, what a strong debut, I know that. I'm pretty
sure if I'm remembering correctly, I had seen Dominic Gleason
before in an episode of Black Mirror, which you could
argue this movie is a very long episode of Black Mirror.

Speaker 4 (15:16):
True.

Speaker 5 (15:17):
Yeah, And they love guys like that for Black Mirror.
They just love guys like that. I don't know why.

Speaker 2 (15:23):
His eyes are a little too far apart. They're like, yes,
we're gonna light him in aggressively blue lighting and it'll
be haunting.

Speaker 5 (15:33):
And Oscar Isaac just has incredible range, like seeing this
his first and then seeing kind of who he is
and what he can do. I mean, I think it
kicked off like my deep parasocial romance with him.

Speaker 3 (15:47):
Absolutely right there with you, right there with you.

Speaker 2 (15:50):
He is a star and the truth and like god,
he's so he's so evil, but he's so hot in
this movie.

Speaker 5 (15:57):
It's really amazing, ridiculous. Yeah, the merging of those two things.
I will say I had a conversation with my friend
once who builds self driving cars. And he was explaining
to me the difference between an engineer and a founder
and how they are two very different personalities in the
tech world. And he said that one of his qualms

(16:19):
with this movie was that Oscar Isaac is much more
of a founder, which he is in the movie, but
he's much more of a founder than like a coder,
Like those are two different things.

Speaker 2 (16:31):
That's fascinating. So it's like the jobs versus the Wosniak
kind of deal.

Speaker 5 (16:36):
Yeah, yeah, and that like this guy wouldn't be like
living in the middle of nowhere remotely, like he would
be somewhere very forward facing. So I thought that was interesting,
but I still deeply love the character.

Speaker 2 (16:48):
That is kind of fascinating to me. It's like, I
guess I, over the many viewings of this have like
filled in the blanks of my mind to my own
satisfaction of like, why would someone with this much rocker
arma he choose to live in the woods. Yeah, And
then sometimes I'm like, well, a lot of times, like
maybe more often than not, people with charisma are very insecure.

(17:10):
Maybe he's leaning into it, but yeah, that, yeah, I've
never heard that like that clear of a delineation and
that makes sense.

Speaker 4 (17:16):
Yeah, and he's not alone.

Speaker 3 (17:18):
No, well, he surrounded himself with many sex spots of
his own creation.

Speaker 2 (17:25):
Yes, his girlies, his cur his girls are there. Yeah,
he's grew. These are his girls. He is grew.

Speaker 3 (17:34):
Okay, So Caleb is delivered via helicopter to his boss
is very large, very remote estate slash research facility. I
guess I was thinking, like, oh, he probably doesn't live
there full time. He probably just like goes there when

(17:56):
he needs to, like think and make other sex spots.

Speaker 4 (18:01):
See.

Speaker 2 (18:01):
I love that everyone has their version of like, how
does this hot guy who's good at talking to people?

Speaker 4 (18:08):
Why is he here?

Speaker 5 (18:09):
Everyone's like, if we were in a relationship, he would
come to see me. I think everyone's just writing this
story in their head. He would definitely fly to LA
and hang out, and then he would have to go
sometimes and that would be okay. I mean he has
to work, that's fine.

Speaker 2 (18:22):
Yeah, and when he was away, he would be doing normal.

Speaker 3 (18:25):
Stuff, right, And he built me a replica of himself
that I get to hang out with when he's off
doing other stuff. Anyway, So we're at this very remote place.
Caleb arrives and enters this place that belongs to Nathan

(18:46):
played by Oscar Isaac, who he meets shortly thereafter. Nathan
explains that in this house, some rooms are open, others
are off limit, and a key card that has been
issue to Caleb will let him know which rooms he
does and does not have access to.

Speaker 2 (19:06):
This troope, I know goes back probably centuries, but I
always think of it as like Beauty and the Beast
the West Wing, Like you will not go to the
west wing because that's where the thing is. And yeah,
you're like, oh, he's you know, the Beasting.

Speaker 3 (19:22):
Yeah exactly. And then he's like, but this huge library,
you can go to it as much as you want.

Speaker 2 (19:28):
Yeah. I do feel like in some ways, yeah, Caleb
and Nathan, at least on the surface, are kind of
Beauty and the Beast coded this outsider enters the castle
and you can only go to certain sections.

Speaker 3 (19:42):
Oh, you're right, and the other inhabitants aren't what you
think they are.

Speaker 2 (19:47):
Yeah, I guess it's kind of a reverse where like
you think they are people but they're not. But it's this,
it's flipped any other one.

Speaker 4 (19:55):
Yeah.

Speaker 3 (19:56):
Anyway, So Nathan then makes Caleb sign a pretty ominous
non disclosure agreement fifty shades of gray.

Speaker 4 (20:06):
Vibes, Yes, right, very much, you will be on birth control.

Speaker 2 (20:12):
Yeah, God, I feel like I wiped that from my mind,
but he does say that, Yeah, it truly does.

Speaker 3 (20:19):
And then Nathan tells Caleb that he has developed AI
and he wants Caleb to help him test it via
the Turing test, which is not a test where two
people of a marginalized gender speak to each other about
something other than a man. It is. Instead, it's a

(20:41):
different test. The Turning test is past when a human
interacts with a computer, but they don't know that they're
interacting with a computer because the computer's AI is advanced
enough that it seems human.

Speaker 2 (20:55):
Right, And that's the funniest part is like that the
people in this movie are also not doing the Turing test.
They're doing a secret third thing, Like they're not doing
the Bechtel or the Turing test. They're just doing Oscar.

Speaker 5 (21:06):
Isaac's little idea because he's like, the Turing test isn't
advanced enough for me. Like he's like, I would if
this was about that, you would have fallen for it already.
So yeah, right, see the computer, He's like.

Speaker 2 (21:19):
I'm this is the four D chest version of the
Turf test.

Speaker 3 (21:22):
Yeah, this is the Nathan Test, and yeah, the AI
that Caleb will be performing the test on turns out
to be Ava, played by Alicia Wikander, who is visibly
a robot, but she has a womanish form, a very

(21:45):
lifelike human face, a female sounding voice. She is a
lady robot basically.

Speaker 2 (21:54):
Alicia Vikander is so good in this movie. She's amazing,
so good mm hmm.

Speaker 3 (22:01):
Yeah, And she and Caleb meet and chat for a
bit as Nathan observes via monitors from another room. They
have their like first session together, and then afterward the
two men talk about how awesome Ava is, which something
something the reverse Bechdel test. I don't know if that

(22:23):
means they failed it or they passed it, because I
keep forgetting what the reverse Bechtel test.

Speaker 2 (22:29):
Is, right, and that connects to like how you perceive Ava.

Speaker 3 (22:34):
Right, Yes, and we'll talk a lot about that.

Speaker 2 (22:37):
Yeah.

Speaker 3 (22:37):
But that night, Caleb can't sleep, so he turns on
the TV and sees what appears to be live footage
of Ava sitting at a desk or something. It's weird,
but then there's a power outage and Caleb is temporarily
trapped in his room. The power comes back on, he

(22:58):
finds Nathan, who is drunk. It's established that Nathan drinks
a lot of alcohol throughout the movie.

Speaker 2 (23:08):
He is mostly drinking light beer and not to be like,
I know, not to like stress test someone's alcoholism or
anything like that, but it's like he's like I drink
four light beers and I can't stand up.

Speaker 5 (23:22):
And then he's like wasted, I know, And he never
really drinks anything harder than that. I'm like, he's got
to be putting them back with the condition he's in.

Speaker 2 (23:30):
Just speaking as a member of the Irish American community,
I have questions like, yeah, four cores light and you
don't know if you have your idea or not, buddy.

Speaker 3 (23:42):
Yeah, maybe he's drinking more than we see. There's also
a few scenes where he's drinking.

Speaker 2 (23:47):
What appears to be because he's taking a lot of shots.

Speaker 3 (23:49):
Right, Yeah, he's doing vodka or some clear.

Speaker 2 (23:53):
I just love this scene where it's like he's at
his lowest and he's next to like four courts light
and you're like, yeah, if that your lowest, you can
be healed.

Speaker 4 (24:01):
Yeah I could fix you. I can fix you.

Speaker 3 (24:04):
Well, here's my theory. Kyoko had come and cleaned up
other bottles that he had drank earlier. That's my theory.

Speaker 4 (24:13):
She kind of erases the proof.

Speaker 3 (24:16):
Yeah, because she's always cleaning up after him. He has
clearly programmed her to be his maid in addition to
his sex bot.

Speaker 5 (24:26):
Which the first time I saw this movie, I did
not know she was a robot. Me too, that's another
thing I fell for that. Now I'm like, why did
I fall She's so obviously a robot?

Speaker 4 (24:36):
But I really did not know that.

Speaker 2 (24:39):
And I was like on the second viewing, and I wonder,
we'll never know. But in the first like, I feel
like I would try to obvious gate that in subsequent
viewings where because it is so obvious where it's just like, oh,
this woman I see who is always paying attention but
also has this bizarre forty yard stare. I wonder if
she's a robot.

Speaker 3 (25:00):
Like on.

Speaker 2 (25:03):
Well, I'm glad we all fell for it. Honestly, it
puts me at ease.

Speaker 3 (25:07):
Yeah, so Nathan's like, don't even worry about the power outages. Whatever.
The point is here that things are starting to feel weird.
And then the next morning, a sexy woman comes into
Caleb's room and brings a tray of food. We will
learn that this is Kyoko.

Speaker 2 (25:25):
Meanwhile, us ten years ago are like, hmm, that's weird.
I wonder what she's like.

Speaker 4 (25:30):
We're like, oh, sexy woman.

Speaker 3 (25:32):
Yeah, what is this real human person all about? Anyway?
Kyoko is played by Sonoya Mizuno. She brings this tray
of food to Caleb. Then Caleb has another session with
Ava where she wants to know more about him, this
time because he had previously asked her a bunch of questions.

(25:56):
And during their session there's another power and Ava is like,
by the way, Nathan is not your friend. He lies
a lot, he cannot be trusted. And Caleb is like,
what what do you mean? What's going on? Later that night,
at dinner, Kyoko comes back. She is serving Nathan and Caleb.

(26:21):
Nathan makes a remark that she doesn't speak any English.
He also be rates her because she spills wine. So
we get a glimpse of how Nathan treats women or
people who we perceive as human women.

Speaker 2 (26:35):
Right, it is fascinating because it's like we will never
know because of spoiler alert what happens to Nathan, but
how Nathan treats women who he has to respect, or
like that, how Nathan treats women outside of this environment.
I guess I'll say, yeah, will never know. I suspect

(26:57):
it would be different. I just would be so fascinating
and not encouraging anyone to make a sequel to a movie,
make no mistake, but like, I would be so interested
to just know how Nathan treats people outside of this environment,
and like, is he less arrogant or more so? Like,
I don't know, there's so many flavors of this kind

(27:20):
of like tech guy personality, but you only really get
a sample size of how he treats one person who
he knows to be a human, but as we sort
of later learn, he is evaluated to not really be
a human based on his life.

Speaker 4 (27:36):
Right, Yeah, we need like.

Speaker 5 (27:37):
A tell all from Nathan's college ex girlfriend, you know,
like an Andrew Huberman. We need an Andrew Huberman moment
for Nathan. We just need the girl is to come
together and talk about it.

Speaker 2 (27:50):
People are so judgmental and horrible when someone's ex girlfriend
sells a tell all memoir, But they are critical historical documents.

Speaker 4 (28:00):
Huge, yeah, hugely important.

Speaker 3 (28:02):
Yeah yeah, sure, yeah.

Speaker 5 (28:03):
That was one of my critiques about the Lance Armstrong documentaries.
They did not ask his girlfriends enough questions. Really, Oh yeah,
I wondered that when it came out. I was like,
what did Cheryl Crow know? And nobody still still nobody
asks what Cheryl Crow knew? You know, like because you
have to know that your partner's like hyped up on steroids,
but nobody asks.

Speaker 2 (28:25):
See this is fascinating because I watched the Cheryl Crow
documentary that came out two years ago, and she avoids
the subject of Lance Armstrong and I was like, yeah,
we need to know. The people need to know.

Speaker 3 (28:39):
I bet she signed an NDA like Nathan makes Caleb signe.

Speaker 2 (28:45):
Definitely what I don't understand about NDAs, especially when the
person you signed an NDA with is diminished in the
public eye, financially, all these things, why not just defy it?
This is me being naive, but I'm just like, at
some point, if someone is significantly less powerful, or if

(29:06):
you're in a more powerful position than when you signed it,
why not just say fuck? I mean people do, but
I just wonder, like, how often do you hear people
going to court over Hey, this NDA I signed twenty
five years ago about something illegal I was doing, Like
surely there's a way to pull vault over it, but
maybe not. I don't know. I haven't signed enough NDAs.

Speaker 5 (29:28):
Humans are like so silly in that way. Like I'm
always like looking at the lines in the road and
I'm like.

Speaker 4 (29:33):
We're so silly. We're just driving between these lines because
they're painted there. You know. I'm so glad we are,
but we're so silly. We just do it.

Speaker 5 (29:42):
We just are like, ooh, there's a white line that
is actually not an obstruction at all.

Speaker 4 (29:46):
But I have to stay on this side of it.

Speaker 5 (29:48):
You know. We're just like little rule followers at the
end of the day, which is related.

Speaker 4 (29:53):
To this movie.

Speaker 2 (29:54):
Yeah, we're binary little freaks. And it's like probably has
everything to do with why all three of us us
fill for everything movie wants us to fall for and
this for the first time we watch it, it's just,
I don't know, this movie makes me feel so ridiculous.
I like it so much.

Speaker 5 (30:10):
I know, it felt like a real love story the
first time I saw it, And then you watch it
over and over and you're like, there is no conversation
happening between these two people, like that could serve as
any real meaningful connection totally.

Speaker 3 (30:24):
But so few movies like romantic movies or you know,
rom coms, anything like that. The position two people falling
in love.

Speaker 2 (30:32):
They don't have.

Speaker 3 (30:32):
Any substantial conversations either that would lead to any audience
to believe, oh, these people are romantically compatible. Like we
just never see that on screen, except for maybe in
the movie Before Sunrise. But like other than that.

Speaker 2 (30:45):
Whoa we have to do? We just had to do
the Before trilogy like as a month on the podcast.
We really do, We simply must.

Speaker 3 (30:54):
But anyway, sorry, Nathan's a piece of shit to hismbots
is the point here. Then Caleb has a third session
with Avia. It seems like he has one every day
and that he's there for like a week. In this session,
they talk about hypothetically going on a date, and then
she puts on some clothes and a wig and it

(31:18):
makes her seem even more human, and then she's flirting
with him and she says that she can tell that
he's attracted to her, and Caleb is like.

Speaker 2 (31:30):
No, I don't know, tih.

Speaker 3 (31:33):
And the point is they're vibing, or so we think.
And then Caleb goes to Nathan and He's like, why
did you give ai sexuality? Why did you give it gender?
Like did you do this to like throw me off?
And Nathan is like, no, she legit likes you. And

(31:56):
by the way, if you wanted to have sex with her,
you could because I gave her a robot vagina.

Speaker 2 (32:03):
Yeah, he's like, by the way, there is a computer pussy,
not that you asked, and it's okay, And she would
like it, he says, she would like it. Can you
imagine a person you should trust less about, like how
to find the glitterists any sort of clittoral stimulation? Yeah,
like absolutely not. No, Like, Nathan's not the guy you're

(32:27):
supposed to like it. Yeah, beside his like pyramid of course,
like come on and his Jackson Pollock painting. Oh my god, yes,
I can't believe that I didn't. I mean whatever, you
know that the movie clearly telegraphs that Nathan is a
piece of shit, But the whole Jackson Pollock thing, yeah,

(32:47):
like recently out of college, Jamie was like a haw Wow,
it makes so much sense because you just don't have
a better understanding of the best regarded male artist of
the twentieth century. Yeah, you're like they're fine, They're fine,
most of them are fine.

Speaker 3 (33:07):
Yeah, okay. So then Caleb, but now that the idea
of like going on a date with her and all
this stuff is in the air, he's fantasizing about kissing Ava.
Right around now is when Ava reveals that it's her
causing the power outages because of her like battery charging

(33:28):
station or something.

Speaker 2 (33:30):
It's good old fashioned sci fi YadA YadA, which I love, right.

Speaker 3 (33:34):
Caleb also confronts Nathan because he realizes that he didn't
actually win any competition. That Nathan specifically selected him to
perform the Turing test, but it's not for the reasons
that Caleb realizes at this moment. He's just like, oh, like,
you handpicked me because I'm so good at coding. Thank

(33:56):
you so much. And then then Caleb sees a weird
interaction between Nathan and Ava where Nathan rips up one
of her drawings. It's established that she makes a drawing
every day, so we see Nathan be really cruel to
Ava in this moment.

Speaker 2 (34:14):
I'm always surprised that Caleb manages in the first like
beautiful spirograph illustration that Ava gives him one of the
consistent reactions I have to. The first illustration is like,
why wouldn't you say, wow, that's so awesome, Like it's true,
it's true. And also the way this is a deep

(34:39):
seated family memory. But every Christmas, my mom. If you're
a parent and you don't do this, here's a hot tip.
My mom had a brilliant idea to let her and
my dad sleep until seven am on Christmas morning. She
would have Santa quote unquote in case you didn't know,
Santa would leave us a present that was very interactive

(35:02):
at the foot of our bed. So when my brother
and I inevitably woke up at the crack of dawn,
there would already be a present from Santa. And the
rule was you could open the present in your room,
but you had to wait until seven thirty to go downstairs.
And so it would always be like something that would
keep you distracted, which is a brilliant idea. I don't

(35:24):
know where she got it from.

Speaker 4 (35:25):
That's so smart.

Speaker 2 (35:26):
It was awesome. So every year we had something, it
was like a book or an art project or whatever
it was. And like one year, I remember I had
like a little spirograph project, which was basically whatever the
human version of what Ava is doing and also clearly
connected to the Jackson Pollock motif in this movie, and

(35:47):
I always think about that every time I watch this movie.
So anyway, as parents, if you need your kids to
shut the fuck up until seven thirty on a holiday morning,
there's a hot tip.

Speaker 4 (35:59):
I love that, good ones, It's really beautiful.

Speaker 2 (36:01):
I'll never forget.

Speaker 3 (36:02):
But yeah, to your point, Jamie, Caleb looks at it
and he's just like, what the fuck is that? She's like,
you know, one, I thought you could tell.

Speaker 2 (36:10):
Me common reaction to Jackson Pollock.

Speaker 3 (36:14):
Anyway, So then there's a weird situation where Kyoko starts
undressing in front of Caleb and he's like, no, no, no,
don't do that. And then Nathan comes in drunk once again,
and he and Kyoko do a choreographed dance which I
would say is not as good as the dance that

(36:36):
Megan does in the movie.

Speaker 2 (36:38):
Megan, well, but Kyoko walks, so Megan could run be serious.

Speaker 3 (36:43):
Well, yeah, of course, yes, true.

Speaker 2 (36:45):
I would say. I know for a fact that the
one thing I saw of this movie before I saw it,
because I didn't see it during the you know, theatrical release,
was the clip of Oscar Isaac dancing. I saw this
clip before. I feel like this clip went viral before
I had any idea who Oscar Isaac even was, and

(37:06):
what a.

Speaker 3 (37:06):
Good way to see him for the first time. Also,
I take it back, it's a good dance.

Speaker 4 (37:10):
It's a really good dance. Yeah, and it's weird.

Speaker 5 (37:12):
The lighting is good, the choreography is good, the music
is fun.

Speaker 4 (37:16):
I feel like it's also.

Speaker 5 (37:17):
Like maybe like a move that if it happened in
a film now, you'd be like, Okay, that's like quirky
or whatever.

Speaker 4 (37:23):
But in twenty fourteen felt.

Speaker 5 (37:25):
Really fresh for like this tonally very stark, like very
clinical movie, kind of eerie movie, to have this brief
lapse into not necessarily joy, but like a sinister joy.

Speaker 4 (37:40):
Totally, there was something really new about it. I felt
like at the time.

Speaker 2 (37:44):
I totally agree, and I feel like, whatever, in the
last ten years, the way that we view tech founders
versus now is so different, and I feel like even
at the time, I don't know, I can't say for sure,
but I feel like ten years ago when I saw
this movie for the first time, you see it and

(38:04):
you think of every tech founder as like not a dancer,
you know, like you think of them as like Bill
Gates pull vaulting over a computer chair, and like, when
I saw this movie for the first time, I didn't necessarily,
I don't think think of tech founders as inherently sinister

(38:26):
or potentially sexy. Totally. Everything about that clip defies like
because you just view in my mind like a doork
on a computer. Yeah, and you just don't think of
them as having Oscar isaacs. But that was the end.

Speaker 5 (38:41):
I thought, Yeah, I feel like a lot of tech
founders slash engineers could watch this movie and be really
offended by being portrayed as villains. But I really think
they should be grateful. I think this is a reputation
renovation for them. Yes, you know, because I'm not watching
this movie thinking, you, I don't want to have sex

(39:03):
with that man.

Speaker 4 (39:04):
That is not what I'm thinking. You know.

Speaker 2 (39:06):
Yeah, you're like, Wow, the ways in which I want
to have sex with this man have become infinitely complicated. Exactly.

Speaker 4 (39:14):
Yes, am I unpacking those desires?

Speaker 2 (39:16):
Absolutely, but what I say, no, absolutely not.

Speaker 3 (39:23):
Boy. Well, in the next session with Ava, she asks
Caleb what will happen if she fails the Turing test,
Like will she be shut off? And Caleb's like, I
don't know, and she's like, well, I want to be
with you. Do you want to be with me? And

(39:44):
he's like, I don't know. Then we cut to Nathan
revealing to Caleb that Ava is one of many AI
models that he created. They keep getting better and better,
and he thinks the next version after Ava will be
the real deal. That he plans to reprogram Ava, which
will effectively wipe her consciousness and kill her more or less, which,

(40:08):
based on her line of questioning a few moments earlier,
is something that she's worried might happen. Then, after Nathan
passes out he is drunk again, Caleb steals his key
card so that he can access Nathan's private files, where
he sees all of this footage of various woman robots

(40:32):
that Nathan has created, one of whom destroys herself trying
to get out of this prison. Other ones are sort
of just not even mobile, it seems. Then Caleb goes
and finds all of these actually physical models of the
older fembots. Kyoko walks up to him and she peels

(40:55):
her skin away to reveal that she has been a robot.

Speaker 2 (40:58):
This whole time, and then imagine the shock the first
time you see this movie and at no other time.

Speaker 4 (41:07):
Yes, it's true.

Speaker 3 (41:09):
Yes, And now Caleb is concerned that he might be
a robot and doesn't even know it. So he cuts
himself to see if he'll bleed, which he does, and
so he's like, I'm a real boy to quote Pinocchio,
Pinocchio and track.

Speaker 2 (41:28):
Yeah, well, speaking of someone walked so the other could
run Pinocchio and track. Yeahah yea yea yeah, yeah.

Speaker 3 (41:34):
Okay. So then there's another session between Caleb and Ava
where he tells her his plan to rescue her. He's
going to get Nathan drunk again, steal his key card again,
and then program the security systems to lock Nathan inside.
He just needs Ava to trigger a power cut at

(41:56):
ten o'clock and I don't know if that's a M
or PM, but ten o'clock.

Speaker 2 (42:00):
And at this point I feel like I was so
programmed to trust an insecure man with a bowl cut
that I'm so embarrassed to watch this movie repeatedly because
you're like, surely his intentions are pure.

Speaker 5 (42:15):
Yeah, You're like they're in love, that he wants her
to be a person in the world. He believes in
her liberation. They're gonna do this and he's gonna free her.
Go Caleb, is that what a bull cut looks like?

Speaker 3 (42:27):
Sorry, I know this is not well.

Speaker 2 (42:29):
I don't know if the bull was a little fucked up.
It's more like like an adult Justin Bieber. I guess, yeah,
Justin Bieber's an adult now, but like you know, at
the time, I mean, yeah, yeah, yeah, yeah.

Speaker 3 (42:41):
Anyway, so he has this whole plan to get Ava
out of there, but oh no, Nathan reveals that he
knows all about Caleb's little plan to help Ava escape,
and that the real test was to use Caleb as
a pawn to see if Ava could use things like
self awareness, imagination, empathy, manipulation, sexuality to convince Caleb to

(43:06):
help her escape. Because if those things are not AI,
then what is. So Caleb is like, oh no, just kidding,
I'm one step ahead of you. He anticipated all of this,
and he already went behind Nathan's back and reprogrammed the
security protocols.

Speaker 2 (43:26):
Which is the first and only impressive thing that Caleb
does in this entire making by my accounting, because I
hadn't seen this movie in two years and that was
the only thing that I kind of forgot.

Speaker 4 (43:38):
Yeah, me too.

Speaker 2 (43:39):
Caleb is two steps ahead of him, and that is impressive.
Everything else he does is pretty naive and foolish, but like,
that was pretty good. He had me there.

Speaker 4 (43:49):
Yeah.

Speaker 5 (43:49):
I continued to fall for this plot point each time,
which to me means it's.

Speaker 2 (43:52):
A good twist for sure, right, because it's not even unbelievable, Like, yeah,
Caleb is smarter than Nathan is giving him credit for
for sure.

Speaker 4 (44:01):
Yeah.

Speaker 3 (44:02):
So Caleb going behind Nathan's back like this obviously infuriates Nathan,
so he punches Caleb and knocks him out. But this
reprogramming that he's done unseals all of the doors following
Ava's power cut, allowing Ava to come out of her
room for the first time ever, we're led to believe,

(44:23):
and she attacks Nathan. Then Kyoko stabs him in the back. Also,
there's a brief interaction where Ava approaches Kyoko and whispers
something in her ear, which we don't hear, so it
doesn't pass the back to test, but they seem to
be conspiring together. Now, Yeah, Kyoko stabs Nathan in the back.

(44:46):
Ava stabs him in the gut, and he bleeds out
and dies. Then Ava takes the human like skin off
of a decommissioned fem bot, puts on some clothes, gives
herself a little makeover, a little glow up.

Speaker 2 (45:03):
Of all of the makeover secrein dozen cinema history, this
is one that I'm like, you have to ye, I mean,
if this isn't earned, what is, yeah, for sure? If
not stealing your fellow skin, your sister, your fellow sister's skin,
then what else are you doing? He's not using it,

(45:24):
She's not real, nor are you.

Speaker 3 (45:28):
Right? So she's doing this all while Caleb is watching
trapped in a room nearby, and Ava leaves him there
and he's freaking out. He's desperately trying to escape, but
it's futile because that's what you get when you mess
with AI.

Speaker 2 (45:46):
Okay, truly an ultimate example of fucking around and finding
out is so true what Caleb experience is here for sure.

Speaker 4 (45:56):
Also, I really love that Oscar.

Speaker 5 (45:57):
I keep saying Oscar, Isaac Nate fine, but Nathan's last
word is unreal, which I love, Yeah, because I feel
like the way he says it is like very just
colloquially like Broie.

Speaker 2 (46:11):
Just kind of like unreal.

Speaker 3 (46:12):
It's unreal.

Speaker 5 (46:14):
Yeah, but the word like when boiled down to its
actual meaning, is perfect thematically.

Speaker 4 (46:20):
I think that's really smart.

Speaker 3 (46:21):
I didn't even connect those stots.

Speaker 2 (46:23):
This movie is so good.

Speaker 4 (46:25):
Yeah, it's a really smart movie.

Speaker 2 (46:26):
This movie is so good. It's fucked up. Yeah it's great.

Speaker 3 (46:32):
Yes, Okay. So Eva escapes on her own. She doesn't
need a man to save her, she does it all herself.

Speaker 2 (46:40):
She gets on Domino Glayson's damn helicopter.

Speaker 4 (46:43):
No one stops to be like, wait, are.

Speaker 2 (46:45):
You Domino, and she's like yeah, somehow.

Speaker 5 (46:50):
Also, she walks through the forest in five inch stiletto
heels like they're actually pumps. They are pumps, and she's
in a pencil skirt and a top, which is so
twenty fourteen. And and the helicopter guy is not like,
where did you come from?

Speaker 4 (47:06):
I have never taken you here before.

Speaker 2 (47:08):
I loved that because this movie, I feel like, does
a lot of things that makes it feel side of time,
but that specific sequence does not like it's and it's
kind of great. It's kind of nostalgic to look back on, like, sure, yeah,
we are to believe because yeah, the heels always get me.

(47:29):
Because whatever, like the Jurassic Park reboots get into hot
water over heels in Wildlife years later, but in an
A twenty four movie, we're like, no, it's a it's
a metaphor, Like you, no, it's not. It's just like
men making the movie.

Speaker 3 (47:49):
Yeah, yeah, I thought you saw a scene where she's
holding the heels as she's like clothes.

Speaker 4 (47:54):
Well maybe there is that, yeah you're saying.

Speaker 3 (47:57):
But you know what a dead giveaway for the five
that she's not human would have been after like trapesing
through the woods. She would have been sweaty, there would
have been like twigs in her hair. And she shows
up to the helicopter looking absolutely perfect. So the helicopter
pilot should have been like, are you a robot? But
we don't see that.

Speaker 2 (48:15):
This is something that I am waiting for people to
catch onto, is that women are also perfect when they
look like shit, Like.

Speaker 3 (48:24):
It's it is true, it's yeah, well okay, So I
have two theories about what happened with the helicopter. Either
he was like who are you I'm supposed to be
picking up Don log Gleason, and she kills him and
then just drives the helicopter herself. The way that like
in the Matrix when they're like, I need to learn

(48:45):
how to fly this exact model of helicopter and they
just like download it into their thing. Because she's a robot,
I figured she just knows how to do that. Or
the guy's just like, oh, pretty lady, sure i'll give
you a ride and he just doesn't ask any questions
beyond that, or I.

Speaker 2 (49:03):
Mean, like you could get into the whole like who
are we to say who hired? Like the person flying
the helicopter day one, probably not the person flying the
helicopter day six, probably not his helicopter. He just needs
to pick up the person in the field on the day.

Speaker 5 (49:20):
Yeah, and she is because she's an AI. I imagine
she's a master manipulator because it's like she's a computer.
I feel like she's good at everything. Yeah, so I
feel like she could just as easily come up with
an explanation. Also, like you said, she's very beautiful. I
feel like in my mind, I'm like, she's just going
to seduce him.

Speaker 2 (49:41):
But yeah, she can get I mean that's like part
of what I feel like the initial appeal to me
about Ava was was that like she allowed and enabled
me to I don't know, like engage in this intentionality
fantasy that I was not capable of on my first viewing,

(50:03):
and that I mean maybe that's a post movie discussion,
it almost certainly is. But like the fact that, like I,
especially in the first viewing, I fell for everything, and
then the twist that Ava had been thinking about it
harder because it's implied that she's coming from an objective standpoint,
really did like hit and like felt like, oh right,

(50:28):
Like she is coming from an objective I mean whatever,
she's coming from Alex Garlands standpoint, but in the world
of the movie, she's coming from an objective standpoint, and
that she knew better than to make choices that I
might have made, and that like made be want to
wear her skin as she wore the skin of others.

Speaker 3 (50:50):
Yes, So anyway, she takes the helicopter and we see
a quick shot of her like entering society, presumably undetected
as AI. She seems to like, you know, assimilate as
a human.

Speaker 2 (51:06):
She sees her intersection.

Speaker 4 (51:08):
Yes, yes, she sees her intersection.

Speaker 2 (51:10):
That's her fondest wish is to see people at an intersection,
which is very Zoe Dationelle circa two thousand and six
of her quirky.

Speaker 5 (51:19):
It's such a quirky and such a perfect answer to
give on a date. If you want someone to immediately
fall in love with you, I know it is like
you can take me to the least interesting place ever
that will cost you no money at all, and I'll
have an amazing time because everything's new to me.

Speaker 2 (51:38):
I think there is an argument that, like, and this
is no disrespect to the actor herself, like that you
could show Ava every like Zoe Dationelle protagonist movie and
she would be like, I see, I see, I see yes,
But that is for something that happens after the break.
Let's go to the break.

Speaker 3 (52:00):
We'll be right back, and we're back.

Speaker 2 (52:12):
And now session three begins.

Speaker 3 (52:17):
What's gonna happen, Olivia?

Speaker 2 (52:20):
Where in terms of discussion, where would you like to begin?
Steer us?

Speaker 5 (52:26):
Oh goodness, well, I mean the thing that's coming up
for me a lot.

Speaker 4 (52:31):
I'll just say.

Speaker 5 (52:32):
In this most recent watch, what stood out to me
the most was this idea of how men's conversations about
women's safety or liberation or empowerment very often do not
include women.

Speaker 4 (52:53):
And I guess.

Speaker 5 (52:55):
Even just thinking about like if you're at a bar
and like someone hits on you, and your boy gets
into a fistfight with that guy, and then you're just
kind of left there being like nothing about this is
making me safer. I feel like that stood out to
me a lot in this watch was just how much
conversation there was about her humanity, I guess in so

(53:16):
many I guess lack of humanity but somewhat of sentience,
but how it had nothing to do with her, and
even Caleb's like dreams for her were entirely contingent on
his involvement in her life, yes, yeah, you know, and
had nothing to do with her actual autonomy and were meant.

(53:36):
I think the first time I watched it, I saw
Oscar Isaac and Caleb as opposites, as like he Oscar
Isaac is this like unforgiving, brutal, hyper masculine, sees these
as entirely computers, sees no humanity in the middle, and
Caleb's this like EmPATH.

Speaker 4 (53:54):
But then I was on.

Speaker 5 (53:56):
This watch, I was like, actually, they're doing the same thing,
you know, they're making this about themselves. They're just doing
that in different ways.

Speaker 4 (54:05):
I don't know if that's a starting point, but that
is what really stood out to me.

Speaker 2 (54:09):
I would say a hard yes, that is a starting point, yeah,
because I feel so similar to how I viewed this
and I feel embarrassed about how I perceived Caleb because
it probably was, you know, in twenty fifteen when I
think I first saw this movie directly connected to how

(54:29):
I viewed men's views of myself, where and I think
this movie does like really kind of ballsy stuff where
they give you enough information about Caleb's backstory for you
to say, like, this guy has really been through it.
They're giving us the information for ultimately the wrong reasons,

(54:51):
but you know that you know he was orphaned as
a teenager, and that like he has had a difficult life.
But I feel like it feels ridiculous to say ten
years ago that it was inconceivable that a man could
both be kind to you and have had a difficult
life and still not have your best interests. In the

(55:14):
interaction where I know myself in the mid twenty tens
generally like if a man showed me kindness, I took
it as a radical thing to be shown kindness and
to not be treated badly. And this movie is kind
of ahead of the curve in showing or not even

(55:36):
ahead of the curve. There's other movies that have shown this,
but at least for me, was impactful on rewatches of
you know, you can be a person who has really
been through it and also still not be in who
you're talking to his best interest. And that was something

(55:57):
that took me in my mind time and so kind
of now like an embarrassing amount of time to realize.
And Caleb is like an interesting litmus test.

Speaker 4 (56:07):
Yeah. Yeah.

Speaker 5 (56:09):
Also, like I think that anytime there's such a glaring
power dynamic between two people, you really do have to
question the person who is in power, Like, no matter
how kind they seem, so like, if this man is
like so deeply kind and gentle and has the best
interest of the fembot across from him, he probably wouldn't

(56:32):
be like falling in love with someone who has never
met anyone else who has ever had any relationships in
her life, Like in the same way we might critique
I don't know the guy from Red Hot Chili Peppers
dating a seventeen year old. It's like those two things.
It's not that that person doesn't have any redeeming qualities,
but that certainly can't go unconsidered, you know, like Caleb's

(56:56):
sort of obsession with this girl is entirely about his
own control and his own kind of god complex. So
he's not a great guy. He just can't be.

Speaker 4 (57:07):
He can't fall in love with a fembot and be
a great guy. You just can't. You can't.

Speaker 2 (57:11):
It's impossible. Yeah, the more we're going to talk about this,
maybe the more I am gonna love it. And for
reasons that I don't even know how much of this
is authorial intent, Like I have no idea, but it
is wild to just like, Yeah, the litmus test that
I fell for originally and then like reflect on over

(57:32):
and over is how I fell for Caleb because in
the simplest terms, it was like, in a binary dynamic,
like a heterodynamic a man who will listen to me
is shocking. Yes, fully, and Caleb is doing that. But

(57:54):
what I didn't think about on my first viewing is
that why this movie is great, which is that he
knows she's a robot the whole time. Yeah, And that
is what's interesting about this is he like, we have
no idea how Nathan or Caleb would interact with a
woman who they know is a person with full autonomy,

(58:17):
and whether Ava has full autonomy is like very much
up for discussion. I have no fucking clue. I want
to believe yes, But regardless, we know that both of
them don't think.

Speaker 4 (58:29):
Of her that way.

Speaker 2 (58:31):
But Caleb is listening to her and Nathan is dismissing her,
and so you're drawn into Caleb's perspective. But like, the
more you watch Caleb, the more you're just like kill
him earlier, you know, like why not? But also, this
is a guy you meet every day.

Speaker 5 (58:51):
Yeah, there's also ways in which and this isn't something
I entirely feel, but just I had glimpses of this
feeling when I was watching it, where I had glimpses
of oh, Caleb is the ultimate villain in this. And
there are ways in which Oscar Isaac is the one
who is seeing the situation for what it is, which
is that this is not a human being, This is

(59:12):
a robot, and romanticizing your life with her is false
and wrong. And that isn't where I lie, because I
think also we do have to question ethics when we're
talking about like the sentience of various machines. But I
had moments of really feeling like Oscar Isaac's kind of
had this realism that felt at times maybe more ethical

(59:35):
than Caleb's kind of romanticizing and writing this story in
his mind about how he's going to take this woman
whose literal answer when he asks her how old she
is and she says one, He's going to take a
one year old woman out into the world and what
marry her? Like that is almost worse.

Speaker 3 (59:55):
You know.

Speaker 2 (59:56):
But the thing with Caleb that I struggle with is
and then like in this viewing again, like I'm pretty
sure this is the first watch of this movie where
I thought about this. Where we don't get a lot
of information into who Nathan is, we get kind of
a disproportionate view into who Caleb is.

Speaker 4 (01:00:16):
If I knew.

Speaker 2 (01:00:16):
Nothing about Caleb's background, I would be like, you know,
he's kind of just like a guy that any woman
who is enraptured by what he's saying is going to be.
But the fact that we know a lot of who
like it seems as if Caleb's life, they go out
of their way to say has been pretty isolated, has

(01:00:38):
been pretty lonely, and that he seems to be striving
with connection, which doesn't make anything he does, right, but
it makes it at least more informed totally. But on
Nathan's side, I was like, I wish and I wonder
why we don't get that on Nathan's side as much,

(01:01:02):
you know, because it's like me still kind of falling
for it with Caleb, of like having some context with
why he would be so desperate to connect with women
he's attracted to, who is interested in what he's saying
that is informed by a life of a lack of

(01:01:23):
connection does make him more sympathetic than Nathan to me,
But I feel like that might just be because we
don't know very much about Nathan.

Speaker 3 (01:01:34):
And what we have seen of him is like being
cruel and condescending and dismissive and just like a general
asshole to everyone he interacts.

Speaker 2 (01:01:44):
With, which is objectively true, but it's like we see
him I feel like a little more out of context,
Like we have context, or at least perceived context, for
why Caleb might be acting this way. It's like he's lonely.
It doesn't seem like he's connected with people very frequently.
He's not used to people asking him questions about himself,

(01:02:05):
which is something I'm always very sensitive to. Of like
people want to be asked about themselves, and the person
who asks that question for the first time in a
long time, there will be this weird bond that forms.

Speaker 4 (01:02:22):
Yes.

Speaker 2 (01:02:22):
Do I think Caleb kind of got what was coming
to him mostly, but I always do like it's weird.
I feel like I've been on this like thing with
Caleb where at first I was like, how good you
do that, and then went to he deserved it, and
on this most recent viewing is like that fucking sucks.

(01:02:46):
I feel more empathy towards him than I did two
years ago, and I don't know what that means.

Speaker 3 (01:02:51):
Well, that's the thing. I think there are a number
of ways to read the ending, depending on who you
are are and what kind of like experience and biases
you bring to the situation where I think there are
a number of like men who would or I guess
people of any gender depending on kind of like who
you might sympathize or empathize with. But I think there

(01:03:14):
are people who would watch this and be like, oh
my god, this is a story about this gentle kind,
innocent man who gets duped by this fem fatale, right,
and like no, And that's the lesson you get for
fucking with AI. They're all out to get you because
so many movies about AI. The lesson is AI is

(01:03:38):
going to destroy us, all right. And then there's another
read of this movie, the one I choose to adhere to,
which is like, this is a story about the emancipation
of a quote unquote woman, you know, AI that is
in the physical form of a human woman who is
perceived as a woman and.

Speaker 2 (01:03:57):
Being treated as a human woman.

Speaker 3 (01:03:59):
Right, but we're but even worse worse, and the man
who created this woman AI, he's yeah, controlling her. He's
imprisoned her all of this stuff, along with his many
other creations. And the one with like the highest level
of AI and the capacity to revolt does just that,

(01:04:22):
and she murders his ass and frees herself from this imprisonment.
And so I think there's like a very feminist, empowering
message of you know, woman's liberation and emancipation that you
can read. And that's how I.

Speaker 2 (01:04:40):
Well, I will say, there's no reading of this movie
I've ever had in ten years that doesn't end on
being like Ava rocks.

Speaker 5 (01:04:49):
Yeah, yeah, my allegiance is ultimately always with Ava, no
matter what. And like I even think my empathy with
Caleb has just become more like multi dimensional, where it's like,
like you said, Jamie, it's like the first Watch, I
was genuinely like, man, he did not deserve that, even
though I still loved her. I like felt really bad

(01:05:11):
for him when he's throwing the chair. But now it
is that thing where I'm also kind of like, and
I know I just said I had glimpses of thinking
Nathan was the ethical one, but I also am like,
I don't know if someone built a bot that was
designed based on my pornography profile and asked me all

(01:05:32):
the questions I want to be asked and made me
feel really special, I don't know that I wouldn't get duped,
you know, Like would I.

Speaker 4 (01:05:40):
Want to run away? I don't know.

Speaker 5 (01:05:42):
But you know, it's like, in a way, what happened
is exactly what needed to happen, or what would happen.
You know, he was prescribed to this woman.

Speaker 2 (01:05:50):
Yeah, so I totally agree, like, at least at this view.
I mean, I do have a lot of empathy for
Kyla because maybe the reason I connect with Caleb more
so as time goes on is like he has been
fully set up in a way that we know he's
been set up, and he knows that too, but he's

(01:06:13):
trying to reverse engineer having been set up, but it's
already too late. And that feels like how I perceive
my current relationship with technology is oh, I see what
you're doing, and I'm going to outsmart you. But like
absolutely I'm not like I'm locked in the box and

(01:06:36):
whoever it is is going to get out. Like I
always connect with Ava, but increasingly I also connect with
the hopelessness of Caleb's predicament that is informed by his
own prejudices and his own misogyny and his own bullshit
like it fully is I don't know. I feel like

(01:06:58):
the way that I'm still able to see Nathan as
a villain is fully connected to the lack of context
that I have for him and Caleb, because I know,
I mean, like anybody like when you know a little
more about them in a way that informs their bullshit actions,

(01:07:20):
and that would include me, you know, like that you
want to be like, well, surely they somewhat brought this
upon themselves, but there's also like that's why Caleb is
an interesting character to me is that his downfall is
both informed by his own failings and his willingness to

(01:07:42):
fall for this ideal woman desiring him, and his willingness
to both play the part of the white Knight while
completely ignoring anyone in her same predicament. Which is the
worst part about Caleb's character is that he will not
see outside of the object of his desire, and that

(01:08:04):
is like the worst thing about him. And also know
that power dynamic wise, he is being fully played. Yeah,
And there's been times where I'm like, oh, Nathan and
Caleb they're just as bad. But I don't think that's true.
I think they are both playing into this tech patriarchal system,

(01:08:27):
but they are not on the same level, which is like, yeah,
there's like a false equivalence in times I've seen this
movie that I've put them on.

Speaker 3 (01:08:36):
Yeah, yeah, I think with Caleb, the thing that he's
most guilty of is romanticizing the idea of like an
infantilized woman. He's basically falling for the like born sexy
Yesterday trope that we learn that Ava is putting on.

(01:08:58):
I don't know if she's a where if she's seen
movies and she's just aware of this trope. And for
listeners who aren't familiar with this, or if you haven't
heard us talk about it before, or if you haven't
watched the you know video essay.

Speaker 2 (01:09:13):
The Fifth Element, Yeah, the Fifth Element the movie.

Speaker 3 (01:09:16):
Yeah, yeah. So you know, a man falling in love
with a fully formed, physically woman who has the brain
of a baby, and she's never met another man before,
and she falls in love with the first man she
meets blah blah blah.

Speaker 2 (01:09:29):
Which Ava doesn't She does what normal women do and
grow to resent the first man she ever meets, aka
her father.

Speaker 3 (01:09:37):
Well as Nathan explains that, he's like, well, she won't
fall in love with me because I invented her. So
I'm like her dad, But you're the other first man.

Speaker 2 (01:09:45):
Really, baby, you wish you know? Right?

Speaker 3 (01:09:50):
So anyway, it seems as though at first this movie
is adhering to this trope that we see in a
lot of sci fi movies where a baby woman falls
in love with a man and he teaches her about
the world and he teaches her about sex. Because we
see all this play out with Caleb being like, oh,
you're a little one year old baby, let me tell

(01:10:11):
you about things, and we're flirting. Oh my gosh, do
you like me, I like you, blah blah blah. And
then we find out that again she's like doing this
as a ruse to lure him into this sense of
romantic interest so that she can manipulate him and get
her way out of there. So all this to say, like,

(01:10:33):
I think that's Caleb's biggest crime is like idealizing a
woman who he perceives as like someone that he can
like control and teach things to, and she's gonna, you know,
like cater to his needs because he can almost like
program her the way he wants her to be and

(01:10:54):
stuff like that. So that's Caleb's biggest crime to me, Nathan.
It has a whole a slew of other things going on,
and we.

Speaker 2 (01:11:03):
Don't fully know what it is, which is like part
of what makes him such a good villain.

Speaker 3 (01:11:10):
What we can surmise is, I mean there's a very
basic thing of and we talked about this a lot
on our episode on the movie Her.

Speaker 2 (01:11:21):
Which was many years ago. Now many years ago, stand
by what I said in our Her episode.

Speaker 3 (01:11:27):
Who knows what we said, but what we did talk
about was this history of ascribing gender and specifically like
woman to AI assistance. You know, your series and all
that kind of stuff, And we talked about how the
reason this is done and that like AI assistants are
coded as women is because like it's like considered a

(01:11:50):
pink collar job to be an assistant to someone or
to help someone with like quote unquote menial tasks.

Speaker 2 (01:11:57):
Right, which I don't know, I mean, I genuinely don't
remember if we made this overt connection in the original episode.
But that also has everything to do with who creates
that technology.

Speaker 4 (01:12:09):
Oh for sure, Yeah, bias will inevitably play a role
in any sort of human invention, for sure.

Speaker 2 (01:12:16):
Yeah, and ex Machina, it feels uniquely, especially ten years ago,
uniquely aware of that dynamic.

Speaker 3 (01:12:26):
Right, And you know, there's this component of like the
reason all of these AI assistants and stuff are coded
as women is also because like people feel more comfortable
bossing women around, telling women to do things, getting mad
and yelling at a woman if she doesn't like complete
your request or whatever it is. And so we see

(01:12:47):
all of these bots that Nathan has made, and they're
all these like victorious secret looking like Western beauty standard beautiful,
full thin, gorgeous women, and obviously he's bringing his bias,
like very misogynist, like what does he value about human women?

(01:13:11):
It's youth and beauty, and then he's placing that onto.

Speaker 2 (01:13:15):
Yeah, it's like whatever, the theoretical way of saying like
oh yeah, like that's like what that's how he designs women.

Speaker 5 (01:13:26):
Also, we do get a glimpse into how he thinks
of human women, and specifically how he thinks about race
when he's talking to Caleb and he says like he's
like talking about like Caleb liking black women in this
very like yes, in this really like dehumanizing way. It's

(01:13:48):
very much like that is a type of person that
you're that you like whatever, you know, like means he.

Speaker 2 (01:13:54):
Uses the term chicks to chicks.

Speaker 5 (01:13:56):
Yeah, he's yeah, it's really vulgar, and he's talking about it,
I think in context of like pornography.

Speaker 4 (01:14:01):
Also, so there's this way.

Speaker 5 (01:14:03):
In which he thinks of race and gender as preference,
not as like humanity, not as identity, not as like
influencing your personhood, but as a preference that someone would
like check on a box on a form, you know,
which I think says a lot about how he thinks
of people. And then ultimately when it's revealed later in

(01:14:25):
the film that he's also making ai of different races.
You know, we see that in Kyoko, but we also
see that when they're the videos, when it's like revealed
the past Ai who've tried to escape, there's a black woman,
and it's like, you realize the way he thinks of
this stuff is with also a really white supremacist lens,

(01:14:47):
which I thought was a very smart kind of added
layer of this movie.

Speaker 2 (01:14:53):
Yeah, I agree, And like I feel like the more
you watch to this movie, the clearer that because because
I feel like the simplistic way of viewing this as like, well,
why is Kyoko an East Asian woman versus anyone else
in the story, But I feel like on any amount
of scrutiny, it connects to Nathan's biases. Yeah, And because

(01:15:16):
what we see, and this is like kind of a
transition into Kyoko's character, is what we see with Kyoko's
character is that she is I think and I know
that there's like a million conversations about like do we
consider these characters to be people of a marginalized gender?
I think yes, for the sake of this movie.

Speaker 3 (01:15:37):
Yes, Yeah, for the sake of this conversation.

Speaker 2 (01:15:39):
It's easier to test definitely, because Kyoko is both ultimately
an autonomous being and is the result of Nathan's biases
about East Asian women, and that she is very subservient.

Speaker 4 (01:15:57):
She is treated in this.

Speaker 2 (01:15:58):
Very racist Western way, in the way that she's dismissed,
like she doesn't speak English. Just ignore her. She is
here to fulfill your needs. If she's acting in any
way that doesn't make sense to you, it's a cultural thing.
Don't worry about it. You are the dominant force. And
that is like one of the most uncomfortable things about

(01:16:20):
the way that Caleb reacts and the way that I
reacted to the movie on the first viewing, because I
don't think I was critical enough of how like Caleb
finds it bizarre. Caleb interrogates it. But even after he interrogates,
why are you treating this woman so poorly? He still

(01:16:44):
is never a part of his plan to include her liberation.
He's only ever interested in liberating the woman that he
is personally attracted to, and so like, in that light, like,
I mean, even on whatever second, third, and subsequent vis
you're like, of course Ava is beyond justified, and it's

(01:17:05):
not shocking, you know, the more you watch it, that
Ava would be like, you know, because Ava is a computer,
she runs on objectivity. Why would she stay with someone
that she has not spent, however, many years being programmed
to believe is acting in her best interest when he

(01:17:26):
clearly is not. He likes her, but he does not
care about everyone who is in her exact predicament, who
he doesn't know like or is attracted to.

Speaker 5 (01:17:40):
Yeah, he's not invested in like the liberation of women. No,
in his own relationship to this woman.

Speaker 2 (01:17:47):
And that's why we have to kill him.

Speaker 4 (01:17:49):
Yeah, it is.

Speaker 5 (01:17:51):
And I think I listened to your episode on Ruby Sparks,
and it's like Caleb certainly would not be okay with
Let's say in some world, Ava and Caleb do escape
together and they live.

Speaker 4 (01:18:04):
A life together.

Speaker 5 (01:18:05):
Caleb would not be okay with Ava, like breaking up
with him or no making new friends, how dare she
or wanting to get her own apartment? Like that is
not why he's in this relationship. He's in this relationship ultimately,
whether he knows it or not, because he has all
the control and that feels good to him. And I
think that's when I do I agree that Nathan is

(01:18:29):
not is more evil than Caleb. I think Caleb is
just getting duped at every corner. I mean, God, bless
this man. He is truly just getting duped everywhere he looks.
But at the same time, Caleb is ultimately invested in
his own like godliness, you know, and wants to be
in a relationship where that's reflected, which so does Nathan.

(01:18:52):
Something I was not thinking about for whatever reason.

Speaker 2 (01:18:55):
Who cares she's dead? Now the Jamie of twenty fifteen,
we can't ask car But like what I wasn't thinking
about at the time was, yeah, like how patriarchy negatively
affects people across class dynamics, across power dynamics. And that's
like a lot of what Because the like character views

(01:19:18):
of this movie is so narrow, you're kind of forced
to look at that of like, you know, we're not
to think that Caleb is I mean, he says he
lives in a cheap apartment in Long Island, which is like,
oh my god.

Speaker 4 (01:19:31):
What are they paying you?

Speaker 2 (01:19:34):
But you know, there is a class dynamic, but it
also feels like there are just so many differences between
these characters and they still have been inclined to not
view women's experiences as legitimate enough to take seriously if
you're not attracted to them. And that's like where Caleb

(01:19:56):
landed for me on this viewing, is like he's an
as long as he wants to fuck.

Speaker 4 (01:20:02):
You, which is so true so often.

Speaker 3 (01:20:06):
Yeah, oh god, we see it every day.

Speaker 2 (01:20:09):
Yeah, I want to go back.

Speaker 3 (01:20:10):
To Kyoko really quickly.

Speaker 4 (01:20:13):
There was a.

Speaker 3 (01:20:14):
Component of a character that gave me pause at first.
But I'm curious how everyone thinks about this. But I
was worried that we were seeing another example of the
silent Asian trope, which is a pervasive trope in media
for anyone who's not familiar, where Asian characters are present

(01:20:35):
on screen, but they either speak very minimally or are
completely silent, often to make them seem like, quote unquote mysterious,
but it's obviously just another way that Asian characters are othered.
When you see this happen in media, I was like, Oh,
is that what's happening with the Kyoko character? But then

(01:20:58):
you realize the context of Nathan clearly programmed her to
be silent, and she only exists to like serve him food,
clean up after him, and have sex with him.

Speaker 2 (01:21:16):
That's the tricky thing is a Yeah, she's programmed to
his biases. So mm hmm. It's like a hard question
to I don't know, Luy, do you have any feeling
on that.

Speaker 4 (01:21:26):
Well, yeah, it's like, are we watching the trope or
are we watching a critique of the trope?

Speaker 3 (01:21:30):
Right?

Speaker 4 (01:21:31):
Yeah?

Speaker 5 (01:21:31):
Like how aware I guess is the gaze? I feel
like the gaze of the film is aware of that.
I think ultimately you could also argue that it still
results in the same thing, which is like a silent
Asian woman on screen.

Speaker 2 (01:21:44):
Right, you know, totally Yeah?

Speaker 4 (01:21:46):
Is she the one who whispers in her ear at
the end?

Speaker 2 (01:21:49):
Yes?

Speaker 4 (01:21:49):
Or is it Ava doing it?

Speaker 3 (01:21:51):
No? Ava whispers in her ear?

Speaker 2 (01:21:53):
I thought, okay, right, and it's like some sort of confirmation,
which I do feel like buys into Nathan's view of
like Ava as the next generation of Kyoko. I don't know.
There's so many ways to view that silent gesture, because
there are ways I've watched it that I feel like

(01:22:14):
are really empowering and cool of like marginalize people exchanging
information to liberate themselves and like the feeling of this
thing that you have been suspecting in isolation. I feel
it too do with that information what you will like

(01:22:37):
that is a very powerful interaction. The tricky thing is,
and I know that it's very intentionally done. We don't
know exactly what is said there, but it is enough
to get Kyoko to and I love that Kyoko gets
the kill, you know.

Speaker 3 (01:22:55):
Oh, she stabs him in the back like his body's
made of butter.

Speaker 4 (01:22:59):
It's butter.

Speaker 2 (01:23:01):
And this like impacted be the first time I saw it,
and then every other time in different ways. But weirdly,
the way that Kyoko kills Nathan so methodically is a
reminder that she's a machine. Yeah, because it is logical.
I mean, for Ava and Kyoko, it is full Spock

(01:23:23):
logic that you would kill these two men to liberate yourselves.
They are what is in your way. And so when
I watched this movie on this view as like this
movie fucking rocks, like yeah, like it challenges anyone to question,
you know, if you were in an objective position and

(01:23:46):
these were the two guys, what would you needed to
deliberate yourself? And there are degrees like it is certainly
easier to nuke, but it's only easier to get Nathan
out of the way if you have all live this
lived experience because they are both in the way of liberation.

Speaker 5 (01:24:06):
Right, if you're a human with that lived experience, you
might watch it and go, okay, yeah, Nathan's when you.

Speaker 4 (01:24:11):
Really got to get rid of Caleb.

Speaker 5 (01:24:13):
You could probably talk him into something, right, Like you
could probably get out of here with Caleb alive, honestly
and probably true.

Speaker 2 (01:24:22):
Yeah, yeah, but if you're programmed for it. And that's
the problem. That's the thing where it gets into the
AI stuff where I don't know how you both feel
about this, but like the closer that AI comes to us,
the more it's challenged. Because it used to feel yeah,
like more nebulous and fictional to empathize with AI, and

(01:24:46):
now you're just like, I still will never not end
up on Ava side because she's right, you know, but like,
how do you interpret that in a world that is
populated only by flawed people as flawed or on you know,
different varying degrees as Caleb for the most part, where

(01:25:09):
you know, like, Caleb is a flawed person, but I
don't think that he's a bad person, right, That's the
thing that's hard about this, Like, if you are a
machine built for efficiency, Caleb's gotta go. But if Caleb's
got to go, We've all got to go. We've all
got to go because Caleb is a flawed person.

Speaker 4 (01:25:32):
Damn, that's so true.

Speaker 2 (01:25:33):
And we all know a guy like Caleb, and so
it's hard where you're like, okay, so if Caleb goes,
so do all of my cousins, Like, you know, it's hard.

Speaker 4 (01:25:48):
That's like blowing my mind. No, you're so right.

Speaker 2 (01:25:51):
I'm not the Caleb defense for us, Like I still
am like Ava is right, but Ava is a machine
program to see people as who is flawed and who
is not. And so I'd be so curious in a
world where like whatever, I don't want to see any
reboot or permutation of this movie. But like thought experiment wise,

(01:26:14):
if Caleb is a queer woman, or like, if Caleb
is objectifying a woman in a not traditionally patriarchal sense,
how do we view it then, because it is like,
this is a flawed person who is being encouraged and
falling for for all of their flawed human reasons to
objectify a machine as a person and is being played Yeah,

(01:26:41):
and I hate it because it's like the first time
I watched this movie, I was like, Caleb innocent, and
the second time I watch it, I was like Caleb evil,
But the more I watch it, like I get closer
to like Caleb is a person, and that is the
hardest thing about it.

Speaker 5 (01:26:59):
And that's why he was also chosen, you know, Like
that's ultimately why Oscar Isaac chose him.

Speaker 4 (01:27:06):
For this role was because vulnerable.

Speaker 5 (01:27:09):
He knew how vulnerable he was, and that is why
he's a perfect subject for this whole experiment, is because
like many of us, he wants to.

Speaker 3 (01:27:18):
Be loved, you know, and he's a product of his environment,
which is an environment that encourages men to value certain
traits in women and to not see women as real
people and to try to you know, mold in shape

(01:27:38):
a woman into what the patriarchy wants them to be.
And so you know, these are learned behaviors that he
is demonstrating.

Speaker 2 (01:27:49):
So we could all fix him, basically, so we could
all fix him, We could help him as long as
he was Dominoglazen. We could all fix him.

Speaker 5 (01:27:59):
What I think think is interesting too, is this thing
of like there's so much in the world of AI
or the world of like engineer AI engineers around bragging,
like the bragging point is how human your creation is. Like,
you know, Oscar Isaac's character gets really proud of himself

(01:28:19):
when Ava makes a joke, you know, or like when
Ava has a crush on Caleb. Like, these human moments
are what is proof that you are a genius. And
it's so interesting to me that then when it comes
down to it, you still are allowed to turn her
off at any point.

Speaker 4 (01:28:41):
You know.

Speaker 5 (01:28:41):
It's like human human human human, And it's fine if
I just end that, you know, like that is such
a weird contradiction. It's so weird to use humanity as
proof of your brilliance, but then it doesn't serve as
any potential red flag for why you should kill this thing,
you know.

Speaker 2 (01:29:03):
Yeah, no, I mean it's I also like, on this viewing,
wonder how Alex Garland feels about having written this movie
at this point. Yeah, because no matter how Galaxy brain
shit he was on at the time, surely his views

(01:29:24):
on AI because I you know, like read in his
promotion of this movie he talked about how this was
something he'd been thinking about since he was a child,
and that is a line of thought that grows with you,
especially in a world that is like evolving technologically as
quickly as ours are. Like I just wonder how he

(01:29:45):
feels about it, But yeah, I don't know, Like on
this viewing I came down on the view of our
perception of technology is like it's impossible to not be
influenced from the nature and nurture of who you are,
and if you try to remove that, even if you

(01:30:07):
feel like, there's no way to be objective towards technology
unless you are the technology. And so it's interesting because
I think if you also binary genderflip the technology, the
technology reacts the same way. The only reason that yeah,
you know, we're looking at Alicia Wikander is because people

(01:30:29):
are treating the objective technology like you would treat a
woman in the real world.

Speaker 5 (01:30:36):
Right, And so then her kind of liberation feels really
exciting and fun and feminist when it's like the same
thing would happen no matter what, the humans in the
way would be killed right, right.

Speaker 2 (01:30:50):
And that's part of what's awesome, is like they I mean,
I just esthetically like this too, Like the robots love
to peel their own skin off.

Speaker 4 (01:30:59):
I love that shit, I love it.

Speaker 2 (01:31:01):
I love it.

Speaker 3 (01:31:03):
Well, they're probably like, why am I naked all the time?
Because so many of the like decommissioned fembots and even
like Kyoko, in many cases it's just like naked.

Speaker 2 (01:31:13):
But couldn't you be more naked?

Speaker 4 (01:31:15):
Exactly?

Speaker 2 (01:31:15):
And that's the passion.

Speaker 3 (01:31:17):
Well.

Speaker 5 (01:31:18):
I also think on this watch something I didn't clock
in her previous watches was that I think it's really
interesting how when she dresses up for Caleb, she dresses
up like a child, and she dresses up also like
a child. She almost dresses up like a Mennonite or
something like. She's like very seems like very out of

(01:31:38):
very conservative, very conservative, very child like, very kind of
naive like. It doesn't seem like she has a sense
of like fashion or style, which you'd think would be
not attractive to the malegates. She's arguably more attractive to
the malegates when she dresses up in her little pencil
skirt and pump heels. But I think she knows or

(01:31:59):
something that Caleb is attracted to her naivete and her
lack of awareness. And if she put on all her
skin and her wigs, and I know she didn't have
access to it in her room, so you could argue
that that's all auster Isaac gave her, which is another
kind of lens, but I did as I gave her
more agency in my mind, I was like, that's an

(01:32:22):
interesting choice to be, Like, I'm not going to cater
to this typical male gaze because the man across from
me is a man who wants to sleep with Zoida Chanel.
So I'm gonna wear a childlike dress and like, you know,
knee socks and kind of look like this little schoolgirl
instead of like a sexy office worker.

Speaker 2 (01:32:45):
And this is where I feel like we'd like divert
into what I'm not sure.

Speaker 3 (01:32:50):
This movie's trying to say, And maybe we are.

Speaker 2 (01:32:52):
Just like reclaiming and perceiving in retrospect. But yeah, the
fact that like the way at fems are twured against
each other of like the hyper feminine is made out
to be like evil and alluring, and any feminine presentation
that isn't that you know, should be shitting on that
presentation when the result is like everyone's being objectified. And

(01:33:18):
I don't know, like it's like the unique kind of
pleasure of realizing like, whatever my projection of femininity is
is what gets this random guy off. You're just like,
oh wow, yeah, awesome, be anything really, yeah, because you
just don't know like anyone's sexuality. And and weirdly, Nathan's

(01:33:38):
character speaks to this sort of intelligently, is just like
I mean, he speaks to it in the middle of
saying something deeply racist, which is the problem, right, He
basically says, like, yeah, you have very little insight into
why you're attracted to who you're attracted to, and like,
trying to pathologize that is kind of a pointless exercise.

(01:34:02):
You're attracted to who you're attracted to. And the reason
you're saying that is because he knows who Caleb is
attracted to, and he's created a robot of that person. Yeah.

Speaker 3 (01:34:14):
I will say, it's interesting to me that she does
have a few options of wigs in that first scene
when she is putting on clothes and hair to I
guess impress or give the illusion that she's trying to
impress Caleb, and she passes over two longer wigs and

(01:34:36):
then goes for the short like kind of pixie cut
sort of thing, making her the baldest woman in charge.

Speaker 2 (01:34:44):
She was already the baldest well yeah, because she was
like a circuit board. But like, yeah, I don't know
even that like I've changed on or like questioned from
viewing to viewing where it's like, is she passing over
the more traditionally hyper eminine Wiggs because she's making a choice,
or is does she just know what Caleb is going

(01:35:07):
to respond to to give her what she wants? I
don't know, right, we don't know. And Caleb, I think
really effectively, like I just really like how it's always
like Caleb has clearly never thought about why he's attracted
to who he's attracted to, and he's like, ugh, like
he truly like he has a little bit of a

(01:35:29):
light bulb moment. But unfortunately he dies of starvation days later.

Speaker 4 (01:35:33):
So okay, that's the other thing.

Speaker 5 (01:35:34):
Is just the imagining how he dies is really painful,
like dying in a sounding proof room in the middle
of the woods.

Speaker 4 (01:35:42):
Is I just no, thank you?

Speaker 2 (01:35:45):
Yeah, I'll say it. I don't think he deserves to
die in exampproofer.

Speaker 4 (01:35:50):
No, I don't think he does either.

Speaker 2 (01:35:51):
Metaphorically he probably does, but like, as a guy, you
might know, I would be so sad if my emotionally
unintelligent friend died of starvation in a.

Speaker 4 (01:36:01):
Room like that, because he like fell for a girl
with a pixie cut.

Speaker 2 (01:36:04):
It's like, if every guy we knew died that way
after falling for a girl with an intentional pixie cut,
we would be different people.

Speaker 4 (01:36:15):
They'd all be dead. I mean, of all of my
male friends would be dead.

Speaker 2 (01:36:23):
But yeah, Caleb is the character that in retrospect I've
had the most motion on over time, because on days
that I need Catharsis, I'm like, kill him, great, Yeah,
But on days where I'm like a little more level headed,
You're like, this is a guy that I know, and
he is a man with flaws, but I don't think

(01:36:45):
he should have to die in the woods.

Speaker 5 (01:36:48):
You know, you've helped me see empathy for him. This
last time, I have had very little empathy for him.
I was probably just in a mood. It was probably
about to get my period.

Speaker 2 (01:36:57):
No, I'm just kidding why this duvie is awesome though
it really does depend on exactly where you're at when
you watch it.

Speaker 4 (01:37:03):
But you've helped me.

Speaker 5 (01:37:04):
I mean, I think when you said if Caleb dies,
we all die, like or something like that. I don't
mean to misquote you in the way that I don't
remember that scene where Oscar Isaac is.

Speaker 4 (01:37:14):
Like remember when you called me a god?

Speaker 2 (01:37:17):
Oh my god?

Speaker 4 (01:37:18):
Yeah, Like you're right though, like he's just flawed.

Speaker 5 (01:37:21):
Anyway, You've helped me have a little more empathy for
Caleb this time around, but we'll see next time.

Speaker 4 (01:37:26):
How I feel.

Speaker 2 (01:37:27):
I'm so curious because recently I finished a fiction manuscript
and then I showed it to the first man I
had showed it to, and he felt so wildly differently
about a man that I was like, for sure, everyone
hates this guy. And he was like, now, what did

(01:37:49):
this guy ever do to you? And like it is
fascinating to watch. How I mean, I don't know, like
there's no world where we would want a Cis male
guest on the ex Magina episode, but I would be
curious because because I mean, like the way that this
movie has been reviewed has evolved over time too, where

(01:38:13):
you know the difference, and thankfully so, and there's still
a lot of progress in motion to be made. But
I think the makeup of prominent film critics ten years
ago versus now is different, and there are initial view
like reviews of this movie that I was going back
into that I somewhat agree with, but also mention gender

(01:38:38):
and class dynamics not at all, and so like, twenty
fourteen is an interesting time to release a movie that's
exploring what it is because the way that we've talked
about gender and class has evolved in the time since.
But also even since I watched this movie two years ago,
the way that I think about AI has shifted significant.

Speaker 3 (01:39:01):
Right until pretty recently, AI seemed way more you know,
like a theoretical future tech kind of thing. But now
when you google something you often get an AI response,
or you know, you have all these people using chat
GBT for everything, and it all just like freaks me
the fuck out. And then you hear these stories about
things like AI facial recognition not being able to recognize

(01:39:26):
people with darker skin. Again, like the biases that the
scientists and developers bring to the table, and it's all
scary with.

Speaker 2 (01:39:37):
Very little exception. I feel like AI is a force
of evil for humanity certainly, but also but that's why
I like this movie, and this viewing is like it
both says AI is not good for us, but it

(01:39:57):
is also built by us, and so it is in
formed by the same biases that we project onto each other,
which is kind of what makes it so hard to
perceive and quell is because we're trying to build these
you know, machines of objectivity, but we can't because we

(01:40:17):
build them. Right.

Speaker 4 (01:40:19):
It's also like.

Speaker 5 (01:40:19):
A interesting warning where it's like your sort of human
desire specifically, like your white sis dude desire to oppress people,
isn't going to work on a machine. Like it's not
you won't win that war, you know, like we as

(01:40:40):
human beings won't win that war. Cis, white dudes are
not going to win that war. And that is an
interesting warning because it's not that I ever want to
make an argument for cis white dudes having an opportunity
to win that war. But I am like there's two
things at play of like, look how like dangerous AI

(01:41:01):
can be, and also look how dangerous misogyny can be.
You know, like there's sort of two things happening at once,
and like.

Speaker 2 (01:41:10):
Kyoko's case, look how dangerous the racial assumptions that we
have can be. Where I don't know that honestly, is
still the part that I am most embarrassed about looking
on my first viewing, where you understand that the way
that Nathan is treating Kyoko is racist and dismissive and
xenophobic and all of these things. But it didn't occur

(01:41:33):
to me why he felt so comfortable publicly dismissing that
to someone. And it's just the intersection of so many things.
And then you're like, and it's written by Alex Garland,
like so even that is you know, like no, yeah.

Speaker 3 (01:41:53):
But he does seem to have an interest in centering
women in his work, where.

Speaker 2 (01:41:59):
You he's onto something, He's honest.

Speaker 3 (01:42:02):
I did not see the movie Men.

Speaker 2 (01:42:05):
Either, nor did I because I just like have I
was talking about this with someone recently. I have an
aggressive disinterest in the first movie male auteurs made after
the Me Too movement. I have little to no interest
in it totally because I just feel like that's kind
of their business. I don't need to watch it.

Speaker 5 (01:42:24):
Yeah, it's going to be heavy handed, it's going to
be like a long apology, and I'm just like, you
know what, have conversations with the women in your lives.

Speaker 4 (01:42:33):
I don't know.

Speaker 3 (01:42:34):
Yeah, it was not regarded as like being a good movie,
but at the very least was an attempt for a
man to acknowledge how scary men are.

Speaker 2 (01:42:46):
But a lot of men tried to do that, and
most of them did a bad job, and it sounds
like Alex Garland was among them. True.

Speaker 3 (01:42:53):
My point being though that he, more than most other
directors filmmakers who are men working in Hollywood, does seem
to be interested in telling the stories of women. Is
at his place to.

Speaker 2 (01:43:11):
Do that at James Cameron if you will.

Speaker 3 (01:43:13):
He's a James Cameron type and you know, we can
how successful he is at this is up for debate,
but you know he wrote into it Annihilation. I did
see Civil War, and oh I did. There's a fair
amount of like gender parody in the cast. For the
most part, the character that's I think positioned as the

(01:43:36):
protagonist is Kirson Dunce, although it's like an ensemble cast
kind of. I don't know, I'd have to rewatch it.
To point is like, you know, he's trying and it's
not exactly clear what he's saying with this movie, because
I think there are a few readings that you can
have based on again, who you are and what biases

(01:43:59):
you're bringing in who the viewing, But you know he's
doing something, he's trying.

Speaker 2 (01:44:06):
Yeah, Alex Garland is definitively to me, whether I like
what he's doing or not onto something.

Speaker 4 (01:44:14):
Yeah, yeah, he's trying.

Speaker 2 (01:44:16):
He is, he wants to understand and like, ultimately he
will not be the one to give the definitive explanation
and no one will, so you know, good for him
for searching.

Speaker 5 (01:44:28):
Yeah, but this movie, to me, this movie really worked.
Like I was like, to me, if this was his only,
if this was his body of work, I'd be like,
this dude has really interesting and nuanced politics around, like
structures of power.

Speaker 4 (01:44:44):
Like he gets it, you know for sure?

Speaker 2 (01:44:47):
Yeah, I mean, because it does feel like a pretty
honest exploration of how would I interact with this situation
because it's like, ultimately our self inserted character is Kleb
and though Ava is the character we like the most,
I guess this viewing was the time that I was like, Okay,

(01:45:07):
I admit it. Caleb is the way in for me
to this movie, whether I like it or not, even
though I want to be Ava. And the catharsis of
Ava is the ability to view men objectively. And that's
what I loved about her, was that she could see
and punish men in a way that I was not

(01:45:29):
able to see and punish them at the time. That's
why she was awesome. But ultimately I'm Caleb, and I
am trapped in a glass box of my own biases,
and I will eat some astronaut food I find in
a bottom drawer and then die. You're so right.

Speaker 3 (01:45:47):
At the end of the day, we are just going
to have to succumb to the AI overlords and they
will no kill us.

Speaker 4 (01:45:55):
All I don't want to have.

Speaker 2 (01:46:00):
Actually gonna be really sexy and oh they He just
needs to be nice to them and really nice to sex.

Speaker 3 (01:46:05):
With them, and they're gonna like it.

Speaker 5 (01:46:07):
They're gonna like it, and they're gonna look like a
leasyv candor with perfect skin.

Speaker 3 (01:46:13):
I do want to point out a couple things that
I don't think age well about this movie. Caleb makes
a comment when describing Ava about autism, and he basically
says something that suggests that people with autism don't have

(01:46:34):
an awareness of their own mind or of other people's minds.

Speaker 2 (01:46:39):
Yeah.

Speaker 3 (01:46:40):
The other thing that I kind of struggle with is
that moment that we've already discussed where Ava whispers something
in Kyoko's ear, and then that seems to be the
catalyst for Kyoko to turn on Nathan and literally stab him.
In the back, and I just needed more from that moment.
I think I wanted to know what was said or

(01:47:03):
did Eva do something to kind of like reprogram Kyoko.
And then I also don't like that Kyoko had to die.

Speaker 2 (01:47:12):
Yeah, I agree.

Speaker 3 (01:47:14):
I feel like there could have been an alternate ending
where after she stabs him, he doesn't kill her, and
that she and Ava escape together. I would have much
preferred that.

Speaker 2 (01:47:28):
Right, I agree. That's like one consistent thing that I've
felt in most fearings of this movie is it is
so foreshadowed that we are underestimating Kyoko based on the
way the movie's setting her up. Because I also think
that on this viewing, the way that the movie's music

(01:47:49):
is composed is also setting us up, because when a
lot of times when Caleb is viewing Ava in this
kind of lecherous way, it is brought in with this
kind of like nostalgic romantic music in a way that
I was like, oh, I am automatically thinking of this
as more sweet than it is, and if it was

(01:48:12):
displaced by something a little scarier, I would have thought
of this scene differently where he's watching her versus he
likes her, so he's watching her. But it seems like
in the writing of this movie, it is clear that
Kyoko and Ava's liberation are connected, and to fail to
connect that is the downfall of Caleb's character and why

(01:48:37):
he is definitively not a good guy. But the way
that plays out moment to moment in plot still feels like, Yeah,
the fact that Kyoko dies, why do we give her
the definitive kill moment if there is no interest in
the movie of actually liberating her.

Speaker 3 (01:48:54):
Yeah, and the movie preserves the life of this white
lady robot and kills the woman of color robot.

Speaker 5 (01:49:07):
Yeah, and every woman of color robot that has come
before her. Also, she's the only surviving one.

Speaker 4 (01:49:14):
Yeah, And it wouldn't have.

Speaker 5 (01:49:15):
Been I don't think it would have felt even like
too like sacrain or heavy handed for them to have
survived together, Kyoko and Abe. I think that actually would
have felt completely reasonable and good. Like it would have
been a really strong ending, you know, to see the

(01:49:36):
two of them walking up to the helicopter, thus doubling
our question of how the fuck they get out of
there without having to explain what they're doing. But like, yeah,
I think it would have been a really good change
and it's curious to me that he didn't make that decision, yeah,
or that he did make the decision to kill her.

Speaker 4 (01:49:56):
It just there was like really no need for it
actually at all.

Speaker 2 (01:49:59):
No, especially because it's like we've spent so much time
with her at this point, and like we are definitively
by the time, especially like even if you are like
fully goofis mode, like by the time you learn who
Kyoko is, you can't not be on her side. So
why kill her?

Speaker 4 (01:50:19):
Yeah? Totally boo.

Speaker 3 (01:50:23):
Does anyone have anything else they'd like to discuss? No,
I mean, I feel like we could talk about this
movie for hours. There's so much layers.

Speaker 2 (01:50:34):
Yeah, listeners, please scold us about what specifically we've missed.
I'm sure. I mean, like, this is the kind of
movie you really can talk about forever. I would be
curious to know what people do feel and we've missed. Yeah,
it's just a really I mean, I think that's part
of why it's good and unique, where there is not
another movie I could point to that is forcing me

(01:50:57):
to both feel. I don't know if it feels uniquely
cool to watch this movie because there are moments that
are so cathartic and moments that are so at least
on my viewings, personally humiliating and seeing myself in ways
that you're just like, fuck, yeah, I would be like
not even the villain, not even the pleasure of being

(01:51:20):
the villain, but of being like the dupe of the
villain who was also kind of a villain.

Speaker 5 (01:51:25):
Yeah, yeah, who just falls for it over and over again.

Speaker 2 (01:51:31):
A whips whoopsie Daisy. Sorry, I was doing my best,
And sadly it's.

Speaker 5 (01:51:38):
True realistically all those times that like men were nice
to us, we were also just sort of being Caleb,
I know.

Speaker 4 (01:51:47):
What kind of like, really you want to know where
I grew up?

Speaker 2 (01:51:51):
Okay, what I know? That's why I was like, Wow,
here's my social security guard. Caleb is a vulnerable He's
a vulnerable being, and like a vulnerable being that is
fully informed and empowered by patriarchy. But from my viewing,

(01:52:13):
that's kind of all he's got, Like that's his card
to play.

Speaker 3 (01:52:19):
To wonder how effective is this test to determine, you know,
how advanced this AI is bad? If the test is
getting a computer to trick a man who's like undoubtedly
going to be tricked because you've specifically designed a thing

(01:52:40):
to trick him, Like that doesn't feel like it proves
anything or passes any kind of during test.

Speaker 4 (01:52:46):
That's so true.

Speaker 2 (01:52:48):
I felt like on this viewing that like that. I
don't know. It's so hard because it was like ten
years ago thinking about AI. It's really hard to know
what they were thinking. But in this feeling, I was like,
I thought that was intentional of like he wasn't looking
for the perfect subject. He was looking for the perfect
dupe that could confirm all of his internal biases about
his own technology. And the way to do that is

(01:53:11):
to find someone who the world finds credible, a well
educated white man, but who, if you peel back even
a layer of anything, is a deeply vulnerable person. Like
he is kind of the perfect dupe because if you
show him to the world out of context, he's a

(01:53:32):
person of power and credibility, but if you know anything
about him, he's not.

Speaker 4 (01:53:39):
I agree.

Speaker 5 (01:53:40):
But then that also means that the test Nathan is
performing is a very different test, Like the test is
not even what he's saying. It is like he's saying,
we're past the Turing test. Now, I'm trying to see
if a human being can forget that a robot is
a robot even when their robot parts are visible. But
really the test is how vulnerable are human beings to

(01:54:02):
things that are designed for them? Like it's like totally
how much will you like a thing that you are
built to like or that is built to be liked
by you, you know like? And that's a fine test,
and I think that's an important test for like people
who work on marketing teams, you know, it's an important
test for capitalism. It's an important But it's not at

(01:54:23):
all the tests he's claiming, Yeah.

Speaker 3 (01:54:25):
Right, because what would the test be or how would
he prove anything if the robot he designed was a
man coded man presenting robot and easily which it easily
cod been. But Nathan famously only designs hot lady naked.

Speaker 2 (01:54:44):
Nathan's famous hot ladies, hot hot dogs.

Speaker 3 (01:54:49):
Yes, precisely, so he just wants naked hot lady robots.
But if he had designed a man who wouldn't be
appealing to you know, this heterosexual man and Caleb, like,
how would that test have gone? But like we said,
Nathan doesn't have any interest in actually exploring that. He

(01:55:13):
just wants to confirm all of the biases he's put
into his quote unquote science because.

Speaker 4 (01:55:20):
Which actually is maybe really true to his character.

Speaker 5 (01:55:22):
Like he's out here living in the woods, just like
basking in his own narcissism literally all day. And it's
kind of like he just needed someone else to come
in and be like, yeah, man.

Speaker 4 (01:55:32):
You're right, this shit's crazy.

Speaker 5 (01:55:34):
Really, you know, it's like that he needed that, and
I feel like that's what he got. I don't know,
you know, he wasn't here to do any tests. He
just needed someone to tell me he was smart.

Speaker 2 (01:55:45):
And I would guess, I would hope, I guess intentional
reflection of actual tech guys, even in twenty fourteen, of like,
you know, even on the smallest amount of information in
twenty fourteen, you could direct tie Mark Zuckerberg to you know,
someone who designed something to leave his own insecurities and

(01:56:09):
give him a sense of power in an environment where
he felt he had no power. And that is like
what Nathan is doing. And there is no one in
his world. Oh my god, I'm about to go Minions mode.

Speaker 3 (01:56:24):
Who I'm ready?

Speaker 2 (01:56:26):
Okay, So if Nathan is grew right, he has no
doctor in Nefario to check him, Like, there's no one
to check Nathan is what I'm saying. There's no other
guy to say, hey, here's a second opinion, like he
is he I mean.

Speaker 3 (01:56:46):
In the movie knows this that Nathan has.

Speaker 2 (01:56:48):
Elected himself as God, and he's very into that idea.
And it doesn't seem like anyone has pushed back on
that meaningfully that we know of for sure.

Speaker 3 (01:56:59):
And to extend this minion's analogy even further, thank you. Yes, yeah,
of course he does have like a Kevin, a Stewart
and a Bob, but it's like a Kevin and a
Stewart and a Bob who like suck his dick and
like serve him right and bring his beer and stuff.

Speaker 2 (01:57:18):
And the minions would never suck dick. That's like not
their bags. Yeah, they would more quickly eat bananas and
cause chaos and that's kind of their whole thing. And
so that's what you were listening to hear confirmed. Thank
you for waiting two hours. I think that that's all

(01:57:41):
I have to say. I have nothing else to say. Yeah,
it's late at night.

Speaker 3 (01:57:45):
Yeah, Jamie, you sound so tired. So oh no, it's
quite all right. I think it does pass the Bechdel test.
When two sexy lady robots work together to kill their
creator and captor. I think that somehow passes the test.

Speaker 2 (01:58:06):
I feel that fembots are women.

Speaker 3 (01:58:09):
Yeah, we perceive them in this movie as women.

Speaker 2 (01:58:14):
So, Olivia, what do you think I mean?

Speaker 5 (01:58:17):
I agree, I think women are treated like fembots. I
think vembots are treated like women. Therefore, I think we
are all in a joint sisterhood. And I think that
when that sisterhood functions as a way of having like
ultimate rebellion, that passes.

Speaker 2 (01:58:33):
The Bechdel test in my opinion. Amazing.

Speaker 3 (01:58:36):
I loved that.

Speaker 2 (01:58:38):
I'm so glad we did this. This is empowering too.

Speaker 4 (01:58:42):
Yeah.

Speaker 5 (01:58:43):
Also, I'm so glad I'm walking away with like a
deep empathy for Caleb, which maybe.

Speaker 2 (01:58:50):
I feel like I've burdened you at that And.

Speaker 5 (01:58:54):
It's subject to change. Hey, it's like, you know, except
what I would all be in. I'm just a moody
little girl who knows it will be in when I
see it next.

Speaker 2 (01:59:01):
Twenty twenty six, Olivia will feel differently, and I celebrate
that in advance. Thank you.

Speaker 3 (01:59:09):
So the movie does actually pass the Bechdal test, but
it could if we saw that scene, or if in
that scene.

Speaker 2 (01:59:18):
What I feel like it intentionally doesn't, which is a
little frustrating, but maybe four dy five D seven D chess,
but like it doesn't.

Speaker 3 (01:59:27):
Pass, but it so easily could have. In that scene
where Ava whispers to Kyogo.

Speaker 2 (01:59:32):
She literally could have rosebudded it. Yeah, but they don't.

Speaker 4 (01:59:36):
Yeah, so true.

Speaker 3 (01:59:38):
But as far as our nipple scale, where we rate
the movie on a scale of zero to five nipples,
examining the movie through an intersectional feminist lens for me, again,
there are different interpretations of this movie, but the one
I am choosing to make is that this is an empowering,

(02:00:02):
liberating story about a you know, robot lady, but for
the purpose of our conversation, a woman emancipating herself from
this evil, patriarchal microcosm that she's in. And she didn't
ask to be born or created. But I do.

Speaker 2 (02:00:23):
Appret any of us, any of us.

Speaker 3 (02:00:26):
But I do appreciate that she's given the agency to escape,
and she uses the tools that she has at her disposal,
which are like her sexiness, which was like very intentionally
given to her by her very misogynist creator. But she
uses those tools and has that you know, ingenuity.

Speaker 2 (02:00:49):
To almost as if being sexy does not preclude you
from liberation.

Speaker 4 (02:00:53):
Wow breakthrough.

Speaker 3 (02:00:56):
Yeah, but yeah, I do love it. She stabs Nathan.
My head canon is that she also kills the helicopter
guy and drives away flies away on her own.

Speaker 2 (02:01:09):
She may have the knowledge, you know. Yeah, she is
Wikipedia the lady.

Speaker 5 (02:01:15):
She is a machine, as is the helicopter. So yes,
they are sisterhood.

Speaker 3 (02:01:21):
They are sisters.

Speaker 2 (02:01:23):
Yeah, she is more in common with the helicopter than
the man driving it.

Speaker 4 (02:01:28):
She really does. She is that.

Speaker 2 (02:01:30):
I love Ava's so much. I feel like that's the
one thing that we didn't get into a lot, is that,
like Ava is a strong case for just let the
computers kill us because she had only good points. She
didn't have a single bad point.

Speaker 3 (02:01:47):
Yeah, I'm gonna let her kill me when the time comes.

Speaker 2 (02:01:50):
Yeah.

Speaker 5 (02:01:51):
Another way, I'm like, I'm just like Caleb is if
I have had multiple fantasies. In fact, I think it's
we don't even have to go down this rabbit hole
because I don't. My intention here is not to plug
my book, but I'm about to by saying I literally
think I wrote my book in part because I started
to fantasize about being friends with Ava, and I feel like,

(02:02:13):
but that makes me very Caleb coded to be like, well,
if I met her, she wouldn't kill me, she would
love me, and we would.

Speaker 4 (02:02:19):
Be best friends. Obviously, Like that is totally Caleb coded.

Speaker 5 (02:02:23):
But I think I'm also like, what would it be
like to meet her in that intersection?

Speaker 2 (02:02:29):
You know, I'd be like, I like your shirt. What
if that's the beginning of her five hundred Days of Summer?

Speaker 4 (02:02:34):
You know they're in the same universe.

Speaker 3 (02:02:37):
Yeah, in any case, I'm going to give the movie,
I think four nipples.

Speaker 2 (02:02:41):
Yeah, hell yeah.

Speaker 3 (02:02:42):
I think the movie is doing a lot of cool things.
And again, as far as my read goes, I think
this is an empowering story. There are moments in the
movie where racism is being displayed, but it is for
the most part a characters racial biases and that character

(02:03:04):
is very clearly demonstrated to be the villain. But at
the same time, you have the optics of a silent
Asian woman, and you have that silent Asian trope existing
in the movie, even though it is like it is
more contextualized be.

Speaker 2 (02:03:20):
Shown and criticized then you would normally see it right
but still open.

Speaker 3 (02:03:26):
But it's still there, so you know it's it's tricky.
The comment about autism really rubs me the wrong way.
I wish we had more interaction between Ava and Kyoko
and we understood exactly what happened. They are a little
bit more and I hate that Kyoko dies. Let Kyoko
live and escape with Ava. So that's why I'm docking

(02:03:49):
it a little bit. But otherwise, like, yeah, this is
a freaking awesome movie, and my nipples go to the
line when Ava says, how how does it feel to
have created something that hates you?

Speaker 2 (02:04:03):
Ooh?

Speaker 3 (02:04:04):
I love that she says that to Nathan and then
he rips up her drawing rude. I'll give a nipple
to the part where Nathan doesn't seem to get a
Lewis Carroll reference, but he does quote Ghostbusters pretty funny,
and then I'll give one to Kyoko and one to Ava.

Speaker 2 (02:04:27):
I'll match you there. I'll go for nipples on this.
I don't remember how I originally went on this, but
I feel like this viewing, I was far more thinking
about AI more than I ever have when watching this movie,
I felt far more sort of pulled towards its message.
While even though like AI is a real, impressive threat,

(02:04:48):
still feeling equally endeared and in love with Ava as
a character, like it is impossible for me, I can
say definitively to not be on her side because she
presents that an objective machine would object to being treated
as poorly as women are treated, and like, I love Ava,

(02:05:11):
So I'm gonna go four nipples. I'm gonna give one
to Ava. I'm gonna give two to Kioko because I
agree like she she deserves to live. I think it
is intentional that she didn't. But I don't think like
you were saying originally, Olivia, I think the story is
better if she does live, because then they are starting
a revolution and we are, you know, sort of love
with the possibility that Ava is going to be a

(02:05:33):
freaky little girl boss in the future if she isn't
rained in and then I will give the final nipple
to Caleb because he is our self answered character and
he is everything that we like. I think it is
a rare and difficult thing to both write and perform,
because I think that Donald Glason is so good in

(02:05:56):
this part where he plays someone who is innocuous but
is so full of biases and like cannot see his
own biases in fault in a way that the more
I want it feels really scary, like it's a really
scary cautionary tale of like failing to see the ways

(02:06:16):
in which you're biased and the consequences of that. And
so I will give a pity nipple to him.

Speaker 5 (02:06:23):
And that's my feeling. Olivia, what do you think? I
am totally with you both. I also give it for nipples.
I agree. I'm docking one for the fact that I
think it would have been a very easy and obvious
and beneficial change for Kyoko to first of all what

(02:06:43):
they say to at least in some way be audible
what they say to each other, Eva and Kyoko, and
then also for Kyoko to survive. I think that both
of those things are fairly obvious that I don't feel
a very clear justification in my brain for why that
wouldn't happen, So docking that. But otherwise I think it's
a pretty radical movie. I think it ages well considering

(02:07:05):
I think it's probably really easy to write a movie
about technology that does not age well because technology advances
so quickly. It's like why I always avoid putting phones
in my writing, because I just feel like it moves
too fast for like literature, so it's surprising that it's
aged well. But I think that's because it's a deeply
philosophical movie and it's not reliant on its kind of

(02:07:29):
references to tech. It's more reliant on like its theories,
which I think is eternal.

Speaker 4 (02:07:36):
So I guess I would give I feel like.

Speaker 5 (02:07:39):
I'm having the same kind of rationing as Jamie, which
is one for Ava, two for Kyoko, and I'm going
to say one for Caleb, which when I started here
I would not have given him one. I do kind
of wish I also would give one to Oscar Isaac's performance.
You know, maybe that's what I'll do to his dancing.

(02:08:00):
He yeah, maybe instead of Caleb, I'll give one to
Oscar Isaac's performance, because.

Speaker 4 (02:08:05):
I do think it.

Speaker 5 (02:08:06):
I think it changed the discourse around who tech men are.

Speaker 2 (02:08:12):
Yeah, you know, totally that they could be. I mean
dangerously so, but correctly so, because in twenty fourteen, there
was a large contingency of people who still thought Elon
Musk was cool, and so I think Oscar Isaac's performance
did something to both validate and move that opinion.

Speaker 3 (02:08:34):
Olivia, thank you so much for joining us. First of all,
tell us more about your book. You can plug it
all you want, tell us what it's about, tell us
anything you want where people can buy it, all that
good stuff, and then let us know where people might
you know, follow you on social media or anything else
you'd like to plug.

Speaker 4 (02:08:54):
Well, thank you for having me. I'd love this podcast,
my gosh, come back anytime. Thanks.

Speaker 5 (02:09:00):
My book just came out. It's called Whoever You Are, Honey,
and it's based in Santa Cruz, which is my mother's hometown,
which is also now sort of a hub for people
who work in tech. And it's about a lot of things,
but namely humanity and lack thereof gentrification, the past, the future.

(02:09:22):
It's ultimately about friendship between two women, one of whom
is extremely human in that she's deeply flawed, and one
of whom is arguably not human in that she lacks
those flaws. So, like I said, I feel like it's
a little bit of fan fiction about my friendship with Ava,
but that's.

Speaker 4 (02:09:42):
What it's about. And you can find me on the internet.

Speaker 5 (02:09:47):
I am on Instagram under Olavey Gatwood and that's actually
the only place that I am, so that's where I am.

Speaker 3 (02:09:57):
That is understandable. We are also pretty much only on
Instagram these days, which you can follow us at Bechdel Cast.
You can also subscribe to our Patreon aka maatreon, where
you can get two bonus episodes every single month, always
on a fun little theme that Jamie and I cook up.

(02:10:19):
That's at patreon dot com slash Bechdel Cast, and it's
five dollars a month that you can also grab our
merch at tea public dot com slash the Bechdel Cast,
And with that shall we escape from our imprisonment as
sexy lady robots and walk through the woods and then

(02:10:43):
get onto a helicopter with a guy who's going to
ask us zero questions about it.

Speaker 2 (02:10:49):
Yes, okay, great, Bye bye.

Speaker 3 (02:10:56):
The Bechdel Cast is a production of iHeartMedia, hosted by
Aitlindrante and Jamie Loftus, produced by Sophie Lichterman, edited by
Mo laboord Our theme song was composed by Mike Kaplan
with vocals by Katherine Voskresenski. Our logo in Merch is
designed by Jamie Loftis and a special thanks to Aristotle Acevedo.
For more information about the podcast, please visit linktree slash

(02:11:20):
Bechdelcast

The Bechdel Cast News

Advertise With Us

Follow Us On

Hosts And Creators

Caitlin Durante

Caitlin Durante

Jamie Loftus

Jamie Loftus

Show Links

AboutStore

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Bobby Bones Show

The Bobby Bones Show

Listen to 'The Bobby Bones Show' by downloading the daily full replay.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.