All Episodes

April 30, 2024 47 mins

Even if you’ve never seen Lenna Forsén’s image, you know it, because an image of the Swedish former model went on to be one the most important images in internet history. 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet, as a production
of iHeartRadio and Unbossed Creative. I'm Bridget Toad and this
is There Are No Girls on the Internet. Welcome to
There Are No Girls on the Internet, where we explore
the intersection of technology, identity, and social media.

Speaker 2 (00:24):
Now.

Speaker 1 (00:24):
Even if you've never seen the iconic image a former
model Lenna Forson, if you've ever shared a meme on
the Internet or texted a picture to a friend, in
some ways, you have her picture to thank for it.
I joined my friends Samantha and Annie over at the
podcast Stuff. Mom never told you to dig into the
Lenna image and why this month Lena is finally retiring

(00:46):
as the first Lady of the Internet.

Speaker 3 (00:53):
Hey, this is Annie and Samantha, I walk to Stuff.
I never told you your propictionure of iHeartRadio. Today, we
are once again thrilled to be joined by the fabulous,
the fantastic Bridget Todd to Welcome Bridget.

Speaker 2 (01:14):
Thank you so much for having me.

Speaker 1 (01:15):
It's always such a joy when I get to start
my week talking to you all.

Speaker 4 (01:20):
Yes, I feels like you have an extra glow. Maybe
it's because you've been like soaking in all of the
sun on the beautiful beaches abroad. I've been stalking you
on Instagram and I'm like, how is this woman always traveling?
And I missed and I'm sad that I'm not.

Speaker 2 (01:35):
I'm just kidding. Oh you all should have come.

Speaker 1 (01:37):
We actually so, I was in Mazatlan, Mexico for a
clips to be in a hospitality. We actually it was
one of those trips where we'd invited all of our friends.

Speaker 2 (01:48):
We were like, we're gonna get a big house on
the beach. It's gonna be amazing.

Speaker 1 (01:50):
And then all of our friends are in and then
one by one by one by one, I'm.

Speaker 2 (01:55):
Just there alone, essentially. No enjoyed it.

Speaker 1 (02:01):
I enjoyed it. I love Mexico. It is one of
my favorite places. It was my first time in mazat Lawn.
Ten out of ten completely recommend it.

Speaker 3 (02:09):
Okay, did you see the eclipse and totality?

Speaker 2 (02:13):
I saw the eclipse of totality.

Speaker 1 (02:14):
It was my first time ever being in the path
of totality of an eclipse.

Speaker 2 (02:19):
You have either of you ever experienced this?

Speaker 3 (02:21):
No? Yes? Once?

Speaker 2 (02:23):
What were your thoughts? Annie, I am dying to know.

Speaker 3 (02:29):
Oh my gosh. If this was a different podcast, we
would go into a whole separate thing because I had
like a relationship issue that was happening on this day
and kind of a drama situation. So a lot of
times when I look back at it, a lot of
the pictures I took, I was like, oh, wow, we
were fighting, but it was also a work event that

(02:54):
I was at, so there was that layer. But it
was beautiful. It was so cool. I it sounds silly,
but I love space, like I like love the stars
are like my favorite thing. So it was really really
cool to see. It was not what I quite expected
because the glasses. Samanth and I were joking about this recently,

(03:16):
but the glasses feel so funny because you're like looking
around like they're not working, and then you like find
the space you're supposed to look up. I was just like,
very very happy to see it, honestly, like all that
drama I was working aside, I remember thinking, this is
really cool that I get to see this, and I'm
really happy that I get to see this.

Speaker 2 (03:37):
Yes, that was what I remember too.

Speaker 1 (03:41):
I burst into tears and the next day I woke
up in the middle of the night in a panic
because I was worried that I will forget what it
looked like being in totality like that, Like that's how
like I had never seen it, but I'd never say
anything like it. Anybody who listens to their no girls
on the internet is probably so sick of me talking

(04:01):
to me about this eclipse, and I am fully making
seeing this total eclipse like my personality. But it's kind
But yeah, I'm already planning where I will go for
the next one. So I guess I will see y'all
in I think, what is it, Spain?

Speaker 2 (04:18):
So that's that's the one.

Speaker 1 (04:20):
The next the next one that you can see if
you can go to Spain and see I think in
twenty twenty six. Okay, so but then the one that
you're referring to, Sam is like, don't hote me on
any of this.

Speaker 2 (04:31):
But that's supposed to be like the big one, the
big one that we will.

Speaker 1 (04:34):
Probably be able to see in our lifetimes, and I
think it's in parts of the African continent.

Speaker 2 (04:41):
I want to say, Morocco. Don't houte me on that either.

Speaker 4 (04:44):
Yeah, I did say at one point like it you
would be able to view it in the US. That's
the next time you'll be able to it. I don't
know if it's like the actual like the totality, as
you say, but like, I don't know, because I know
nothing about this. That's the only I know the date
in no kind for this past eclipse.

Speaker 1 (05:02):
Earlier this month, we had done so much planning, including
like looking at farmer's almanacs to see what the weather
and cloud coverage is like this time of year. And
that's how we settled on Mazetlan, Mexico, because it was
the place that is closest to us on the East
Coast in the United States that was most likely to
not have cloud coverage in April, because you could see

(05:23):
it from Vermont and Upstate New York and Texas, but
a lot of those places in April might be cloudy.
And so I have friends who were in Vermont and
upstate New York who were like, Oh, We're just.

Speaker 2 (05:32):
Gonna see it from our house, and I'm like, oh,
will you.

Speaker 1 (05:35):
Then on the morning of vietclipse in Mazatlan, Mexico, we'd
been there for a week. Every single day it's like
a beautiful, cloudless, blue sky day. So I wake up
on April eighth, the day of the eclipse, and it's cloudy,
the first cloudy day we've had in Mexico for the
entire week we have been there. Luckily, during the eclipse time,
the clouds did part, so we did get to see it.

(05:57):
But there would have been a lot of feelings had
we've not been able to see it.

Speaker 3 (06:01):
It's a lot of pressure to put on a trip
like that. Honestly, it's pressure to put on the eclipse.

Speaker 4 (06:07):
I mean, it exists.

Speaker 2 (06:08):
It's not their fault. So you can can contact like
the manager of.

Speaker 4 (06:15):
The sky to be like, actually, we didn't get a
good view, and we were like, okay, for y'all who
are Christians, here tell this God that truly did we
get canceled?

Speaker 1 (06:27):
We were like getting a little superstitious, like the things
that we were doing to try to like ensure good sky.

Speaker 2 (06:34):
It was getting a little a little out there. We'll
just sleep it at that.

Speaker 4 (06:38):
He brought a shaman in like we're going in.

Speaker 3 (06:42):
Oh my gosh, Bridget, I want to ask so many questions.

Speaker 2 (06:45):
About this later.

Speaker 3 (06:50):
Oh well, I'm very glad that you got to see it.
It is. It is amazing, like truly, and over on
the other podcast I New Savor, we did a episode
on like weird companies making money off of the eclipse
with their products, and I have heard from so many
people about the foods they made for the eclipse, and
it's brought me so much story. So oh oh yeah,

(07:14):
like totalityea, like.

Speaker 2 (07:16):
Tea, Oh yeah, that's good.

Speaker 3 (07:18):
Oh my gosh, so many things like this. So I
feel like we have a couple of we have some
years to brainstorm things like this.

Speaker 4 (07:26):
Next giant celebration, but keep that.

Speaker 3 (07:29):
In the back of your head, you know.

Speaker 1 (07:31):
Oh, next time, definitely doing eclipse themed at food party
or dinner party or something.

Speaker 2 (07:36):
I love that.

Speaker 3 (07:37):
Yes, there's a so many puns. I will I will
hold myself back for now, but I have to say
I am very, very excited to talk about the topic
he brought today, Bridget, because it is a thing that
I love of, like the history of something I think
a lot of people don't question the history of and
it's fascinating and I didn't know about it. So can

(07:59):
you tell us what we're discussing today?

Speaker 1 (08:01):
I feel the exact same way, And today we are
talking about the Lenna image.

Speaker 2 (08:06):
Is this something that either of you had ever heard of?

Speaker 1 (08:08):
I do not know, not known, So even if you're
listening and you're like, what is the Lenna image?

Speaker 2 (08:14):
I've never heard of this image. I've never seen this image.
Even if you.

Speaker 1 (08:17):
Don't know the story and you don't feel like you've
ever seen this image before, you kind of do know
this image because, as Linda Kinsler puts it in a
really meaty piece for Wired that I'll be referencing a
few times in this conversation, she writes, whether or not
you know her face, you've used the technology it helps
create practically every photo you've ever taken, every website you've
ever visited, every meme you've ever shared. Owes some small

(08:41):
debt to Lenna, and it really is exactly as you
were saying, Annie, one of those stories that is foundational
to the Internet and technology that you don't necessarily think of,
don't necessarily think of how it came to be, and
especially I think it's one of those stories that says
a lot about technology. On you here on Sminty, We've

(09:02):
had plenty of conversations about this. I've had many conversations
about this on their No Girls on the Internet, about
how things like massogyny are kind of can sort of
be baked into the foundation of technology, and I think
that is one of the reasons why tech is so
often perpetuating misogyny, not because it's some sort of an
unfortunate bug, but because this misogyny can be sort of

(09:24):
foundational in some ways. And I think this image really
is a good example of what I mean. And I think,
especially as we're having conversations about the rise of things
like new toify apps and AI generated adult content creators,
we're seeing what is kind of becoming a marketplace that is,
men making money off of the bodies and or labor

(09:46):
of women without their consent, certainly without their compensation. And
I think this situation in the Lenna image, where this
image of a woman went on to create this entire
field of technology without her consent, perhaps really tell us
something about where we're headed in twenty twenty four.

Speaker 3 (10:03):
Yes, absolutely, especially when you consider where it comes from,
which I know we'll talk about. But also, yeah, these
conversations we're having now about like actors perhaps not given
their consent to being used in certain ways just and honestly,
it ekes sense to all of us if you've posted
an image on her line, right, not consenting to using
an image to an image getting used in a certain way.

(10:25):
But so much about this history is fascinating because it
feels so standardized, which is odd. Can you tell us
about that?

Speaker 2 (10:34):
Totally?

Speaker 1 (10:35):
So for folks who don't know, the Lenna image is
literally an image of this woman, Lenna Letna Forsin. She
is a woman from Sweden who in the seventies was
a model. So this kind of sensual image of her
wearing a tan hat with a purple feather flowing down
her bare back, staring kind of seductively over one shoulder.

(10:55):
That image of her was published in Playboy in nineteen
seventy two.

Speaker 2 (10:59):
She was essentially a playmate.

Speaker 1 (11:01):
That image would go on to become what's called a
standard test image.

Speaker 2 (11:04):
So big caveat here. I am at an engineer.

Speaker 1 (11:09):
If I say something that is you're like, if you're
an engineer listening, and you're like, that's not totally correct.
I am at an engineer. But here is a definition
of what a standard test image is that I found
from caggle dot com, which is like a developer community site.
They say a standard test image is a digital image
file used across different institutions to test image processing and
image compression algorithms by using the same standard test images,

(11:32):
different labs are able to compare results both visually and quantitatively.
The images are in many cases chosen to represent natural
or typical images that a class of processing techniques would
need to deal with. Other test images are chosen because
they present a range of challenges to image reconstruction algorithms,
such as the reproduction of fine detail and textures, sharp

(11:52):
transitions and edges, and uniform reasons. So basically, to put
that in Layman's terms, a standard test image is it's
like a test image that tests to make sure that
the technology is working as.

Speaker 2 (12:04):
It should be, or like rendering the way that it
should be.

Speaker 1 (12:07):
Lenna's image is not the only common standard test image.
There's also one that is like a bunch of different
colored jelly beans on a table. There's another one that's
called peppers that's just a bunch of different colored like
red and green peppers, like callapeno peppers. So this is
just a thing that becomes a way for technologists to
test that the image generating technology is working correctly.

Speaker 3 (12:29):
I do think this is very interesting for a lot
of reasons. But if you have like jelly beans and peppers,
those are things to be consumed, and then when you're
thinking about where they got this image from, a Lena like,
how did this happen? How did this image become this
standard testing thing?

Speaker 2 (12:50):
So this is actually a pretty interesting story.

Speaker 1 (12:52):
The story of how Lenna's Playboy picture becomes this standard
test image that is everywhere and very ubiquitous starts with
computer and a electrical engineer, Alexander Sawchuk. According to the
newsletter for the Institute of Electrical and Electronic Engineering, or
the I triple E, as I have found out it's
sometimes called.

Speaker 2 (13:10):
I was talking to somebody about this and I was like, oh,
the I E E ee.

Speaker 1 (13:15):
And they were like, it's just the I triple ELL.

Speaker 3 (13:18):
I want to so I E E.

Speaker 5 (13:22):
Actually the I E exactly.

Speaker 2 (13:26):
So it's the summer of nineteen seventy three.

Speaker 1 (13:28):
Alexander Sauchuk was an assistant professor of electrical engineering at
the University of Southern California and also a grad student
in the SIPI lab as a manager. As the story goes,
he's like frantically searching around the lab for a good
image to scan for a colleagues's conference paper. He had
just sort of gotten bored with their usual stock test images.
Because they mostly had come from like nineteen sixties TV

(13:50):
standards and then we're just a little bit dull. He
wanted something glossy and sort of like fresh and dynamic,
but he also wanted to use a human face specifically
just that as the story goes, somebody happens to walk
in holding the most recent issue of Playboy magazine.

Speaker 2 (14:06):
Why this person was.

Speaker 1 (14:07):
Bringing Playboy Magazine into his workplace, I'm cannot tell you
how good you.

Speaker 4 (14:13):
Just come into your institute. Okay, cool?

Speaker 1 (14:15):
Yeah, Like, I mean, I do think that that sort
of like gives you a sense of like the dynamics
that we're dealing with, right, that somebody just happens to
walk in with the but the most recent.

Speaker 2 (14:25):
Playboy under their arm.

Speaker 1 (14:27):
Right, The engineers tore away the top third of the
centerfold so they could wrap it around the drum of
their merhead wire photo scanner, which they had outfitted with
analog to digital converters, one for each red, green, and
blue channels, and an HP twenty one hundred mini computer.
So all of that to say is that they effectively

(14:47):
cropped this image so that you can't see the models
bears in the image, So it's just a picture of
her from the shoulders up looking over her shoulder. It's
still like quite a seductive photo, but the full photo
has her like bare booty in it.

Speaker 2 (15:01):
She's wearing I look like.

Speaker 1 (15:02):
A feather boa and like thigh high stockings looking over
her shoulder. So back in the seventies and eighties, this
image was really sort of like used in very limited cases.
You could only really see it on dot org domains.
It was pretty limited to like engineers. Then in July
of nineteen ninety one, the image was featured on the
cover of Optical Engineering, alongside that other test image of

(15:25):
the different colored peppers. Funny enough, I took a look
at that cover. It's all black and white. So I'm like, oh,
I think they're trying to demonstrate that, like these images
had all these different dynamic colors, but both of them
are rendered in black and white, kind of rendering that meaningless.
So this is when Playboy gets wind of this, and
they are not happy because it's basically copyright infringement, which

(15:48):
this is not related to the story, but I always
have to add this whenever it comes up. How litigious
Hugh Hefner and Playboy were. I always think this is
very rich because, as y'all probably know, Hugh Hefner made
an entire lucrative industry off of images of Marilyn Monroe
that she did for a calendar company, for which she
was only paid fifty dollars. Many years after that photo shoot,

(16:12):
Hugh Hefner bought those photographs from the calendar company republished
them without Marilyn Monroe's consent or permission in nineteen fifty three.
That was the first ever issue of Playboy. Hugh Hefner
paid five hundred dollars. She got fifty dollars. Right, So,
whenever I read about how litigious Playboy is, which they're
very litigious, I always had to chuckle at that that, Oh, like,

(16:32):
you don't want somebody profiting off of your intellectual property,
but had no problem profiting off of a woman's body
without compensating.

Speaker 2 (16:42):
Her fairly or even her consent.

Speaker 4 (16:44):
Interesting, this is like par for the course for him.

Speaker 2 (16:47):
Oh my godness, don't even get me started with Rough Heffner.
We will be here all day.

Speaker 4 (16:54):
The things that came out after he died, which I'm like, Wow,
he had a pretty good, like powerful handle on people
not talking until he died.

Speaker 2 (17:03):
Oh my gosh.

Speaker 1 (17:03):
I was listening to an episode of celebrity memoir book
Club where they read a lot of X Playmate and
ex Playboy Bunny memoirs. Some of the things that they
write about, I'm like, oh my god. Like even even
Lenna in an interview, she talks about how in the seventies,
after this photo shoot, she was invited to go to
the Playboy mansion and the quote is something like, they

(17:26):
made it clear in the invites that I would have
to spend time with Hugh Hefner while he was in
his dressing robe, and I said, no, thanks, I mean.

Speaker 5 (17:37):
She already knew. Was like, yeah, yeah.

Speaker 1 (17:52):
So Playboy threatens to suite these engineers and at this point,
the engineers, it sounds like, had like grown very fond
of using this image that they thought back. Eventually Playboy
back down because, as a Playboy vice president put it, quote,
we decided we should exploit this because it is a phenomenon.
So yeah, by his own words like, oh, let's exploit this.

Speaker 2 (18:13):
Yeah.

Speaker 1 (18:13):
No, talk about the fact that this is two groups
of men fighting over who owns this image of a woman,
in one case being used in a manner that is
completely without her consent or control it just it.

Speaker 2 (18:26):
Already from the beginning.

Speaker 1 (18:27):
It just feels to me like men fighting over how
they can use a woman's representation that I think is
so foundational to some of the conversations we're having about
technology like AI right here in twenty twenty.

Speaker 3 (18:39):
Four, absolutely and she did become pretty foundational, right.

Speaker 1 (18:46):
Oh, absolutely, So this is when the image of Leta
really becomes super popular. The whole drama about the cover
catapults this image into what you might think of as
like early Internet virality or popularity. This was in nineteen
eighty five. The use of the photo and electronic imaging
has been described as clearly one of the most important
events in history. It is truly hard to overstate how

(19:08):
ubiquitous this one image is in technology. There is this
fascinating interactive piece by Jennifer Ding at the Pudding. The
piece is so cool, It's like one of those interactive
pieces that has a timeline.

Speaker 2 (19:19):
Definitely check it out.

Speaker 1 (19:20):
But in that piece, Ding actually includes a freeze frame
of the show Silicon Valley on HBO, where in the
background there is a poster with the Lenna image on
the wall. Right, So this image is also included in
scientific journals just all over the place. Ding found that
within the dot edu world, so like websites related to education,

(19:42):
the Lenna image continues to appear in homework questions, class slides,
and to be hosted on educational and research sites, ensuring
that it will be passed down to new generations of engineer.
So this became so popular that Lenna herself is often
called the first Lady of the Internet.

Speaker 3 (19:59):
Wow, I kind of like her taking that picture having
no idea that this is what would happen, which, yeah,
I mean, I guess that speaks to the next question,
why did this image take off the way that it did?

Speaker 2 (20:12):
Well? If you asked David C.

Speaker 1 (20:13):
Munson, who is the editor in chief of the i
E or the Tripoli Transactions on image processing, he said
that the image happened to meet all of these requirements
for a good test image because of its detail, it's
flat regions, shading and texture. But even he will not
leave out the obvious fact that it's also a picture

(20:34):
of like a seductive, sexy young woman. Duh, right, like that,
That's definitely part of it. He says, the Lena image
picture is of an attractive woman. It is not surprising
to me that the mostly male image processing research community
gravitated toward an image that they found attractive. And so
I do think there's something about these highly male dominated

(20:56):
spaces where it's not just that there's a lot of men,
it's like their worldviews, their interests, their perspectives, their biases
that are really taking up a lot of space in
these in these spaces, I just think that men feel
like these spaces are theirs, and that they are free
to decorate those spaces with the pretty women that they

(21:17):
think they feel like they should be able to use
without their consent or compensation. I just think that, like Annie,
you mentioned earlier, that the other test images are these
things that you consume, right, like peppers or jelly beans.
There's another famous one of a baboon that's has different
colors on its face.

Speaker 2 (21:34):
It's interesting to me that it's these.

Speaker 1 (21:35):
Things that are not human, things that are like animal
or that you consume that like throwing a sexy young
woman into that mix. I don't think maybe seem like
a huge departure for these guys.

Speaker 3 (21:46):
Yeah, and again, when we think about things in the
realm of AI or even I know I've complained about
this many times. But in the worlds of fandom are gaming,
it's like that, it's like, you can come into our
world on our terms and you wear what we want
you to wear. You are here because we let you
be here in this male dominated space, but you're gonna

(22:07):
do what we want. It's not up to you, and
that's the only way that you can you can be
in this world. But that being said, there has been
some pushback recently. Ish right, Bridgie.

Speaker 2 (22:23):
Yeah, So one thing about what you just said.

Speaker 1 (22:26):
When I was researching for this episode, some of the
different engineers who had contributed to this image's popularity, they
were quoted when they actually met the actual real Letta
at a conference that she was invited to, they were like,
I can't believe she's a real person.

Speaker 2 (22:42):
And part of me was like, you didn't even see
her as a real human.

Speaker 1 (22:45):
They just saw her as something that they had got
an image in a picture that they had been consuming
for decades, and they had so removed her from being
a real, breathing human that meeting her in real life
was like they were.

Speaker 2 (22:58):
Surprised that she was real.

Speaker 1 (23:00):
I think that really speaks to the sort of fandom
element that you were talking about.

Speaker 2 (23:03):
This idea that like.

Speaker 1 (23:04):
You can come if you are a fantasy and in
some ways not even a real human.

Speaker 2 (23:11):
You know what I'm saying. You're like, do I ever?

Speaker 3 (23:16):
Yeah, like, don't say anything that I don't like, like,
keep quiet and look the way I like, then you
can be here. But oh you're a real person. Oh no,
I don't want to hear it at all.

Speaker 1 (23:29):
Yeah, so you're exactly right, Annie. All of this happened,
but it was not without pushback. Around like the twenty tens,
people started publicly asking whether or not this image of
a woman from Playboy should be so soundational to technology,
especially in education settings, you know, given conversations about the

(23:49):
need for more women in these spaces and how to
make these spaces more inclusive and more diverse.

Speaker 2 (23:54):
That's really around when you start.

Speaker 1 (23:55):
Hearing like people in public being like, wait a minute,
maybe this isn't so cool. In twenty fifteen, Mattie Zugg,
who was then a student at the Thomas Jefferson High
School for Science and Technology right here in DC, riilive
who I should say now is a product safety engineer
at Apple who focuses on preventing tech and abled abuse
and stalking and harassment on Apple platforms, So like go Maddie.
Maddie sounds like she was cool in high school and

(24:17):
is cool now. So Maddie wrote this op ed basically
asking the question of like, should I, as a high
school student at at a STEM high school be given
an image from Playboy as part of my education in
technology and STEM? She writes, I first saw a picture
of Playboy magazines Miss November nineteen seventy two, A year ago.

(24:39):
As a junior at TJ, my artificial intelligence teacher told
our class to search Google for Lenna Soderbird, not the
full image, though, and use her picture to test our
latest coding assignment. At the time, I was sixteen and
struggling to believe that I belonged in a male dominated
computer science class. I tried to tune out the boy's
sexual comments. Why is an advanced science, Technology, Engineering, and

(25:01):
Mathematics school using a Playboy centerfold in its classrooms? Her
piece ends was saying it's time for TJ to say
hello to inclusive computer science education and say goodbye to Lena.
So Maddie was not the only person who was like,
maybe this image shouldn't be the thing that all of
our education is centered around. In that piece for Wired,
I mentioned they talked to several women in technology. You

(25:23):
had very similar stories. This one is actually pretty funny.
Deanna Needle, a math professor at UCLA, had similar memories
from college. So in twenty thirteen, she and a colleague
staged a quiet protest. They acquired the rights to a
headshot of the male model Fabio Lonzoni and used that
for their imaging research. So they kind of like turned

(25:43):
it around, like, oh, you're gonna use a sexy woman, Well,
we'll use a sexy man.

Speaker 2 (25:47):
What do you think about that?

Speaker 3 (25:50):
I love it.

Speaker 1 (25:51):
So in that piece they actually tracked down and speak
to the real Lenna, who also called for her image
to be retired. She says, I retired from modeling a
long time ago. It is time I retired from tech two.
We can make a simple change today that creates a
lasting change for tomorrow.

Speaker 2 (26:08):
Let's commit to losing me.

Speaker 1 (26:11):
And there's actually some news on that front, because as
of April first of this year, that I Triple E
officially retired the use of the Letna image and announced
they will no longer be using that image and their publications.
Ours Technica points out that this is kind of a
really big deal that will likely have a ripple effect
in the space. Because the journal has been so historically
important for computer imaging development, it'll likely set a precedent

(26:35):
removing this image from common use. In an email, a
spokesperson for the I Triple E recommended wider sensitivity about
the issue, writing. In order to raise awareness of and
increase author compliance with this new policy, program, committee members
and reviewers should look for inclusion of this image, and
if present, should ask authors to replace the Letna image
with an alternative.

Speaker 3 (26:56):
Yeah. I love that from Lenna herself, like, let's commit
to forgetting me. That's such a great line. But it
does speak to It speaks volumes, as you've been saying,
bridget to our attitude towards women on the Internet and
towards consent on the internet. And so when we're thinking

(27:17):
about this, which was foundational, what do you think about
this the legacy of this image?

Speaker 1 (27:27):
Yeah, I love that question. You know what I was
reading about how this image came to be. I'm imagining
a very different time, right, It's the seventies. People aren't
necessarily having a lot of public, loud conversations about the
power dynamics of who's in the room and who's not
in the room.

Speaker 2 (27:44):
Where a lot of this technology is getting built.

Speaker 1 (27:46):
And it really made me think of, like, Wow, the seventies,
that probably was such a different time. But here in
twenty twenty four, we are having those conversations, loud voices,
are publicly having those conversations. There are women and people
of color, and trans folks and queer folks and all
kinds of folks who are building and making the technology
that shapes our world today. And so in twenty twenty four,

(28:08):
it almost feels like we are pretending that we're still
in this nineteen seventies we didn't really know how who
could have foreseen world, when in fact we're not really
in that world. People are asking the questions, people are
raising the alarm, and I guess I don't think it
should be several decades after AI technology becomes ubiquitous for

(28:30):
people to start asking the question about how traditionally marginalized
people like women are being used and represented and perhaps
exploited without their consent in these spaces. I think it
provides a really interesting precedent for what's going on here
in twenty twenty four and.

Speaker 2 (28:45):
Jennifer Ding put it really well.

Speaker 1 (28:47):
She writes to me, the crux of the Lena story
is how little power we have over our data and.

Speaker 2 (28:53):
How it is used and abused.

Speaker 1 (28:55):
That threat seems disproportionately higher for women, who are overrepresented
in Internet content but underrepresented in Internet company leadership and
decision making. Given this reality, engineering and product decisions will
continue to consciously and unconsciously exclude our needs and concerns.

Speaker 2 (29:12):
Right, And so I really agree with that.

Speaker 1 (29:14):
That this let us story really is a story about
power dynamics and who is represented in technology and who
is just sort of like has their needs exploited or erased?

Speaker 2 (29:24):
Right, Like.

Speaker 1 (29:26):
Men wanting to consume the bodies of women is like
foundational to the Internet. It's like why we have the
Internet the way that we have it. And I think
we know that now. It's like an objective fact about
the Internet and technology. I don't think we can still
make technology that does not honest about that, because if

(29:47):
we're not being honest about that, we can never fix that,
we can never question that, we can never have that
be a dynamic that we stop perpetuating with technology.

Speaker 3 (29:56):
Yeah, and I think it's like going back to the
point about being in a classroom setting and being shown
explicitly like this is how women are viewed in space.
This is what built a lot of what we use today,
and we're still talking about it is telling in itself,
and especially when we're seeing that perpetuate in all of
these tech spaces where it still feels in a lot

(30:19):
of ways even though women in marginalized people have built
those spaces that like, you're the guest here, and you're
only here because we're opening our gates a little bit
to let you in, but otherwise, yes, get out.

Speaker 1 (30:35):
And I just think that's a dynamic we need to
be questioning in twenty twenty four. And I think so
like something about the use of this image it's ubiquity
in education spaces, I find so telling. But also even
if you're not studying to be an engineer or something,
I think there is a dynamic that says that if
you are a person who is traditionally marginalized, you're not

(30:57):
a decision maker, you're not a power holder, you're not
doing or making anything that anybody needs to care about.
And the entire dynamic is that we use you in fact, right,
So Ding actually points this out on her piece, she says,
while social norms are changing toward non consensual data collection
and data exploitation, digital norms seem to be moving in
the opposite direction. Advancement and machine learning algorithms and data

(31:19):
storage capabilities are only making data misuse easier, whether the
outcome is revenge porn, or targeted ads, surveillance, or discriminatory AI.
If we want a world where our data can retire
when it's outlived its time or when it's directly harming
our lives, we must create the tools and policies that
empower data subjects to have a say in what happens

(31:39):
to their data, including allowing their data to die. And
so I think, even if you're not somebody who is
a techie, that does concern you this dynamic that just
says we consume, we exploit, we make money from you,
and you don't get to have a.

Speaker 2 (31:53):
Say about it.

Speaker 1 (31:54):
That's the dynamics that I think this Lenna image really
did usher in without really even necessarily meaning to.

Speaker 4 (32:10):
I think there's a big conversation here on like the
power of capitalism within the tech industry and what makes money.
I can't help but think, like with the Lina image,
the fact that this toxicity was used to make more
profit and more power within this industry. It took forty
to fifty years for it to even have a conversation

(32:31):
about like let's change it, let's retire it. But the
fact that it had that much push back because they
didn't care enough and they wanted to build on this
toxicity because they knew it could make money is the
most concerning thing to me. And then the powers that
be are saying that, yeah, yeah, we're definitely going to
control this and then just goes after an app instead
of the root of the problem. Seems like the biggest

(32:51):
part of the conversation because even in the AI world,
with new apps coming through, new programs coming through, and
they're all competing with each other, they don't want to
let go of the toxicity. But that's what's making the money,
which is really really concerning.

Speaker 1 (33:06):
Yeah, And I mean like if there was one, so
what of why I wanted to have this conversation, Sam,
That is exactly it that it is about money. It
is about capitalism. It is about making money off of
people's own exploitation and selling that exploitation back to them
to make more money. And it's just a really toxic
dynamic that I believe is harming us and making the

(33:30):
people who have created that dynamic rich all the while
they get to be like, Oh, it's not a big
deal for you.

Speaker 2 (33:36):
Actually, this is going to be really good for you.
This is going to be convenient for you.

Speaker 1 (33:39):
And I don't know, Like I woke up this morning
when I was trying to decide, like what I wanted
to talk to you all today about, and one of
the ideas that I had that I that I scrapped
was just this feeling that being on the internet just
doesn't feel fun anymore. Anytime I go on a website,
anytime I google something just to find out information, it
feels feels like a scam. It feels like exploitation. I

(34:02):
feel like I am one click away from somebody getting.

Speaker 2 (34:05):
My social Security number.

Speaker 1 (34:06):
It feels like AI generated garbage. And I just think
we have hit the wall of that feeling. I can't
imagine that I'm alone in this. I think the feeling
of being showing up online today in twenty twenty four
feels exhausting, and I think part of it is because
it feels like we are being bled dry by people

(34:27):
that we have already made rich from our own exploitation.

Speaker 2 (34:31):
Do you re feel that way? Oh?

Speaker 4 (34:32):
Absolutely, I think with because getting on TikTok, the first
opening video, I'm sure you've seen it is that content
manager who's like, I'm here for the safety of TikTok.
Have you seen this?

Speaker 2 (34:42):
I have not.

Speaker 4 (34:43):
She's been there. She is for safety in something like
she has a very specific safety Yes, Susie someone she
is very white and she's very redheaded. It's I was like, okay,
so we've got played into the xenophobey. She's like, look,
I'm a white person. I'm gonna help you out here.

Speaker 2 (34:58):
Don't worry about don't worry out.

Speaker 4 (34:59):
I'm get But that's the first thing that I'm seeing, So like,
you know, urging TikTok users to talk to the government
because they voted this in and this is real bad
and all this and not whatnot. And I'm just like,
all right, it's gonna go away next. This is now
my attitude because also I'm very tired. But also I
just got an email saying that AT and d Yes

(35:20):
has a record that oh that that reached they have
your stuff. But good news, since you don't have a
bill with us we don't.

Speaker 2 (35:26):
We didn't.

Speaker 4 (35:26):
You didn't get any perfect information. But I literally think
every month I have been given seeing an email that
says something of my information is has been breached, and
it's nothing that I have done. It is literally everything
from my insurance, my dental insurance, my healthcare provider, my internet,
which I'm like, what the hell, my phone subscription, my

(35:50):
cell phone, which I'm like, I'm starting to get back
to that. I think I want a landline. I'm gonna
have this moment, y'all to each of those things popping
up on things, I'm like, I hadn't. I have to
use that information in order for me to have healthcare.
So y'all, let my healthcare information go out and they
have my social Security number. There's nothing I can do
about that. As many times as I can change my password,

(36:10):
the next email I'm getting is telling me that I've
got a data breach of my information. So what is
the point. Like, at this point, the only way is
to rewrite my identity and to never get online again,
which would be really hard for my job.

Speaker 1 (36:26):
Yes, Like, if you have a phone in your life,
if you vote, if you drive, these things that we
are required to do to participate in public life should
not just be avenues for somebody to make money and
scam us.

Speaker 2 (36:42):
But yet it feels that way.

Speaker 1 (36:44):
And you know what, Sam, I have actually not seen
the TikTok that you're referring to, because I have not
opened my TikTok app in days because it's starting to
feel like QVC.

Speaker 2 (36:54):
And I cannot take it anymore.

Speaker 1 (36:55):
Like whatever happened to spaces on the Internet that we're
supposed to feel like safety or exploration or fun or
community or connection. I'm I hope that somebody out there
listening is like Bridget you're old and on hip. We
have those spaces, they are syz tell me about them.
I want to know about them. But I think that
we should. We really got to get back to, like

(37:17):
to those principles of the Internet feeling like something other
than being taken for a ride.

Speaker 2 (37:23):
On which you are the chump.

Speaker 4 (37:25):
Right, And I will say a lot of people have
felt like Discord and read it has been like brought
in but we already know Reddit has god its problems.
And then I think there's a new lawsuit with Discord
with his problems and its terms of service changing as well.

Speaker 2 (37:38):
I'm like, what totally it's happened.

Speaker 4 (37:40):
So there's literally no one is protecting the individual to like,
there's no protection for us at all, but they want
us to say, they want us to to take away
things from us, which is like the least of our worries,
or they're just like sorry, you're like you can't see us.

Speaker 1 (37:57):
Yeah, I think everybody is feeling that, but I think
it is particularly dangerous for people who are traditionally marginalized because, yeah,
which it's just the expectation that.

Speaker 2 (38:08):
Oh, it's totally fine.

Speaker 1 (38:10):
If people who make apps that non consensually undress women
using AI, why wouldn't they be able to advertise on
Facebook or Instagram or Twitter.

Speaker 2 (38:20):
They got to make money. That's a business.

Speaker 1 (38:21):
Like how how easy it is to erase the human
people at the heart of this dynamic, erase their concerns,
erase their needs, erase their harm because men gotta make
money off of it.

Speaker 2 (38:33):
Right, I'm thick of it, right?

Speaker 4 (38:35):
Or is tradition literally like, yeah, this image has always
been here, we need to teach it as a historical now.
It was definitely not exploiting somebody or taking advantage of
somebody or using humiliating content because she wasn't humiliated, I
don't think, but like in the ideal of like it
being forever and ever and ever, of like your seductive

(38:55):
picture being used for it people, which is a whole
different conversation in this Yeah, I.

Speaker 2 (39:01):
Mean so Lenna, the real life Lenna.

Speaker 1 (39:03):
And again there's a really interesting Wired article that has
an interview with her.

Speaker 2 (39:07):
She doesn't feel like she was exploited.

Speaker 1 (39:08):
She's actually really proud of that image, even as she
recognizes that it's like time for it to be retired. However,
she does wish that she had been fairly compensated for
what would go on to be her like non consensual
contributions to tech when she took that image. There's no
way that as a young playboy playmate in the night
in nineteen seventy one or whatever, you would have a

(39:29):
sense of like, well, if this goes on to be
to make me the first lady of the Internet, I
better have compensation and protections. No way, right, So in
that Wired piece they say it makes sense that she
would feel this way. Unlike so many women in tech,
Lenna has at least been acknowledged, even feted, for her contribution.
She did that work, and the people started using that
photo in this neat new way, and now she has

(39:51):
this kind of immortality woven into the design of the machine.
This is from Marie Hicks, a historian of technology and
the author of programmed Inequality.

Speaker 2 (40:00):
All of this happened for a reason.

Speaker 1 (40:01):
Hicks writes, if they hadn't used a Playboy centerfold, they
almost certainly would have used another picture of a pretty
white woman. The Playboy thing gets our attention, but really
what it's about is this world building that's gone on
in computing from the beginning. It's about building worlds for
certain people and not for others.

Speaker 4 (40:17):
I find it interesting, dude, that they invited her to
the conference, Like, I'm wondering what the purpose was other
than two like for because it obviously wasn't to ask
her questions about tech and how she did this thing,
because they did not even consider human as we know.
It was just literally to aggle her in real life.

Speaker 2 (40:38):
Yeah, I was thinking about why they did that.

Speaker 1 (40:41):
I don't know, I have parted me wonders if it
was like an attempt to be like, oh, we need
to acknowledge the way that this woman's image was so
foundational to our technology, but then like not really doing it,
like still sort of treating her as like a booth
babe or something like.

Speaker 4 (40:57):
I don't know, right, I just find all of that
interesting in this level of like not again of not
what she was doing this for. She came in with,
like whatever her ambitions were in being this model and whatnot,
and then all of a sudden being told you're being
used as an example for computers, like for specially images

(41:20):
for computers, and not only will you see this, but
your grandkids will also, like if she has children, like
any of those things, and your your family members forever.

Speaker 1 (41:29):
And like who would who would have ever thought that
that's how that image would go on to be used
in history. And I really think like this is where
we are today, and this is like why I wanted
to talk about this is that I think, like the idea,
the concept of images being shared online, the way we
understand that in twenty twenty four, the fact that this
image of Leta became so foundational to that concept without

(41:51):
her consent, you know, perhaps without like proper contribution to
the way that she actually was foundational to that and
building out this entire universe around it that is mostly
controlled and protected and profited off by men, and nobody's
stopping to ask about the rampifications of that until decades later.
I just think it really establishes like a concerning precedent

(42:14):
for where we're going right now with AI in twenty
twenty four.

Speaker 2 (42:17):
And it doesn't have to.

Speaker 1 (42:18):
We can learn from what we did with that letter
image if we ask the right questions, if we center
the right perspectives and the right voices, and so Yeah,
I don't want to wait until twenty forty to be like, oh,
should we have been talking about the ways that women
and girls and other marginalized people are being exploited and
used to make technology companies money.

Speaker 2 (42:39):
I don't want to ask that question when it's too.

Speaker 4 (42:40):
Late, Right, And here's like the big conversation is, shouldn't
we also recognize that big companies and big tech companies
and big companies that are developing are purposely leaving out
marginalized people because they like the old ways and that
it's only making a certain amount of people money.

Speaker 2 (42:58):
Yes, that's exact. I think that I would argue that's
exactly what's going on. I mean, in twenty twenty four,
there are so.

Speaker 1 (43:03):
Many loud, thoughtful voices from women and people of color
who are really talking about AI in some interesting and
thoughtful ways. So they exist, They are out there. This
is the tale as the oldest time. When it comes
to technology. It is not that they are not there.
It is that they are being, whether intentionally or unintentionally, marginalized,

(43:23):
sideline silenced, pushed aside to make room for voices who
are just repeating the status quo, who are just saying like, well,
I'm trying to get rich, so who cares how this harms?
Somebody who cares about whether or not this goes on
to exploit. And I think that's really it's really like
a it's a little bit of a complicated cultural dynamic

(43:45):
and cultural shift that I think.

Speaker 2 (43:46):
That we really got a break.

Speaker 3 (43:48):
Yeah. Yeah, And it's really sad, going back to your
point Bridget of like the Internet not being a place
of joy anymore, because so many times it was marginalized
people who made those spaces because they couldn't find them
anywhere else, and then these companies come in and are like, Okay,
well we can make money, and then it doesn't become

(44:12):
a joyous space anymore. It becomes a very toxic, a
toxic place, and so like hearing this story and seeing
how so much of what we use still is based
on something that was a guy walked in with the
Playboy magazine like it's it's bad when you that doesn't

(44:37):
feel so out of place and what we're talking about
in our current time.

Speaker 1 (44:42):
Yeah, and again, I mean, I opened up our conversation
with this, but I guess, and I guess I'll close
with that too. I believe people when I say this,
people think I sound alarmist or extreme, but I mean
it the way that I mean it.

Speaker 2 (44:53):
I think that these things are features, not bugs.

Speaker 1 (44:56):
I think we got to be honest about the ways
that things like misogyny and exploitation, particularly when it comes
to marginalized people has been foundational to technology and the
Internet from the very beginning. I love the Internet. I
love technology. It is why I do the work that
I do.

Speaker 2 (45:12):
But I think that until we are honest about that
but these things are features and not bugs, we will
never get anywhere.

Speaker 1 (45:19):
And so I think that it really starts with having
honest conversations about where we started so that we can
get to a place that we that actually feels a
little bit better for everybody.

Speaker 3 (45:28):
Yes, yes, well, thank you so much as always, Virtute
every time you come on, I'm like, we could talk
for hours about this and this and this.

Speaker 1 (45:39):
Invite me back for an episode, just dragging Hugh Hefner.

Speaker 2 (45:42):
Yeah, I'll be for it.

Speaker 4 (45:44):
I think we need to do this because I think
for a minute, back into the magazine world and jumping
into like all of that.

Speaker 1 (45:52):
Don't even get I mean like this is like spoiler alert.
I like totally had this wrong. For so long in
my life, I was like, oh, you have was a
champion for free speech and some little riots and.

Speaker 2 (46:02):
Blah blah blah. Then I grew.

Speaker 1 (46:04):
Up and learned and I was like, actually, she wasn't
such a good guy.

Speaker 4 (46:08):
Right, I mean we really fed into the but read
the articles so so good.

Speaker 3 (46:12):
Oh my god, yes, yes, oh yes, please come back
for that, Bridget. In the meantime, where can the good
listeners find you?

Speaker 1 (46:25):
Well, you can listen to my podcast. There are no
girls on the internet. You can follow me. I'm not
really on social media that much anymore, but you can
try to find me there. I'm on Instagram at Bridget
Marie and DC. I am at Blue Sky at Bridget
Todd on Threads at Bridget Marie and DC sometimes on TikTok.

Speaker 2 (46:42):
You'll I'm easy to find. You'll find google me. You'll
find me. Yes, google me.

Speaker 4 (46:47):
That's a flex.

Speaker 3 (46:48):
It's true though. Our listeners are smart. They can find
you and listeners. If you would like to contact us,
you can. Our email is stephaniea momstuff at iHeartMedia dot com.
You can find us on Twitter at most of the podcast,
on TikTok, and Instagram that Stuff I Never told You.
We're also on YouTube. We have a tea public store
and a book you can get wherever you get your books.
Thanks as always to our super produced Christina, our executive
producer My and your contributor Joey. Thank you, and thanks

(47:11):
to you for listening. Stuff Will Never Told You. This
proction by Heart Radio. For more podcasts on my heart Radio,
you can check out the heart Radio app Apple Podcasts
wherever you listen to your favorite shows
Advertise With Us

Popular Podcasts

40s and Free Agents: NFL Draft Season
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Bobby Bones Show

The Bobby Bones Show

Listen to 'The Bobby Bones Show' by downloading the daily full replay.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.