All Episodes

July 28, 2023 56 mins

On this episode of The Circuit, Bloomberg’s Emily Chang is joined by three women who have raised the alarm on big tech’s harms: Ifeoma Ozoma, Timnit Gebru, and Safiya Noble. They discuss their experiences speaking out and the risks of what they see as tech’s newest emerging threat: artificial intelligence. 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
I'm Emily Chang, and this is the Circuit. For as
long as I've been covering tech, I've covered the resistance
to it, the critics, and the whistleblowers who've turned up
the volume. Doing this is incredibly risky because these companies
are just so powerful they basically make you take a

(00:24):
vow of silence to work there. It's called signing a
non disclosure agreement. In this episode of the Circuit, I
meet Efoma Ozoma, a former pinteress, public policy and social
impact manager who essentially blew the whistle on the system itself.
More recently, she left Silicon Valley for a quieter life
on a farm in Santa Fe, where we find her now.

Speaker 2 (00:46):
So this is.

Speaker 3 (00:47):
Where the coats live and the chickens and the rabbits,
And yeah, I couldn't imagine doing all of this living
in the Bay Area, but I built this.

Speaker 1 (00:59):
We're joined by two women who've also spoken out about tech.
Timney Gabrew, founder of the Distributed Artificial Intelligence Research Institute,
and a former Google ethical AI researcher. She says she
was fired, Google says she resigned. UCLA professor Sofia Noble
wrote the book algorithms of oppression, how search engines reinforce racism. Together,

(01:22):
you might say the three of them make up a
small part of a growing tech resistance. Here's my conversation
with Efoma Ozoma, Tiamnique Gebrew, and Sofia Noble.

Speaker 4 (01:35):
At what point did you realize I need to speak up,
like enough is enough?

Speaker 5 (01:41):
I saw her on Twitter rab.

Speaker 6 (01:45):
And we actually spoke before before I spaking.

Speaker 3 (01:49):
Yeah, So, after Pinterest pushed me out, I knew at
some point that I was going to talk about my story.
I didn't realize it was going to be so soon.
But Pinterest, along with many other companies who posted statements
after George Floyd's murder about how.

Speaker 6 (02:10):
They stood with their black employees.

Speaker 3 (02:14):
And what was particularly galling for me was I knew
the team who would have drafted that statement, and so
I felt I had to say something.

Speaker 4 (02:24):
Yeah, And to me, didn't you call I Foma the
day before it all went down?

Speaker 7 (02:29):
I was just at that point, I had been also
tweeting about, like, you know, whistle blower pretection and being
silenced in all of that and if you must sell whatever.
I tweeted about whistle blower protection and was like, hey,
I'm working on this thing, which ended up being the
Tech Worker Book, right, And so she was describing to

(02:51):
me the handbook that she was working on, and I
was like, you know, I feel like they're really harassing me.
She's like, you know, you gotta take notes and all
of that. The next day, you know, I found out
I was disconnected if I resigned. Apparently by then, I probably,
like you, had tried many other avenues. During the Black

(03:12):
Lives Matter movement, we organized an entire thing for the
research organization, giving them five points to act on after.

Speaker 5 (03:20):
Taking so many people's input and all that.

Speaker 7 (03:24):
Like, that was a few months before I got fired, Right,
So what gave you the courage to speak up?

Speaker 3 (03:30):
I don't know that anything gave me the courage. The
way I've thought about speaking up always is what are
you afraid of? And what would keep you from speaking up?
And for me personally, the only thing that I've ever
been afraid of was my mom passing away, and that happened,
So honestly, what else do I have to lose?

Speaker 2 (03:55):
How about you?

Speaker 8 (03:55):
Two?

Speaker 2 (03:56):
Yeah?

Speaker 7 (03:56):
I think that for me, it's like what am I
willing to lose in online? A position in my life,
in my career right now.

Speaker 5 (04:03):
To lose that.

Speaker 7 (04:04):
And I was actually compared to a lot of people
who have to think about putting food on the table kids,
you know, they can't. I have been in situations in
my life like, for instance, before I got my citizenship,
when I had political asylum, I would never go to
a protest like I wasn't about to risk that. So
I was no longer in that position, and I tried

(04:27):
not speaking up, and it wasn't like things weren't getting better, So,
you know, so I really was at a place in
my life where I think, you know, if I was
never going to get a job in those kinds of
places anymore, I was okay with that.

Speaker 6 (04:40):
I think you made a great point.

Speaker 3 (04:42):
We have a lot of privilege sitting here, and I
think it's important to acknowledge that I wrote shortly after
coming forward about pinterest that if I had kids, if
I had family who were on my health insurance, I
never would have even hired an attorney while I was there,

(05:02):
because I knew from the point I hired a lawyer
what the end would be and I couldn't have risked
that if I had people who dependent on me. So
there is a level of privilege in even being able
to say something.

Speaker 4 (05:16):
Sophia, you've been critiquing the tech industry for years. What
was it like on the outside watching their stories people?

Speaker 8 (05:23):
I mean, I had so much respect and felt like,
and still feel like, my work is in service of
making it possible for you all to do what you've
done in your careers, and also to be the person
who holds up the studies and the research and says,
you know, these things are happening, these things are real,

(05:44):
and these are the people who pay the price, as
well as millions of other people who look like us,
who pay the price for the kinds of terrible.

Speaker 2 (05:54):
Worker practices, labor practices, but also.

Speaker 8 (05:56):
Product development that comes out of the tech industry. You know,
I remember the night that to meet was being let
go of Google. I'm going to put that generously. People
were watching it happen and unfold because you were live
tweeting what was happening, and I was watching, and I
remember like sliding into your DMS and I was like,

(06:17):
hey girl, are you okay?

Speaker 2 (06:19):
And are you going to be all right? And we
got you?

Speaker 6 (06:21):
What do we need to do?

Speaker 2 (06:22):
What do you need?

Speaker 8 (06:23):
Do you need a fellowship like we've got to make
sure that you can still pay your bills. I don't
think I fully thought through that you were leaving Google,
so you probably have money of many but at least
you weren't going to be like out next month. But
you know, but that had been you know, my orientation
in my own career, which was many of us who
were taking risks in critiquing the tech industry were untenured,

(06:49):
vulnerable graduate students, you know, people who there would have
been a high price to pay if there had been retaliation.
So I identified so much with that. And then when
if Yoma and I met, I remember we met in
person at a meeting. And then in the meeting, I
remember they said, everybody who's like energized in your work,

(07:13):
you know, go to this side of the room, and
everyone who's like really circling the drain and struggling, go
to that side of the room. And she and I
went fighting for last place, you know, on our line,
and that was where most of the black women were,
in fact, on that side. And I remember that what
we started talking, and I was so attuned to the

(07:35):
price you were paying for the work you were doing,
and I was like, you know, I identify. I identify with
both of you, and you're not alone. You're not going
to have to go through these things, like we're in
this together.

Speaker 7 (07:47):
So if Yuma was working in the background every I
feel like so many of them were working in the
background on my behalf, Like you were finding lawyers, you know,
writing letters and everything.

Speaker 4 (07:58):
So there's a network.

Speaker 6 (07:59):
Was this network?

Speaker 4 (08:00):
If you will, Can you characterize what it's like being
inside these companies and trying to raise these issues and
the struggle of.

Speaker 1 (08:08):
That, like are they trying to squash you?

Speaker 4 (08:11):
Does it feel incredibly lonely? Oh?

Speaker 6 (08:14):
Yeah?

Speaker 3 (08:15):
My situation is unique in that, not that I experienced
it very much, not unique, but in that it was
written about. There's one particular situation where the company had
been a call to task for years on elevating plantations

(08:35):
as wedding venues. Plantations you would not elevate any other
place where people were tortured and murdered as a wedding venue.
And civil rights organization came to us said, hey, we
want to work with you, but if you don't make
this policy change, we are going to go to the

(08:55):
press and talk about it. I brought it to the company,
aid out all of the options that we had for
changing the policy, did it in what I thought was
a very diplomatic way, and the company was praised for
it for taking my recommendations in the end, and in
my performance review after fifty plus articles came out praising

(09:19):
the company the kind of pr boost you can't pay for.
I got a negative performance review for the work that
I did on.

Speaker 6 (09:28):
That because I was biased, is.

Speaker 3 (09:31):
What my white male manager said in my performance review
for it.

Speaker 2 (09:36):
Ended the promotion of plantations.

Speaker 3 (09:39):
That's that's correct, and you were criticized and I was
criticized for it, specifically for being biased, is what he said.
As if there's an unbiased position to take, Apparently there's
a pro plantation stance that could have been taken that
I guess I should have taken. I mean, I got

(09:59):
to the point where my sister would call me and
she'd say, I just heard you on NPR, Like, aren't
they afraid that one day you're going to be in
a live interview and you're going to just say I'm
being treated like trash at this company.

Speaker 6 (10:14):
Here's what's going on.

Speaker 3 (10:16):
And there was this dissonance that they believed that my
work was good enough to be on all things considered
to be the one who's out there talking about the company.
Because my work was good, my face was the right
face for it. But I wasn't worth being paid and
respected the way that others were to me.

Speaker 4 (10:37):
What was it like agitating for the stuff that you
were agitating or advocating for inside Google?

Speaker 7 (10:43):
So, like two months after I joined, I would say,
was the Google walkout?

Speaker 6 (10:47):
Yeah?

Speaker 7 (10:47):
And I sent this email being like I am so
glad this is happening because I've only been here for
two months and this is ridiculous, you know. And ever
since then, I think I didn't really make friends and
with the higher ups, you know, and I think we
what the way I tried to do it at Google

(11:08):
is me and Meg were always a duo, and so
my co leead Meg was in trouble when I got in,
and yeah, so I always went to her HR meetings
with her and they're like, what are you doing?

Speaker 5 (11:21):
Like I'm with her, you know.

Speaker 7 (11:23):
And they're like, I'm complaining about all the issues she's
facing when they asked me like what what should be better?
And I and they're like, but what about you? And
I said, look, that's going to be me in like
two years. So we tried to go as a team
and back each other up every time we complained about
whatever it is, like gender discrimination, racial discrimination, et cetera.

Speaker 5 (11:47):
And they didn't ding.

Speaker 7 (11:49):
Me on my performance review because my manager was trying
to protect us. But it always felt like they were
waiting for something, because you know, we always get that
policing and it was really exhausting, and so I felt
like that they found that one thing when I, you know,
with my paper.

Speaker 4 (12:07):
Sofia, when it comes to sort of the broader state
of tech workers.

Speaker 2 (12:11):
Is history repeating itself? I think it's the same history.
It's just a new industry.

Speaker 8 (12:17):
There's this way of critiquing people who have something to
say about technology, calling us ledites, right.

Speaker 2 (12:24):
I mean, this is like a you know.

Speaker 8 (12:25):
A group of people of workers who understood that new
technological inventions were going to reshape the labor force and
we're actually going to take away the creativity, the creative
labor in fact that the Ledites did.

Speaker 2 (12:40):
You know.

Speaker 8 (12:40):
I find that to be so similar today that people
who are saying, listen, so many of these products that
are being brought to bear in the marketplace are not
only harmful, but they they steal our creative imagination, they
steal our human agency. They are you know, these are
dangerous to humanity. We get kind of I think marked

(13:01):
as some type of not innovative, not forward thinking.

Speaker 2 (13:06):
You know, Luddites, and I think we're in fact that's
a compliment.

Speaker 8 (13:11):
I mean we are actually people who have, like many
people who are trying to say some technological innovations are
actually going to undermine workers, the quality of work, the
nature of work, the creative expression and joy that also
can be tied to work. They are going to flatten

(13:31):
our human experiences in so many ways, rob people of
opportunity for close opportunity. Those kinds of criticisms many people
have made for centuries when we have seen industries extract
to the point of absurdity and leave workers holding the bag,
or leave workers unable, in the case of tech workers,

(13:53):
unable to live in the Bay Area, unable to afford
unable to afford to raise children. I mean, I can't
even tell you how any women I've met who feel
that they have to choose between working in certain industries
or having a family. I mean, these are ludicrous ideas,
especially in an industry that is making more money than
any other industry on the planet.

Speaker 4 (14:13):
And this is global too, right, I mean, isn't aren't
there workers propping up tech around the world? Yeah?

Speaker 7 (14:18):
I mean we had an event a Dare called Stochastic
Parents Day, which is about a paper that I got
fired for, and Sophia and I were actually on a
panel there with somebody whose name was A Richard, and
he was one of the workers who was labeling the
outputs of the models of open A models like chat

(14:39):
GPT or these text image models, and he was talking
about how traumatic it was, and he was detailing for
us the PTSD that comes with it, and how much
worse it is when it's actually synthetic data, right, like
generated content, because you go home wondering if this is actually,
if this happens, if people do this in the real world,

(15:01):
because you can just generate any kind of content you want, right, So,
people like him, we're getting paid like an a one
dollar something an hour, and workers like that are left
dealing with the trauma that they.

Speaker 5 (15:14):
Experienced and they have no resources to do that.

Speaker 7 (15:16):
So it's similar to the extraction industry. It is global
and it preys on vulnerable people.

Speaker 4 (15:24):
Open AI was accused of using workers in Kenya, paying
them less than two dollars an hour to make chat
GBT less toxic. I asked the CTO about this, and
she said, we chose this particular contractor because of their
known safety standards, and since then they've stopped using them.
This is difficult work and we recognize that, and we

(15:45):
have mental and wellness standards.

Speaker 7 (15:48):
This is a known tactic to offload to a third
contractor so that you are not the one who is
being held accountable.

Speaker 8 (15:56):
So yeah, they borrowed that right from the fashion and
just and other industries. So there are many industries that
have done that. The tech industry has pulled a page
right out of those playbooks. And I teach my students
at ECLA about all.

Speaker 2 (16:11):
Kinds of different social movements.

Speaker 8 (16:13):
One of the things that people feel about what's happening
with tech is that it's just kind of it's totalizing,
it's a fata complete. These are the technologies that we have.
They're here now, there's nothing we can do. We have
to live with them, make the best of it. And
of course, people who live during the era of the
Transatlantic slave trade, people who lived in the Americas during

(16:34):
the time of the period of enslavement also got up
every day and got their kids ready for school. They
completely normalize the experience of enslavement. And so this is
I think an important thing to invoke is that people
build societies that they think are normal when they are
the beneficiaries of them for the most part.

Speaker 2 (16:55):
And also people who are oppressed.

Speaker 8 (16:57):
Inside those systems come to accommodate in some ways because
it is totalizing and overwhelming to live under those kinds
of oppressive systems.

Speaker 2 (17:08):
And there are.

Speaker 8 (17:08):
Also people who seek to abolish who resist and are
like absolutely not, no way, not in my lifetime, not
in my children's, my grandchildren's.

Speaker 2 (17:19):
I think we are, in fact.

Speaker 8 (17:20):
All three of us the descendants of people who refused
in one way or another.

Speaker 3 (17:24):
And I think it's important to say we're not anti technology.
My entire career was at Google, Facebook, Pinterest. I'm working
on issues within the industry, but we're pro human dignity,
and currently in the industry, unfortunately, human dignity is not

(17:46):
even on the list of issues.

Speaker 6 (17:49):
It's not there.

Speaker 4 (17:50):
Algorithms are a complete mystery to most of us on
the outside, a black box. You know, we know the
recommending videos or targeting ads, but how are algorithms influencing
us in ways that we are completely unaware of.

Speaker 7 (18:05):
You know, sometimes I don't even know myself right when
you're on social media and you're super angry and you're
responding to everything versus not, you don't realize that that's
because the algorithms that they're using maximize for engagement of
any kind, and when you're angry, you're engaging with the

(18:28):
site more So, I feel personally, even as someone in
this field, I'm still learning about all the different ways
in which my day to day life is being influenced
by a bunch of decisions that companies are making.

Speaker 4 (18:42):
Decisions like to release chat GPT.

Speaker 3 (18:45):
Yes, I mean that so chat GPT absolutely is important
and we should discuss it. But even that using that
is a choice. Currently, using social media the companies that
I worked for, using their product is a choice. But
where algorithms are particularly dangerous is where we don't have

(19:06):
a choice. There are algorithms that determine what kind of
care you're going to get if you're black and you
have kidney disease. There are algorithms that determine like, for instance,
when I went to purchase this house, I could have
ended up in a situation where I wasn't able to
get a mortgage depending on the algorithm that the mortgage

(19:31):
company was using. And that's where we really have to
be careful because it's not consumer choice. It's not that
consumers are ignorant, and so the responsibility is theirs. The
responsibility really lays with the companies and with regulating agencies
who don't understand or just don't care.

Speaker 4 (19:53):
So when chat GPT comes out and there is so
much buzz so fast, what goes through your minds?

Speaker 7 (20:01):
Eye roll so hard that COMPLI roll to the back
of my head, you know, Like I wrote a paper
about large LA models like two years ago, and most
people didn't know what it was. I'm not surprised at
the rate, I would say, of deployment, but I am

(20:21):
surprised at the constant daily hype, the free pr that's
being given some of these companies, completely uncritical right of
dear claims like it's disheartening personally speaking, What are.

Speaker 4 (20:37):
Your what are your biggest concerns about AI? Now that
it's not in a lab at Google anymore, it is
out and everybody knows about it.

Speaker 6 (20:46):
What worries you most?

Speaker 8 (20:49):
I think the thing that keeps me up at night
is the amount of data that's collected on us every
single day just by having a data collection device in
our bag or our pocket, the phone, and all the
kinds of things that we're interacting with, and the predictions
that get made then from that data that's collected about
us that will foreclose opportunity. Kathy O'Neill talks about this

(21:14):
in her book Weapons of Maths Destruction that these types
of prediction, predictive algorithms and AI make things better for
people who are already doing great and things worse for
people who are not faring well. And that is probably
the thing that it stresses me out the most, is
that those types of AI that get embedded in every
industry as again a tool of alleged efficiency, are using

(21:37):
data sets and also data collected about us to over
determine what our futures will be.

Speaker 5 (21:45):
To me.

Speaker 4 (21:45):
When you left Google, it sort of triggered this wave
of agitation within you. Now were hearing that some Google
employees begged for Bard to not be released.

Speaker 1 (21:58):
They said it was a pathological line.

Speaker 6 (22:00):
Is the tension building, you know?

Speaker 7 (22:03):
Unfortunately, I would say that there was a moment that
we had where people were speaking up, and now it's
like backlash all the way down, and I feel like
they're making the calculation that they don't even need to
pretend anymore.

Speaker 4 (22:17):
The tension is it building in like a progressive way,
like a way that leads to progress, or.

Speaker 5 (22:23):
So I think that we need regulation.

Speaker 7 (22:26):
So perhaps that kind of momentum is building. More people
are thinking about regulation. But in terms of worker protection
and workers inside these organizations pushing back, they're facing so
much backlash with a lot of the layoffs of these
ethics teams at Microsoft, at Google, at Twitch.

Speaker 5 (22:44):
I'm sure there's more too.

Speaker 9 (22:46):
I mean, let's not even here's where policy as I'm
not a computer scientist, I'm not an academic, but I
know policy, and here's where policy makers are really dropping
the ball.

Speaker 3 (23:00):
We had an opportunity to move forward on something like
healthcare and disentangling health care from employment. That's one piece
that would allow people to speak up, because if your
access to healthcare, if your family's access to healthcare isn't
dependent on your employer, then you can talk about harms

(23:21):
that you're seeing in the technology that's being created. If
policymakers were willing to step up and actually regulate some
of these things before they're an issue, or as they're
an issue, then we wouldn't be in this place where
you have lawmakers in Utah who think that it's a

(23:42):
solution to ban kids from using social media and disconnect
them from communities that are actually life giving and life
affirming for people who are in marginalized communities.

Speaker 2 (23:55):
Right, or just pushing the onus of.

Speaker 8 (24:00):
Like control from being harmed to the individual. The thing
that people forget is that so much of the R
and D for these kinds of technologies is offloaded already
it's the riskiest part of the business, and it's offloaded
to the public. The public pays for the most dangerous
dimensions of what the companies do because most of those

(24:22):
companies have partnerships with academic labs and they get NSF
National Science Foundation funding and other kinds of government funding
to do that experiment. So here the public pays for
the experiments and then never benefits from the profitability of
any of these companies. And then they regulate downstream and
say it's on you to figure out whether it's safety.

Speaker 7 (24:45):
I mean, Silicon Valley got most of its money from
the government. Right during World War two, and after that, right,
like Elon Musk the amount of money California taxpayer money
that he got, and they talk about, you know, pulling
yourself up with the straps.

Speaker 4 (25:00):
There's this perception that artificial intelligence is magic, but what's
really happening.

Speaker 3 (25:07):
There are human beings like the one who you mentioned
who are tagging things. They're human beings who are ensuring
that we, those of us in polite society, don't have
to see the murders that are being captured, the beheadings
that are being played live on Facebook or on YouTube

(25:28):
or anyone else.

Speaker 6 (25:29):
Someone is seeing that it's just not us, and.

Speaker 5 (25:32):
They don't think it's magic.

Speaker 7 (25:33):
The people who are doing the data labeling, the people
who are content content moderation, and a lot of the
people whose work is being stolen to train these models,
like the artists whose images have been taken without any
sort of attribution or payment, or the writers who you know,

(25:56):
I was just at a conference listening to artists and
writers talk about some of the issues that they see
with these generative AI models, right, and what the future
holds for them. And they don't see it as magic.
They see it as plagiarism and theft. So you know,
and I want to say that in my field, there
is a concerted effort to make it sound like magic.

(26:16):
It's this obfuscation that happens on purpose so that you
don't see where the data is coming from, who it's
being stolen from, because they would have to compensate everybody,
how the workers are being exploited. And then something's something
in the cloud, right, it's not the cloud is data
centers taking lots of water. I mean we were just
hearing about the water usage of CHATCHPT. You know, when

(26:39):
you really make it visible, it's not magic. You know,
it's not sentience or magic or anything like that.

Speaker 3 (26:46):
Oh, there's nothing wrong with convenience. We all appreciate convenience,
but I don't want my convenience at the expense of
another human being.

Speaker 4 (26:55):
What's your long term view about how many jobs this
will disappear and create? Because you know, we're talking about
more not so great jobs, right the tagging of chat GPT, like, like,
how do you see this playing out?

Speaker 7 (27:10):
So I'm very worried about like the centralization of power
that this type of technology is enabling. So if you
whether you believe their claims or not, the claims of
these companies or not right, what they're striving for is
one model that does everything for you. You you want to

(27:30):
go to a doctor, talk to a chatbot, a lawyer,
talk to their chatbots. So what they're planning on having
is they're like the super renter where everybody else builds.

Speaker 5 (27:41):
On top of them.

Speaker 7 (27:42):
They have APIs on top of one or two or
three companies for everything. So this is not the world
I want to live in.

Speaker 1 (27:49):
So what does society even look like?

Speaker 2 (27:50):
In this world?

Speaker 1 (27:51):
Is there like a massive wealth gap? Is there on casts?

Speaker 5 (27:54):
I mean?

Speaker 4 (27:55):
Or is everyone happy because we're on universal basic income?

Speaker 7 (27:57):
Not the last one because I mean they're not, but
they could do that now. Like it's just like according
to open I Ah, you know, utopia is just around
the corner any day now, and we're all gonna be happy.
I don't think that we're all gonna like work less.
I think that's not what's going to happen. But I
think that the work, certain people's work, I feel like
it's gonna get degraded, degraded. So a lot of artists

(28:20):
are very worried about losing their jobs, and not because
the image generator thing is going to be as good
as an artist, but someone's gonna think that it's good
enough or something like that, and so they won't hire one.

Speaker 5 (28:30):
I think that the content moderation.

Speaker 7 (28:34):
Requirements are going to be exploding with the amount of
synthetic media that we have on the Internet, and so
I feel like we'll have a lot more content moderation
kind of jobs.

Speaker 5 (28:45):
But then we know what those jobs are like. So
I'm not looking, you know, I'm not optimistic.

Speaker 4 (28:51):
There's a lot of doom mongering about the capabilities of AI.

Speaker 1 (28:55):
For the people who are scared, what do you want
them to know?

Speaker 2 (29:00):
They're not wrong wrong, that's right, they're not wrong.

Speaker 8 (29:06):
Well, I mean, I would say that degradation is a
really helpful word in this, because people have a sense
that something is off and something is going to be
lost with these technologies. I will tell you right now
in classrooms around the country, and I'm seeing this already
in our university system. Some people think something like generative

(29:31):
AI helps level the playing field.

Speaker 2 (29:33):
So there's a lot of mythologies.

Speaker 8 (29:34):
Right if you're not a great writer and you're in
the sciences or you're a math student, you don't have
to worry about that when you're those English classes. Now
because you've got open AI is going to help you
write those essays.

Speaker 2 (29:46):
What is lost?

Speaker 8 (29:47):
Imagine not learning the things that you learn when you
put yourself in an environment to learn something. So I
think we will have a transformation in the field of
education that will not be what we want.

Speaker 2 (30:00):
I don't think it will help us think critically.

Speaker 8 (30:02):
A lot of us are doing experience experiments where we're
looking up certain types of things where we are subject
matter experts and we know a lot about that, and
we're asking open AI and it's giving us terrible answers.

Speaker 2 (30:13):
That look good enough.

Speaker 8 (30:16):
And so the loss of expertise, the loss of knowledge,
those kinds of things are very important in societies that
we have reliable information that we can all act upon
as voters, as parents, as a residence, as people who
live in communities.

Speaker 4 (30:34):
Fake it so you make it, and it's making it
right now.

Speaker 2 (30:36):
I'm predicting the wrong things.

Speaker 3 (30:38):
One of the things that I think is important here
misinformation disinformation have been a focus of mine is the
shrimping of context, and that's one of my main concerns
with chat, GPT and AI bots like that, because.

Speaker 4 (30:54):
We're losing, we're not gaining that.

Speaker 3 (30:55):
Yeah, because it's one thing, I think to present a
bad answer. It's another thing to present it as though
it's the only answer, and without showing your sources or
anything else, so that if you're the person on the
receiving end of that answer, you can say, Okay, well
this came from this source. I can look that source

(31:16):
up and make my own determination.

Speaker 4 (31:18):
You know, a bunch of people sign this letter, including
Elon Musk, calling for a pause on the development of AI.

Speaker 2 (31:24):
What was your take on that?

Speaker 7 (31:25):
I mean, we had to write a statement responding to
it because it really confused the conversation. A lot of
people are like, great, that's what you've been asking for,
this is what you want, and we had to put
a stake in the ground and be like, no, we
are different from those people.

Speaker 6 (31:41):
We called it unhinged.

Speaker 7 (31:43):
Yeah, yeah, it was completely unhinged because some of the
things that all three of us are discussing right now
are not sci fi scenarios. We're talking about people losing
opportunities today. We're talking about people not getting a mortgage,
people not getting care, people being sentenced, you know, have

(32:04):
really high sentences because of some mundane, not sci fi
algorithms that are right now currently deployed, and then we're
currently talking about worker exploitation Monday, not sci fi, you know.
And in this pause letter they're talking about as if
it's magic, it's this magical thing. It's it's we've created

(32:26):
something too powerful.

Speaker 5 (32:27):
We don't even know how it's gonna Oh.

Speaker 8 (32:29):
No, they just want to slow down so they can
get catch up. That is like my take on it
is that not only do they are they trying to
positive as magic, but they're doing that as a tactic
so they can catch up because they're behind.

Speaker 7 (32:42):
Well he started Elon Musk started this truth Chat truth
GPT thing now right right after he's signed saying stop,
It's like, oh, I have.

Speaker 6 (32:50):
My own things.

Speaker 8 (32:51):
Know. What they mean is slow down so we can
catch up, Sophia, to use your term algorithms of oppression
algorithms supercharged on AI.

Speaker 4 (33:00):
Given all the bias and misrepresentation and misinformation issues that
already exist, how much damage could we do?

Speaker 6 (33:06):
What could the fallout be?

Speaker 2 (33:08):
We are already doing a lot of damage.

Speaker 8 (33:10):
We have products in this industry, in the tech industry,
that are made up and launched out of the public,
no oversight, no one checking for whether these are going
to do damage until the kinds of communities of researchers
and technologists that we're part of start to end. Community

(33:31):
organizers and journalists start exposing the evidence of harm.

Speaker 4 (33:36):
AI is so much more than chatbots. What are all
the other ways that AI is developing. I mean, we're
talking about surveillance and facial recognition and all of these
other things that are very.

Speaker 3 (33:46):
Consequential surveillance in ways that people can't opt out of
and people can't defend themselves from. They are people who
have been arrested because their face came up in a
police database where it wasn't actually their face.

Speaker 6 (34:03):
There's no way.

Speaker 3 (34:04):
To opt out of being included in this database. And
there are no guardrails for both the police departments that
are buying this technology using our dollars, our taxpayer dollars,
and then the companies that are scraping this content from
the Internet. There are no guardrails anywhere, and actual people

(34:28):
are paying the price.

Speaker 4 (34:29):
And by the way, you've got a handful of companies
that are based in Silicon Valley deciding what the next
tech platform will be.

Speaker 1 (34:37):
Where most of the people who work there are white men.

Speaker 4 (34:40):
Let's be honest, are you thinking to yourselves like here
we are, here, we've been going.

Speaker 8 (34:46):
And the consequences of that are really mundane in ways,
like maybe this app can help me track my period
better than I could in my diary, so I use that,
and now law enforcement has about door to come and
arrest me because it might look like I'm planning to
have an abortion from some type of analysis of my period.

Speaker 5 (35:08):
Trekker.

Speaker 8 (35:09):
So that seems so sci fi, but that's actually happening today, right,
very mundane kinds of ways. We do these things in
education where students have to use these learning management systems
and upload papers and give comments and do all of
these things in these tech systems, and then algorithms are

(35:30):
run on them that predict which students are going to
be the most successful. In fact, financial aid officers and
a mission officers now in universities are people who need
to be skilled in CRM databases because what they're doing
is they're looking to see who's most likely to matriculate
in four years.

Speaker 2 (35:46):
Well, you know who that is.

Speaker 8 (35:47):
It's not somebody like me who had to work thirty
hours a week to get through undergrad his parents couldn't
pay for them to go to college.

Speaker 2 (35:56):
I would definitely fail the.

Speaker 8 (35:57):
Statistical model right of who the most idea kind of
university that seems so banal. These kinds of things are
really they're not part of the doomsday machine, but they
actually are the kinds of things that, even if you
don't think you're ever going to get entrapped in a
mugshot database, you care about whether your kids get an

(36:18):
opportunity to go to school. When the Trump administration announced
that they were going to adopt a new software that
was going to predict who the next mass shooter was,
I thought to myself, this will be the moment that
white men care about these conversations because it's going to
be them that's going to be in the database, because
they're the profile of the mass shooter. And you're telling

(36:40):
me you're going to apply an algorithm to determine who
the next mass shooter is.

Speaker 2 (36:45):
You've got to be kidding me. There couldn't be more
snake oil to be sold. So everybody's in on this.

Speaker 8 (36:51):
Everybody needs to be concerned because everybody could potentially be harmed.

Speaker 4 (36:55):
The issues that you're raising sound like what a lot
of the tech leaders say they care about. Reid Hoffman
and Sam Altman and Sacha Nadella and so Darkachai and
even Elon Musk who said we want AI that benefits society.

Speaker 1 (37:11):
Where do you diverge?

Speaker 2 (37:13):
I mean, I don't think we say what they say.
I don't think.

Speaker 8 (37:17):
I mean I would say, I think there are a
lot of types of AI that should be abolished, that
should not be in the marketplace, that are making claims
that are unfounded and are actually proven to be harmful.
And I'm not convinced in fact that AI can be
used for good.

Speaker 2 (37:36):
I'm not one of those people.

Speaker 8 (37:38):
I also think that they have adopted the language of
the people who have been raising the criticisms and have
incorporated those criticisms and then put themselves in charge of
solving the very problems that they're being critiqued for.

Speaker 2 (37:52):
So this is where you know.

Speaker 8 (37:55):
To me, and you know, I always laugh with her
because I remember the day that Google announced that it
was forming like an ethical.

Speaker 2 (38:04):
AI team, and I was like, am I being punked?

Speaker 8 (38:08):
Like there's just no way in the world that the
people who have been denying that their products are harmful
for a long time are now in charge of addressing
that there are some harms. It's profitable for them to
look socially responsible. Companies have been trying to put posit

(38:29):
themselves as socially responsible at the point that consumers start
to be concerned about the lack of social responsibility of
those companies.

Speaker 2 (38:37):
But I don't think that we're actually saying the same
thing at all.

Speaker 4 (38:40):
What's been more communication, like with tech workers on the
inside of Google and Apple and other companies where there
has been organizing.

Speaker 7 (38:48):
It seems like everybody we've been communicating with has been
getting fired. We have you back channels, and we have
some kind of channels of organizing. But my sense is
that right now a lot of people are scared because
of the layoffs, and a lot of people are Yeah,
So that's that's been my sense recently.

Speaker 4 (39:06):
Should people speak up, walk out, leave if they see injustice?
Is there any way to communicate your concerns internally effectively.

Speaker 3 (39:17):
I'm glad you asked that because my view is and
always has been, no one should make themselves a martyr.

Speaker 6 (39:23):
I don't believe in martyrdom.

Speaker 3 (39:25):
I think if you feel the conviction, if you are
able to because of your familial situation, your financial situation,
to speak up, then you should. But we live in
a capitalist society. Capitalism all has us all by the throat,
and so if you need money to shelter, to have shelter,

(39:50):
to have healthcare, to have food, to care for your family,
I don't want anyone to risk all of that because
of harm that are continuing. But for the people who
are and the people who are standing up, we have
to do everything that we can to protect and support
them and do what we're able to on the outside

(40:13):
now to ensure that their work isn't for vain and
are in vain.

Speaker 4 (40:19):
You help get the Silence No More Act pasted in California.
You've got the attention of Governor Gavin Newsom.

Speaker 6 (40:26):
Yep.

Speaker 4 (40:27):
This means that companies can't use NDAs to prevent employees
from speaking out about certain things.

Speaker 6 (40:33):
Right.

Speaker 5 (40:34):
Is that enough protection?

Speaker 3 (40:35):
I mean it's protection for forty million people for both
non disclosure and non disparagement agreements. So it's more than
existed before. But the work is continuing. A similar bill
was passed in Washington State. My hope if we ever
get a function in Congress would be to have federal
protections because as we know, workforces are distributed and so

(40:59):
there are people working all over the country right now,
who are on teams where their manager may be protected
in California at a headquarter of a tech company, but
the people who are actually being harmed may be in
Texas or Louisiana or New York and not have those protections.

Speaker 4 (41:18):
You also created the Tech Workers Handbook R What can
workers find?

Speaker 5 (41:22):
There?

Speaker 3 (41:23):
Resources on how to seek legal help, resources on how
to speak with reporters. One of the things that I've
talked about a lot is one of the advantages and
privileges that I had when I came forward is I
had experienced speaking with people like you. It's not easy
to sit down and talk about a traumatic experience you've had,

(41:44):
and it's not easy to sit down and answer questions
without training. That's training that I got on the inside
of these companies speaking on behalf of the companies, and
of course I was going to use that to speak
on my behalf and on the behalf of other workers.

Speaker 1 (42:01):
The circuit continues after this quick break.

Speaker 4 (42:09):
So is regulation the only way to keep us in check.

Speaker 3 (42:14):
It's the most important way because it's the way that
impacts the largest number of people, and we want everyone
operating under the same framework and protections. We may be
shareholders of a company, but we're small fry. I mean
the banks, the folks who are really making the decisions

(42:36):
because they own one to three percent of a company,
They could step in, and they should step in because
long term their investments are impacted by the harms that
these companies are perpetrating.

Speaker 2 (42:49):
Yeah, I would say just a build on that too.

Speaker 8 (42:52):
Workers should be looking at their pensions and where their
investments and their four one ks are. We remember there
were so many people I remember just two decades ago
who were organizing around making sure that they were not
investing in prisons or in military weapons and other kinds
of things. They did not want to see their pensions
and Furrow and Kay's invested in. So those are other

(43:13):
ways that everyday people can also be asking the kinds
of questions of the companies where they work and how
their assets are being managed.

Speaker 4 (43:21):
If you had one year to run these companies, what
would you do.

Speaker 3 (43:26):
I like my life here raising my goats and chickens,
so I wouldn't. But I am happy to continue the
work that I'm doing now. I mean there's so much harm,
like I said, that's been done that before you figure
out the plan for what's next.

Speaker 6 (43:44):
There needs to be repair work that's done.

Speaker 2 (43:46):
So what does that look.

Speaker 3 (43:47):
Like actually paying people for the harms that they've endured,
whether that's consumers who have been harmed by privacy violations
or the workers who have been harmed through wage theft
and asking physical harm to themselves.

Speaker 8 (44:04):
Let's say your Google's search algorithm is populating some kind
of disinformation about you, and you can't get that off
of the search engine and it's precluding you from employment.

Speaker 2 (44:18):
You need repair from that.

Speaker 8 (44:19):
You can't just submit online a request for Google to
take that down. They have hundreds of thousands of takedown
requests that are never going to be attended to.

Speaker 2 (44:30):
So you need to have these spaces, these.

Speaker 8 (44:32):
Intermediaries where people can go and say I have been harmed,
this product is faulty. There needs to be liability both
for the company, but also I need to be able
to litigate or have some type of repair.

Speaker 2 (44:43):
We don't really have.

Speaker 8 (44:44):
The laws on the bark action so that people can
in fact get the remedies that they need through courts.
We also don't have a judiciary that is properly educated
yet so on how.

Speaker 2 (44:55):
To resolve some of these things.

Speaker 8 (44:57):
So we need human and civil rights laws that give
people the kinds of protections that they need to and
when really, I think Congress is not caught up with
the digital age and the kinds of harms and civil
rights violations.

Speaker 2 (45:09):
So this is also another place of enforcement.

Speaker 4 (45:10):
So civil rights for the digital age, which is absolutely
been calling for.

Speaker 5 (45:14):
What about you to meet?

Speaker 4 (45:15):
If you could go back to Google and you were
in charge for a year, what would you do.

Speaker 7 (45:19):
I wouldn't put out a chat bought version of a
search engine, you know. I think open Ai can do that,
because that's the only thing really people expect from them.
Search is not their bread and butter, and so I
think that, in my opinion, Google just trying to act
like they're also innovative or.

Speaker 5 (45:38):
Something like that.

Speaker 6 (45:39):
I don't know.

Speaker 7 (45:39):
They wanted to be in the hype cycle too, I suppose,
and they just put something out there, I would say, Okay,
this is our bread and butter. Search is our bread
and butter. You know, we have to maintain the integrity
of the information ecosystem. As a researcher. Also, I think
a lot of researchers in computer science can.

Speaker 5 (45:58):
Do something, can choose to do something different.

Speaker 7 (46:00):
There was just a professor at University of Chicago who
created a tool called Glaze that artists can use to
when so that when they put out their work, they
can't be used to mimic their styles. So it fools
tools like Dally and some of the other texts to
image generators. So, because you know, when artists are in

(46:22):
these lawsuits, it takes years for their course to decide,
they don't have time to wait for this kind of stuff.
So tech workers could actually choose to use their skills
to help them protect their art. A lot of us
talk about the tech industry, but it's not just the
tech industry. It's the whole ethos of the entire computer
science engineering, you know, machine learning or whatever research community

(46:45):
as well that's led to this. It's not just you know,
the corporations.

Speaker 3 (46:50):
So policymakers too, who are bought and paid for by
the companies that they're supposedly legislating against and meant to
be regulating.

Speaker 6 (47:01):
When when you look at.

Speaker 3 (47:04):
The spending that Google and others do in campaigns that
they make sure is fifty to fifty on either side,
they're funding absolutely everyone. Who is it that we can
turn to as a champion when they're all accepting money
from these companies.

Speaker 5 (47:22):
So are we stuck?

Speaker 1 (47:24):
I mean, is there a glimmer of hope?

Speaker 8 (47:26):
Like, yeah, absolutely, we know we know many how many
people who are glimmers of hope.

Speaker 2 (47:33):
And you know, it's interesting because ten years ago it
was very hard to have this conversation.

Speaker 8 (47:39):
So we know that a lot of envelopes are being
pushed and that we're even in a space to bring
these conversations up and have them understood. So of course
there's lots of hope. I mean, this is why I think,
you know, scholars and journalists and tech workers and artists,
we're able to make those conversations legible to everyday pe

(48:00):
And to me, that is incredibly hopeful.

Speaker 4 (48:03):
If the CEOs of Google and Microsoft and Amazon and Apple, Facebook,
we're in this room right now.

Speaker 5 (48:11):
What would you say to them?

Speaker 6 (48:14):
Nothing?

Speaker 3 (48:14):
That nothing that would be super effective if their gcs
weren't also here, Because some of these CEOs may want
to do things differently, but they have legal counsel who's
ensuring they don't. And so what I have found, even
when I've been in conversations with CEOs and decision makers

(48:35):
is they are looking to the people who really make
the decisions, and those are the ones who are mitigating
for risk. And right now, it's not a risk to
harm workers, it's not a risk to harm consumers. It's
not more of a risk than harming shareholders. And so
we have to speak to the people who are actually
the decision makers. And it's not always the CEOs.

Speaker 8 (48:58):
Well it was to say, I mean are of course,
but also the fines that they pay to when they
violate the law, and you know, the court cases that
they settle.

Speaker 2 (49:09):
It's like pocket.

Speaker 6 (49:10):
Lint to these companies.

Speaker 2 (49:12):
It's not their money.

Speaker 8 (49:13):
So for sure we know that CEOs, if they were
personally liable in any way for what happens in those companies,
well it would be slim pickens for who would want
to be the CEO of most of these companies. I
think that if American tech companies in particular really wanted
to seal their legacy and reputations, they would take these

(49:35):
conversations to heart and say, all right, you know, we
are the so called innovators. And this is how they
frame themselves as being the most imaginative, the best innovators,
the best thinkers. It's strange how they can't solve these problems.
Will you stay in kind of a race to the
bottom to extract profits at all costs, or could you
innovate a different kind of labor and cooperative model. Could

(49:58):
you share in the profits with your employees. Could you
think of different kinds of economic models. I mean, they
actually could, and that could be transformative, but I don't
think that there's the will. And I think, you know,
maybe if they were thinking like good ancestors, like what's
the legacy that they leave for three or four or

(50:19):
five generations from now, maybe they would make different kinds
of choices.

Speaker 3 (50:24):
I also want to add really quickly, I think some
of the responsibility is on your industry as well, because
we have living Hagary Graffi's that are done about these people.
Like we were watching the harms, we're sometimes talking about
the harm, sometimes honestly, often not very honestly. But then

(50:45):
they put out something that's new and shiny, and almost
every outlet runs to cover it and runs to cover
it using the talking points that the comms and pr
folks from the companies supply, and so there's a lot
of responsibility there in the way that things are covered
and therefore the way that people understand them.

Speaker 6 (51:08):
Yeah, fair point.

Speaker 4 (51:09):
It matters what stories are told, how the stories are told,
who's telling the stories, whose voices are lifted, So thank
you for lifting your voices with us today. It sounds
in many ways like this is an impossible problem to solve.

Speaker 2 (51:24):
Is it impossible?

Speaker 3 (51:25):
No, we're I mean, we're providing solutions there, and we're
we are following the work of many, many, many other
people who are not represented here, who are never asked
to be on camera, who are never asked for a quote,
and you just have to look at the work that
they're doing, and the work that communities are asking for

(51:48):
and that people are stepping up and saying is important
and necessary.

Speaker 4 (51:52):
If things don't change, what does the future look like?

Speaker 7 (51:56):
I think, you know, I'm worried about the fundamental remaking
of society the same way it happened with cars. For instance,
I was just reading this book called The Road to
Nowhere by Paris Marx talking about that that in the beginning,
you know, in like the nineteen twenties, nineteen tens, it
was not okay when cars killed people, people protested people,

(52:18):
you know, out in the streets. And now it's completely
normalized and we have a completely different sort of society
than the one without individual cars. You know, we don't
have public transportations. So many places talk about Silicon Valley,
like you fix that. Yeah, And so what I'm worried
about is a similar type of remaking of society with

(52:39):
some of these tools right that are out.

Speaker 5 (52:41):
There, where by the time we.

Speaker 7 (52:44):
Regulate things, the society has already been remade, you know,
in ways that we weren't you know, that we don't
want and then normalized.

Speaker 8 (52:53):
Yeah, we are pummeling toward a type of global inequality
that will be very hard to come back from, and
that will also have incredible consequences, social consequences. You know.
I think of us and so many other women like
us and people like us as like the canaries and
the coal mine. You know, we're we're like chirping, we're trying,

(53:13):
and then we're gonna go silent.

Speaker 2 (53:15):
And what that.

Speaker 8 (53:16):
Means is we're dead because the fumes have taken us over.
And that that part seems so tragic and unnecessary when
we know that there's enough evidence to pivot in other directions.

Speaker 4 (53:34):
If things do change for the better, what could the
future look like?

Speaker 3 (53:39):
Oh man, I was just looking outside at the goats
and thinking that my dream, Like what I want for
my kids, what I want for everyone is work to
not define your life. I want you to be able
to make a living that enables you to do what

(54:00):
brings you joy. If that's milking goats, if that's taking
eggs from your chickens, like, that's what you should be doing.
Whatever joy looks like for everyone, and whatever joy looks like,
specifically for black women, is what I want us to
be able to get because when black women are taken

(54:22):
care of, everyone is, and so that's that's the future
that I want.

Speaker 8 (54:27):
Yeah, I mean, I feel like there's so much there's
so much wealth in the world.

Speaker 2 (54:32):
Really everybody could have a high quality of life.

Speaker 8 (54:35):
Yeah, you know, like, let's spread a high quality of
life for everyone.

Speaker 2 (54:39):
I want.

Speaker 8 (54:39):
I don't just want to save incredible world, a joyous
world like you're talking about. I'm not sure if I
would do the animals, but I'm just saying, like my
urban city version of that, I want that for everybody's kids,
you know, I want that for everybody's generations.

Speaker 2 (54:52):
For everyone, for everyone. Thank you. That was amazing.

Speaker 1 (55:05):
Pinterest said in a statement that it has quote introduced
pay transparency tools for employees, and taken steps to continually
monitor employee pay to ensure that we're achieving equal pay
for comparable work. We've increased the percentage of women in
leadership and continued to set goals for increasing diversity. The
company set it's committed to ensuring that every employee feels

(55:25):
quote empowered to raise any concerns about their work experience.
Google said in a statement that it seeks to develop
quote AI in a way that maximizes the positive benefits
to society while addressing the big challenges guided by our
AI principles and that quote large language models have known limitations,
which is why we launched barred as an experiment and

(55:47):
develop safeguards to provide people with a positive experience. Thanks
so much for listening to this episode of The Circuit.
I'm Emily Chang. You can follow me on Twitter and
on Instagram at Emily Chang TV. You can watch new
episodes of the Circuit on Bloomberg Television or on demand
by downloading the Bloomberg app to your Smart TV, and

(56:09):
please let us know what you think by leaving a review.
I'm your host and executive producer. Our senior producer is
Lauren Ellis. Our associate producer is Lizzie Phillip. Our editor
is Sarah Lives. Thanks so much for listening.
Advertise With Us

Popular Podcasts

Dateline NBC
The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.