Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
I'll tell you a thing that'll fuck you up.
Speaker 2 (00:02):
I just watched Three's Company because I like and I
watch it. If you watch it again, you can't watch
it again because you're like, oh my god, it's all
about date rape and homophobia, literally, and you're like, even
Marjorie Taylor Green would be like, no, that's enough.
Speaker 3 (00:21):
This is Hello Isaac, my podcast about the idea of
success and how failure affects it. I'm Isaac Msrahi and
in this episode I talked to the host of the
podcasts On with Kara Swisher and Pivot journalist Kara Swisher.
Speaker 1 (00:37):
Hello, Isaac, this is Kara Swisher. My mother is thrilled
I'm talking to you because she was in fashion and
I am fashion. What's the way to say it.
Speaker 3 (00:47):
I have no taste.
Speaker 1 (00:48):
I've been wearing the same thing since I've been twelve.
It's like caer animals. That's how I dress. And I
hope you can help me.
Speaker 3 (00:54):
I'm very exhilarated this morning. I've had a lot of
caffeine because my guest, Kara Swisher is probably like the
smartest and most informed person in the world. She kind
of lives in the center of everything right now. In DC.
She knows everything about tech and politics and where we
(01:18):
are as a culture, more than a lot of people
talking about a lot of different subjects, like boy, she
is really onto something and she has a huge fan
base and a huge kind of a popularity at the moment.
Because she is so outspoken, I have to say I
want to match her, like, you know, word for words.
(01:39):
So here we go, Kara Swisher. Hello, Hello, how you doing.
I'm okay. I have to say, we've never met. I
mean never, never, never, ever, And usually people I talk
to on my podcast I've known for some time or
at least i've met. I'm a little scared because you
(01:59):
because you're coming to the world from a kind of
a journalistic perspective, like you research stuff, you know, stuff
you know, which is like something I respect so much,
and no matter what I do, I'm just not you know,
I went to art school. By the way, Well you
research that.
Speaker 1 (02:17):
You know all about fashion.
Speaker 3 (02:18):
Right, this is true? Well, I guess so and entertainment. No,
I mean you know, seriously, I googled you and you're
roughly my age darling. We're yes, in his nineteen sixties,
the early nineteen sixties. Yes, indeed, and I would refer
to you as kind of like an elder statesman, right.
Speaker 1 (02:35):
Oh, I guess, yeah, no, only because what.
Speaker 3 (02:37):
You write about, Yeah, you know, I mean there really
aren't that many people who are older than you who
have a grasp of this, right.
Speaker 2 (02:44):
Yes, well no, I started covering the Internet when I
was in my twenties, right, so it's about twenty five
thirty years old. Actually it's my early thirties, and so
I was the young person on the staff. So that's
why they assigned it to me, because it was a
new I mean, people, you use the Internet every day,
but it was not around.
Speaker 1 (03:03):
In the in the nineteen nineties.
Speaker 3 (03:05):
How does it feel to be this kind of person
who has been with it almost since its own kind
of inception?
Speaker 2 (03:12):
Seriously, that's a complicated question. I just finished a memoir
about it. It's called burn Book, So that'll give you
an idea of how I feel about what's happened.
Speaker 3 (03:20):
God.
Speaker 2 (03:21):
But it's also the subtitle is a tech love story,
because I love tech, but I hate what they've done
to the place, if you want to think about it
like that. And so I've been with these people, you know,
the ones that you think of masters the universe Elon
or Mark Zuckerberg or the Google guys or Jeff Bezos
since the beginning before they were rich, when they were startups,
and so I've known them a long time, so I've
(03:42):
watched their journeys, their individual and together journeys.
Speaker 3 (03:46):
What do you hate that they've done to it as
a genre, Oh, is a genre.
Speaker 2 (03:50):
They took all the money and thought that they were
the geniuses when actually the Internet was paid for by
the American taxpayer and by the government, and then they
proceeded to track the government, for whom they wouldn't exist
really in lots of ways, including someone like Elon Musk
who's gotten loans from the government to save Tesla.
Speaker 1 (04:09):
He's got contracts with.
Speaker 2 (04:10):
The government, and so when they complain about it, you're
sort of like, wow, you've done rather well because of
the American taxpayer, and so you know, they really do
think that you're saving the world, when in fact they've
done huge amounts of damage through their own greed and
a need to grow, grow, grow, and control. So that's
how it feels about it.
Speaker 3 (04:29):
This is an amazing topic that I really want to
get back to because that's sort of the meat of
what I want to talk to you about but I
want to start by getting a little history about you,
you know, first of all, like where are you from?
Speaker 1 (04:41):
New York?
Speaker 2 (04:42):
The New York area, right on the Try State area,
a local Try State area, Darling.
Speaker 3 (04:48):
And where were you educated?
Speaker 1 (04:50):
I was born in Philadelphia. My dad was in the Navy.
Speaker 2 (04:53):
We moved to Long Island, Rosin Barber Rosinhaba, and then
my dad died and we stayed there in Long Island
for a little bit. And I went to a private
school called Portlage if I don't know, if you know it,
it's in Locust Valley. And then we moved to Princeton,
New Jersey when my mom remarried, and I went to
a school called Princeton Day School.
Speaker 1 (05:13):
So a lot of private schools. And then we were
in New York a lot.
Speaker 2 (05:16):
My mom had an apartment there, so we spent a
lot of time in Manhattan and Princeton essentially.
Speaker 3 (05:22):
And where did you go to university?
Speaker 1 (05:23):
Did Georgetown? To Foreign Service School? And I wanted to
be a spy.
Speaker 2 (05:27):
I wanted to go to the military, and I was
gay and so not allowed at the time, if you remember,
don't ask so to do very much. And thank you
Bill Clinton for that.
Speaker 3 (05:35):
That was great, fantastic, thanks right exactly. Hey London, what
did you major in in college?
Speaker 1 (05:42):
Propaganda? This information?
Speaker 2 (05:45):
Yeah?
Speaker 3 (05:45):
Wow? Oh no, God, I love you more than so.
You were like a lesbian, darling. You were a lesbian,
not just sexually but also just spiritually. You were the
prototypical fabulous lesbian.
Speaker 1 (05:58):
Yes, yes, I thought it was since I was four.
Speaker 3 (06:01):
Greater me too. Oh no, no, no, when they say nature
or nurture, darling, it's so not about nurture and about
I was born dreaming about men.
Speaker 2 (06:09):
Well, interestingly, I had lots of boyfriends I had in
high school. I had a four year high school boyfriend
that I went out with, and I didn't mind men.
Speaker 1 (06:18):
I didn't mind it, just preferred women. So I mean
it were fine.
Speaker 3 (06:21):
Well, but I mean as a kind of like spiritual
lesbian wanting this job as spy, that's to me hilarious.
Do you think it prepared you for your career as
a journalist.
Speaker 2 (06:34):
Percent Well, because first of all, the Internet's become a
propaganda machine, and so I really did understand it.
Speaker 1 (06:39):
I had studied Hitler and Mussolini in China.
Speaker 2 (06:43):
There's a propaganda has been used since the beginning of time,
including the US uses it. And so that was my
area of study because I was very interested in how
you could shift the mentality of a population, including in
terrible ways, and so, you know, I was a student
of history and you could watch it happen over and
were again in this, you know, perceptions of people, including
gay people. I don't know if you remember reading Vito
(07:05):
Russo's The Celluloid Closet, and it was the same thing
I read that and the movie which was narrated by
Lily Tomlin. It was all about how you make people
think a group of people are away when they certainly
they can be a little bit that way, but they
certainly aren't that way, and you could create a sort
of a hate scheme. And so I study that and
that was my area, particularly around hate around.
Speaker 3 (07:26):
Wow, that's a very heavy perspective to mattings from. And
do you remember what it was that took you into
the kind of the tide of your career? What was
your break?
Speaker 1 (07:39):
Well, I did want to go in the military. I did.
Speaker 2 (07:41):
I really very much did, and I very much wanted
to go, you know, serve the country, whether in the
I was going to be an analyst for the CIA.
I thought that was my career path essentially, and so
you know, sort of like you know, Homeland, but less crazy,
I guess than the main character. And I wanted to
do that. And then I wanted to be in the military.
I could have been a military intelligence and had a
(08:03):
great regard for that and couldn't just couldn't.
Speaker 3 (08:06):
Just ACM so shocking to me because I don't want
to serve in the middle like Iver and not because
you know, my death was in the million. I don't
want to break my nose. Oh I see. So it
was like a family in my kind of like a
way of thinking, like you are actually serving your country.
Speaker 1 (08:22):
Yeah, yeah, that was just the way. That was just
the way.
Speaker 2 (08:24):
But I wanted to go into that service because I
you know, I do believe America's incredibly a problematic place,
but it's also if you have the right people running it.
And I'd met so many good public officials and so
many good public servants. I had that experience, when of
course there's tons of terrible ones, especially right now, and
so I just did want to do that. But then
reporting was very similar, with the idea of ferreting out
(08:46):
the truth. So that's what I was doing, really, including
on the campus of Georgetown University. There was a lot
of again anti gay stuff by the priests who were
actually having some relationships with Mela.
Speaker 1 (09:00):
Yeah, and it was just like, what the heck is?
What the heck?
Speaker 2 (09:04):
I was just like, how could they do this? And
so I was always sort of irritated by people who
said one thing publicly and lived another way privately.
Speaker 3 (09:11):
And were you writing about it or something? In university?
Is that what happened? And that's what led you to
begin your I was a columnist.
Speaker 2 (09:18):
Yeah, I was a columnist for the paper, and then
I got a job at the Washington Post. I had
been covering a particular thing the Post also covered and
they got it wrong. And I called the editor and
started yelling at them because I loved the Washington Post,
and they hired me.
Speaker 1 (09:32):
I went down there. I argued with them, and they said,
we'll give you a job because you're so.
Speaker 2 (09:36):
Irritating a job as what a stringer for Georgetown University
who covered it?
Speaker 3 (09:41):
Yeah.
Speaker 2 (09:41):
So and that got me into Columbia journalism school. And so,
you know, everything's a step, you know that, you know,
you do one thing to the next. Absolutely I worked
to the Post and then worked my way up really
essentially from the mail I worked in the mail room
at the Washington Post, which is kind of funny.
Speaker 3 (09:57):
And here's a question. This might sound like slight, lee,
I don't know what, like sexisty you or something.
Speaker 1 (10:02):
Then go ahead.
Speaker 3 (10:03):
I need it and I'm just going to go for it.
Like do you think as a gay woman, Yeah, it
was like slightly easier for you to get ahead than
it was for a non gay woman. For a straight woman.
Speaker 2 (10:13):
Yeah, you know, there's an old joke. I don't remember
who said it, but you know, I don't know. Why
do you think lesbian's hate men? They don't have to
sleep with them? And I have brothers who I love,
and I've always had good relationships with men, and I
think in many ways, and I don't think all men,
but many men want to get along with women.
Speaker 1 (10:30):
It's just a very difficult.
Speaker 2 (10:33):
You know, there's a lot of tension between straight men
and straight women, and I didn't have it, you know
what I mean. And every now and then they think
of me as a guy, which is kind of ridiculous.
I was at a thing with a bunch of venture
capitalists and they started talking about a woman in a
really objectified way, and they're like, what do you think, Kara,
and I go, still a feminist, sorry, like I don't
(10:53):
talk about women like.
Speaker 1 (10:54):
That, you know.
Speaker 2 (10:55):
I mean maybe a little bit, but not not a
group of people with a big haha. So you know,
I think they wanted to get along with me and
welcomed it.
Speaker 1 (11:03):
Yeah, I do think. Yeah, there's a lot of lesbians.
Speaker 3 (11:05):
Yeah. No, I mean I mean just the fact that
you called them up and started screaming at them and
went like, hey, wait a minute, you.
Speaker 2 (11:12):
Know, yeah, yeah, they were comfortable with me in my
attitude toward them. I wouldn't say it was mail either.
I just think I didn't have anything that I wanted
from them in particular, and so I think it made
the relationship pretty easy. I had a very clear I'm
very ambitious as a reporter, and I'm going to get
what I want.
Speaker 3 (11:29):
This is important. I think that's an important trait for
a journalist. I think, yes, right, dog, I can't take
that away from you can't say, oh, I'm just going
to sit in the back here. As a journalist, you
have to want to find out about the truth.
Speaker 2 (11:42):
So many journalists, as you might know, aren't very curious.
I find sometimes you know, that sort of take everything
hook line and sinker, or they decide to be snarky.
Speaker 1 (11:51):
That's the other, you know, the opposite side.
Speaker 2 (11:53):
I try to be fair, That's what I try to do,
and I don't assume culpability at the beginning.
Speaker 1 (11:59):
I just would like to know what happened?
Speaker 3 (12:01):
Right? And is there anything about that that scares you
a little? Bluck? You were? Are you afraid someone's going
to fire you? And you're going to and he lists
in the street to something.
Speaker 1 (12:09):
No, yeah, I'm going to be penniless. And so I'm
highly educated.
Speaker 3 (12:15):
But you know, I know I have terrible, terrible fantasies of.
Speaker 1 (12:20):
Content, not at all.
Speaker 3 (12:22):
Congrat How do you do that?
Speaker 2 (12:25):
Well, you work in an industry that's you know, like
Hollywood entertainment. It's based on loathing and insecurity and the
next minute you're not good enough, And so I don't operate.
Like someone was telling me like that, They're like, what
are you scared of? I said, scary things? That's about it?
Speaker 3 (12:39):
Like, what's scary to you?
Speaker 1 (12:40):
Someone pointing a gun at me? I guess I would
be scared, you know, something like this.
Speaker 2 (12:44):
But I think people always overwhelmingly turf themselves in a
lot of ways. I don't have that good girl thing.
I think women do it in a lot of ways
where they're the good girl like they have to be.
They don't want to be loud, and I'm like, you know,
not loud, but I'm like, i'd like that please. And
I've written several books, but one book I want to
write is a series of three books. One is called
(13:06):
no is a Complete Sentence, the second one is called yes,
I'll take that, and the third is called maybe I'll
call you back. Because I think when you have power,
if you exercise it, people do tend to respond to it,
both men and women.
Speaker 3 (13:20):
Getting back to this conversation of being like a homosexual
woman and a woman and whatever the opposite of homosexual.
You know, you think about that, right, like, yea, you
are able to do that. You know, we almost expect it,
like we expect a gay woman to be quite loud
and quite to that. All are No, there's a lot
(13:40):
of somehow huh. There's a lot of time exactly, and
there's so much Listen, I'm going back to the original statement,
I feel weird, you know, sexualizing it. This way or
something or right, But like if you think about a
woman in Congress, or a woman in the Senate, or
a woman like Hillary Clinton, I don't know who you
want to think of it. You know, they can't do that.
(14:02):
We don't want them to do that some crazy, weird,
fucked up extent. You can't behave like that.
Speaker 1 (14:08):
Well, you remember Barack Obama's really I think it was
a lowsome statement, as she's likable enough, you know, you
remember that, I do.
Speaker 3 (14:16):
It was terrible, you will, right, let's go right to
all the show.
Speaker 2 (14:19):
And I he and I had a back and forth
on a bunch of stuff. You know, he's so used
to being constantly liked. I was like, just a second, sorry,
you know, that was a good interview.
Speaker 1 (14:32):
I went later.
Speaker 2 (14:33):
My ex wife worked for him and at the end
of the term, you go in and take a picture,
and so she's like, can you just come with us
with the kids? And I had done a very testy
interview with him. It's a long story, but it was
quite testy. And I walked in the oval office to
take at the picture, you know, with the family, and
he's prepared to be super friendly, and he looked at
me and He's like, how did you get in here?
Speaker 1 (14:52):
And I was good to see you too.
Speaker 2 (14:54):
I was like, I don't want to be here either.
You know, it's just a lesbian drama here, su.
Speaker 3 (14:59):
Jesus do and Jesus. All right, So let's try to
start digging into this, Okay, just a little bit in
terms of the content of your work, yep, and what
you've been observing about the Internet and social media and
everything that is so much more in depth than any
of us who actually use this stuff sure wouldever delve
(15:19):
into it, Like, Darling, can you tell me, like what
we're essentially missing about the whole thing? What should we
know that we don't know? Well?
Speaker 2 (15:28):
I think one people didn't realize how much can be digitized, right,
And my whole premise when I started coming in and
it was everything that can be digitized will be digitized.
And if you start from that premise, you can go
very far medicine, law, automation, robotics.
Speaker 1 (15:43):
You know, it just goes on and on and on
and on.
Speaker 2 (15:45):
And I don't think people were prepared, just like they
weren't prepared when we went from plows to mechanize farming.
Where we went from sort of handicrafts to manufacturing, factory manufacturing.
We are in that period, and we've been in it
for quite a while, and it's slowly been working its
way through all aspects of society and improving as humans
don't right, And I think people don't recognize how much
(16:08):
power we've given over to won the digital overlords, and
secondly digital itself, how much we've become dependent on it.
Speaker 3 (16:15):
Does it work? Because it doesn't.
Speaker 1 (16:16):
Work sometimes in some cases a.
Speaker 3 (16:18):
Lot of cases for me, it really doesn't. It's so
random that I categorize it as something that doesn't happen.
Speaker 2 (16:24):
But think about just basic things. You used to have maps.
You're old enough to remember maps right when you got
in a car. You don't have those anymore. You don't
have payphones. I once took two of my kids to
walking in Los Angeles, which is an unusual thing to do,
and we ran into a payphone that was sitting there,
an old payphone, and my son said, what is that
and I said.
Speaker 1 (16:42):
Oh, it's a payphone. We went up to it.
Speaker 2 (16:44):
You put money in it and you talked in it,
and he looked at it and he thought, he went, eh,
that's dirty, And I was like it was, Yeah, you're
right now that I think about it. So, I mean,
I think everything has been changed irreparably for the good
and bad. And some of it's good workplace obviously, and
then the pandemic just accelerate shopping, food, fashion too because
(17:05):
of trends. Right now, you're seeing things used to be
somewhat of a gatekeeper situation where certain small amount of
people made decisions. Now that's changed drastically.
Speaker 3 (17:23):
So what are you most afraid of about the Internet?
What scares you about either the current situation right of
it or the future of it.
Speaker 2 (17:32):
You know, everyone talks about AI taking over and killing us.
I'm not scared about that. I'm scared about people doing that.
I'm always scared about human beings and unaccountable power, unaccountable
unelected power, with people that have so much money they
can do anything they want. That scares me, and with
people that are unqualified to make the decisions for our society.
(17:54):
And you know, you could laugh all you want at
elected officials, and you should, especially this week. There are
ridiculous bunches of clownshell but they were elected. Okay, let's
just start with that. You know, someone like Elon Musk
or Jeff Bezos or any of these people were not
elected and they're making major decisions. You know, a small
group of people is making decisions, and that's I find
that problematic.
Speaker 3 (18:15):
Yeah, it's very scary, I have to say, especially since
they only have their own agendas. Sure, do the forefront
of that decision making right. And you know, there are
a great deal of blessings about the Internet and about
social media. But can we focus for a minute on
social media because I feel like of the Internet, that
(18:36):
is probably the thing that throws the cancer and the
cancer exactly. Okay, do you refer to it as a cancer?
Do you know?
Speaker 2 (18:43):
But I think I call it like I had did
an interview with Mark Bennioff.
Speaker 1 (18:46):
Actually I like very much. There's a lot of people
I like, you know, there's a lot of really interesting
I always like Steve Jobs.
Speaker 2 (18:52):
I thought he was a visionary. Spent a lot of
time with him, did a lot of interviews. But yes,
I think social media is It didn't have to it
could have been a tool. It became a weapon. It's
been weaponized, it's been politicized, it's been weaponized essentially by
all kinds of people. And it's easy by it also
plays into our base instincts. It crawls down our brainstem
(19:16):
and goes to our real base instincts around how we
speak to people. I just did a great interview by
guy at MIT who has I think it's the Institute
for Constructive Communication, which is kind of a ironic irony
right now. And you know, it's really the things you
do face to face you wouldn't do on the internet.
And now the stuff you do on the internet is
(19:37):
spilling into the back to the real world. So hence
Marjorie Taylor Green, she's both online and then she behaves
the same way offline, right, And so you know, there
was a really interesting thing that happened this week when
this guy who's absolutely viruently anti gay was elected to
the speakership of the House. And what a reporter asked
a normal question. He's also an election denier, and not
(19:59):
just any election desire. He's like chief election denier and
I know, and so reporter asked a question about that,
and he had a group of people behind him.
Speaker 1 (20:09):
Behind they all especially that old lady who I know.
Speaker 3 (20:12):
That old lady. I thought, I know, I'm an old lady.
Speaker 1 (20:15):
I was like, we're having a pick a ball match
and whoever ends gets killed in the end, you know,
like you.
Speaker 2 (20:21):
Know, whatever happens with her. But they were jeering, and
I thought, that looks like online.
Speaker 1 (20:25):
This is a physical manifestation of how people behave online,
which was really interesting. So it's shifted back now.
Speaker 2 (20:31):
But in real life, most people are decent. They do
not say, you know, dunk or you know some dumb.
Speaker 3 (20:38):
Meme or you think I'm.
Speaker 1 (20:42):
I'm talking about But it's bleeding. It's bleeding the other way.
Speaker 2 (20:46):
So it went from a relatively decent conversation to a
really ridiculous conversation and now that's bleeding back. So more
people feel emboldened to act performatively, and that's what it
is is performance.
Speaker 3 (20:57):
Really, do you feel that maybe you can answer the question,
okay about this idea of regulating such a thing. Are
there regulations.
Speaker 1 (21:07):
About a lot?
Speaker 3 (21:08):
But okay, tell me everything you know.
Speaker 2 (21:09):
Well, I happen to live in Washington right now, long story,
because of family stuff, and I think it's the right
place to be because one of the things that's happened
is in the entire history of the internet. Now this
is the top ten richest people in the world are
tech people except for Bernard or Nou and the Saudis
are also in there, but it's mostly tech people. The
top ten richest companies same thing, all tech companies except
(21:33):
for his company and the Saudis, the Ramco. So that
means they're the richest and most powerful people on the
planet by far. These are trillion dollar companies. Multi trillion
dollar company never happened before. I think Apple and Microsoft
are up there in that regard. Google's close. Facebook is slower,
but nonetheless they're the most powerful companies in the world.
How many pieces of legislation do you think there are
(21:55):
regarding this particular industry. There's hundreds on finance, hundreds on fashion,
hundreds on pharmaceutical, et cetera, et cetera.
Speaker 1 (22:03):
How many do you think there are?
Speaker 3 (22:04):
You're scaring me, now, I have zero ideas. Zero's what
I thought.
Speaker 2 (22:08):
And there is one law, but it allows them complete
immunity from litigation.
Speaker 1 (22:14):
That's I know, right, They could walk Avenue and someone.
Speaker 3 (22:18):
So you're talking about laws that need to be enacted, right,
except how do you ever get any of that done?
But it's basically such a scary, scary situation between these
people who run these companies, well, they've tried our government
and the world government, because it's not just our government
that has to regulate, it's.
Speaker 1 (22:38):
The US has to like this or u US companies.
Speaker 2 (22:41):
For the most part, let's leave China out of it
because it's a whole different situation. But they aren't to
be regulated. They aren't to be regulated. They're not in
our purview. They don't behave in internationals.
Speaker 3 (22:50):
All right.
Speaker 2 (22:52):
But there's a woman in Europe named Marguerite Vestiger who
has passed all kinds of laws. She's a badass. They
hate her and I love her, my favorite person on
the planet. They made a whole series, a Danish series
about her and she's Oh.
Speaker 3 (23:05):
I love that series as a called organ. Yes, that's
her love that's my favorite series.
Speaker 1 (23:10):
Badass.
Speaker 3 (23:10):
I love her and I love that actor. That actor
who plays her is so so although men and that
show just to make could I just make a a
side the beauty of the men and that show if
you just want to see some beautiful Danish men. Okay, sorry,
go on whack to her virtual She's.
Speaker 2 (23:26):
Amazing, she's an astonishing she's at the EU, but she
passed all kinds of laws. But it doesn't matter because
it has to be in this country, a lot of
them now. California has passed some laws under Kamala Harrison,
Gavin Newsom and others, but there's been no privacy legislation national.
There's been no anti trust legislation. There's been no algorithmic
transparency legislation, there's been no data legislation. They are not liable.
(23:49):
They can do anything they want. There is some stuff
happening now around AI and it's bipartisan and there's several
people trying, but they have yet to pack. I'm going
to the White House on Monday because President Biden is
initiating an AI executive order around this. Let's see what
he says. But it shouldn't be an executive order. Congress
should be able to pass the privacy you know.
Speaker 3 (24:10):
But and what's even scarier than what you're saying, or
what I'm reading into what you're saying, is that you
know that kind of regulate themselves a little bit. You know,
it's they absolutely down if they feel that's correct. That
is basic fascism. That is fascism.
Speaker 2 (24:28):
Well, it's something I don't know if it's fascism. I mean,
taking down Donald Trump off of Twitter was probably a
good idea, the same thing, but it was made by
two people, right, And that of course gets into First
Amendment issues. But these are private companies making decisions for
all of society. And Tim Cook at Apple has done
a lot around ads and tried to regulate some of it,
but he doesn't want to be the chief regulator. He's
(24:49):
a very nice man. He's a very wise person, I think,
but he shouldn't be the chief regulator in the United States.
And you know, you have these senators that have lobbying
money pushed against them that is so massive you can't believe.
Speaker 1 (25:01):
I mean, Amy Klobachar, who is a very I think
she's terrific.
Speaker 3 (25:06):
I really like her.
Speaker 1 (25:07):
She's a badass, but she's tried.
Speaker 2 (25:08):
She used to call me. She'd be like, this month,
I'm going to get a past care. I'm like, good
fucking luck, Senator Klobuchar, and it didn't happen.
Speaker 1 (25:16):
And then she called me. She goes, okay, now this time,
and I'm like, uh, you give me a call.
Speaker 3 (25:20):
Why why doesn't she have more to do in Washington?
Speaker 1 (25:24):
She does. She's a very effective legislator. She's actually someone
who actually, unlike say Jim Jordan actually passes legislation.
Speaker 3 (25:31):
Right, But talk to me for a minute about this
idea of you know fighters, right, like people who will
actually give a shit enough and don't go along just
accumulating followship and liking you know, Barbie posts about Barbie,
you know what I mean. What about the people who
can't be bought.
Speaker 1 (25:52):
Well, they all can be bought. It's not bought, you know.
Speaker 2 (25:54):
Like Amyklobachar had one hundred and ten million dollars of
lobbying against several of her bills and they didn't pass.
And by the way, it got scuttled by both Democrats
and Republicans because they were worried about their individual constituency.
Speaker 3 (26:06):
Well, maybe I see this as a parallel. It's like
the arts. You know, there's no like critical writing so much.
There's not a whole lot of critical writing in the
mainstream anymore. Where they used to be critics who would
come out and say, I hate this, this is a
terrible thing. Don't go see this, And there is no
longer that that doesn't exist so much. How do you
(26:28):
encourage me that we're not going to all end up
like you know, like Wally, you just see the movie
Waly yeah, is the most brilliant thing. That brilliant. But
people shall go, well, you need to see that, because
I feel like we're all going to be floating in
some kind of a capsule with a screen, you know,
obese people with the screen talking about this, right, and so,
can you reassure me that that is not going to happen?
Speaker 2 (26:50):
No, I can't, because this stuff is addictive. We left
out the addiction part. These these are like cigarettes, These
are like liquor. You know, maybe there's a no zepic
for this, but right now it doesn't exist because it
is addictive.
Speaker 1 (27:01):
It's a casino. By the way, I think those drugs
are really interesting. I think it's a really interesting trend.
As much as people make fun of it, it's actually
an important issue. Technologists are super involved in it, and
this is an area I think is important. But that aside,
it's addictive and it's necessary. It's both addictive and necessary
because you can't do your job without it.
Speaker 2 (27:21):
And so we have decided to marry digital, right, We've
decided this is their marriage we're going to have, and
we can't get out of it. We really literally can't.
And in some ways some of it's incredibly powerful. What
AI could do around healthcare, around savings, cost savings, around
climate change, It's massive. But that's what it was like
at the beginning of the internet. What the internet could
(27:41):
do for a world peace. It could have showing our commonality,
linking us together. Instead it to generate it into what
is happening now and then again spills over into the world.
So there's every one of these technologies is a tool. Look,
a knife is a tool and a weapon, right depending
on how deploy it. And so it's been too much
(28:02):
weaponized and too fuel in terms of a tool. That said,
the possibilities are endless, like endless and endless and endless,
and you have to sort of start to get leaders
that lean into that versus leaning into the quick book.
Something I always say, I like the new level of
entrepreneurs coming out. There's a lot different ones coming out.
(28:22):
But years ago when someone was doing something I didn't like,
I might have been Zuckerberg one of them.
Speaker 1 (28:28):
And I said two things. I said, One, you're so poor.
All you have is money.
Speaker 2 (28:33):
And then the second thing I said is you either
are going to do something about income and equality because
the money you have is obscene, obscene and you didn't
make it just because you were so smart. You gain
the system like at some level you're smart. The second
part is it has a network effect essentially. And I said,
you either have to do something about income and equality
or you need to armor plate your tesla, right, and
(28:56):
because that's.
Speaker 1 (28:56):
Where it's going.
Speaker 2 (28:57):
And then I thought, oh my god, no, they like
armor plating their time. Tesla's right, that's what they want.
Speaker 3 (29:02):
Actually, yes, so yeah, because I feel like that's the mentality,
like the deeper the bunker, the more armor plating you
have on your car, the better off you are. And
that's what everybody they like.
Speaker 1 (29:14):
The Kashmir these worlds. Do you watch Succession?
Speaker 3 (29:18):
Excuse me?
Speaker 1 (29:19):
I did the podcast. I did their podcast, I know.
Speaker 3 (29:21):
I listened to a few episodes and it was absolutely
it was great.
Speaker 2 (29:25):
But one of the things I thought we talked about
with the guy who created it, Jesse Armstrong, was what
you what he got right about it? Since I spent
so much time with billionaires I know them quite well,
is their worlds get smaller and smaller and smaller and
more comfortable cashmere, right, Yes, I call it Kashmir prison.
Speaker 1 (29:42):
And so they go from their plane to their car
to their house totally private.
Speaker 3 (29:48):
And then they medical facilities, right, they.
Speaker 2 (29:50):
Never and they have no sense of self awareness, mark injuries.
And who's a very famous technologist, Uh A long time ago.
I would say he's been slacking a little bit. But
he just wrote a piece saying you're either with AI
or against it, and you're either with us because us
regular people. He literally put this in a thing, and
I thought, I've met a regular people in a decade,
(30:12):
except if they're coming to bring you your mourning whatever the.
Speaker 1 (30:15):
Fuck you drink.
Speaker 3 (30:15):
Also, what an antiquated word to you use? Right?
Speaker 2 (30:18):
It was so fascinating. I was like, you're not regular.
You weren't regular to start with, and now you really
aren't regular.
Speaker 3 (30:25):
Speaking of secession, okay, speaking of what we were talking about,
like arts criticism and the entertainment business merging in this
kind of sickening way with the news industry, you know,
can you talk me off the ledge a little bit
about that? I mean, so far you've only increased my anxiety.
Speaker 1 (30:46):
All right, Okay? Good? No, I cannot.
Speaker 2 (30:48):
Actually I don't think it's Look, everything's getting impacted by
digital Like look, the New York Times made a grievous
mistake with the bombing of the hospital.
Speaker 1 (30:57):
They put a headline on and they're veryruential. That was.
Speaker 2 (31:03):
Inaccurate at best, right problematic in every way because they
were moving at the speed of social media, and social
media was so full of bullshit and lies and everybody
was using it for propaganda that even they got affected.
And they're the sort of should be the last line
of defense.
Speaker 1 (31:19):
And so I think the.
Speaker 2 (31:21):
Idea of look over here and the entertainmentization has been
happening for a very long time.
Speaker 1 (31:25):
It's not a new phenomenon.
Speaker 2 (31:27):
I think the internet and social media just accelerates and
the pandemic accelerated it further because we were so reliant on
these things, whether it was shopping or entertainment. The other
thing that's happening in Hollywood, and you do a lot
of different shows, is the economics have changed rather drastically,
and people in Hollywood have not paid attention to this.
Like I have spent so much time with Hollywood people
(31:48):
and I cover them, and I keep saying, economics are changing.
You better get ready for what's happening. Same thing happened
in newspapers and music. It's coming for you too. And
they seem to want to live in the old world.
When I listened to them talk about, you know, residuals,
I'm like, that's kind of done. You have to now
think of a new way of getting compensated. And by
(32:09):
the way, the power is all in the hands of
these especially like a Netflix, because they have all the data.
You don't have all the data, and everything is data now,
and the news is the same way. There's been an
explosion and amazing journalistic enterprises. I've created a lot that
are much better than the bigger ones, and they're more accurate.
They don't need as much money to operate. They can
(32:29):
make a lot of money and yet also not depend
on the larger ecosystem essentially, So in that way you're
seeing a lot of innovation.
Speaker 3 (32:37):
That's a modern phenomenon if you ask me. Because if
you were a journalist in the nineteen fifties sixty seventies, yeah,
like it wouldn't necessarily be this platform for you to
stand upon and say what you thought. That's the job
of like, you know, a critic, So that's the job
of someone else, but not necessarily a journalist, right.
Speaker 2 (32:57):
Because you had to be attached to a big ship.
And I left a long time ago, and I'm just
as powerful. And it's interesting. It's an interesting phenomena. I
think it's probably the same in fashion, right you can suddenly.
Speaker 3 (33:08):
I don't know, Honestly, I don't know. I don't really.
Speaker 1 (33:11):
You have to be next to a big company.
Speaker 3 (33:13):
No, yes you do, I guess so, yes, yes, But
I feel like fashion in that way is kind of
over in so many ways. I feel like movies are
kind of you know what I mean. I used to
want to really go out to see movies, but now
they don't shoot movies as much as they shoot like
all those special effects that they mad in later, right,
you know what I mean. So it's like, well, why
(33:33):
should I go to the movies and sit there and
pay however much money when I could probably see as
good a experience as something there's no print of anything,
you know what I mean.
Speaker 2 (33:44):
It's because the experience has gotten better as the movie
experience has gotten worse, right, right, So I know a
bad consumer premise. I pay all this money to go
out with shitty seats, shitty food, shitty sound exactly.
Speaker 3 (33:57):
But wait, wait a minute, because I was kind of
using that as a parallel to this kind of journalism
that goes on today. Listen, the Washington Post today, as
much as we love it, is not the Washington Post.
Speaker 1 (34:10):
Of oh that critic who was amazing Kay Graham.
Speaker 3 (34:13):
But you know what I mean, it's like, you know,
you are fighting this fight by yourself and it's sort of.
Speaker 1 (34:18):
The but it's not. But that's different.
Speaker 2 (34:20):
But I think there's like, look, the business of fashion
is a very small little thing happening is now.
Speaker 1 (34:24):
I think she's at Puck.
Speaker 2 (34:26):
But I think there's a lot of really innovative entrepreneurial
journalistic activities happening all over the place, and I think
they're good.
Speaker 1 (34:33):
I think they're really good.
Speaker 2 (34:34):
And the economics work for a smaller group of people,
and that's what you're going to see throughout because like fashion,
people can now go to social media or TikTok, especially
music get discovered. And you know, I did an interesting
metare with John Legend when they did the strike. I said, oh,
this is a problem because TikTok's now entertainment. I don't
think you realize how much it's entertainment.
Speaker 3 (34:53):
Of course, he.
Speaker 2 (34:54):
Said there's a lot of bad people on it, right,
And I said, you know, John, out of one hundred
people seven them are good. They never would have been
discovered before. So you're screwed because Seth, you can now
find them amid the thousands, because it's one constant tryout.
It's the same thing with fashion, it's the same thing
with everything journalism. And so I think that's exciting.
Speaker 1 (35:14):
I do. I personally think it's exciting.
Speaker 3 (35:16):
I think it's exciting, but it's also so scary cars like,
because there isn't like a trained bunch of people looking
at like you go on YELP. A lot of people
believe Yelp so much more than they believe you or me.
Speaker 2 (35:32):
Sacemakers. You're talking about tastemakers.
Speaker 3 (35:35):
Gatepers, tastemakers or you know, it's like the way people
politicize news. You know, you can't even talk about an
insurrection that happened on January sixth without some crazy person
in Congress right saying that you're politicizing the news when
I'm not really, I'm just reporting on what happened.
Speaker 1 (35:54):
But it's your choice to react to them, right, Why
are you read?
Speaker 3 (35:56):
What do you mean?
Speaker 2 (35:57):
Because I think what they're doing is it's all performative
and we're in an episode of Network, right, you remember
Network for the movie and the wonderful Theater. Brian Cranston
was amazing in that. We are in Network, Remember everyone
laughed like Sybil the souths Sayer and all this stuff
and anger, you know, being a thing. We are in that,
that's where we are and everyone laughed at the time.
(36:19):
It was a satire, but you know he was aggressive.
Speaker 3 (36:22):
Who laughed at that?
Speaker 1 (36:23):
I think a lot of people.
Speaker 2 (36:25):
It was, but I think a lot of people at
the time were like, look at this, It'll never happen.
Speaker 1 (36:29):
And that's precisely what happened. Actually, well, that's.
Speaker 3 (36:33):
What scares me about the moment that we're living through,
especially right now, you know, like where we think something
won't happen, but in fact it is happening. You know,
it is happening, darling, and there's real.
Speaker 1 (36:43):
Power to these people. And I would agree. I think Trump,
of course, is the biggest version of that.
Speaker 3 (36:48):
Like we can trace it all back to twenty sixteen,
right right, but WIT's.
Speaker 2 (36:51):
Before that because because he was you know, he was
on his show. I watched every episode of that show.
I was one of the two people who watched it.
Speaker 3 (36:58):
I was on it. Were you yeah, I hate it? No,
it was like they worked for me. In this one
episode that woman called what's her name? She couldn't pronounce
my you know that fabulous she's cold, yes, yeah, And
she said, oh, I'm going to work for Isaac MASSARAHI.
And I was like, darling, if you can't pronounce my name,
you can't work for me. It was like that was
(37:19):
an episode and it got a lot of And I
was so mad at my business partner the time for
hooking me up with that. I was so mad. You
we already Yankee game in October, right, and we had
to leave because I had to go tape this fucking thing.
And I was so mad at her. And then of course,
you know, everybody picked up on something about my name
and about yeah. So we were talking about I'm sorry,
(37:48):
the nascans or the well, here's the thing.
Speaker 2 (37:50):
Trump used twitters, particularly the way JFK might have used
television or FDR used radio. I mean, I think every
era has someone who's good at it. Now he's now
he really was. He doesn't continue to be, but he
was a troll in chief, right.
Speaker 1 (38:07):
He's really good at manipulating the media and going around
it with his own voice. Now you may despise him,
but it was. It was a brilliant move on his
part because he had an image that had been carefully
crafted through the Apprentice. I always thought he was a
poor person's version of a rich person, right, you know
what I mean, like an idea of like.
Speaker 3 (38:26):
You know, he doesn't care about like having a seat
on the board at the library, right, Yes, he cares
about having Yes, he wishes, well, he needs someone to
fill that.
Speaker 2 (38:35):
For him, right, And so I thought he used it
rather effectively. I wrote a lot in the New York
Times about this. I kept saying what he's doing. I
wrote a column in the Times, and I won't tell
you the date where I said, after the election, if
he loses, what if he got on there and said
it was a fraud, it was stolen. And then he
kept saying it and repeating it for a month afterwards.
(38:55):
And then he pushes his supporters to stopping the process physically.
I wrote this in twenty nineteen. Mm hm, that's exactly
what happened, because you see, you could see how he
was using it, and he couldn't have done it without
these digital tools. He absolutely couldn't have done it, you know.
And it will come to bite him in different ways
because he's in a court of law now, which is
a very different place and has not been affected in
(39:18):
the same way. But he definitely took the power of
the Internet and used it for his own devices.
Speaker 3 (39:23):
Yes he did. I am still suffering from this. What
would you call it rel Trump's derangement stan Trump derangement syndrome.
It's a relative of Stockholm syndrome. I just I'm so
afraid that he's not going to have to pay for
his actions, you.
Speaker 2 (39:39):
Know, because he's become a narrative for you versus you know,
Trump's been He's been here before, you know, whether he's
Huey Long or Mussolini, these people existed.
Speaker 1 (39:48):
Look, yeah, Hitler did not need Instagram.
Speaker 3 (39:50):
It's making me feel great, keep going.
Speaker 1 (39:52):
But I'm just saying Hitler did not need instagra.
Speaker 2 (39:55):
Well, it's just the question of at some point all
these things, everyone will use them and therefore they are
rendered less effective, right and so or people get used
to them.
Speaker 1 (40:04):
I have four kids. I have four kids.
Speaker 2 (40:06):
They are not as affected by this stuff as we are,
right because they know how to use it, they understand
what's happening. They aren't quite as easy to manipulate, just
like we weren't as easy to manipulate with television. You
ever lost our minds over television? It was the idiot box,
all that idiotic. We knew it wasn't the case, right,
and then television got better.
Speaker 1 (40:25):
It did. It's great.
Speaker 2 (40:26):
It's not just good. Television is an art form now.
It really truly is not all of it, not all
of it.
Speaker 3 (40:31):
Secession is a lot of it, a lot of it,
so much, and it's weird. Speaking of addiction and stuff,
I just watched this incredible movie that I recommend very
highly called Cassandro, and it was so beautiful and I
watched it on Amazon Prime and something about it. It
was beautifully crafted enough so that I could actually sit
through two hours without needing to go, you know, next episode,
(40:54):
skip intro, you know what I mean? Whereas like now
I need to go, like Gilmore girls, skip in. If
you're out on the cold, skip intro, like next episode.
Speaker 1 (41:04):
Right, that's I'll tell you a thing that'll suck you up.
Speaker 2 (41:08):
I just watched Three's Company because I like and I
watch it. If you watch it again, you can't watch
it again because you're like, oh my god, it's all
about date rape and homophobia.
Speaker 1 (41:19):
Literally, and you're like, even Marjorie Taylor Green would be like, no,
that's enough.
Speaker 3 (41:24):
Wait a minute. What about just the story of Suzanne
Summers and how she asked for more money at least
as much as John Ritter. And they were like, oh, okay,
you're a blonde, we can recast you. And then they did.
They did. Ris, I mean, Christy came It's just Chrissy exactly.
Speaker 2 (41:38):
Chrissy was her name. And the woman, the other woman
came in and who looked like Chrissy.
Speaker 3 (41:42):
Right, Well, just that story alone, you know, don't go
back and watch it. No, go back and watch it.
You have to, all right, I will try, But I'm saying, like,
back to that thing about how you know I don't
even want to watch a movie, Like the length of
a movie has become irrelevant.
Speaker 1 (41:57):
You know, you saw Oppenheimer, didn't you.
Speaker 3 (41:59):
I just no, I did not. I saw Barbie and
I literally disliked it so much.
Speaker 1 (42:04):
No, I did not like I loved it.
Speaker 3 (42:06):
I really did not like it. I didn't. I felt
for a million different reasons. I didn't feel like there
was enough of a story or a movie as much
as a bunch of little ideas, just in terms of
something that holds my attention. And then you know, it's
like when I was a thirty something, you know, they
asked me to be at the Miss America pageant to
(42:28):
be and I was like, you know, I can't do that,
and they were like, oh, please, please, can we just
talk to you for the documentary. I was like, you
really don't want my perspective on this because I think
it's disgusting, you know, like we really do, we really
do it. It was like okay, And I sat there and
I told them how disgusting I thought it was, and
how I really have absolutely no humor about that subject.
And that's kind of the way I feel about Barbie,
(42:50):
and that's kind of the way I feel about the
kind of evil empire of Mattel and how that's all
it was was just more grabbing and more branding and
more money and bless them, bless Greta girl. We can
bless those people, and yes, yes, yes, but it didn't
personally work for me. And I guess I don't know
what excellent review. I'm sorry, I will see Oppenheimer. Do
you think I should?
Speaker 1 (43:10):
It's too long?
Speaker 2 (43:11):
It's actually I think nobody wants to tell Chris Nolan,
like I'm doing an interview with him, and I'm going
to say, look, I feel nobody told you you needed
to be edited. You're a brilliant guy, but nobody understands
your last movie, and someone should say, you know what, hey,
hey lady.
Speaker 1 (43:28):
You need a little cutting, like a little judicious cutting.
But it was it was a good movie. It was
a fantastic movie.
Speaker 3 (43:34):
I'm going to definitely see it the minute it starts streaming,
you know, because I'm so lazy and I'm not going
to the movies. But wait a minute, I have a
question to kind of wrap this up a little bit
because we're kind of almost finished. But you know what,
I remember when I started in fashion and entertainment, some
journalists asked me this question, like what do you see
for the future. And I thought about it and I said, well,
for fashions, well for fashion and entertainment, just for all
(43:57):
of it. Okay, let's talk about fashion, right yeah, And
I said, I think it's going to be these two incredible,
like separate factions. There's going to be this incredibly, incredibly simple,
simple thing where it's just minimal to such an extent
that you don't even notice it. And then the opposite
(44:18):
of that, it's going to be like you know, Hunger games,
the Hunger Games exactly right. And that was my answer
because I thought about it, and I'm not wrong. I mean,
that is kind of thirty years later, that's kind of
what's taken place. What if I asked you that about
the Internet. Could you see what the future of this
soul is? The Internet and social media is there?
Speaker 1 (44:38):
Well, I think.
Speaker 2 (44:38):
Social media is destroying itself, right, I think it's becoming,
especially for young people. I think they're not interested, including
you know, TikTok is not social media, it's entertainment. That's
how I look at TikTok. I think it's addictive, but
I think it's addictive in the way good entertainment is, right,
like you just kind of want to It's like uh
Dorito's I guess, and some of it's quite good.
Speaker 1 (44:57):
I mean, I don't know if you use it. I
use it on a bird.
Speaker 3 (45:00):
I use it a little. I use Instagram a lot.
And in the past sort of six years or seven
or eight years, it's like, you know, war guns and
then puppies and you know, Britney spears. You know what
I mean, and it makes me feel a little crazy.
And what I worry about is people not feeling crazy
when they scroll from from people being hostages to you know,
(45:23):
from puppies.
Speaker 2 (45:24):
Oh there was no way that wasn't going to enter
the picture over in Instagram. But Instagram has been relatively
benign comparative to lead to like a Twitter or yes book.
Speaker 3 (45:31):
Well, Twitter is just cock. God it's always been cock.
Speaker 1 (45:34):
Got it has now it's really lost its mind and
because its owner is also problematic in a lot of ways.
That's right, But I think that there's going to be
new ways.
Speaker 2 (45:43):
I think I really do believe in young people. And
maybe it's because I have kids, but I do think
they're very judiciously watching this stuff. I do think there's
a lot of technology that's so promising around healthcare, cancer,
climate change, weight gain, if solving some of these issues
around transportation. I ride these driverless cars around San francisc Going.
(46:06):
I know some of them are problematic, but it's heading
in the right direction. It really truly is it.
Speaker 3 (46:10):
Just is I love you, you.
Speaker 1 (46:13):
Know, cars, trucks, safety.
Speaker 2 (46:16):
I do think we're going to self cancer in this
next set fifty years using digital means I just was
with Jennifer DOWDNA who won the Nobel Prize for crisper
gene editing and things like that, which is you know,
some of it could.
Speaker 1 (46:28):
Go a real ugly way.
Speaker 2 (46:29):
Is she talking about that? But some of it could
go an amazing way. And so it just depends on
where we are. So I see a lot of promise
at the same time, you saw what happened with this
is really Hamas war and a stay away from the
war itself. What's happening online is insane, just insane, and
it's so problematic, getting back to propaganda and misinformation, and
(46:51):
it's setting people against each other in this highly reductive way.
Speaker 1 (46:55):
Pick one, pick a side, pick aside.
Speaker 3 (46:56):
Can talk about me and my husband for a minute ago, but.
Speaker 1 (46:59):
It's a complex shoe.
Speaker 2 (47:00):
You can have two ideas in your head at the
same time, and for some reason people can't. People can
say Hamas is a terrorist organization, they did a heinous
massacre net and Yahoo's a problematic criminal.
Speaker 1 (47:14):
He really is.
Speaker 3 (47:15):
That it's correct, and.
Speaker 1 (47:16):
The brutality against the people of Gaza is problematic. We
have to figure it out.
Speaker 3 (47:21):
And so people don't seem to be able to those issues. Darling.
It becomes nothing but pure anti semitism, and it is
one hunt.
Speaker 1 (47:30):
Listen, my wife is yours. I get it.
Speaker 2 (47:32):
It's really is and people can't do it. And there
was something at GUGW. I live in Washington, d C.
And they were putting all this stuff on walls.
Speaker 3 (47:40):
How cruel I don't think you the cruelty and tearing
down posters of people, of people who are.
Speaker 1 (47:47):
Kidnapped online moving into it's cruelty.
Speaker 2 (47:50):
It's it's impulse control, it's inability to take a frigging
moment and understand another human being. And that's my worry
is that people are That's the fear is that people
become it's impossible to see them as a person right
because everything is a reaction and a reaction performative, and
that's worrisome.
Speaker 1 (48:10):
That is worrisome.
Speaker 3 (48:11):
Usually I ask this particular next question much earlier in
the interview. But because you're such a like a fighter,
and I don't necessarily see your career trajectory or your
life even as linear in this way, it took me
a minute, but was there a setback that you went
through in your life that taught you something like a failure?
Speaker 1 (48:31):
You know, my pop it has this called pivot. One
of them is called pivot.
Speaker 2 (48:34):
You know, I don't see things as failures, to see
them as opportunities. That's a good thing I've gotten from
tech people. Is that not stupid? Like a lot of
them are there like it wasn't a failure.
Speaker 1 (48:42):
I'm like, oh that didn't that didn't work. What can
I do?
Speaker 2 (48:46):
I tend to try to see things from a different
perspective as other people like I tend to just go, Okay,
then what am I going to do next? I'm not
particularly scared about next. I'll tell you one very short story.
I was meeting with a woman who was a very
powerful tech executive and she came to me and she said,
you know, I'm leaving this company. I want to be
a CEO. I've been offered this, this or this right.
(49:10):
And I said, uh huh. She goes, what do you
think I should do? And I said, well, what do
you want to do? And she said, well this is interesting.
I said no, no, no, lady, you sound like a
lady who's gotten a marriage offer and that's your only choice, right.
I was like, if you were offered in a restaurant chicken,
beef or fish, and you want duck. Fucking ordered duck, Like,
(49:31):
get duck, get and get the duck. And she was like, well,
what if the duck's unavailable.
Speaker 1 (49:37):
I said, Oh, the duck's available. It's back there. That's
fucking back there. They have the duck.
Speaker 2 (49:42):
They may not have the duck, then go to another restaurant.
I was like, why are you picking the choices that
are in front of you versus everything else? And so
that's how I've conducted lime life that way. And the
second way is I leave things all the time. I'm
very willing to leave. And I was at a very
powerful place and someone said, well why are you leaving?
And I said, don't take this the wrong way, but
I don't want to talk to you anymore.
Speaker 3 (50:03):
How can I say them?
Speaker 1 (50:04):
I didn't hate them. I was like, I just am done.
Speaker 3 (50:07):
I can't deal with you.
Speaker 1 (50:08):
Done talking to you. And it's no insult, but that's
me and it's not me, it's you.
Speaker 2 (50:12):
I'm done having a conversation with you, and I want
to talk to someone else. And I think people don't
do that, especially lucky people like you and I. Right,
we are educated, but both of us are white. We
have a lot of advantages, and we tend to focus
on our negatives versus our advantages, and we are able
to get up and get out, and so many people can't.
So many people are trapped in prisons.
Speaker 3 (50:34):
And one of the things that I loved about this
talk was we never once talked about how difficult it
is to be homosexual, of the setbacks are bullying anything,
because that exists, but we talked about the actual advantages.
I think it's better, so much better, absolutely better. I mean,
(51:00):
I have to tell you, I don't know why you
had kids. You you didn't have to have kids. I
love kidding, kidding, I'm kidding, I'm kidding. I like sub kids.
I like sub kids. My kids.
Speaker 1 (51:12):
Sixty. I have a four year old and.
Speaker 2 (51:13):
Two year old.
Speaker 3 (51:14):
I'm too old, really, my husband is fifty two, so
I think we're too old.
Speaker 1 (51:19):
No, No, kids are the best.
Speaker 3 (51:22):
All right, maybe we'll get by the way.
Speaker 2 (51:24):
Let me just make an observation, lesbians should raise all
the kids. I have the most manly sons who are
also very sensitive, but they're not too sensitive.
Speaker 1 (51:34):
They're great, they're great men.
Speaker 3 (51:36):
Well, it's like that incredible Michelle wold Wolf joke. I
don't know if you watched her no I love free,
excuse me, you must You're welcome go home and watch
that on that I will I will love And she
says like, I don't know why lesbians like to go.
Here's some solutions, you know. And then we're all friends
with gay men who go you're ugly, you know, which
is just the most hilarious. It's a fucking no, I know,
(51:57):
but you know, as a joke, it's a punchlin.
Speaker 1 (51:59):
I love a gay man. I live in the castro
of San Francisco. It's my favorite place.
Speaker 3 (52:02):
Of course.
Speaker 1 (52:02):
Whenever I feel back, I come back here and I'm like, ah,
the naked guys, I'm so happy they're here, all right.
Speaker 3 (52:08):
Final question, which I actually think is a good pivot
to the last question.
Speaker 1 (52:12):
Pivot.
Speaker 3 (52:12):
You know, I'm obsessed with obits. I wake up every
day and I read the obits and I think about
what mine is going to be?
Speaker 1 (52:18):
What is yours?
Speaker 3 (52:19):
Tell me mine is mine? Is not what it would
be if I died tomorrow. I need another like fifteen
years to actually prove what I want my obit to
be about. What do you want your obit to say?
Speaker 1 (52:29):
Oh, that's a good thing. She did, what she wanted.
Speaker 3 (52:33):
She ordered the duck for what she ordered the duck.
Speaker 1 (52:36):
She ordered the fucking duck.
Speaker 3 (52:37):
The fucking duck, fucking duck. That is so inspiring to me.
I cannot even tell you. I'm going through all this stuff.
I think the world astrologically. I'm not sure if you
believe in astrology, but a little bit world is going interesting,
freaking crazy.
Speaker 1 (52:51):
It's a simulation.
Speaker 2 (52:52):
Do you know that a lot of tech people think
a lot of tech people, including Elon Musk, a lot
of people think we're an actual simulation. So essentially teenagers
from the future of future society or playing a video
game right now, and we are the video game, and
so it's not real. And that's how come it's so
crazy to say, that's.
Speaker 3 (53:10):
Their premise, why we are crazy exactly?
Speaker 1 (53:13):
This is just a simulation.
Speaker 3 (53:15):
Well, what do you want to promote on this podcast?
Speaker 2 (53:17):
You know, I have a lot of things I do,
but I have podcasts. You should listen to them.
Speaker 3 (53:21):
They're fuck okay, I can't wait. I listened to some
of them.
Speaker 1 (53:24):
Obviously, Pivot is really funny.
Speaker 3 (53:26):
You can't avoid Karas Wisher.
Speaker 2 (53:28):
I think soon I have another thing about to be announced.
I can't say yeah, but soon. I really try very
hard to treat people fairly, and at the same time,
I think you should listen to my interviews with some
of the most powerful people on earth, because while I
don't let them off the hook, I really do ask
the questions that need asking of these people and hold
them to account.
Speaker 1 (53:46):
I'm also not unfair to them.
Speaker 2 (53:47):
I don't think there's any plus in attacking people unnecessarily
unless they deserve it and ask them tough questions.
Speaker 1 (53:54):
I believe smart people like smart questions.
Speaker 2 (53:58):
Right. Someone said, why did Steve Jobs get coming back
to talk to you? Why do all these people that
you are very tough on. I said, because they're smart.
Because they're smart people, and they're not scared of tough questions.
Speaker 1 (54:08):
And I believe that. I do believe that, and I
think the lesser people are and they run from you.
Speaker 3 (54:13):
Amazing. It's great. Keep on darling, all right, doing this.
Speaker 1 (54:16):
I shall I'll try.
Speaker 3 (54:18):
I want to have dinner. Let's have dinner, all right,
that would be great.
Speaker 1 (54:22):
You call me anytime you can't. I'll let you, baby,
I'll let you baby. Oh okay, no, no, no, I
have no you would love my kids.
Speaker 3 (54:29):
They're historical way to meet them.
Speaker 2 (54:31):
One thing that's interesting about my kids is, you know,
all my friends have various they and them and different gender.
Speaker 1 (54:37):
Which I think is great. Like I don't.
Speaker 2 (54:39):
Sometimes I'm like, what do you care what people want
to call them? If you want to call yourself Shirley,
let them call themselves Shirley. Although that would name I
would choose. But I have the most cisgendered children in America.
Speaker 3 (54:50):
Yeah.
Speaker 1 (54:50):
I had an argument with a right wing person JD. Vance.
He was like, liberals don't believe in the future. I
was like, let me explain something to you. I'm growing
all the straight people. It's you people that are grow
all the game people. It's I literally.
Speaker 2 (55:03):
Am like, you're welcome. And by the way, I have
double the amount of children he does. Why can't he
keep up?
Speaker 1 (55:08):
That's all I.
Speaker 3 (55:08):
Nothing genders like transgression, like religion.
Speaker 1 (55:13):
Yes, right, exactly. So I'm always like, I am doing
the society.
Speaker 3 (55:16):
Of my job. You're doing your part.
Speaker 2 (55:18):
I'm doing my part to grow great men and great
women for the future. You need to keep up, all
you right wing people, because you're very slow.
Speaker 3 (55:26):
No, darling, hopefully they will be creating fabulous like transgender
that's correct, which is weird always the case is that
not always always. Incredibly incredibly welcome.
Speaker 1 (55:35):
Is my favorite thing ever. Anyway, thank you, you can anytime.
Speaker 3 (55:39):
All right, let's do it. Come to New York and
give me a call.
Speaker 1 (55:42):
I have my son cook for us. He's an amazing cook.
All right, all right, thanks Isaac.
Speaker 3 (55:46):
Well, I am extremely surprised right now after talking to
Kara Swisher in such depth. I was not expecting what
I got, which was a great, big inspiration towards the end.
(56:09):
I mean, so much information, so much kind of insight
I was expecting, but I wasn't expecting that story at
the end where she was telling that woman to order
the duck. And look at this point, I want a
T shirt that says that. I want a T shirt
that says order the duck. I want it to be
(56:30):
in like gold glitter, and I want it to be
like ironically oversized, so I feel really good when I
wear it. Order the Duck. Darlings, if you enjoyed this episode,
do me a favor and tell someone, Tell a friend,
tell your mother, tell your cousin, tell everyone you know. Okay,
(56:51):
and be sure to rate the show. I love rating stuff.
Go on and rate and review the show on Apple
Podcasts so more people can hear it. It makes such
a gigantic difference and like it takes a second, so
go on and do it. And if you want more
fun content videos and posts of all kinds, follow the
(57:12):
show on Instagram and TikTok at Hello Isaac podcast And
by the way, check me out on Instagram and TikTok at.
I Am Isaac Msrahi. This is Isaac, Missrahi, thank you,
I love you, and I never thought I'd say this,
but goodbye Isaac. Hello Isaac is produced by Imagine Audio
(57:37):
Awfully Nice and I AM Entertainment for iHeartMedia. The series
is hosted by Me Isaac Msrahi. Hello Isaac is produced
by Robin Gelfenbein. The senior producers are Jesse Burton and
John Assanti. It is executive produced by Ron Howard, Brian Grazer,
Carl Welker, and Nathan Kloke At Imagine Audio, Production manager
(58:00):
from Katie Hodges, Sound design and mixing by Cedric Wilson.
Original music composed by Ben Wilson. A special thanks to
Neil Phelps and Sarah Katmak and I AM Entertainment.