Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
Now, Really Dan Dan, Dan Dan, I love our theme song.
Can I just say that I love the song ruining it?
You're singing, Wow, I may be ruining it, but I
love it. So Hello and welcome everybody to another episode
of Really No Really. I am internationally below Comedy Superstar
(00:26):
and Television from Hope for the Ladies, Hopeful Jason An
Hope for the laugh. How dare you with me today
as my partner and friend or my friend and partner,
or just maybe just my partner? Peter Tilden say hello, Peter,
you Hi. Good to see that was That was exciting.
That was a real greeting. Can I just tell the
(00:47):
audience that I'm looking at you, I'm looking at you
on a camera, and you basically look like you took
beats and ate a bowl of beats and rubbed them
onto your face. I think you have one hundred and
six degree fever today. You you looked terrible, and I
mean that in a good way. How do you feel?
I feel like if you don't start this and we
move it on because I feel terrible, I'm gonna get
(01:07):
in my car as sick as I feel, drive to
your house and strong with my bare red hands, and
that's why you're my partner. And maybe I don't know
what it is, but you don't look good, So you
know what, we'll fake it. And that is a segue
to what we're talking about today. And I'm going to
tell you, Carte Blanche, our topic today freaks me the
(01:29):
hell out, and I think it's only going to get worse.
I don't think I'm going to feel better about this
by the end of the episode, because today we are
talking about the new world and technology of deep fakes. Now,
the first time I saw this, I don't know about
your Peter. The first time I saw it, there was
a Tom Cruise video online. When I saw it, I said, really, no,
really that stuff. So it turns out, of course it's
(01:51):
not Tom Cruise. And then the one that kind of
made me crazy, and you sent it to me, and
I hadn't known this was out there. You know the
scene in Pulp Fiction where John Travolta and Samuel Jackson,
you know, they go into the house and they're gonna
blow these guys away, and one guy's hiding out in
the bathroom and comes out and shoots at them and
misses everything. And that's why Samuel Jackson thinks there's been
(02:11):
a miracle. They they made the actor in the bathroom
Jerry Seinfeld, and they did a soundtrack as if it's
an episode of Seinfeld with what that damp bam bam
bamp bland. It didn't think that, you know, but it's
like it is, it's Jerry Seinfeld, and I'm going, what
the hell is happening? What is going on? So it
freaks me out because immediately immediately I go to, oh,
(02:35):
who's playing around with my face? What are they making
me do? What are they making me say? What? Because
you know, it's not like I it's it's not like
you have to go deep deep dive to find me
doing ridiculous things in front of a camera. But and
by the way, so if they did it at the
company for their sales meeting, or they did it, or
kid did it and put it out there on YouTube,
how are you gonna what are you gonna do? You're
(02:57):
gonna go after him? But me while it's out there
and you could get a million only the technology is
becoming more and more available and is getting better and better.
Can they just start making anyone do anything? Okay? Yeah, yes, yes, Jase,
they can and then they'll take the fake a little
Jason and put him in a fake world like the
metaverse that the Mark Zuckerberg is trying to create. He's
struggling with it, but it'll happen, but from somebody. And
(03:19):
because those places sell real estate and they have comedy
clubs and they have strip clubs. Say what now? Say what? Yeah,
we'll get to it. And how do you make it
rain in the metaverse? I don't know that. Well, we'll
find we'll find out later. By Jason, let's answer all
of your questions, especially the big really no, really, can
anybody make a deep fake? Now? Is that technology exists
(03:39):
where you can make a deep fake at home of
somebody and it's undetectable. This guy is the top guy
when it comes to deep fakes. He's considered the father
of digital forensics. Jason. Let's meet him and talk to him. Now,
let's welcome to really No, Really, Professor Honey for Reid,
who is the dean and how do you see Berkeley
School of Information And, as I said, considered the father
(04:01):
of digital forensics. Who better to ask about deep facts
impacting is now and in the future when we're trying
to figure out what's truth and what's reality? So go
for Jay. So we're here with what is your what's
your favorite title? Honey? Is it? Professor? It is it?
Sir captain? Is it? One of the great things about
being an academic is you hold many titles. Professor is
(04:22):
just fine, Honey is just fine? Um. You know, one
of the one of the great things about being an
academic is we get to educate the next generation of
young minds, which is always inspiring being on campus with
all these very bright kids, but also really pushing the
forefront in my case, of where technology is intersecting and
sometimes colliding with societies and with democracies, and that's that's
(04:47):
really exciting being on a campus and trying to think
through some of these issues without the corporate treasurers of
profits and so on and so forth. You know, what
all I had, all I asked him is what he
wanted to be called. And already we're in over my head. Dude,
I'm a professor, I'm talking about academics, and I just
wanted to know, is it? Honey? Is a doctor? Far? Well,
(05:08):
how do you? How do you press your last name? Honey?
Honnie red and Honi is fine? All right, That's all
I wanted to know. Don't get ahead of me, honey,
makes me look stupid based on that leading as a
as a professor, treat this as a remedial class that
you're talking about. Yeah, somehow we got in, somehow we
(05:29):
registered three class, but we're slow. I'm just a theater major,
so really slow it down for me. So, just for
remedial purposes for the audience, would you, in your own
way tell us what is a deep fake? What are
we talking about? So what you should understand about deep
fakes is that it's an umbrella term for many, many
(05:50):
different things. Um At its core, what it's getting at
is automatically synthesized media that can be audio, can be images,
or can be video. And by automatically synthesize, I mean
think about historically how we have manipulated audio, images and videos.
You make a recording, you cut and you splice, you
(06:11):
add words, you subtract words with images, put into photoshop,
manipulate an image, take one person's head off, put another
person's head on. And essentially what deep face have done
is automate that process using advances in machine learning and
artificial intelligence. And the reason why that's interesting and important
is because when you automate something, you democratize access to
(06:34):
what has historically been state sponsored actors Hollywood studios who
could do very sophisticated editing, and you've now put it
in the hands of the every person who can alter audio,
alter images, alter video, and of course now distribute that
to the world instantaneously through social media. So think of
deep fakes is just an automatic way of manipulating media.
(06:59):
So immediately what I think is this isn't good. This
is not a good thing. Is it a good thing?
I mean, it's let's take away, it's nefarious used for
a beginning, but is it Is it something that people
really need to do? Does it have a real purpose
that's meaningful. Yeah, that's the right question to ask. So
(07:21):
the people developing these technologies are not, in fact nefarious.
Most of them are on campuses like where I am,
around the world, and they have been working on these
technologies in fact for decades in various forms in the
fields of computer graphics and then the fields of computer vision,
with the primary purpose of Hollywood studios and special effects.
(07:41):
That's mostly where this technology is coming from. Is I
want to film one performer and I want to make
them look younger or older. So instead of doing the
makeup and the special effects, you can do this all
in post with some deep fake technology. Or I want
to swap out one face for another space to make
it a different person. Imagine customized movies or customized TV
(08:02):
shows where you can mix and match your personalities that
you have attached to each of the characters. There are
some technologies that are primarily being developed from nefarious purposes,
to commit fraud, to commit crimes. That's not this. It's
just that once you peel away those Hollywood special effect applications,
what's left is some pretty scary things. And I assume
(08:25):
we're gonna be talking a little bit about that little
film trivia from my career. If you want to see
a great example of it, it's very quick. It's in
the movie I did, The Adventures, and I know this
is going to create a rush to get this. On
the Adventures of Rocky and Bullwinkle where I played Bars
batten Off, and there's a scene where I am perched
on a water tower and they never told me I
(08:48):
was going to be perched on a water tower. And
I have a tremendous fear of heights, and I'm now
eighty feet off the ground, shackled to the water tower.
I can't fall, but the stunt coordinator is about to
leave me there I can get a shot, and he
says to me, Hey, yeah okay, and I went no,
I'm not I'm truly not okay. And if you walk away,
(09:08):
there's going to be an incident. There's going to be
at least wet pants if nothing else. So this is
not going to work. And I actually couldn't do. I
couldn't I guess if I didn't have to act, I
probably gonna sat there and screamed like a like a
like a you know, a baby. But I couldn't act,
so my stunt double went up and they had to
put my face in the film. They had to put
(09:30):
my face on my stunt double, so I was deep
faked onto another guy. Yeah, honey, let me ask you this.
If it's I started the premise of our really no realers,
I can't believe where we are, that the defas are
so good, they were possibly at the point of no return.
And then you just made me feel better because if
they couldn't do, DeNiro and the irishman were it looked real.
Are we still behind the curve? You look at Denier
(09:52):
and you go, what's wrong with his face? It was not.
It was not like boy, a young DeNiro. How'd they
do that? And that was a Hollywood studio. They didn't
do it very good job. So here's what I can
tell you. When you're thinking about these technologies, you always
have to have to look at the rate of change.
How are things evolving and how quickly are they evolving?
And I can tell you, and now the twenty some
(10:14):
odd years that I've been working in this space, I've
never seen a technology developed so quickly. I mean really
every three to four months. You see rapid advances in
the quality and the sophistication. I think there was a
seventeen year old in Rhode Island who ran a candidate.
Right there was a day fake and he ran the
candidate and it registered. So now what happened? Really the
(10:37):
election is coming close. Hold on, let's tell let's tell
that story. This is a great story. This is the
kid is in Florida. He's a high school student. Everything's
story like this is from Florida. So this is a
high school kid board on his Christmas break creates a
fake profile on LinkedIn of a candidate for US Congress
(10:58):
in Rhode Island, gets that person the blue check mark
on Twitter, so the register you know, authenticated, and gets
the Board of Elections approving him as a candidate. The
whole thing is fiction. Anybody out there would be out
of their mind not to hire this kid. I mean,
this kid is awesome. But here's the real threat of
deep faiths when you enter this world where any audio recording,
(11:21):
any image of recording, any video recording can be faith
whether nothing has to be real and not only is
this technology advancing too quickly from my taste, seventeen year
old kids are advancing too quickly from my taste. Seventeen
year old kid on a school break decides to create
a deep faith candidate for a congressional seaton Rhode Island.
(11:42):
I was setting bonfires in the backyard with a magnifying glass.
That was my big thrill. My mother, by the way,
was still clipping my gloves to my jacket so I
wouldn't lose him in the snow. Right, Yeah, seventeen year
old's creating congressional candidates for good saying that by the way,
the kid said that he did it in about twenty twenty.
He was boored doing break and did it twenty minutes.
(12:03):
Opened it to it because he wanted to see that
they validated, because we don't want phonies. Since you're the
guy when it comes to deface, anybody ever come to
you for help to determine whether it's a deep facing him.
On the weekly basis, I get an email from somebody,
many of them are politicians around the world, who have
typically something around a sex tape, and they're saying it's
(12:24):
not me, it's a deep fake, prove it, and I'm like,
I'm pretty sure it's probably you. By the way, I
love that you said pretty typically a sex tape. What
you're saying is they're all sex tapestically. You mean like
professor hurried, you know, because somebody made him say boopy,
you know, here's a kiss a guy wrestling and gristly.
(12:47):
This could end his career. I don't think so. I
love how you said that relatively most relatively most yeah wow,
but this is so. This is the holled liars dip
end that when the world can be manipulated, well, then
everybody can just decide on their own set of facts.
What was this. When I was growing up, I always
(13:08):
got smacked. My parents could always detect the slacked me.
I mean only I wish I could beat a liars?
Did mom liars dividend? It wasn't me? Did your mother
do the thing? Look? Look look at me in the face,
Look me in the face, especially grab my face. She
would make me look, look, grab my face. Professor, I
gotta ask you, even when people find out they've been
(13:28):
deep fake, it's stunning. And tell Jason to me, it's
not necessarily really good news. Right, many many years ago,
when Carrie was running for president, before deep face was
even on our radar screen, but we were seeing photoshop manipulation.
Somebody had created a fake image of Carrie standing next
to Jane Fonda and nanti war rally and it was
(13:49):
a fake image. It was probably one of the first
widely distributed fake images on the Internet. And NTR was
interviewing somebody after the election. They in this guy said
would you vote for I couldn't vote for John Kerry?
Why not? I couldn't get that picture of him and
Jane found out of my head. The interviewer said, you
know the photo is fake? And he said, I know
(14:10):
but I couldn't get it out of my head. And
that tells you something about the power of visual imagery
is that even if you can debunk it, it doesn't
It's very hard for people to undo those things. And
by the way, there's evidence in the literature and the
academic literature that when you try to undo effect, it
actually makes it worse. It's called the boomerang effect. So
(14:31):
when you try to correct the record, people remember that
there's a controversy, but they don't remember the sign of
the controversy, and it ends up reinforcing the false narrative
in the first place. So setting the record straight online
is virtually impossible. WHOA wow, honey, thank you so much.
(14:53):
Oh my god, this is horrifying. I want to be
light about this, but this is an extremely concerning It
is concerning. So okay, I'll make it lighter they I'll
take you to a lighter place. Now, I got a
quiz for you. So here are things, and you won't
know what's true or false. Here are things that you
think are true your whole life that may not be true,
that will rock your road. You're ready, okay, but some
of them are TREI is what you're telling me. It
(15:14):
can be true. They can be false. Some of them
may be true. Tell you, I want to leg up. Oh,
you're a really good listener. You know that I like
about you. You're a listener. You're from the listeners. And
my mother would say, here we go. What shaved hair
grows back thicker? True or false? You've heard that forever.
Let me tell you first of all, you listen, just
(15:35):
stomp on my heart going into the quiz. It is false.
You can share until you can shave, until the cows
come home. And by the way, that's false. Cows don't
come home. But no, it does not make it all right.
The Great Wall of China's visible from space. Everything's visible
from space. They can see they can see the head
of a pin with these satellites. No, but not with
not with amplification. I always say, you can see see
(15:58):
me if I was standing on this International Space Station,
can you see the Great waters? Be able to see
the Great Wall of China? I would, but I have
very good eyesight. Tell me I'm wrong. Go ahead, tell
me I'm wrong. You're wrong, you can't see. No, you're wrong,
don't tell me out and we're out International Station and
we'll look out the window, and you see who's right
deep faking me. There. You go. Swimming after eating will
(16:21):
give you major cramps. It depends what you've eaten, according
to Billy Crystal. Billy Crystal said his aunt used to
be or somebody his uncle used to be the arbiter
of you know, based on what you ate, how long
you had to wait if you had what did you have?
I had a tuna fish sandwich twenty minutes? What did
you have? Spaghetti meat balls? One hour? What did you
have cheecake tomorrow? Yeah, you're never swimming again. And by
(16:43):
the way, think about this. My mother used to do that.
My family would do that. You can't go in right
away because you're you're you'll get cramps. You ever have
you ever look up death by cramps in a swimming pool?
First of all, there're eight thousand people standing next to you.
It's three feet deep. How are you going You're going down?
And what what's happening? Just lying there? Nobody's saying anything.
You're cramping up so bad? Yeah, I mean old people
(17:04):
would be dead dropping dead in Florida every twenty minutes
because but they're the cramp. It's like, wow, eighty people
in the pool and I didn't see Sylvesta lying next
to me for an hour. And here it's impossible. It's impossible,
all right, Um, a penny dropped from the top of
the Empire State Building will kill you, uh it? Yeah?
(17:25):
I think the terminal velocity is pretty Yeah, it could.
Can't penetrate concrete or ausphalt. It won't even cause serious
damage to a person, and even to the speed of sound,
still not damage flesh. It most sting a little. How
about that is news? That's a really no, really, I
did not know. There you go, you've heard because you've
heard it your entire life. Yeah, hair and finger nails
continue to grow after you're dead. In fact, the skin recedes.
(17:47):
And that's why it looks like, oh, yes, all right,
true or false? Deep fakes are interviewing for remote jobs.
Deep fakes are interviewing forte. Could the technology be that
good already? I would say there's probably truth in that
in that perhaps they are not using video deep fakes
(18:10):
but audio deep face video or video deep fax video
actually interviewing for jobs. And you know what they're doing.
They're trying to get okays and be hired so they
can get into it companies, etc. And have access in passwords, etcetera.
So the FBI is saying, be careful for deep fake interviews.
How about that? Let me ask you a question. When
I turn on the beautifi filter on my Zoom, does
(18:30):
that qualify as a deep fake? Because I don't look
that good? I was gonna say, if they've seen the
real you? Oh yeah, oh yeah, yeah, I'm the cabinet
of doctor Kilgari. For guess you got an h in
the middle of your head? It looks like a third eye.
Of course, absolutely. Oh look it's producer David Guggenheim aka Googleheim.
With either a clarification, a correction, or an explanation Googleheim
(18:54):
pronounced Caligari. The Cabinet of Doctor Kalighari refers to a
silent horror film from nineteen twenty considered to be a
classic of expressionist cinema in post World War One Germany.
The theme, which is about the abusive authority, features an
insane hypnotist who uses a somnombulist or a sleepwalker for burger.
(19:16):
Spoiler alert, all of the flashbacks are actually delusions of
inmates of the asylum, where Calaghari is the director. Much
thanks google him. Now go away. So Jason, we're hearing
about Bruce willis giving a deep fake company the rights
(19:37):
to his image, which, by the way, isn't it wasn't true.
Not true, James L. Jones signing, He's not true, James W.
Jones signing his voice or the rights to his voiceover
not true. He just gave the rights to the Darth
Vader voice to Lucasfilm, so when he's gone, he's ninety one,
they can continue to do Darth Vader. You know, this
is the area I'll tell you where I first became
aware of this. Peter, you may remember this. There was
(19:58):
an ad I think for a vacuum cleaner that repurposed
fred As Stair's movie where he's dancing on the ceiling
of the room and all that stuff, and put him
in the commercial dancing with this vacuum cleaner. And I
the first thing that came into my head was would
he have acquiesced to this ad? And does he profit
(20:18):
from his image being used in this? Yeah, he doesn't
have to acquiesce because he's dead, just like you're going
to be one day, and your grandchildren gonna run on
the money. And if we may, Will you take a
short trip with me into They're not so dis in
the future. This is a time when all of the
grandchildren of all the stars of the show have run
through their money, so they're able to do that Siginfeld reunion.
And why not your grandkids have already sold your image
(20:40):
to check cashing, bail bonds, failed crypto hair growth, laxatives,
standing bathtubs, snoring cures, and comprobrationlets. This is a this
is an easy cell and for a lot of money.
And in this episode, in the first episode of the
new actual Seinfeld's, because the airs have folded, you're sitting
(21:03):
in most entirely incased in a leather s and m outfit,
talking to the others. Or maybe you're a sure goo.
It actually doesn't matter because your airs don't care if
they got a check, which is why there's also a movie.
Sorry morning, going to show a ride at Universal. You
maybe should something here, will. I have so many problems
(21:24):
with your premise. First of all, you're a man of research.
You're a man that really does a deep dive on things.
It's Monk's Cafe, monkh Mo. I think most cafe is
the Simpsons. It's most cafe that's number one. Number two,
number two, Jay, there's no number forget number two. Your
(21:45):
grandkids are selling some moot moot point. They're selling you
because it's easy and you don't have to actually yes,
but bigger than that, you know it burns my buns. Though,
honestly that I had to buy a fake Seinfeld jingle.
They can make a deep fake Jerry no reprogardition. That's
that's fine, and maybe make them do horrible things. But
have heaven forbid I used they are coming, They're coming
after me? Right? Why can't you use the sideways theme?
(22:07):
You're absolutely right? If he If we use the actual
side belt theme, we're gonna bay some money. But fred
Astair probably danced with the vacuum cleaner and nobody made
a dime. Oh look, Googleheim's back, Jay, eybody's gonna address
the Fredistair no money saying yes, Googleheim, don't let me done.
Fred Astaire born Frederick Oustrelitz was an actor, singer and choreographer,
(22:27):
considered by many to be the best dancer ever to
appear in film. Upon his death at age eighty eight,
he left his wife, Robin, forty six years his junior,
and a retired jockey with over two hundred and forty
seven wins to her name. The guardianship of his image
and the amount paid for the Dirt devil Ads was
(22:48):
never revealed. Other stars used in Necro marketing, among others,
are John Wayne for Coors Beer, Bruce Lee for Johnny Walker,
and Kurt Cobain for ore Doc Martins. So, Jay, I
get that you're really worried about the deep fakes because
(23:09):
probably because they're going to do you and not me.
And I get the issue too. It's the blurring of
reality and what's real, what's fake, what's truth. But what's
the bigger issue for me would be a fake world,
an entire fake world that you're immersed in with an
ocular headset for hours a day, that someone else has
created just for their monetization, and yet we're not in
reality or is that our reality? And how close are
(23:30):
they making that something so attractive that we're going to
spend time there and maybe more time than we do here.
So that's my bigger concern is the fake worlds that
people are trying to create them, you know, like the metaverse, Well,
I know, Doctor Strange is that's the multiverse multiverse, And
I get the metaverse and the multiverse. Confused. Yeah, they
gives me a headache. What is the difference? What do
(23:50):
you understand the difference between the metaverse? Kind of but
we'll talk about that in a moment. No, I don't.
So we're going to talk with this writer, Like I said,
who immersed yourself in the metaverse. Joeyna Stearns happens to
be a journalist at the Wall Spree Journal. She's amazing.
She specializes in tech. She was a reviewer and editor
at the Verge way way back, so she knows tech.
(24:11):
She does amazing videos, hilarious. She takes phones and puts
them underwater. She takes she doesn't want a snorkel underwater.
She interviews that the titans of the industry, and an elevator.
Wait a minute, Wait a minute, start a step. What
do you mean she takes I haven't seen this video.
She takes the phone and takes it underwater. Some set
it's water. Joeanna will will test it, under test. You
(24:34):
made it sound like she's, you know, holding Transatlantic meetings
at fifteen feet. That is the video. The video is great.
And her an elevator. She started doing this elevator stuff
because I think what happened when she was in the
elevator of the Tech Titan one day and had a
certain amount of time to ask him questions. So the
Wall Street Journal started out here do interviews and elevators,
which is very funny, and I know this is going
(24:55):
to be triggering. She's also an Emmy Award winner for
her documentary God egg Shell. For God's sake, they've given
him out like chicklets. I mean, everybody has an Emmy.
Do you have an Emmy? Sure you do. Don't even
lie to me. Yeah, calm down. It was an honor
(25:16):
to just be now. Yeah, yeah, yeah, I'm vibrating with honor.
So without further ado, No, she want an Emmy. What
does she want an Emmy for? For her documentary on
death and technology? Jesus God, clearly I was going down
the wrong path. I was trying to I was trying
(25:38):
to make people chuckle. If I knew to go death
and you know, death and technology, it'd be swimming in metal.
What would be really sad? And I just flashed on,
this is down the road that scenario that I proposed
of them doing a siginfold with a deep fake Jason, Yeah,
if I know how you am. Let's go to jo
(26:04):
It's a fake Emmy on top of everything else, on
top of that. Oh boy, get over it and don't
be don't be bitter. Say why did Joanna? Will you? Hey, Joanna,
it's Jason nice, thanks for being here, and so please,
I'm a moron. Please explain the metaverse in the simplest
way possible to someone who has never been there, has
(26:26):
no experience or frame of reference for comparison, I e. Me.
The metaverse is a virtual world where you, or a
version of you, which is likely an avatar to talk
about that later, is living in this virtual space. And
you're doing everyday things in this virtual space. So you're working,
(26:47):
you're hanging out, eating, we can talk about it. That's
a little bit more complicated. But you are doing things
in this virtual space, which consists of all different types
of spaces. Okay, And by the way, Jason, there's night there. Supposedly,
there's nightclubs, there's comedy clubs, there's all kinds of stuff there.
So you're absolutely right. And and I think it was
(27:09):
Bill Gates said that in two or three years, the
majority of meetings are going to happen in the metaverse
and and Johnna, what do what do you think is that?
Is that going to happen? Okay, so I think we
have to ask ourselves again back to what is the metaverse?
And how I answered it is this vision that really
(27:29):
has been popularized now by Mark Zuckerberg, who is running
a company called Meta formerly known as Facebook. And in
the metaverse. In this vision, we put on our virtual
reality headsets or augmented reality headsets, and we are an avatar.
So we have customized that cartoonish looking thing, and we
go to various places. We go to a comedy club,
(27:51):
we go to the office to meet with people, we
go to a movie, we go to play games, crazy
games right where we can swat at things and everything
feels really real to us. Metaverse meetings is the idea
that we put on these headsets and this we meet
with other avatars, and this could get really super cool
(28:12):
in the future. Holograms. You put on your headset and
sitting across from you at your desk and your real office.
Right now, Peter, I'm You're in your office. You put
on your headset and Jason pops up as a hologram.
He's got digital It really looks like him, but you
know it's not really him, and so that sounds really cool.
(28:32):
But the technology is not there yet at all. I mean,
if we were doing this meeting right now in the metaverse,
I would say that it would probably take you, no offense,
probably forty five minutes to get there, like just to
sign into things and get the headset on and get
the accounts ready. It's not I think you're being coined
with Jason to meet and may take much. Look I
(28:53):
was gonna say it takes me forty five minutes. Now
all we need help bringing the I crew. So what
was your I gotta know what was your experience, both
physically and emotionally to spending twenty four hours in the metaverse.
So when you try it, and I can't wait for
you to try it, you're gonna be kind of blown
(29:15):
away at how real it can feel. And I mean
that in the sense that everything looks like a cartoon,
so that doesn't feel real, but the presence of people
in your space feels real. So right now we're talking remotely, right,
I don't feel like you're in my house. I don't
feel like you're sitting across from me. And so we
have different body language, and we are communicating differently. When
(29:38):
you put this on, you have these physical, really virtual
objects that are in your space, right because this is
a three D environment, You're wearing that headset, so you
kind of feel like somebody's there, Like you remember the
Seinfeld episode of Close Talkers, Like you don't want someone
being so close to you, and you feel that when
you're avatars, You're like, whoaa, you know you're an avatar,
(29:59):
Like you're avatars will come in a little too close
to me and invading my space. And I think one
of the funniest things about the twenty four Hours in
the app, in the metaverse piece was I went to
a comedy club. Okay, there's an app called ault Space,
and people can build different spaces. So I went to
a comedy club in alt space, and I don't know
my avatar. I've customized it. I got nice brown hair.
(30:20):
I put on a shirt at an app. I don't know.
It's like a red shirt. I don't know. I don't
I don't think I look much of anything. But I'm
standing in the front row, I guess of this virtual club,
and the comedian on stage starts to kind of hit
on me right and say like whoa, Like I'm trying
to impress the lady in the front, and I'm thinking, wait,
(30:42):
that's me, and I start feeling like nervous. So but
like if you were at a real show and so
this is where I'm talking about it, it it can feel
real as weird as this is this world of cartoons
and cartoons, stages and avatars, you kind of start feeling like, wait,
am I really in this space? Isn't it amazing? What
(31:04):
the great part of that you just said to me
is that even in the metaverse, people can be close
to you and annoying you. That's what I love, annoyance
in the middle and with my luck, the avatar in
front of me has bad breath. By the way, we
should figure out how to do smelling. That should be
what we should we should do? Were so jo want to?
You said that the best part of being in the
(31:25):
metaverse was working out. Really really, I think working out
is going to be a huge driving point of virtual reality,
and that is is that people get bored when they're
working out at home. You're staring at your clouded space.
You've got your home gym, You've put on the Peloton app,
but you're you're kind of you're stuck in this space,
(31:47):
right And maybe this was a little bit of an
outgrowth of COVID, But why not do a workout class
with an instructor on Machu Picchu, which is what you
can do in VR right now? And it's actually great. Wow,
somehow I feel like my avatar. No matter how hard
my avatar works out, it's still going to have love
handles and man booms. But let me ask you, does
it really when you're doing it? Do you really feel
(32:09):
like you are on Machu Pichu visually? Yes, Like you know,
there are certain things like my dog comes and like
licks my leg and I'm like, oh yeah, right, not
in Machu Pichu unless I brought the dog with me.
Aside from Machu Pichu, I got to know after spending
twenty four hours, there is there any long lasting impact
in virtual reality that you still carry with you Over
the next couple days. After that, I had some just
(32:31):
sort of my eye eyes and headache issues because I
had been staring at that screen or in the headset
for so long. So there there were some short term effects.
And I do not suggest that anyone do this at home.
Or at your local hotel. I do not suggest it.
I totally get that because you know on the a
(32:53):
couple of times that I've done virtual reality games, after
about twenty minutes in the headset, I'm a little nauseous.
You know it. It's a false depth of field. It's
just there's something very jarring to your system about it.
No doubt we are going to involve, evolve into people
that handle it perfectly, but it is it is disruptive
(33:15):
to at least my system. So what I say is,
it's hard enough to get people to the gym to
work out. Now you want me to buy a headset.
You want me to pay to be in the metaverse.
You want me and I'm going to match your peatre.
There's no way. And the thing is on my head.
I can't wear a hat when I work out because
they sweat like a pig. I'm going to work out
with an oculist headset in the gym, jumping up and
(33:36):
down like I'm with a close darker in the gym.
I don't think. I don't think people are signed sign
enough right away. Right well, let me ask a question,
what if the workout and I'm serious about this because
I know they have it. I mean even like we
has it where you're playing again, you're playing ping pong,
you're playing tennis. If that were your workout or like
a boxing workout. Right at this point in my life,
(33:58):
I'm not going to a boxing gym and hitting a
bag and sparring, right. But if I'm if I'm boxing
with an avatar and there's no impact and I can't
get hit, would I do it? Would it be intriguing?
I don't know. I think we'll work. Just like joining
a gym, you put the money down, you do it
for five days, you tell people how great it. This
second week, you do it for two days, and you're
(34:19):
a little bit busy. By the fifth week, you're going
anybody see my ALCOA headset? A word about Marco headset.
So yeah, yeah, I'm meeting an avatar pizza before we
let you to go. I know she has a question
for you, Jason. Oh, I'm sorry this is a George question,
but it's really adjacent question. How big is your wallet?
And have you put all of these things into your phone?
(34:42):
I'm assuming you have an iPhone. Do you use the
Apple wallet? And is your Apple wallet as big now
as George's wallet. I have never had a George wallet.
And for those of you who who are not necessarily
steinfeld Sianados, George had a wallet that was comically overstuffed
(35:05):
and to the point where in trying to close it
one time, it basically exploded. It spontaneously combusted. I am
of the belief that if it's electronically stored, it's electronically
accessed by somebody other than me, So I don't I
don't tend to put that kind of information into my
phone ever. Well, okay, so so before we give it
(35:28):
our conclusion on what you think, I think handy for Reid,
who we talked to earlier, who's the erofessor or professor
determined that listen to his take. I thought before we
do our conclusions, it'd be good to hear an expert
de fake experts, take sure on the metaverse, and remember
Facebook is now meta because they're heavily influenced by this thing,
(35:50):
doctor professor, sir, honey, take it away. Here's my on
a bad day, here's my impression of Markus Zuckerberg. Let's
destroy society and democracy and then get everybody to strap
a virtual reality headset to their face. Eighteen hours a
day and monetize every aspect of human behavior. Well, when
(36:10):
you say it that way, it doesn't. Jeez, he's all in.
So Jay, aside from a fake universe in the deep fakes,
you've grown up. We've grown up with fakes our lives
like fake news. Right. Fake news has been around since
(36:32):
the beginning of time. My mom bought counterfeit stuff, fake items.
There's fake art. Great documentary about that called Made You
Look a True Story pharmaceutical fraud like Farnos. And then
you got dating bots, so you don't know if you're
on a dating site what you're talking to. In my world,
when I did radio, we used to do radio stunts
which were faking hold on, hold on. Producer Laurie Crime
(36:55):
is waving flapping her hands around, which could mean only
one thing. We have a surprise is what mystery is
the list? We guess? Ready? Yeah for me, Okay, good morning,
say hi to Jason. Good afternoon, Hello Jason, how are
you doing very well? Thank you. I'm trying to place
the voice so far this is going to set the
(37:15):
world on fire. It's the former governorsh my god, really
no really now wait wait wait, wait, really that's a
good name for the show. Yes, really, no, really it
is me. Well I if it is really no, really, you, sir,
I am delighted to be chatting with you. Have you
watched Seinfeld a Governor, Yes, I have. It's the greatest show.
(37:37):
What ever it is now screaming a watchful at the time. Well,
I'm delighted, thank you, thank you. And I you know,
I had a beautiful vantage point into your backyard not
long ago. I know somebody who lives above you on
the hill, and I saw I think the goat, right,
there's a goat. Maybe, yes, we are cutting livestock. I
think I need to put the glasshouse India for people
(37:59):
to look at you reality. Call it North Class House.
I like it, so, governor, Governor, it's so thank you.
I'm sorry we had so much technical problem, but thank
you for holding it being patient. Nothing problem. I just
know that I do not have Google crow. Yeah, well
listen not to get any of the fabulous technology. Jason,
(38:28):
Oh my god, Governor. Thank you, by the way, Jason. Yeah,
this is how easy it is to do a deep fake.
This is my buddy Phil Hayes, who was a voiceover talent.
Really and Phil filed really no really, so wait, so
what happened Phil? Are you there? Yeah? Yeah? How you
guys doing all right? Phil? So Phil does the best
conversational Schwarzenegger ever. And we heard that. And in two
(38:48):
thousand and three, when Gray Davis was being recalled, I
said to Phil, call and seeven, get him on the
phone and punk him hahaha. And Phil gets him in
the limo. Gray Davis thinks it's really archenegger calling to
avert the recall. And I said to philm just go
a minute or two and then reveal you who you are.
They're twenty minutes in. They're planning dinner, the couple's getting together, exercising,
(39:11):
and they hang up after like thirty minutes. And I
called Phil and say, what did you do? He's and
he goes, I got caught up in it. I got
caught up. So I then had to call ACT. I
had to call Governor Gray Davis back, and he ate
me and knew, you know what? Up one side down,
one side, and I said, I ended with so I'm
assuming I can't use on the air before you go,
(39:32):
just say goodbye Schwarzenegger. To show everybody what a great
conversation or Schwarzenegger, you do sign off and say goodbye,
and then I'll give your credits. I'd like to thank
you so much for Peter and Allied Jason there for
being so so good at this play show. Ever on
the on the iPod, you know what figure out we're
using that as a collection. He is, by the way,
(39:54):
he's Batman and tweeting it to Investor and Sonic the Hedgehog.
But the best is he was in true justice with
Stevens Goal. I got to ask him about that one day,
unbelieving in your face. I'm watching your face. Let me
tell you. I mean, you know, I'm right on that
tight rope of going okay, I know what our subject
matter is today, but I also know that we do
know people that have a relationship with Schwartzenegger. And I thought,
(40:16):
did you pull in something? And then I'm thinking, how
does he how does he relate to this topic? But
I have to tell you, I hedge the bet too.
Let's go with it. And I saw your face. You
were starting to go, starting to be impressed, much like
Governor brought me back to great day. And that is
the thing, how believable and how willing we are to
(40:37):
accept these things as possibilities. I remember, I wonder how
many times you've been deep faked? Google him? Do you
know how many times they done a Jason or George
Coustanza defaid? Do we know that? I don't have an
answer on that. I did? They did deep fake the
Seinfeld cast onto the opening montage of Friends? Oh come on,
so so here's the question. Jason, who do you think?
(40:59):
Who's boy? Did they deep fake your face onto of
the Friends cast? David, don't be stupid, I'm Jennifer Anniston.
Come on? Who else would I be? No? You know
who they would make me if if they if they
kept it gender specific, they probably made me Matt Perry.
That would be my guess, Matt Perry, or maybe Swimmer,
but Schwimmer would be an easier scrammer. No, Matt LeBlanc,
(41:22):
I'm LeBlanc. You're LeBlanc? How you doing? Can you guess
who got onto Jennifer Aniston's face, the coveted Jennifer Aniston's face. Well,
if it's if it's not Julia, is it Julia? Should
I go there first? No? No? She went She went
on on Monica, She got kind and it has to
be Stell, Harris, You're You're not far off. Stell actually
(41:48):
made it onto Phoebe Wayne Night was on, Uh sure
it was on Jennifer. Why wouldn't she be Hello Newman? Hey, hey,
before we get your uption, which I know up to
now has been very, very negative. You actually said in
the beginning of the show that you had a real
concern and that it freaks you out, and it's only
(42:10):
it's only going to get worse here. So keeping that
in mind, let's take a look at the Let's take
let's take a look at the cliff. We just took
a second through the magic of pausing, and we all
just watched the sign folk characters onto the Friends, The Friends, right,
(42:32):
I look so good. I knew you got so much hair.
I am so pretty as matt LeBlanc. I you know what,
I'm into this technology and we're out. Good night everybody.
That's all it took. I'm a convert. I'm a believer.
(42:55):
I couldn't believe me if I tried. It was. We
did a whole segment on the potential for disaster, how
nefarious it is. Well, I figured, much like of everything
else in my life, people would use my Image to
create havoc. I didn't know they were going to create
beauty on Enhanced me. Absolutely, I'm so glad we got
(43:21):
an I'm making you happy now really now, really everybody,
we want to thank Digital forensics expert Professor honey Free.
Honey Fred is also a senior advisor to the Counter
Extremism Project, which is a fascinating software tool that allows
(43:42):
companies to quickly find and eliminate extremist content from more
information go to counter Extremism dot com. Also Wall Street
Journal tech columnist and Emmy winner Really Really Everybody wins
an Emmy. Everybody but Me Emmy Winner Joanna st win
A Starns on Instagram and Twitter at Ennis Spurn and
of course Phill Hayes our Deep Faked Schwarzenegger. He's an actor,
(44:05):
a voiceover artist to stand up comic. He's got hundreds
of TV shows and movies to his credits. You can
check him out at IMDb and of course you can
follow Really No Really on Instagram and TikTok at Really
No Really podcast. For questions, you can message us on
Instagram and Really No Really podcasts. Really No Really is
a production of iHeartRadio and Blaise Entertainment, and most of all,
(44:26):
thank you for listening. We release new episodes every Tuesday,
so follow us on the iHeartRadio app, Apple Podcasts, or
wherever you get your podcasts.