Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Brought to you by Toyota. Let's go places. Welcome to
Forward Thinking. Hey there, and welcome to Forward Thinking the podcast.
Then it looks at the future and says, then I
saw her face. Now I'm a believer. I'm Jonathan Strickland,
(00:20):
I'm Lauren Boa, and I'm Joe mcformick. So I saw
something absolutely delightful the other day on the internet. You
shared it with us, yes, and then you thought this
could be a good blog post, and I said, to
heck with that, that could be a great podcast. That
is right, That is what you said, because you know
a good podcast when you see one, which is odd
(00:42):
because it's an auditory kind of thing that we do.
But anyway, well thanks a pun. That would have worked
if we'd already announced what today's podcast was about. What
we're about to get to it. Um. So I found
it was through Alexis Madrigals Five Intriguing Things newsletter, which
if you're not subscribed to that, you should check it out.
It's excellent. Um. The link was to a project of
(01:05):
a Korean artist group that's I believe made up of
two people, Shin Sung Bak and Kim Yong hun, and
the project was called cat or human. Okay, So for
those who are listening who have not had the benefit
of seeing this particular project, what do you how would
(01:25):
you describe what cat or human is? First of all,
it's not horrifying, No, it's it wasn't like a taste test. No.
It was also not one of those animal human hybrids
that I keep hearing about from the stuff They don't
want you to any dr Moreau, none of that. No,
(01:48):
it is merely a collection of fun of images from
two different algorithms, and it's marking where two different algorithms
went wrong. One of them was an algorithm that was
created to recognize human faces and say that's a human
face just by looking at a picture. Another was an
algorithm that was created to recognize cat faces by looking
(02:11):
at a picture and saying, hey, that's a cat face.
Of course, what's great is when they mess up, specifically
when the cat recognizing algorithm recognizes a human face as
a cat face, right, and when the human recognizing algorithm
recognizes a cat face as a human face. So they
collected images from both of these sets of mistakes and said,
(02:34):
according to these various algorithms, these people are cats and
these cats are people. That's exactly right. It's kind of
like a weird yearbook. Yeah, so that is what they
did from what I can tell that. You should check
it out on their website. The the cat people, you know,
look like cat people. I agree. I mean it's I
(02:54):
guess a little bit. They mostly look like people. Yeah,
they may have some features that are slightly feline. When
we say cat people, we mean people who may have
certain feline features. We don't mean people who have twenty
seven cats. No, not that thing, and also not the
thing from like doctor Who, not those cat people either. Um,
maybe slightly cat like eyes. Yeah, we're not talking to sleepwalk.
(03:16):
They tend to be very happy looking people with like
big grins ches cheshire people, which is funny because cats
are so evil. But anyway, the selected cats I'm looking
at now that we're yeah, the cats that were identified
as human, they all look really sour. They're not smiling cats.
I don't know what like, So grumpy cat might have
(03:37):
ended up being classified as a grumpy cat is not
one of the ones I've seen. Instead, these are cats
usually that are kind of wide eyed and looking straight
on at the camera and in this kind of like
what what so the reason why we even bring this up,
as as amazing as this particular art project is, it
does have to do with the technology of the future,
(03:59):
we promise. Yeah, we wanted to talk about facial recognition technology.
What that has to do with artificial intelligence? What is
it actually doing, how is it doing that, and what
could that possibly mean to us? Not to mention some
interesting looks at the future and apparently at least one
segment that will scare the pants off me, but I
don't know what it is yet. Yeah, So this we're
(04:22):
we're going to go on a journey of discovery and
some of us are going to be discovering right along
with you. So to jump into this, Uh, facial recognition
is part of artificial intelligence. When we talk about AI.
We've said this several times on this show. AI is
actually a very broad term, right. We often think of
it in terms of strong AI, some sort of computer
(04:42):
that can quote unquote think like a person. But in reality,
artificial intelligence incorporates lots of different elements of intelligence, right, right,
And one of those elements would be making sense of
the kind of data that makes sense to animals and
humans but not to computer. Right, So, like visual data
I can look at Joe and I can recognize that
(05:04):
that's one, that's a person. Two, I can recognize that
it's a male person. Three, I can recognize that it
is a person I already know for that that person's
name is Joe. Like all of this kind of stuff
a batman costume right now, right exactly not right now,
that you're not a cat, Yeah, they're there are all
these things. And I can also have all these other
(05:25):
associations with Joe that that you know, factor into who
he is and what he is, and all of this
is very natural to me. But all stuff we have
to teach computers right right, and it's very natural to
humans to be able to I mean, you could probably
recognize Joe from across the room even if he were
turned around, due to you knowing him well enough to
know like his gait and posture and like if you
(05:47):
catch kind of a weird angle from the side of
his face, probably still identify that as Joe. But a
computer doesn't have that natural capacity. Yeah, if I look
at if I if I meet Joe for the first
time and he looks me right in the eyes, we
shake ends, and then the next time I see him
he's in profile, I'm still able to recognize him. But
for computers that's not a trivial task, right Yeah, because
(06:09):
they're they're relying on a way of analyzing visual data
and breaking that down into data points. That makes sense.
So it is a big part of artificial intelligence, and
not just facial recognition, but object recognition in general. Now,
to get into what facial recognition is doing, there are
a couple of things we have to factor. One that
(06:29):
has to be able to recognize what is a face?
What are the elements that make up a typical face?
And that ends up being sort of the baseline, all right.
Beyond that, when we talk about facial recognition, we're usually
referring to two different things that are common to biometrics,
whether it's facial recognition, fingerprint technology, hand scans, whatever it
may be. And that's verification and identification. And these are
(06:54):
two related but different concepts. So verification might be something
you should think of with thet afore of a key. Yes,
verification would be something like if we needed to have
a biometric system to let us into the office, so
we might have it might be a two step verification process.
Right there, there's one step where we might use a
(07:15):
badge or a pen or something else to identify and
say this is who I claim to be. Now. At
that point we would then submit whatever happened. Biometric it's
asking for, whether it's a fingerprint scan or maybe you're
just looking into a camera and facial recognition software is
taking a look at who you are. Well, when you
created that profile for the system, when you created your pen,
(07:37):
when you had your fingerprint scanned or whatever it may be. UM,
then that's when it has uh scanned you and taking
those data points and associated it with your identity. So
what it's looking for is a match of the person
who is standing at the doorway with the person who
have a profile that it already already has set up
(07:58):
in the system. Right, Really, this is trying to get
a computer to do what a human would already do,
looking through the people of a door. So someone knocks
on your door says hello, it's Joe, and Jonathan comes
to the door and looks through the people and sees
that's not Joe, that's a werewolf. Then I'm denied access.
(08:18):
But at least you have some other questions you need
to answer before I will open that door. You know,
you might have had a rough n hope. Yeah, yeah,
you might have let your beard go a little bit
um or or or you know, like Jonathan could also
look through the people and go, that's Joe. He's wearing
a jaunty hat. But it's definitely still Joe and let
(08:40):
you in, I mean, depending on his feelings about johnty hat. Right, Yeah,
do you take your hat off if you come inside?
That's Joe and clown makeup, but I still recognize him.
I'm definitely not come anyway. Yeah. But so that's the
sort of artificial intelligence angle on it. It's just trying
to recreate what happens in your brain and when you
(09:00):
look through the peepole at somebody before you let them in.
And these are the kind of systems, by the way,
that can can cause problems for people, even when they
have a legitimate profile set up. I did an interview
with Dr John Padfield for How Stuff Works where we
talked about biometrics in general, and he talked about this
is this is related. It's not facial recognition, but it's related.
(09:21):
He talked about working at a job where he had
to put in a pen and have a hand scan
before he could come into the office, and he normally
wears a class ring, and one day he left the
ring off, and so when he went to have his
hand scanned, his hand had a different readoubt than it
normally does because he didn't have this enormous lump from
(09:42):
a class ring on his finger, and it would not
let him into the building. He had to wait until
that actual human game and verified that he was who
he claimed to be before he was let in. Same
sort of thing can happen with facial recognition software. Okay,
but there's another thing that facial recognition software might need
to do, and that's, as you said, identify vacations. So
unlike looking through a people and saying, oh, okay, is
(10:04):
the person standing there who they said they were through
the door, It's more like looking at a person and saying,
who's that? Yeah, do I recognize this person at all?
So verification makes it easy for the system because it's
just looking for a match of who you claim to
be and who you are. Identification is more tricky. It's
saying who are you without any of that who you
(10:24):
claim to be? Stuff? Right? Right? Right? You don't well,
I mean, you might have a set of vague profiles
set up. But yeah, so this would be something like
a system where the police might use an identification system
in order to figure out the identity of someone who
has committed a crime and try and match that against
their entire database of all the photographs that they have stored.
(10:48):
This is particularly tricky because again, you can run into
those situations where the image you get during whatever it
was that happened may not be the same sort of quality,
the and and condition as the image that you originally
have in your database. But we'll talk more about that.
But yeah, I was going to say that's exactly right,
because one of the things you might be thinking now
(11:09):
is identification is harder for a lot of reasons. And
one of the reasons verification is easier is you typically
when you're verifying someone's identity, can assume that they're being
cooperative with the scan, that they're looking directly into the camera,
that you've set up a lighting situation that's conducive to
to measuring certain bits of their face. Exactly right, They
(11:30):
want to be verified, and they're getting in the best
possible scenario for the system to do that identification. You
don't know they might be willing to let you scan them.
They might not be willing, they might not be aware.
You can't depend on them trying to get in position
and make it easy for the computer or the hardware.
But all of us will make more sense once we
have explained how this facial recognition technology works. So what
(11:54):
are these computer programs looking for now? Basically, what they're
doing is breaking down faces into a collection of landmarks,
right of identical identity points I should say not identical points,
but identity points. So they're looking and they, depending on
the system, they break down the face in a different
way and they tend to get obviously more high resolution
(12:15):
as the technology evolves. But what they do is they
look for points like the the corners of eyes, the
width of the nose, the length of the nose, the
relationship between the nose and the mouth, the relationship between
the eyes and the mouth. All of these are little
data points, nodal points that the system can look at.
And then it's all the relationships between these points that
(12:38):
make up the identity. Right, So it's not just oh,
there's Joe, Joe has two eyes. That's not quite specific enough.
It's talking specifically about things like the width of the
eyes from one another, the distance between the the the
inner part of the eye from one to the other,
or the triangle that the eyes make with the nose. Uh.
(13:01):
Imagine that, but multiplied by a dozen or or a hundred,
and you've got all the different little points that make
up the the landscape fingerprint, if you will, of Joe's face.
So that when the system looks at another face, it
compares those data points, and it looks for those data
points and then compares it against what's in the database already,
(13:23):
And if there's a match that reaches a certain threshold,
then the system says that's Joe. Now, if it doesn't
reach that threshold, it may say it's someone else, or
it may say I do not know who this person is.
I don't have a record of this person, so, uh,
you know, I can't help you. Um. And it's all
based on statistical probability, right. So we talked about this
before in the past. IBM S. Watson is the example
(13:45):
I love to use, where Watson on Jeopardy would only
give an answer if it met a certain threshold of certainty,
and if it was below that then Watson wouldn't guess.
So that was kind of a calculated UH risk on
the part of IBM S teams saying, well, we don't
want it to guess incorrectly because there's right. Um, some
(14:06):
some of these programs will, for example, if it's not
super sure who a person is, will spit out a
series of guesses like like maybe for for human identification. Yeah, yeah,
they're in fact, usually what you'll do if you're talking
about if you're talking specifically about identification not verification, and
you're and you're looking at a person and you're trying
(14:27):
to mind an enormous database. Let's say that there are
thousands of people that are within this database, because in
law enforcement databases there can be many thousand people. Then
you usually have a couple of different passes at this
where one past will narrow it down to a certain
group of people that that fit the general UH scan
(14:49):
that goes on, and then they'll start looking for a
more in depth comparison to see which ones are the
most likely to be the same as the target, the
one that was captured whatever image or video was that
you're you're analyzing. So uh, they start looking at things
like the depth of your eye sockets, the width of
(15:10):
your nose, your cheap bone shape and height, your jawline.
When I when I read about this kind of stuff,
it makes me think of the character creation process of
something like Skyrim, where you get into the minute details
of what your character looks like. You're like, well, all
those things are the things that facial recognition software actually
looks at to say this is the person, because it's
(15:31):
what makes you look like an individual human being. Aren't
there some cat people in Skyrim? Yes? There are. Yeah,
it all comes back. I believe they're what they're called
head Head shape comes into play too, like the distance
from the eyes to the ears, the distance from the
center of the nose to the ears, any facial hair
that you may or may not have, your hairlines, stuff
(15:53):
like that, and that's one of the things that also
can make it tricky for a verification. For example, if
you have a distinct hairstyle and then you change it,
then you may encounter a system that no longer really
recognizes you as easily and it may mean that you
have to go and update that profile with whatever you know,
however you've changed. And of course there are other things
(16:14):
that can cause some issues too, but robust systems usually
will will depend upon multiple measurements. They're not looking for
just one thing. Like if you're looking at a very
simple facial recognition technology, where it's maybe something that's built
into a digital camera, and all it's supposed to do
is just detect that there is a face. Period. Yeah,
it's usually really simple stuff. It's just looking for for uh,
(16:37):
for shapes that correspond to what it thinks of faces,
which means they can sometimes detect faces where there are
no faces. I've seen this happen in nature photography, where
a person will hold up a camera at a tree
that has an interesting pattern on it and it will
detect it as a face. You know, you want to
exactly and you're thinking, you think, this isn't Game of Thrones.
(17:01):
The tree does not have red eyes. Now that's not
a This is not one of those trees that the
Starks talk about all the time. So there are times
where that kind of technology can rely on a very
simplified view of what is a face. But for things
that are really about verification and identification, they usually depend
upon multiple inputs to make sure because I mean, clearly,
(17:24):
if you're talking about verifying someone's identity to give them
access to something that could be really uh you know,
classified information or just uh top secret type stuff or
just just sensitive materials. Then you want to be absolutely
sure that that is who they say they are. So
some of them will even go so far as to
(17:44):
use multiple cameras to get more of a three dimensional
look at a person, because we know, if you're using
a single camera with a single lens, it's a flat image.
You know, it's once you see in a regular movie
or a regular picture, regular photograph, you don't get any
real sense of up there. I mean, you can see
contours and stuff, and you can tell if something is
closer or further away, and if something has been lit
(18:06):
really well, it might look three dimensional to you and
you might even your your brain plays a little bit
of a trick, but it's still a two D image, right,
And so if you're using a verification system or any
identification system that's reliant on a single camera, then that
makes it a little more complicated too, because it's looking
at three dimensional images but representing it as a two
(18:27):
dimensional image. So it that that has some elements of
difficulty there. If you have multiple cameras, then you can
get around that. You can rely on multiple inputs to
figure out how you know, how sure are you that
this person is who you think they are? Okay, but
we know facial recognition is not perfect yet. So what
(18:47):
are some of the challenges that are really facing facial
recognition software today. Well, facial recognition faces many problems, Joe,
thank you for throwing it to me like that. Uh, well,
we talked about the lighting issue. That's a really good example.
So if I go to create my profile, uh for
(19:08):
the company I work for. Let's say that I'm working
for you know, um M six, I've somehow gained clearance
to work in Britain's Superspy uh Academy. And I don't
know how I've managed to get there, but I don't
really doubt that you have. I think that's a pretty
likely Yes, this accident I'm putting on is actually not
my real accent. But at any rate, let's say that, uh,
(19:31):
I need to go and fill out my profile, and
so they put me in a well lit room, they
take my photograph from several angles. All of that is
factored in so that I can get into Superspy Academy
whenever I need to, uh, And then I walk up
but Superspy Academy. Unfortunately, maintenance hasn't changed out a lightbulb
that really needs to change. It's gotten very dim, it's
(19:51):
not giving nearly enough light, and that that dim light
is now casting odd shadows on me. And that might
be enough to throw off a facial recognition system. I
mean that you're ultimately talking about detecting light that's reflecting
off of a person. And if the lighting is different enough,
then from whatever your base photograph is, whatever your base
(20:12):
amount of data is for your the system, then it
may not return enough information, or at least not enough, uh,
you know, a good quality information for the system to
be able to identify you. This is true whether it's
a security camera that's on the wall, whether it's something
that's actually set up to be a verification system. I mean,
it doesn't really matter what the context is. It's it's
(20:34):
a real problem. Yeah, how about something even simpler. The
way you're holding your head? Yeah, yeah, greater than fifteen
degrees of tilt can really throw a wrench in facial recognition.
But I've seen this in action. Actually, when I've I've
looked up pictures of faces that we're supposed to be blurred.
We'll get into that in a minute. But weren't a
lot of the times a program that's supposed to detect
(20:56):
a face and blur it will miss a face that's
I agonal, right. Yeah. I mean if if you're looking
for a specific uh series of shapes and that is
what defines a face, and that those shapes are in
an orientation that's different from what you would expect, then
you can understand why a system would overlook that and
(21:17):
not recognize it as a face. Yeah. So my suggestion is,
if you're a criminal trying to avoid the law, just
do the limbo all the time. Just lean over. My
suggestion is don't go crime doesn't pay. That's my suggestion.
But Joe, you know, we understand that you're in the
ferious now. Now that you've let on, I am gonna
have to train my superspy techniques on you. So just
(21:39):
fair warning. Uh, but yeah, I mean they're there. We
have the same thing about the single camera systems having
problem detecting depth. That means that you might be able
to fool such a system by holding up a photograph
of let's let's say that you have captured a superspy
and gotten their pin and you go into superspy Academy
and you type in the in. Then you hold up
(22:00):
a picture of said spies face to the camera. And
it's just a single lens camera. It's not able to
detect depth. It may think that that flat image is
in fact the actual person's face. Yeah, then you get
an entrance. Yeah, how about this problem. Your face isn't
quite as unchanging as say your fingerprint or your retina. Right,
(22:23):
you could gain weight or lose weight, which might change
certain aspects of your face. Yeah, you could grow facial
hair or shave facial hair or your hairstyle. Like we
mentioned before, you might suffer an injury and you might
have some scarring, or you may end up having plastic
surgery or some other thing that alters your face in
a way that would be enough to throw it off
(22:44):
from a facial recognition software from being able to verify
or identify it. Right, It makes it harder. It means that, uh,
these systems have less variables they can work with. They
need to look for data points that aren't going to
be changing when these types of things happen to a
person's face, and that really limits what you can look at.
And I mean even even heavy eye makeup could change
the shape of your face enough to throw something up. Yeah,
(23:06):
there's some some elements of your face that are less
likely to change unless you've had something radically done, right
you reconstructive relative distance between your eyes and your nose,
and the curvature of your nose again, unless you've had
some sort of plastic surgery, the shape of your jaw.
These are things that are are less easy to change
without some extreme intervention. And so those are things that
(23:30):
a lot of facial recognition software packages rely heavily on.
Or it maybe that the relationship of marks that you
have on your face, like a mole or freckle, things
like that, things that aren't likely to change a lot
unless again you go in to have them removed, right,
you could apply makeup to cover them up. That also
could be enough to fool them. So these are all
(23:51):
things that that are uh that are taken into consideration
by people who designed these systems. Yeah. Yeah, there's a
new story that came out in August that I wanted
to mention here. Um that a fourteen year old case
against an alleged child sex abuser in kidnapper, a terrible
guy all around, um came back into courts when the
FBI caught their suspect thanks to their facial recognition software
(24:13):
of two passport style photos of the guy who had
not changed his face in the past fourteen years. Um.
I mean the dude put in like a visa application
to the U. S Embassy in Nepal, and the software
was like, hey, hey, you guys, this is totally that guy. Yeah.
So it was a really great example of of how
this kind of technology can can work towards towards catching criminals,
(24:36):
which is terrific um, but kind of under vary. This
one very specific circumstance in which a very good photograph
was obtained of the suspect um. And there are lots
of ways in which you can mess up. That's right.
For for all the times it works really big, there
are lots of times when facial recognition software goes wrong,
and often just goes wrong in a hilarious way. Yeah.
(24:58):
Sometimes a uh you know, you see where the the
drawbacks or the limitations of the technology are in in
incredibly um profound ways, you might say, so, like like
the Japanese cigarette vending machines. I remember hearing this story
when it first broke I was listening to. I think
it was probably seen nets Buzz Out Loud at the time,
(25:20):
which is no longer a show, but I was a
big fan and they covered this story when it first happened. Now,
these vending machines were famous because they had cameras built
into them that were supposed to be able to detect
what the age was of the person who came up
to the vending machine. So they're looking for specific features
things that would indicate someone who has reached a certain
(25:42):
level of maturity, and that was supposed to prevent these
machines from selling cigarettes to underage people without checking an
I D share. Yeah, yeah, because I mean it's we
used to have these cigarette machines all over the place
in the United States. They aren't really anywhere that I
know of at this point. Yeah, yeah, you can. Yeah,
you can find them in those kind of places where
presumably someone's idea has already been checked before they've entered
(26:04):
the premises. So the idea would be you would walk up,
you would put you know, the camera would detect whether
or not you're old enough, and then you could buy cigarettes.
And what they what people discovered was that if you
just walked up and held up a picture of a person.
The example an adult person. Yeah, the example I saw
was a picture of Bruce Willis, uh uh, like die
(26:27):
Hard four era Bruce Willis. If you held up a
picture of Bruce Willis, it would totally sell you cigarettes
and probably you know, say that it was a big
fan of your work. What about like a really baby
faced adult. Probably not, because again it was looking for
specific kind of features that would indicate adult nous. They
didn't find out that you didn't have to have a
(26:47):
very large picture for this to work. You could. You
you can hold up as something as small as a
three inch picture. We're talking about like seven and a
half centimeter picture of an adult. They did discover that
if you tried to hold up a picture as small
as one inch, that did not fly. But but you know,
something something like a three inch picture like you could
you could just carry that around and put that in
(27:08):
front of the camera and that was enough for you
to be able to purchase cigarettes from it. So that
shows you that this is not an infallible technology, especially
in that particular implementation. I got another funny one for you, okay,
Google street View. Yeah, so, first of all, Google street
View did not always blur out faces. Uh. In fact,
the reason why Google street View would blur out faces
(27:31):
was due to the kind of uproar people had when
street view first became a thing. Because what Google is
made by engineers, like engineers power Google and all of
Google's projects, and engineers are really good at figuring out
how do we create this amazing experience, but not necessarily
the best at thinking of all the implications of said
(27:52):
amazing experience. So sure, like, you know, it's terrific that
we have a picture of this storefront of a doctor's office,
for example, But make the people who happened to be
in front of that doctor's office that day don't want
to be permanently associated with it, right, Or you don't
want your picture on the Internet walking out of the
liquor store or a strip joint, or maybe you do.
(28:13):
I don't know, but there are a lot of There
are a lot of cases where someone would say, uh, yeah,
I even if I wasn't going to set establishment and
I just happened to be walking by it at the time,
Now it appears that I am giving my custom to
something that I do not approve of, and I would
much rather not be associated forever and ever with this
(28:35):
particular location. Right, But of course you can't just manually
blur all the faces. There's too many. You know, it'd
take it would take ages, because you know, you've got
these fleet of Google street View cars that are going
through different neighborhoods and documenting them. So Google created an
algorithm that would detect faces and then blur them automatically
(28:56):
to halarious results. Well, it was, as I believe you
said in the notes here, it's a bit overcautious maybe,
So it ends up blurring some wonderful things like the
colonel's face on the KFC billboard. Right. There are murals
where you you see this gorgeous artwork, except there's this
amazing blur part of it, or sometimes multiple blurs. I've
(29:18):
seen images on the Internet where it blurred animal faces.
So they're like some ponies by the road and it
blurred the bony faces. Look, those ponies deserve just as
much privacy as you and I. Can you tell a
horse face from a human face? Hey? Art project horse
or humans? So many jokes that I'm not gonna do it. Also,
I've seen it blurring statues. Yeah, you might have a
(29:41):
Buddha by the road and it's got a blurred face
like it's a criminal on America's Most Wanted. It also
does miss some others though, which which which cracks me
up on on the flip side of it, Like I've
seen like a like a dude sitting in a spa
and his mirror image is not blurred, and it's like, well,
thanks Google, Well that's that's his evil, evil twin version
(30:03):
right in the mirror world. Yeah. I've also seen some
images that were at least alleged to be from Google
street View that missed people's faces, often at an angle,
people who had their heads turned to the side, right,
So it again, it is not infallible, and so you know,
it's it may be that that that street View ends
up being one of those things where some folks at
(30:24):
Google occasionally have to go in and manually blur stuff.
But the hope is that, yeah, the hope is that
the algorithm catches the vast majority. And I'm sure that
the company would rather err on the side of caution
and have some hilarious unintentional blurring going on than miss
(30:46):
one or has to do it all by hand, which
would just be ridiculous, and it does. It does rely
on users to send in feedback like if you if
you find something that isn't blurred or should be blurred
or either way. On on an unrelated note, I love
the communities that have taken Google street viewed as an
opportunity to do art projects along streets that normally wouldn't
happen there. I remember there was one street where there
(31:07):
was an entire almost like a parade, like a story
that happens as you go down the street, where people
had uh created these tableau all the way down, ranging
from everything from uh you know, families posing together like
a nineteen fifties style sitcom to Larper's in all out combat.
It was amazing and it was just down one streets.
(31:30):
I think. I'm sure I've seen Google street view capture
a lot of the infamous horsehead mask. Yeah. The question then,
is the horsehead mask does it blur it? Well, I guess,
I guess we'll have to ask our listeners to find
uh you know, instances of the horsehead mask blurred and unblurred.
Maybe we can crack this mystery. But now, you know,
(31:51):
this also leads into the fact that we have this
blurring going on with street view shows that there is
there are some concerns with face recognition. And it's not
just the fact that are you looking up horse bording
go the horse said mask Joe. We're doing a show
right now. The time for research is over your all right. Uh.
(32:18):
The other cool thing, well, not cool thing. The other
thing that we have to keep in mind is the
fact that there are some concerns about facial recognition. That
you know, in this case, Google is using facial recognition
to meet concerns, but their concerns about the technology all
by itself. Right. So when we talk about officials and
organizations and governments using facial recognition technology, that sounds really
(32:40):
good when it's like, oh, they used it to catch
somebody who was really bad, like a dangerous violent offender. Okay,
But at the same time you have to think, well,
should we be concerned about privacy? Yeah, And the answer
to that is yes, we should be concerned about privacy. Absolutely.
We should be concerned about pay all the time. Basically. Yeah.
In the United States, more than half i think twenty
(33:03):
six states and the District of Columbia all allow law
enforcement to use facial recognition technology. Uh. In in the
course of their duties as peace officers. And uh, I
think there's something like twelve that do not, and then
or maybe it's it's eleven that do not and then
twelve that um have some form of of facial recognition
(33:26):
technology but are not currently using it. That kind of thing.
At any rate, UM, you also have this enormous database,
not just of all the different criminals have been processed
and taking mug shots have been taken that kind of thing.
I mean, mug shots really are an early form of
facial recognition that was all manual, right, It wasn't algorithms,
it was algorithms. It was like rather than manual, Well
(33:52):
that's fair fans. Yeah, well you have to sort through
the pictures. But at any rate, Uh, you know, now
a long process. Right now, this technology can rely on
those mugshots, but it can also you know, some of
these law enforcement agencies might be tapping into databases for
people who have I don't know, received a driver's license
or an i D, a state issued i D where
(34:13):
they've got an image on file that is tagged with
their identity because it is part of ascertaining that person's identity.
It's a it's a way of verifying that person's identity. Yeah,
and it also has their address and lots of other stuff,
other vitals, things like that. Yeah, that starts to maybe
become a problem when you realize. Okay, so all kinds
(34:35):
of people are being visually scrutinized and perhaps in a
very broadway, all considered as potential suspects in identifying faces
without even really being at the scene of or associated
with the crime in any way. Yeah, it's it's a
worry that it turns it could quickly turn into a
true surveillance state, right, not just not just that we
(34:57):
have this capability of looking at uh, people in the
in the in the context of a crime that's been committed,
and trying to figure out who was there, what were
their roles in the crime? Uh, and how can we
get in touch with them so that we can either
you know, detain them if we suspect that they were
part of it, or ask them questions if they were
not part of it, but they were present and uh.
(35:19):
And it's it's also a question of letting the public
know whether these systems are in place and whether the
law enforcement agencies have access to them. There was a
story in Ohio where there was a big backlash of
of citizens who were unaware that these systems were in
place and could be used, and the law enforcement response was,
(35:41):
these systems are in place, we have our own policies
in place to make sure that misuse does not happen,
because if it does happen, it will be treated with severity.
That sort of thing to try and let people know
we are relying on these things so that we can
help pursue criminals, but we're not we're not going to
tolerate any kind of misuse of the system, which I
(36:03):
think is an important I mean, they have to prove
that that is in fact their policy, right right, Just
saying it isn't quite enough. No, But you know, it's
one of those things that is clearly you know, it's
it's it's a it's a big concern. Oh yeah. And
it's not only on the state levels. There. The the FBI,
the Federal Bureau of Investigation here in the United States,
(36:23):
has an increasingly robust system called the Next Generation Identification
System or n g I, which includes and I quote
an image searching capability of photographs associated with criminal identities
um paired with this advanced increasingly advanced facial recognition software
that they're building out. Um, it just rolled out this
(36:44):
uh interstate Photo system or i p S yesterday September
fIF um. Really timely news. Yeah. I mean it's been
working on this for for years. I think that they
had announced their intention to do it back in two
thousand six or two thousand eight, um something around there,
and that the contract for developing all of the technology
(37:07):
was awarded at that time to Lockheed Martin Transportation and
Security Solutions. Not a big surprise. Luckheed has a long
history when it comes to surveillance. Look at the history
of Area fifty one for more, right, right, yeah, um. So,
so this system basically pools data from from as far
as I can tell, all levels of law enforcement and
(37:28):
criminal justice and distributes that data back out to those
agencies once the FBI has collected it. So in this case,
you have a kind of national system where uh you
might you know, otherwise have to rely upon a regional system.
Like a state might have access to all criminals that
have been processed in that state, but if it's a
criminal from another state that has done something, you may
(37:51):
not be able to link the identities easily, you would
have to have interstate cooperation, which is an incredibly complex
in the law enforcement world. Oh yeah, yeah, and and
that's what this system is intending to do. UM. At
the very least on the surface. There there are a
bunch of organizations like, for example, the Electronic Frontiers Foundation
(38:12):
that are calling for greater disclosure of UM and and
interoffice checks and balances on the privacy and security and
scope of this entire project UM because it's obviously concerning
well yeah, I mean it's one of those things where
right now, if you're talking about this is specifically uh
(38:33):
limited to people who have a criminal background, then for
a lot of people they're just going to say, oh, well,
then that that's fine. But the implication here is obviously,
well one, if you've been processed, that doesn't necessarily mean
that you're actually criminal. Yeah, yeah, you may you may
have been processed, but it was because you had been
mistakenly identified as a criminally in the system or I mean,
(38:56):
you know, in the case of and this is treading
on slightly opinionated territory here, apologies guys, but you know,
if you were maybe processed for participating in a non
violent protest or something. That's a great example if you
you could have been protesting something and then rounded up
as part of an effort to clear out the streets,
so after a curfew or something along those lines. I mean, uh,
(39:19):
that at the time we're recording this, obviously the uh,
the events in Ferguson are brought to mind in that case,
and you could easily see where people would be very
concerned about the potential misuse of the system, or even
just how the system could could inaccurately identify somebody and
that could very much impact their life in a negative way.
(39:42):
I mean, it's understandable why this is a frightening thing
and it needs to have a responsible and accountable administration. Right.
It's really easy, I think, especially for the general, perhaps
less informed public's imagination, to jump to something like minority
report as an example of how wrong it could go. Yeah. Yeah,
(40:03):
Like the example that I put here was minority report,
and I was thinking specifically about how a minority report
John Innerton walks around and is immediately identified and then
the world responds to his identity. Uh. And we think
about this in the context of the Internet of Things,
about how our our environments will adapt to us, which
(40:23):
is a great part of the Internet things. It's really
this magical kind of world. But it also means that
it needs to know who you are in order for
this to work. And if it knows who you are,
then maybe other people know who you are and they
may use it for good or for ill. And right now,
as it stands, we don't have like an incredible database
that has everybody's picture and everybody's identity. There are a
(40:48):
giant databases of that regionally, but it's not all linked
together yet. But I see that there's a note here
that's going to fill me with terror. Yeah. So this
new system the FBI is setting up will be able
to collect photos and other biometrics from civil sources. Um,
and what does that mean? That's a terrific question, y'all.
(41:10):
I'm not entirely share. Civil sources could include I suppose, um,
businesses that had taken fingerprints and and photographs. Um, if
those businesses choose to disclose that data to the FBI.
Could it include the Department of Motor Vehicles because that
(41:31):
would be everyone's driver's license. You know, that's a that
is a terrific question that I do not personally know.
The answer to um. But you know they'll they'll also
have that capacity to collect and retain images from certainly
from from crime crime scene security cameras for for later use.
So you know, even if you were just walking by
and would not have normally been called in, it's going
(41:54):
to be in their files. UM. I have a quote
here from their Privacy the Impact Assessment for the next
Generation Identification Interstate Photosystem from that was written in on
June nine of two eight. Actually so so a minute ago. UM,
kind of explaining what this civil business is a little
bit about. So you know part this, if you will,
(42:17):
Authorized non criminal justice agencies and entities will be permitted
to submit civil photographs along with civil fingerprints submissions that
were collected for non criminal purposes. These photos may either
be provided to the submitting agency by the individual or
taken directly by the submitting agency. Civil photos will supplement
the biographical information and narrative physical descriptions that are already
(42:40):
provided under existing practices. Yeah, that sounds a little ominous.
I mean that doesn't. That doesn't it doesn't about privacy. Actually,
it doesn't say a lot is it's a lot of
words that doesn't actually describe it sounds to me like
we can totally ask if you have this information, we
can totally ask for it, you can totally give it
(43:00):
to us, and then we'll have this enormous database. See
the issue that we bring up here, the reason why
this privacy issue is really important. And you know, again
there's always the argument of if you if you just
behave you don't don't do wrong, then but that's the
problem is who defines what is wrong. So if right now,
what is wrong means you know, you're obeying the law,
(43:22):
and the law is reasonable and the law reflects what
society feels is important, then that's one thing. But if
the law ends up changing where let's say, you aren't
allowed to uh to associate yourself with a certain political
group or a certain ideology, and then they're using these
sort of systems to identify people who do associate with
(43:43):
those things, that becomes a real problem. I think the
whole if you haven't done anything wrong, you don't have
anything to worry about. Argument against privacy is nonsense, false.
It just anyone who says that say to them, Okay,
can I read all your email now? I mean you're
not a criminal, are you? Yeah? I mean Obviously, people
(44:06):
have reasons for wanting privacy other than wanting to conceal
the fact that they've committed crimes. People, you want to
have some things that are kept private. You don't want
people knowing more about you than you want, then you
give them permission to to know. I mean, that's just
a sort of basic fact about human psychology. That's perfectly normal,
and we all feel it. I don't know anybody who
(44:27):
would be like, oh, yeah, you can just read all
my emails here, my text messages. Sure. And also, I mean,
you know, like I know that this is a pretty
common thing to to bring up in a discussion like this,
but you know, go back to the to the nineteen fifties,
the Cold War and the Red Scare, and the amount
of like weird blacklisting and stuff that happened to to
(44:48):
in courts, like legally in courts at the time. And
of course now most people are like, yeah, no, that
was our our bad are bad, sorry, guys, But yeah,
you know, you could see similar rumblings Host nine eleven too.
So that's the thing is that we cannot, you know,
we see these these these these dark sides of human
(45:08):
nature come up over and over again, given the right
set of circumstances, and uh, you know, when you get
a technology that enables a much more efficient means of
sorting people in that way, And of course, I mean
obviously there are some concerns there. So again the big
concern here, keeping in mind, technology itself is neither good
(45:29):
nor evil. It's how we employ application. So it's one
of those things that if we are responsible, if we
hold the people who have the keys to this technology
responsible and accountable, then we have a much better chance
of making sure it's being used in an appropriate way. Right.
And another thing about technology is I mean you can't
stop it. That's also true. We're not going to stop it.
(45:51):
It's going to happen. So it's better to set in
place standards for how it should be. Can't and say no, no, no,
we can't go there. Somebody's going to go there. It's
better to do it, right. Yeah. So here's the thing.
Now that we have these concerns, I'm starting to wonder,
are there ways to make my face undetectable? Um? Yeah,
(46:13):
I mean basically yes, yeah, I mean apart from just
like sticking knives in it and stuff. I yeah, No,
I don't necessarily recommend that Joe. Um, lots lots of
people are are working on that kind of thing. Um
that there's an artist by the name of Adam Harvey
who's gotten a lot of press for his ideas and
suggestions via his website, which is c V Dazzle. Um. Dazzle,
(46:35):
of course, being an old term for camouflage that was
used I believe starting during about World War One. Um.
He suggests not wearing makeup that enhances your facial features.
So cut out that terriffic HATI eyeliner, Joe. Um, partially
obscuring the bridge of your nose, partially obscuring your eyes,
(46:55):
creating asymmetry on the two sides of your face. Um,
you using makeup or hairstyling or accessories to create unfacedlike
and unheadlike shapes. Cut yeah, yeah, something like that. Sure,
and um and furthermore, being careful not to really overdo it,
which would make you more conspicuous to humanize. If I
(47:17):
walk out and I've got, you know, harlequin style makeup
so that facial recognition software can't tell it's me, chances
are someone else is going to look at me and say,
all right, well now, I'm even more alert to your
presence than I would have been otherwise. Sure, sure, And
you know certain things like like like masks are illegal
to wear around on the street in certain cities or
(47:38):
certain municipalities. Atlanta, Atlanta is one of those. I didn't
know that. I mean, I've never tried to just wear
a mask out when when Anonymous, when Anonymous were holding
various protests and they were getting together in public. You know,
the symbol of Anonymous is the guy Fox mask. And
there were quite a few places and I think Atlanta
(47:58):
was one of them where they were not allowed to
wear those masks. It was against the local laws. So
um yeah, there are there are cases of that, so
uh yeah, And I mean and lots of lots of
other folks are interested in designing technology to counteract the
sort of thing. Back in twenty thirteen, Japan's National Institute
(48:20):
of Informatics designed this pair of goggles that was fitted
with eleven near infrared LEDs, which made them invisible to
the human eye basically, but would confuse the heck out
of infrared sensitive cameras, which are often used in order
to capture images in low light situations, right exactly. Also,
it would mean that you wouldn't be able to buy cigarettes.
(48:40):
Of course, part of the problem is that as we
we come up with these techniques to disguise our faces,
I mean, whether or not we actually want to go
to those links. At the same time, visual identification technology
is probably just going to continue getting better and better.
I mean, it can get a lot better than it
where it is today. One of the things I thought
is going to improve obviously is wider diffusion of high
(49:03):
resolution imaging, because if it's trying to get your face
from a low res image, which a lot of security
cameras and things like that are fairly low resolution, it's
going to have more trouble. If if it can get
a better shot to work with, that's obviously more data well.
And also if it can rely on more than one
camera angle than that helps too because you get that
(49:24):
three dimensional look. Yeah, and as so as camera technology
becomes cheaper and more widespread, And as one thing, just storage,
I mean, you know how much data can you store
on these things. It's cheaper and easier to do more.
I can see that happening. Another thing is whole body recognition.
We all ever thought about this, Yeah, yeah, I've heard
(49:46):
I mean, like I said earlier in the podcast about
how Jonathan might be able to personally recognize you from
just your your stance. Yeah, exactly. So maybe the way
your body is shaped, your height, in the length of
your show older, is your build, your your posture, the
way you gesture. There was a study published in the
(50:06):
journal Psychological Science where a team led by the UT
Dallas researcher Alison Rice reported that seeing the whole body
is actually a pretty crucial factor in identifying a person,
especially when the facial data can is ambiguous. Um. So
they did a few experiments. They asked college students to
look at images of people who were somewhat difficult to identify.
(50:29):
So they were either two different people who look similar
or the same face, or the same person in two
different pictures where their faces looked fairly different, and they
judge these by having a computer algorithm look for the
ones that we're giving them trouble. Um. So the students
could judge identity pretty accurately when they were allowed to
(50:50):
look at the person's face and whole body, but not
when they saw only the faces. And the funny thing
was the students seemed to think they were relying on
faces to determine identity, but the researchers used eye tracking
software to show that they were actually looking at the
rest of the body basically for supplemental data when the
faces were somewhat ambiguous. And so the way this applies
(51:13):
to the software and hardware we're talking about is that,
obviously there's more data to account for. If you you
can take a picture of somebody's whole body, that gives
you a lot more to work with than just the face. Yeah, yeah, yeah,
these are the I mean, obviously, anything that's involving biometrics.
The ultimate answer here is that the more data you have,
(51:35):
the more sure you can be of the result. That's
exactly right. And one of the things is another advantage
that's going to be given to the people identifying you
in the futures that we're all giving up tons of
data right now. Yeah, I'm sorry, I was posting a
Facebook What are you talking about. We upload pictures of
ourselves and our friends and our family members and our
(51:56):
cats to Facebook and Google and et cetera basically all
the time, did it? Yes? Yeah, I mean I think
that doesn't it funny? We like complain about privacy and
then we do this, yeah, yeah, and then we do
we're like, check out this great vacation I had. Look
how awesome my look? Yeah, totally. The projected number of
photographs that the FBI is supposed to have in this
new database of THEIRS by next year, I think it's
(52:18):
something like fifty to fifty two million, and the number
of photographs that are on Facebook is like seven or
eight times that. It's easily Yeah, yeah, it's a lot
of photographs. Well, and keep in mind that we're talking
about facial recognition here, but there's also image matching algorithms. Right,
So if you end up being able to tap into
(52:40):
something like let's say that you get back door access
to the behind the scenes of one of these big
social networking sites where you can do a quick image
matching search of a of a picture you have versus
all the images that are actually contained within that social network,
(53:00):
most of which are tagged in some way or at
least are associated with a specific person, then you suddenly
have an incredibly powerful surveillance tools. Yeah, and there's there's
certainly concern um you know, from from activist groups among
many other human people. I think that it's similar to
how the n s A has definitely collected data from
(53:21):
these private organizations and businesses that for example, the FBI
might might also start tapping into that kind of thing.
So again, just not to scare everybody out there, not
to say, you know, never share anything ever with anyone. Rather,
again I'd like to to reiterate that the answer here
(53:44):
is that we hold people accountable, and we we do.
We demand transparency for the use of these kind of
technologies so that they are used responsibly because people are people,
you know, and and while we often will think of
these organizations as their own kind of sentient entities that
operate completely independently of the human uh you know, experience,
(54:07):
they're made up of humans. So we got to hold
them accountable and make sure that they're being responsible. And
there are a lot of ways that this technology is
going to be incredible and really help us out, like
in that Internet of Things scenario, but the way that
we get there is that we make sure it's being
used correctly. Yeah, so, so you know, educate yourself. Well,
I'll try to remember to to do a blog post
(54:28):
or something with some links for further reading, but you know,
get out there and be aware of what your local
and state and national governments are doing with your image. Yes,
it's it's important. Well, I hope you guys learned something
from this podcast. Make sure if you have any suggestions
for future episodes. Maybe you have a question that came
up during this episode, or there's just something else you
always wanted to know about. What is this going to
(54:49):
be like in the future. You should let us know.
You can follow us on Twitter or handle us f
W thinking. That's the same over at Google Plus. If
you go to Facebook and type in the little search
term fw thinking will pop right up. Come and be
our friend, Come and follow us, Come be part of
the conversation. Because you know we're heading to the future.
We don't want to leave you guys behind. So that
wraps up this episode and we'll talk to you again
(55:11):
really soon. For more on this topic in the future
of technology, visit forward Thinking dot com h brought to
you by Toyota. Let's go places,