All Episodes

July 14, 2023 29 mins

Calling all tech lovers! Amazing Wildlife is joined by the head of San Diego Zoo Wildlife Alliance's Conservation Technology Lab, scientist Ian Ingram. He shares with Rick and Marco how the latest technology is being used to help wildlife, and how we hope to implement it in the Asian Rainforest Conservation Hub. The trio discusses the use of artificial intelligence (AI), how cameras know what they should or shouldn't record in native habitats, and how sensor systems can detect things that people cannot. Ian also describes how, in the not-too-distant future, scientists might use drones and four-legged robots to set cameras.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Hi, I'm Rick Schwartz.

Speaker 2 (00:06):
Wuendosis World, Marco went.

Speaker 1 (00:08):
Welcome to Amazing Wildlife, where we explore unique stories of
wildlife from around the world and uncover fascinating animal facts.
This podcast is a production of iHeartRadio's Ruby Studios and
San Diego Zoo Wildlife Alliance, an international nonprofit conservation organization
which oversees the San Diego Zoo and Safari Bard.

Speaker 3 (00:26):
All Right, Rick, so we wrapped up our last episode
telling everyone that you learned about something called conservation technology,
and I see that we have an appointment at the
San Diego Zoo Wildlife Alliance Beckmann Center. That's the main
building where a lot of our conservation scientists work. This
episode was going to be about our Asian Rainforest Hub.
So how does conservation technology fit in all of this?

Speaker 1 (00:49):
Well, Marco, all I can say is be prepared for
a whole different side of conservation that often happens behind
the scenes, or at least if not behind the scenes,
it can often go unnoticed. And you're right. We started
planning this episode around our Asian Rainforest Hub and this
includes our projects and partners that are focused on needed
work to maintain sustainable habitats for tigers and orangutans, and

(01:11):
one of my favorites beingerings and sun bears and hundreds
of other species that inhabit the region.

Speaker 2 (01:16):
Oh man, that sounds really interesting. Now. I know a
lot of our conservation scientists.

Speaker 3 (01:21):
Do amazing work, like studying and preserving genetic materials in
the Frozen Zoo as an example, and definitely a whole
lot more. But we're talking about conservation technology. So is
this about radio collars and like trail cameras.

Speaker 1 (01:36):
Oh yeah, yeah, radio callers, trailcams sometimes called camera traps.
These are both pieces of conservation technology. But get this,
I found out there is much more to it now
than just camera traps and callers. With the rapid growth
of technology from algorithms and artificial intelligence, these are all
things we get a sort of at a consumer level.
Conservation technology is also rapidly advancing along with these technologies.

Speaker 2 (01:59):
Oh man, that's super fascinating.

Speaker 3 (02:00):
I mean, seriously, I can't wait to discuss what else
is there and how it's all going to be used
in our Asian rainforest hub to help with these conservation efforts.

Speaker 1 (02:08):
And that's the interesting twist to the story, Marko. Oh yeah, Well,
As I was digging around to find out more about
our work in the Asian Rainforest Hub, I spent some
time with our conservation technology team and found out that
some of the latest technology has been deployed here in
the Southwest Hub and more recently in the Amazonian Hub.
The success of some of these technologies in the Amazonian

(02:29):
Rainforest now has our conservation team very excited to deploy
it in the similar habitat of the Asian Rainforest Hub.

Speaker 3 (02:36):
Oh wow, So when do we get to talk to
someone from the conservation technology team.

Speaker 1 (02:40):
Well, I say we head over to the Beckman Center
now and go have a conversation.

Speaker 2 (02:44):
Oh man, that's a great idea, you know, I'm sure
the guests.

Speaker 3 (02:46):
Now, we're eighteen hundred acre conservation park and there's a
portion that's inaccessible to guests, but it's a unique area
called the Beckman Center where all of our conservation scientists
and researchers do all this incredible work.

Speaker 2 (02:57):
So yeah, I'm super pub let's go.

Speaker 3 (02:59):
Rick.

Speaker 4 (03:01):
I am Ian Ingram. I am a conservation technology scientists
here at the San Diego Zoo Wildlife Alliance, and I
lead the Conservation Technology Lab.

Speaker 1 (03:11):
And what does conservation technology mean for the average person,
what would that mean.

Speaker 4 (03:17):
I mean, technology is a pretty broad term. Everything in
humans build is technology. We're really mostly focused on the
use of computers, embedded computers and devices like that, so
digital electronics that's applied to the conservation problem.

Speaker 3 (03:32):
And actually with this location, we're in a unique spot
here at the Safari Park. What's the name of this
building and what's so important about this location?

Speaker 4 (03:38):
We're in the Beckmann building, which houses at least the
majority of the conservation science and wildlife health team. So
there are numerous scientists and researchers of all sorts of
stripes working here in conservation, genetics, disease investigations, recovery, cology,
population sustainability.

Speaker 1 (03:56):
I mean they're actually four more.

Speaker 4 (03:57):
But so there's, you know, a very broad swath of
folks who are tackling our conservation goals with different tools.

Speaker 3 (04:08):
I'm curious to because admittedly I was doing a little
like stalking of you on the internet and I found
some interesting facts about you that you have a background
in an artist and in robotics. Right, Can you share
a little bit about some of your ast history.

Speaker 4 (04:20):
Sure, I'm trained as a roboticist, specifically in underwater robots.
My goal when I was a kid was to find
the Lockness Monster, and at the time, it seemed as
if the best way to go about that was to
use underwater robots. So I studied that, and then I
segued and look at it for the giant squid. I
was advised that maybe looking for an animal that may
or may not exist might be a bad career move.

(04:43):
I took that advice, and so I segued in the
giant squid, which definitely existed, but was still kind of
mystical in a way. The time, nobody had seen them alive. Really,
they'd always been washed ashore or more buns floating on
the surface. And I mean to summarize the arc of
my career has sort of been about initially looking for

(05:03):
really large animals that may or may not exist, to
working with very small animals that definitely exist, and working
with animals that might not exist for much longer if
we don't help them.

Speaker 3 (05:14):
Yeah, no, kidding, you're working at We always just talk about,
you know, especially when you're a child, like what sparks
your interest and conservations.

Speaker 2 (05:20):
I picture you, you know, in love because.

Speaker 3 (05:22):
Someone actually hinted at me that you're in love with
the locknest monster too, so it's interesting that's where all
it came from. And now you're knee deep in like
really essential conservation projects with real life animals.

Speaker 2 (05:31):
Not to say that the locknest monster doesn't exist.

Speaker 3 (05:33):
Or does you know, I'm not conferving or denying it,
but now you're doing some really wicked work out there.

Speaker 2 (05:38):
Can you talk about the projects that you're involved in
at the moment.

Speaker 4 (05:42):
We're doing a number of different things. A lot of
them relate to the application of machine learning, you know,
artificial intelligence, to processing data from sensor systems. So that
can mean image data basically photos that are coming back
from camera traps and similar devices, to video data from
similar sorts of camera devices, to audio data coming from

(06:04):
audio recorders, and also movement data that comes from accelerometers,
which are for people who aren't familiar with accelerometers, they're
tiny little sensors that measure acceleration, which is to say, movements,
and you've got them in your phone, and your phone
actually is using them to learn things about you too,
and we're applying some of that same technology to elephants

(06:26):
and similar species. So the machine learning aspect makes processing
what are essentially massive data sets at this point much
more efficient in collaboration with humans. Still, who check that
the mL isn't totally misleading us.

Speaker 1 (06:40):
Well, I have a couple of questions about that. To
start off with, why do we need, as you say,
mL or machine learning to help with this? For instance,
you mentioned camera traps or cameras that are set out
in the wild by humans to take pictures when wildlife
walks by. So I guess my question is how does
that work? How does the camera know what to take.

Speaker 2 (06:57):
A picture of?

Speaker 1 (06:57):
And then I guess additionally, why do we need to
computer to go through that set of pictures or data?
Why can't we just look at it and say, okay,
there's a leopard or or there's a monkey and so on.

Speaker 4 (07:06):
You absolutely can do that, and that's how it was done,
and it still is done to a large extent. People
look at the photos and identify what they are. On Zooniverse,
for instance, which is a partner that we work with.
The fact is that the bulk of images coming in
is so large that that's actually prohibitive at this point,
and a lot of times something goes wrong and the

(07:26):
camera trap captures something that isn't even real data, like
just grass blowing in the wind. And I'll get back
to that in a moment, since you asked how they work,
So the mL can buy MLME machine learning algorithm can
very quickly look at those images and throw out the
ones that don't have animals in them at all, meaning
that the citizens scientists who contribute on zooniverse have that
many fewer false positives as we call them, to look

(07:47):
at images that don't contain any animals, and it can
also identify which animals are in there, and that speeds
things up greatly. So, as an example, a data set
that was comparable one that took us six months to
have citizens scientists label doing it the old way only
took us three weeks. And that's when the mL goes
through first and says what everything is, and then a

(08:09):
person goes back and looks at it on zoooniverse and says, yeah,
that's right, that's right, that's right, and then that's not right,
and then it gets thrown back through the soup. The
camera traps are triggered by a passive infrared sensor. So
this is a sensor that's looking for the movement of
an animal in front, but also a warm animal, an
animal that's warmer than the background, which actually gets into

(08:31):
a totally other thing that we're interested in doing, because
a lot of animals aren't warm blooded and they don't
trigger the camera traps particularly well, so ecdotherms like reptiles
of various sorts and amphibians, and so if you're using
a camera trap in that context, there's some hacks you
can use to try to create that signal that the
PIR will trigger on the passive infrared sensor. But what
you can definitely do is apply another kind of machine

(08:54):
learning sort of paradigm which is called EDJAI, which is
the use of write on the device itself. So we
have another project called scrubcam where instead of being an
off the shelf camera trap that triggers with the PIR,
the scrubcam is using an EDGAI machine learning model constantly
looking at what it's seen and identifying it with the

(09:15):
AI and then triggering only when it sees what it
needs to, or actually really just recording what it sees
when it needs to.

Speaker 3 (09:22):
Whoa a camera trap that knows when it should or
should not record.

Speaker 2 (09:26):
Wow, that's pretty wild.

Speaker 3 (09:27):
I mean, it sounds like a lot of tech and
computer science to me, But this might be a good
time to ask, I mean, why are these images so important?
Why can't you research or just go out in the
wild and look for footprints from the animals or maybe
even look into their droppings. Why is having this AI
or artificial intelligence out there so essential.

Speaker 4 (09:48):
Well, there's two answers that question. One is that you
can't take action if you don't know what the problem is,
and you don't know the extent of the problem. So
we can't just have sort of anecdotal ideas of whether
a given animal is reducing in population size or actually improving.
We have to do population studies and that's a big
part of what we're applying those kinds of techniques to

(10:10):
the camera traps and things like that. The other part
of your question is, well, there's two factors to that.
You were talking about a person going out and looking
for tracks or spore and a that's a lot of labor.
That's a lot of people that have to be there
to equate the efficacy of a large array of camera traps.
There's actually a lot of different pacets to this, because

(10:31):
there's also the fact that people themselves disturb the habitat
when they go in there, and a lot of what
we're trying to do is to gain this data in
the least invasive possible way, and just the as one
of my colleagues called it, the ball of smells that
a human being represents is the problem. I mean, you
leave a little bit of that on the camera trap
when you deploy it, which actually gets into another thing,
which is stuff we're experimenting with where we would use

(10:52):
robots to deploy the sensors too, so humans wouldn't even
have to enter the habitat being studied. And then there
is the fact that animals don't show up when people
are around, and animals don't always go to the places
where people are. And then there's the fact that that
people don't detect a lot of things, So we have
the potential to use sensors that can detect things that

(11:13):
people can't detect. When it comes to the acoustic side
of things, and we have an acoustic recorder out there,
that often means recording into the ultrasound. So what defines
ultrasound is sound that humans can't hear. It's higher frequency,
higher pitch than humans can hear. And a lot of
animals are vocalizing in that range. So bats and rodents,
for instance, are vocalizing in that range. And on the

(11:33):
other side of the frequency spectrum is the infrasound, the
sound that's too low for humans to hear, and elephants
are using that too, So some of our sensor systems
are recording in those places and humans wouldn't even know
there was something happening there to begin with.

Speaker 3 (11:47):
That's really interesting for me because you know, in our
nocturnal episode I reference you know, the realities behind the
realities of the realities, and you're mentioning, you know, different
acoustics and sounds, and it's a jungle as an example.
And in the past I've worked with castwres, you know,
very nique vocalizations that humans can't pick up. So it
gets me really excited thinking that we have technology now
I can immerse themselves in a habitat where wildlife isn't

(12:07):
necessarily going to react to, say maybe a human giving
a cough in the middle of the jungle or sloughing
up some skin cells that a mammal might pick up
that now we might be able to like be there
observed behavior with that physically being there but see some
really unique stuff that we probably never would have really
in seene or you mentioned, technology can pick up sounds
or maybe even some visuals that we just can't pick up,
and we'll know even more about a certain species, which

(12:30):
kind of makes me think about there's really unique environments
that are really hard to get to, you. I mean,
everything from like the Arctic of the polar there, right,
I have frigid temperatures, or my favorite front is like rainforest,
you know, but even then, like that presents challenges in
its own, right. I mean, can you talk a little
bit at maybe some projects we're trying to focus on
in our Amazona a rainforest habitat with really unique species

(12:51):
and like thick thick jungle, it's kind of almost in
near impossible to get to and access data, right, Yeah.

Speaker 4 (12:56):
I mean one of the maybe more prosaic aspects of
making any of these things is powering the devices. So
they have to have enough power to run, so either
have to be low power and run on just batteries
like you would have and consumer device, or they might
use solar power. But in the Amazonian rainforest, it is
not easy to use solar power because the trees are

(13:17):
working against you, so you can put them panel a
solar panel up higher and then run the power down,
which isn't really that easy to do, but that's a
major concern working in that kind of Habitat to the
question of what we're doing there. We do a lot
of different things across the organization in the Amazon, mostly
the proving Amazon. A lot of the projects of the

(13:38):
Conservation Technology Lab is involved in are connected to camera
trap arrays that are deployed that are paralleled by audio
moapacoustic recorder arrays, so we're getting images and video and
sound of different animals that are there. It's probably worth
bringing up that there's this whole idea that that which
you see in a camera trap is different than that
which you'll hear, because a lot of cryptic species will

(14:00):
never show up in a camera trap either because they're
not on the forest floor, they're not big enough, but
they often will be making noises that you'll pick up
with the audio recorders, and so that's why having those
two different paradigms of sensing are so important.

Speaker 1 (14:14):
Now, I like that you brought that up. I think
that's important to remember too that for as many camera
traps are deployed in thousands and thousands of images that
come in from them, that it is a very narrow window.
You have to have the camera trap pointed in the
right direction, right at the right level for a particular species,
and hope that they walk on the right side of
the tree where your camera is and not the left
side of the tree where your camera's not pointing. So

(14:36):
that's a really good point that it's a very limited
window and which you're getting. Obviously, we can use our
best guests by understanding trails and how the environment is
used by the species. But that does then bring us
right back to what you said about the acoustics needing
that audible side of things, which is picked up differently,
it travels differently through the forest, et cetera. We were
talking a little bit beforehand too about all this work

(14:57):
that is being done in the Amazon for us and
how how what we have learned, what's developed over time
is now going to be something we can apply to
the Asian rainforest as well, because the challenges there aren't dissimilar.
What would you say has been in your time doing
this work, the most interesting thing you have learned from
something going right or wrong out in the field.

Speaker 4 (15:17):
I mean, I guess. The first thing I'd say, and
this is just something that I know after many decades
of working as an engineer, is that things go wrong
and you have to know that they're going to go wrong,
and you have to test. It's pretty much the cornerstone
of making something work that you just tested a lot.
We in the Conservation Technology Lab have this internal mnemonic
that we use Bora Bora which stands for bench. Is

(15:41):
the b O is the outdoor learning lamb, which is
just this teaching lamb that's right outside this building, the
Beckmann building, and our is the reserve for the Biodiversity Reserve,
which is this eight hundred acres of land that's immediately
adjacent to the Safari Park, so just a little further afield.
And then a is a field like our remote locations
in Small Bar or in the Asian rainforest or in

(16:02):
other places. And so we work as hard as we
can to make something work on the bench, which is
the first part, and then we take it to the
outdoor learning lab which is just outside the door, and
immediately realize that we forgot something.

Speaker 2 (16:15):
That's why it's called the learning lab, right, Yeah.

Speaker 4 (16:17):
For us, it's the learning what we've forgotten what we
were done about lab and then once we get that,
we take it to Sagebrush, which is our sensor network
in the biodiversity reserve. It's the brush stands for Biodiversity
Reserve ubiquitous Sensing and habitat. And then we prove it
out there, and we do that with projects that we're
doing there with cougars and rattlesnakes that are very interesting
species that are local to our southwest area. And then

(16:40):
if it passes that test and it goes further afield.
And the reason it's Bora Bora instead of just Bora
is that entering, like many things, is iterative, and you
end up going back to the drawing board like Wiley
coyote and having to start the whole process over again.
So you know, there are stories of things going around.
The reason I'm pausing so much is I don't want
to like accidentally a point of finger at some point.

Speaker 2 (17:03):
Yeah, you don't have to out I guess.

Speaker 1 (17:05):
I guess what I was getting at is that I know,
just from you know, working outdoors, working with the wild,
and working in conservation, that sometimes your best lesson comes
from something you didn't even realize, something you had to
learn or it's the most unexpected thing that was like,
oh duh, you know, and try and outthink the situation
all you want, but it seems that the animals will
always teach us something about the outdoors in general, teach

(17:26):
us something.

Speaker 4 (17:27):
So I mean, this isn't an example the worst thing
that ever happened, But we often find that wood rats
like to nibble on the cables of the things that
we've got out in the biodiversity reserve. So, and it's
a simple solution. You just don't run the cable past
where they live. That's the area they care about, and
that's where they start exploring with their teeth, which is.

Speaker 2 (17:45):
What rodents do.

Speaker 1 (17:47):
So I've been out on the reserve saw a lot
of the different locations where the camera traps are. So
for here, just locally in our southwest environment, we have
this stuff deployed to look at what's been some of
the most surprising, the most interesting things you have seen
come up in all this data or were there no surprises?
You're like, no, everything's out there, we.

Speaker 4 (18:05):
Expected well, you know, it kind of speaks to your
earlier question about trying to do surveys with just people
out in the woods versus doing it with the equipment.
I've never seen a bobcat in the wild ever. I
think someone pointed one out once and I couldn't tell
that it was about. But we actually see them on
the camera trap images all the time. So with these

(18:27):
regular camera traps were able to kind of see this
aspect of what's going on in there that otherwise you
wouldn't see at all. We see a lot of other
kinds of things. I mean, it's always interesting to see
who's eating whom. We get a lot of camera trap
images of cougars would say a skunk in their mouth,
you know, and yeah, maybe that's not so crazy, but
you know, then you know that they're actually catching skunks

(18:48):
and enjoying them.

Speaker 3 (18:49):
I remember seeing one of those field cams out in
the reserve and I saw one of a skunk walking
out with look like a gopher in its mouth, and
I forget, like they're omnivores, but you know, like to
your point, you don't really see that with your eye,
you know, so you notice these field cams and it's
think even being a wildlife you're specialist, and having these
field cams as a tool for us to use is
a good angle to really capture certain behaviors, like for instance,
the western growingou was at Conder Ridge. There were certain

(19:11):
aspects of their behavior between the pair that I had
never witnessed. Even though I was trying to be the
sneakiest specialist that I could be hiding behind a pine tree,
they knew I was there.

Speaker 2 (19:20):
You know, they still modified their behavior.

Speaker 3 (19:22):
The second I lot they knew it, and they just
showed me some really interesting stuff utilizing those field cams.
So there's a lot of good potential in tech, I
think with that.

Speaker 4 (19:29):
Yeah, something that I often think about is how animals
are alive all the time. It sounds like a simple thing,
but as a bird watcher, an animal watcher for most
of my life, you get to the point where you're
realizing that you're only seeing them at certain times, but
they're doing everything else the rest of the time. If
there's a storm, they're finding someplace that they're a bird
to roost, and they're wet and so on, and then

(19:51):
they're muddling in their nests if they're a squirrel or whatnot.
But we're diurnal beans, and even when we occasionally stay
up to be nocturnal, we're not doing it all the time,
and so there's a lack of overlap between when we're
active and when we're not, and when the animals are
active and when they're not. And the device is the camera,

(20:12):
traps and other sensors can be active all the time,
so they can catch all those little secrets.

Speaker 3 (20:16):
Yeah, and I'm kidding, right, and all the nuances we
we'll know with a certain species like the polar bear.
And again back to the Asiatic Hub where we're doing
work with the Asiatic black bear as an example. And
still you know, in a thick, vibrant rainforest habitats, it's
really hard to track a bear, and even if you could,
it's going to modify its behavior in some regard to
and me. We were talking about this before we started
recording Buddy how I'm definitely not a tech guy, but

(20:38):
I can't appreciate technology and the advances that it's helping
us out in these conservation efforts, and tied to that,
helping out communities that are trying to live next to
this wildlife too, which I think it's just one of
the most important things I.

Speaker 1 (20:51):
And I want to take a moment and ask about
the rapid developments and technology for example, and then probably
dating myself. But back when I started working professionally with
wildlife care and conservation, the trail camera was just a
box with a thirty five millimeters camera and it with
film and everything. And then that got updated eventually as
technology moved forward to digital and then it was HD
as technology advanced even further, but you still had to

(21:11):
go out there and retrieve the data cards and so on,
you know. And with Selle technology now where we're seeing
many devices that just upload data right to the cloud
directly via a sale signal, so no need to go
back out there and have the humans disturb the environment.
It's amazing and we hear in the future and probably
not too distant future where robots are going to be
helping us with conservation as well. What are some of
the things robots will be used for in conservation.

Speaker 4 (21:33):
There's a broad spectrum of things we think we can
use them for. A lot of those would require that
sort of turnaround moment where the longevity of the robots
is greater the short term application that we're planning to use,
and I wouldn't say proprietary because the philosophy of our
organization and also the Conservation Technology Lab is that we
share these kinds of results pretty openly. Is to use
the quadrupedal robots and VHF tracking. So we have a

(21:56):
collaboration with the Engineer Exploration at UCSD, the University California,
San Diego, and they had developed a device to do
VHF tragging. So this is where an animal has a
beacon attached to it, micro radio collars, lego radio collar.
There's lots of different kinds of collars, but there are
some that use these VHF tags as a way of

(22:17):
tracking them. And historically you'd go out there with a
large antenna and you'd zero in on where they are
by wandering around and seeing where the signal strength was stronger.
Are collaborates at UCSD. They build a system that could
be borne on a drone and fly around and zero
in on that. That was a project with Glenn Gerber
connected a lot to his work with iguanas and the
Caribbean and elsewhere, and we're now going to use it

(22:40):
with rattlesnakes and the Biodiversity Reserve. And we're going to
do that not just with aerial drones but also with
the leged robots. So that's a plan to begin to
understand whether we can use the leged robots to do
that sort of thing. And there's always going to be
a balance where certain applications would be better served by
using the droness is the legged robots, but there are

(23:01):
plenty of spaces where the drones aren't going to do
the thing we need, and so this is an exciting
new thing where you can have these essentially little robot
dogs wandering around trying to track down the snakes and
being able to do that kind of exhaustively in the space,
wanting the fire roads and sitting down when they need
to rest and charge up, and then going back on
to duty to figure out where the snakes are so

(23:21):
that we can get a really good map of the
snake's activity in the space.

Speaker 1 (23:26):
Wow, that's amazing. And you had mentioned earlier that the
deploying sometimes of audio tracking equipment or the cameras, just
the human going into the space and doing that can
be disruptive enough that perhaps changes or alters the behavior
at least for a few days, if not longer, of
the species in the environment. Therefore, the data you're collecting
isn't actually accurate. Would these robots potentially be an opportunity

(23:50):
to deploy these type of devices into the environment without
a human going in and disturbing that space.

Speaker 2 (23:57):
Yeah.

Speaker 4 (23:57):
Absolutely. One of the more blue sky ideas for using
these robots is to deploy one of our other systems,
the Dencam system that goes to Smallbar to monitor maternal
polar bears, and as it stands, in collaboration with Polar
Bears International, we work very closely with them on the
Dencam project. They fly out in the helicopter and then
they land somewhere near where the bear is. They know

(24:17):
where it is because of the GPS fix, and then
they ski the last few kilometers in and then they
have to you know, make sure they don't get too close, yeah,
for the reasons for the bear's safety and for their savis,
so on and so forth. But we imagine that something
like that and this might be more like five six
years out instead of humans having to deploy that robotic

(24:38):
terrestrial robot, well, this quadrupedal terrestrial robot likely quadrupedal. There's
other versions. There's six legged ones and things like that
would just slowly wander across the landscape bearing that device
and then over the course of probably weeks, get to
the location, sit down and monitor the polar bears. So
there's a science fiction aspect to that whole idea, and

(24:59):
part of what we we have to do is sort
of tease it apart and see whether there's some real
potential there and it's really actually going to be as
helpful as we hope it will be, or whether there's
some massive gotcha that makes it kind of nonsense. But
by having that blue sky idea, we can explore it,
and we can find all these other places where there's
a utility for this technology that's just about to become

(25:20):
very prevalent and ubiquitous.

Speaker 3 (25:23):
Oh you know, And I really think it's worth noting
some of the concepts of the work that we do
here with wildlife. Here at the San Diego Zoo Safari
Park or the San Diego Zoo, we learn new aspects
of wildlife behavior and health because of the work we do.
I mean, we're literally side by side with amazing wildlife
and we share this information with our partners in those
conservation hubs. But honestly, I've never really thought about the

(25:46):
tech side of things and how we're applying all these
technological advances I mean like robots, are you kidding? And
then being able to share this information.

Speaker 2 (25:55):
Around the world too, It's just epic.

Speaker 3 (25:58):
And that leads me to wanting to ask you, dude,
what does this mean to you to be part of
this conservation technology to be at this level of conservation,
this level of making sure that this world says in
some form of ecological balance.

Speaker 2 (26:11):
What it means for me?

Speaker 4 (26:13):
Well, I mean I feel good about the work we're doing,
and that's important to me. I think that's important to
the people who work here in general. And as you mentioned,
it's a lot of different kinds of people who are
tackling this problem in a lot of different kinds of ways.
And yeah, we the Conservation Technology Lab, facilitate that work
by providing data that the ecologists and other sorts of

(26:34):
scientists can do the sort of studies that they've done,
but often faster or with more data than they otherwise
could have done it.

Speaker 1 (26:42):
It is truly amazing work, and I really appreciate you
spending time with us today and sharing everything you know
with our listening audience. All this work you do, all
the work your team does, it's truly amazing.

Speaker 2 (26:52):
Oh yeah, I definitely agree.

Speaker 4 (26:53):
Right.

Speaker 3 (26:53):
I mean, I think I learned a lot today. I
really appreciate it. Ian, thank you so much.

Speaker 4 (26:59):
Yeah, it was a pleasure to y'all.

Speaker 2 (27:02):
Rick.

Speaker 3 (27:02):
You know, I can't believe I'm going to say this, because,
like I've said before, I'm definitely not a tech person
at Oh, but after listening to Ian, I am super
stoked about everything tech can do for conservation.

Speaker 4 (27:15):
Now.

Speaker 1 (27:15):
I know exactly what you mean, Marco. I just love
hearing Ian's passionate excitement for his work, knowing that he
and his team are doing a lot for conservation now,
and then knowing there's so much more new technology on
the horizon.

Speaker 2 (27:28):
Oh yeah, and it.

Speaker 3 (27:28):
Really reminds me of what you know, we've been saying before
San Diego Zoo Wildlife Alliance. It's like a city with
every kind of job and career working for wildlife conservation,
from being a mechanic to being a chef or a writer,
wildlife care even social media.

Speaker 1 (27:45):
Yeah, it's true, and that brings me to yet another point.
I want to make something I was reminded of as
Ian was sharing his.

Speaker 2 (27:51):
Work with us.

Speaker 1 (27:52):
Now, I've had plenty of people ask me throughout my
career how they can be a part of wildlife care
or conservation if they aren't a biologist, or they've already
gone through school and they have a profession to something else.
And I think all of this talk about conservation technology
really shows us that you can be a computer programmer
or an engineer, or a roboticist or like you said,
social media specialist or a chef or whatever and still
be directly involved with saving wildlife.

Speaker 3 (28:13):
I mean, that's absolutely true, man, a one hundred percent degree,
And I hope everyone listening understands that you can make
a difference for wildlife no matter what you do.

Speaker 2 (28:23):
Oh so true.

Speaker 1 (28:24):
And you know, Marco, I realized this episode was supposed
to have a main focus on our Asian Rainforest Hub,
but I'm really glad and researching our work there that
we found out about Ian and what the conservation technology
team is doing and how all of that is going
to benefit really all of our hubs.

Speaker 2 (28:38):
Well, I mean, I absolutely agree.

Speaker 3 (28:40):
I mean the work is expanding, if the Amazon, the
Pacific Islands here in the Southwest, and of course the
Asian Rainforest Hub. So you know, what do you think
about taking some time to maybe focus a little bit
more on some of that wildlife in that Asian rainforest hub.

Speaker 1 (28:54):
Well that sounds good to me.

Speaker 2 (28:55):
What are you thinking about? Well, you know International Tire
Day is coming up this month.

Speaker 1 (28:59):
Oh well you heard them.

Speaker 2 (29:01):
Folks.

Speaker 1 (29:01):
Be sure to subscribe and tune into our next episode,
in which Mark and I talk about everything whiskers and
stripes for International Tiger Day.

Speaker 2 (29:11):
At Proximo. I'm Marco Away, then I'm Rich Schwartz.

Speaker 1 (29:14):
Thanks for listening. For more information about the San Diego
Zoo and San Diego Zoo Safari Park, go to SDZWA
dot org. Amazing Wildlife is a production of iHeartRadio's Ruby Studios.
Our supervising producer is Nikia Swinton and our sound designer
and editor is Sierra Spreen. For more shows from iHeartRadio,

(29:34):
check out the iHeartRadio app, Apple Podcasts, or wherever you
listen to your favorite shows.
Advertise With Us

Popular Podcasts

Dateline NBC
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.