All Episodes

January 16, 2024 61 mins

Tech journalist Ed Zitron joins Robert, Garrison, and Tavia to discuss the AI branded products that dominated the Consumer Electronics Show.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Al Zone Media. Oh my goodness, it's it could happen here,
a podcast that is about things falling apart are Dystopian
now and tomorrow, And for the last several days has
been heavily about the Consumer Electronics Show, which is a
huge event every year where one hundred and twenty to
one hundred and fifty thousand people flood into Las Vegas

(00:24):
to show off all of the new gadgets and to
have big, fancy panels on the future of technology. And
this has been a particularly good year for the Dystopia beat.
Part of that because the entire industry is obsessed right
now with artificial intelligence. Now there's a couple reasons for this.
Every laptop manufacturer is basically throwing out laptops with AI assistance.

(00:45):
Microsoft's is Copilot, and they're doing this because laptop sales
have stalled a lot of people, like the pandemic was
great for laptop sales, and then people stopped buying them
because most people don't need to replace their laptops very often.
So there's this death spread hope that by scaring everybody
into thinking they need AI immediately, they can get folks
to buy a new raft of machines, and outside of that,

(01:08):
it's just as I'm sure you're aware, with interest rates
where they are companies, Tech companies, particularly startups, are having
trouble getting VC money venture capital money invested in them.
So there's this kind of desperate hope that by plugging
AI constantly they can fill in the gap. So today
we have probably in a week or two, we're going
to have be putting out a long investigation based on

(01:29):
number of panels we went to with executives from Google,
from weirdly enough McDonald's, from Adobe, from Nvidia, from the
Consumer Electronics Association, in multiple government agencies including DHS, on
what they see as the future of AI. That's going
to be some pretty in depth reporting. But today we
want to talk about the AI products that we've been
seeing and as a spoiler, they're basically all the dumbest

(01:52):
shit you've ever heard of. So I want to introduce
our panel today. Coming back after catching a horrible, horrible
lung infection, throat infection, some kind of infection. Yeah, Garrison
got strep throat and despite the fact that we've been
hanging out together, I did not, which does prove I'm
genetically superior. We also have Tavia Mora coming back our

(02:12):
technical expert. Hello Tavia, Howdy everybody, and for the first
time on well, no, not for the first time, for
the third time on It could happen here. The upcoming
host of the cool Zone media tech focused show Better
Offline ed zitron.

Speaker 2 (02:28):
Ed Wi Guan Hello, Hey, yeah, sorry, Hi, Yeah, hit
my head on the way, and yeah.

Speaker 3 (02:35):
It's a truly awful show this year.

Speaker 2 (02:39):
The thing that I said to Robert yesterday when we
were talking about the show, and this really stood out
to me, is if you had told me this was
twenty twenty one, I'd have believed you. It doesn't feel
despite the use of the word AI, it does not
feel like tech has actually moved that far.

Speaker 3 (02:53):
And it's very strange.

Speaker 1 (02:55):
Yeah, there was this period of time after the iPhone
came out where every year there would be really big
lead in the tech you saw. And this part of
I think why they're leaning on AI so heavily is
otherwise it's just the same laptop, smartphones, speakers, connected gadgets,
you know, autonomous cars and shit that we've been seeing
for years and they really haven't jumped forward much. But

(03:16):
you know, the downside of that is a lot of things.
But the upside of that is people are increasingly cramming
AI into insane shit in the hopes that somebody will
want to buy it. And so I want to start
off ed since you're you are not just our newest
host but also a Las Vegas native. I think people
could probably assume that from your Vegas accents. Yes, natural, Yeah,

(03:39):
what is your favorite or the first AI product do
you want to get into today?

Speaker 3 (03:44):
I want to talk about the rabbit. The rabbit on
one Oh God.

Speaker 1 (03:47):
Yes.

Speaker 2 (03:48):
So this thing is a square box and I can't
tell if it acts without your phone or with your phone,
but it uses AI. You you speak into it, look
a walk, dooky, and it does a series of actions
what you say. So it can do all the things
that Siri could do five years ago, like change music
and start. But it also has like a three hundred
and sixty degree camera, which can based on the extremely

(04:11):
awkward and agonizing how a long demo twenty five minutes
but on me it felt like an hour. It can
look at a picture of rick Castle and start very
and after several agonizing seconds stop playing never give you up.
It can also it claims do a series of nuanced
actions like you can say, get me a cab home
and also put on my tunes and also change the

(04:34):
air conditioned at seventy four degrees, all in one one sentence.
Now you may think, why do I need to spend
two hundred dollars on a device to do this? And
the answer is you don't. You do not need to.
This thing looks cool and on some level, I'm just
glad we're getting new tat.

Speaker 1 (04:53):
Yeah, the design is not bad. It's like a square.
It looks like it's maybe two to two and a
half inches by two and a half inches or so
something like that. Yeah, a little screen. It's like well
designed from an industrial design standpoint, and I think the
big Yeah, it looks like it's just that it's a
It's a basically a seri that can use app It
can use uber, It could book a flight for you.

(05:14):
One of the things they show is it like planning
a vacation in London for you, which does seem to
kind of go against the point of like going somewhere
new and like figuring out what you want to do there,
as opposed to it's basically pulling from a list. I'm
sure in the AI wrote of like top ten things
to do in London.

Speaker 2 (05:29):
And it's just very weird because all of these tech guys,
who they very loudly claim their free spirits. They're independent,
they're not controlled by any authority, they cannot be manipulated,
all desperately want a machine to tell them exactly what
the hell to do with their lives. And it's so
bizarre because they we were discussing the different articles about

(05:51):
this and people trying to argue other thing these three exists.
It's like, oh, it takes out the friction between all
of these apps. I'm sorry, I just don't think there's
that much friction.

Speaker 1 (05:58):
Pull up my phone. I'm on a yes, right, I
pull out my phone, I pull up grub Hub, I
order food. It's very simple. It's remarkably easy. I don't
see how talking to a square is better, Like it's
the same, Like I could call someone on the phone
and do it hands free, or I could text them,
and I always text them because that's more pleasant.

Speaker 4 (06:18):
I mean, like, I have my phone open to signal
right now, I can swipe up go to Uber in
less than a second, saying the words move from signal
to the Uber app. Takes a whole lot longer than
just doing it with my thumb.

Speaker 1 (06:32):
I also do love the idea of like completely ruining
the point of signal, which is an encrypted, extremely secure
messaging app, to be like, hey, random box, I want
to feed my private messages through you and have you
read them out to me as I go about my day.
I don't know what your data retention policy is or
what you'll be doing with it.

Speaker 3 (06:51):
They sold down and they made two million dollars.

Speaker 1 (06:54):
Like ten million of them ten thousand.

Speaker 2 (06:56):
Sorry, it's just and it's I've read. I read like
eleven articles about this thing, because I occasionally drive myself
insane with these things when I see everyone excited about something,
but I can't read a single article that tells me
why I should buy it, even though my rap brain says, oh,
take with screen I want, but then I want to
use it.

Speaker 3 (07:14):
But I'll have to explain this to.

Speaker 2 (07:16):
The normal people in my life why I have this,
And I don't want to do that if it's useless.
But on top of that, I just don't think controlling
my life with voice is that useful.

Speaker 3 (07:24):
Yeah, I don't like that.

Speaker 1 (07:26):
I'm already and I think a lot of people are
already kind of fed up with the extent to which
my smartphone is a part of my life. Yeah, but
like it does irreplaceable tasks at the moment for me.
So I have it this thing is number one, adding
a device because I think it does require your phone.
But it's also, like you know, in addition to the
current problems I have with privacy on my smartphone, I

(07:48):
am adding another company and another device and another set
of security potential security flaws to it.

Speaker 2 (07:55):
But on top of that, the thing they have failed
to explain anywhere no jealous apparently is interrogate them about
this is they claim this thing can log onto your
Uber and make a flight booking, ostensibly having your passport information,
your date of birth, and all this stuff. First and foremost, that's,
like you mentioned, the data retention policy is very strange.
But where is this crap all happening? Is it happening

(08:16):
on my phone? Is my phone just doing all this?
I refuse to believe that. So you're doing this in
the kind of virtual machine environment. How is that possible?
Surely these companies are going to have a problem with that.
Mark Sullivan for Fast Company. Actually, I think ask them
this and they were like, oh, yeah, they'll be fine
with it. They just want people using their apps. I
do not think they're going to be fine with this.
Companies hate it when they hand off power from the user,

(08:39):
who will still be liable to another computer.

Speaker 1 (08:42):
Yeah. Well, the other thing is just that, like part
of me kind of suspects And when you watch the video,
we'll play a clip from it in a second. The
CEO of Rabbit very clearly, like a lot of guys
in tech, wants to be Steve Jobs. And I will
say one thing I kind of suspect that might actually
be that would be a Steve Jobs move, is he
may have just been hoping that this thing coming out

(09:03):
selling a shitload on free order and getting huge buzz
would force these companies after the fact to allow integration.
Like he may just be gambling, like if I get
enough buzz behind me, Uber and whatnot will come to
the table and be willing to work with me, because
suddenly this is like the Hippus new gadget.

Speaker 2 (09:19):
Except ten thousand customers is actually not that many. And
I actually look forward to I really can't wait for
two months to pass people to get this and someone
to end up like sending the word penis to their
or company slack because they wanted to order pizza, and
on top of that, ordering a fly, ordering an uber.
These are actually really nuanced actions. Coming to mandelay, baitenit

(09:42):
Uber took me to the wrong place because it decided
it wanted to go to the convention center. I did
not select that. If you go to the airport, you
need to put in Southwest Airlines and what have you.
With grub hub, you need to do little bits. It's
just most people don't order lunch. They order something for lunch,
and I just don't. Ah, this whole thing just feels useless.

Speaker 5 (10:00):
Yeah. See, for me, it's the additional level of abstraction
on top of these already abstracted apps that we use
to order our basic necessities like eating and things like that.
It worries me in sort of like a fantasy dystopic way.
What happens when people suddenly don't use that after getting
used to using it, Like what are they going to know?

(10:20):
Are they going to know how to operate a door
dash app? Are they going to know how to book
a flight? That kind of thing.

Speaker 1 (10:25):
Yeah, it is kind of because one of the things
there was a c neet review that said, like, well,
the potential of this is that it completely removes physical
use of a device. So you're using these apps, but
they're just a part of your life. Uber's just a
thing you talk to. You never look at anything when
you do it, And like, is that better? Like I
don't like the idea that you basically have a robot

(10:46):
that you treat as like your nanny that plans your
life for you. Like the amount of hype over there
will be a more concerted piece about this coming out.
But the first thing I thought when I looked at
all these guys talking about how cool it was to
be able to just tell a robot to book your
flight and plan your travel and book your hotels for you.
That's like part of the experience of traveling, and like

(11:07):
choosing things to do is like one of the things
that traveling is, and the desire so many people have
to hand off elements of choice really reminds me of
like cult dynamics, And I don't think this is a
consumer thing. I think this is specifically a weird subculture
of tech people of AI people, a lot of the
same folks who got into NFTs. But this desire, like

(11:28):
life is so complex and scary, I want to hand
over all of my agency to a robot. It's the
same thing that is behind a lot of like why
people join cults. And I don't think this is a
pro societal problem, but I think it is a weird
problem with the group of people who are most excited
to have a fucking rabbit.

Speaker 5 (11:46):
It seems like a sad thing to me that folks
might only attend bars or restaurants that are rated like
four point five and above that's decided by something else. Yeah,
and they don't get to have this like experience of
walking into like the sedious bar you've ever seen in
your life and have like maybe possibly like a life
changing experience.

Speaker 2 (12:03):
I was just in South Korea and we went to
this fried chicken place that ended up being close actually
was like we opened, but nobody was there, which made
me just want to leave before getting killed. And so
I just went to a random chicken place across the
road from my hotel, and I thought, well, it'll feed me.
It was wonderful, it was delightful.

Speaker 3 (12:25):
And it was I could not find any reviews for it.

Speaker 2 (12:27):
It was just a flipping place, and I don't I
think these people who were desperate for a device like this,
this kind of weird nanny device. First of all, I
don't think they think about the practicalities of this. I
don't think this is quicker or easier or better. But
also they're like, oh, I wish I could just say
one thing and all of these things could happen for me.
Same people, by the way, who were saying that people

(12:48):
need to pull themselves up by their bootstraps and do
things for themselves. It's just I don't know if they'd
even call it the stopian. It's just weird and sad
to me.

Speaker 1 (12:57):
Speaking of weird and sad, We're going to move on
to the next product in a second, but first I
gotta play everybody, in case you haven't seen it or
heard it, the CEO of Rabbit trying to rickroll the
audience with his Hell device. Have you seen this, Garrison? Oh? Okay,
eyes on the screen. Everybody to activate the eye, just

(13:21):
double tap the button. Oh funny seeing you here, Rick.

Speaker 6 (13:33):
Let me take a look. You're never gonna give you up.
Enjoy it?

Speaker 3 (13:49):
What am I getting rickrolled?

Speaker 2 (13:51):
In my own?

Speaker 3 (13:51):
Kenots. Let's move on to the next one.

Speaker 2 (13:54):
All right, I have a question real quick. So what
is the functionality he just activated? Is it that you
just put you point the eye at something and it
chooses an Actually the.

Speaker 1 (14:04):
I automatically see Rick Astley and choose to play one
specific song of his, Because that actually doesn't seem like
a feature. That seems like a bug.

Speaker 2 (14:12):
Yeah, that seems like what happens if it sees certain people.

Speaker 1 (14:17):
Yeah, Jeffrey Epstein, Yeah, what happens if it sees Jeffrey Yeah,
plays children screaming like what is how is this thing work?

Speaker 2 (14:23):
Booking trips to Florida.

Speaker 1 (14:28):
I maybe it's respectable that they showed how bad the
lag is because that moment where there's quiet after he
like clicks on, it is like it's loading, it's processing
for a considerable period of time, and.

Speaker 2 (14:41):
It's just Also I feel for the bloke because I
know he was probably so excited to do this and
he's like, I'm going to be Steve Jobs. But man,
when you can't perform, you don't perform, Like.

Speaker 1 (14:52):
Yeah, that's bad delivery.

Speaker 2 (14:54):
The did I just get Rick Rold in my own video?
It was like that I forget what the movie's Oh
hi mock.

Speaker 1 (15:01):
Yeah, it is and obviously like English is the reverse language,
but like it's a performance. You like you practice, right,
you get coached and stuff because you're trying to represent
your company.

Speaker 2 (15:13):
Oh I tell you this from experiences I've run a
the off. Yeah, that guy actually did practice because all
of that was He's actual timing wasn't bad. He just
does not have that dog in him.

Speaker 1 (15:24):
Yeah. Yeah, you bring in other people to do like that. Anyway, everybody, anyone,
anyone's mind on the rabbit changed having seen that. Absolutely
not Garrison has a look on their face.

Speaker 4 (15:37):
No, it's just like what I've always wanted in a
tech gadget is be able to point at three to
sixty five degree camera at a picture of a musician
and then wait thirty seconds and then have an AI
pick a random song of theirs. That's always what I
wanted for the future.

Speaker 1 (15:53):
Yeah, yeah, that's that's the dream of fucking Archimedes had.

Speaker 4 (15:58):
That's right.

Speaker 1 (15:58):
It was when he was building his laser that we
all saw in the most recent Indiana Jones film. Speaking
of the most recent Indiana Jones film, this podcast is
entirely sponsored by that movie. So here's some other ads.

Speaker 4 (16:22):
Why are we giving free advertising to Disney.

Speaker 3 (16:25):
Why are we.

Speaker 1 (16:27):
Why because that movie was so close to being worth it.
That last twenty minutested no Nazis machine gunning Roman legionnaires
pretty funny.

Speaker 4 (16:40):
Well, do you know who would have loved the s
our Comedies? Probably? Yes, he probably would have would have
had a great time. What what Nextarrison? The product do
we want to talk about?

Speaker 1 (16:55):
How about the pet one? Garrison? You saw that?

Speaker 4 (16:57):
All right? So I think I think I think me
and Ed saw chat GPT for animals.

Speaker 1 (17:02):
Oh yeah, damn it.

Speaker 4 (17:03):
Which is not really what it is saying. It's like
it scans a picture of your dog and then tries
to tell you if it has any health problem. It's
based on that picture. It's it's. It's You're not You're
not actually talking to your dog or anything. It just
it takes pictures of animals and then it analyzes it
to tell you how the dog is feeling. Blah blah

(17:25):
blah blah blah. It's it's. I saw a product like
this earlier at CES. I saw a product. I saw
a product like this last year. They're just calling it
chat GPT because it's an AI name. It's it's it's
it's it's.

Speaker 1 (17:36):
It's hip like because people people, they're hoping that that
will make people spend money up.

Speaker 2 (17:41):
It was every ces I see something that begins to
make me disassociate and I walk I walked past there
and Blovo the chat GPT for and my brain was
just like, could she just like start like glitching out?
And then when I went to look it up as
Garrison did, I was so disappointed because I hope that
these would just crackpots are like, yep, you put the

(18:03):
microphone to your dog. Now you know your dog's saying
that I would respect even if it didn't work, just
if you're like, yeah, fuck it. Yeah, your cat's said
he hates here. Your cat's been radicalized them Afraight.

Speaker 1 (18:14):
See, there's a fun product in here, which is you
sell to Rubes and a product that you're like it
translates your dog's micro expressions into language. And then the
actual paying customers are sickos like us, and you just
take control of somebody's pets voice.

Speaker 4 (18:30):
That would be so cool.

Speaker 1 (18:31):
You can have their like yeah, your cat's racist, now
your dog's are Nazi like.

Speaker 4 (18:36):
This is this is the perfect product for HP. Lovecraft
would have loved this.

Speaker 2 (18:43):
No, if you gave me like the show light to me,
but for dogs on my phone, I would spend whatever
you want A thousand dollars, I will geah.

Speaker 1 (18:51):
I wouldn't pay like average West Coast rent prices to
be able to like gaslight some family into thinking their
dog is a terrorist.

Speaker 2 (18:58):
See a friend of mine, Oh, what's what's wrong? Ed
chat GPT said the I said that my dogs joined
ISIS and I don't know. I don't know how he
did it, but he's been He's talking about a caliphate
according to the app. I don't know what this app
is bankrupting me. I paid four and a half thousand
dollars for this app a month.

Speaker 3 (19:19):
I don't know why I need it.

Speaker 4 (19:21):
So because so I unfortunately had to miss yesterday. So
there was probably an endless number of tech innovations that
I was unable to see because I had to miss
one day. But with the help of penicil and I
was able to return today to do one final.

Speaker 1 (19:35):
Chat GPT of antibiotics.

Speaker 4 (19:37):
That's that is exactly what my doctor said. Actually, but
I did. I did swear revenge on cees. So I
just walked around most mostly mostly the Venetian just seeing
all of the worst things I could find in documenting
them so I could get revenge from that twink poisoning
me with strap throat. So the first really good thing

(19:57):
is this. I mostly walked around the award winning sections
because that's where you find only the best. There was
an award winning speaker called Audio Cu that all of
their marketing was built from this horrible, horrible uh Ai
image generation of this like extremely busty blonde woman in

(20:17):
a latex suit, but if you zoom it onto her fingernails,
her her fingernails are like sticking through the wrong side
of her fingers.

Speaker 3 (20:26):
There, my god, oh my god.

Speaker 2 (20:29):
It's the woman's from that one movie for oh damn it,
not skin the one. The other it was the woman
where the alien was sexy and then she killed people
when she had sex with them. It's the same thing, Yes, terrifying. Yes,
rita's cool in and say what that is?

Speaker 4 (20:46):
Yeah, it looks just like that. It says relaxed, stick
it in, which is pretty funny. So that that was
pretty bad.

Speaker 1 (20:54):
Now I respect that. I respect That's that's a baller
move right there.

Speaker 4 (21:00):
Again. This is this is for a speaker company like.

Speaker 5 (21:02):
DJ Girlfriend in the shape it's a speaker in the
shape of a girlfriend.

Speaker 4 (21:07):
No, it's just home theater speakers.

Speaker 1 (21:09):
It just have a horrible AI generated woman as their spokesperson.

Speaker 5 (21:13):
I mean I would buy it if it was DJ Girlfriend. Though.

Speaker 1 (21:15):
DJ Girlfriend is a great idea for a product and
might stop several shows.

Speaker 3 (21:20):
AI has brought back sexism.

Speaker 1 (21:22):
If you do DJ girlfriend right, you could stop at
least one mass shooting.

Speaker 5 (21:27):
Finally, we have a real solution now.

Speaker 4 (21:30):
Another product that won the CEES twenty twenty four Innovation
Awards is an AI powered coffee brewer and grinder system.
I'm just gonna read the description from Coffee's been missing,
That's right. I know we wake up every morning make
our little French press coffee. That's fine. But you know
what could be better? An AI system that does it
for you. I'm going to read the award the award

(21:54):
description for this product. Okay, introducing Barista Brew Coffee Brewer
and Grinders System, a smart coffee system that tailors your
brew to perfection with AI guided personalization. Easily adjust brewing
parameters for a custom cup. New expertise needed rate to
track and refine your bruise brew iq AI suggestions for

(22:15):
your ideal taste, simplify with one touch favorites, elevate your
coffee experience.

Speaker 2 (22:22):
Yeah, well, I hear all that. The one thing I
think is simplify. That's that's simple. The Movie's Spacies, by
the way, I.

Speaker 4 (22:29):
Love that movie. One of one of one of the
best hr Geiger art utilizations.

Speaker 1 (22:34):
Yeah yeah, and easily the horniest movie of the nineteen nineties.
Like this, which is a lot of.

Speaker 4 (22:39):
Which is which is a high high bar, so on on.
On this AI coffee maker on the front, there's a
little control panel with nine different settings that you can
you can change because they're all in a graph. We
have we have citrus, spice, nutty, fruity, balanced, cocoa, floral, herbal,

(22:59):
and honey. So you can you can with your with
the with the ease of a touchpad, start to customize
your own AI coffee. So that that is revolutionary. I'm
going to be getting one for Robert this Christmas.

Speaker 1 (23:10):
Thank you, Garrison. I know I've always thought, you know,
what I hate is the experience of exploring new flavors
on my own and learning new ways of brewing coffee,
a beverage I consume every day. So I'm glad to
be handing that whole experience off to a machine.

Speaker 4 (23:26):
That's right, And I know a lot of people used.

Speaker 1 (23:29):
Tavia just brought something up that I think is relevant here.

Speaker 5 (23:33):
It's a Guardian article about an AI smoothie shop that
opened in San Francisco well before ce S. That is
a combination of uh it's being driven forward with AI
technology as well as five G stuff that I think
had opened up and then like three weeks later had
shut down.

Speaker 1 (23:49):
Oh that's too They were like, a robot will pick
the perfect smoothie for you.

Speaker 2 (23:53):
Well, I actually want to bring work. I want to
bring something up. So I love smoking me I pelosmokers
at home. And I saw a few times on this
show AI grills and I just looked up one called
a brisk It smart grill, and I was like, how
could you possibly make a thing which is basically maintaining
hot air in a tube long enough until the food's done?

(24:15):
And what it is is it has a thing. You
can ask the grill what seasoning should I add to
make my chicken skew a spicy or how do I
see a medium rest eak? I don't fucking know. Why
don't you learn to cook.

Speaker 1 (24:26):
You twat, it's just like.

Speaker 2 (24:32):
The enjoyable part of cooking is the experimentation and learning taste.
But no, thank you, just like that goddamn coffee thing. Oh,
I don't want to learn anything. I don't want to
have a human experience.

Speaker 5 (24:42):
That's the thing with a lot of these AI solutions,
will call them, is I feel like they're robbing people
of real experiences.

Speaker 1 (24:49):
Yeah, for like no bit, Like there's some stuff that like,
you know, the ability of a smartphone to once you
had to be like in a building in order to
like access a phone or like use a payphone. Now
you can connect with people everywhere. That's that's a clear benefit, right,
there's downsides to it, obviously, but it's a clear benefit.
But like now you don't have to learn. Now, now

(25:11):
you don't have to cook. You can let a robot
do it for you. It's like, well, but why cooking
is pleasurable? And if I don't want to cook, I
will go to a restaurant or order food, and it's
cheaper than buying several thousand dollars AI device.

Speaker 4 (25:25):
I mean, some things are hard to learn, which brings
me to the next product.

Speaker 3 (25:29):
That's smoking me.

Speaker 1 (25:30):
But hard word.

Speaker 4 (25:34):
Kind of like like like.

Speaker 7 (25:36):
Parenting, right, so good, okay, nice, you know what, Garrison,
I'm proud of you. That was a good SEXUA so
AI parenting, especially with your infant child. This was also
in the CEES Awards section, so you know it's going
to be legit. I was able to see an demonstration
of an AI baby crib that will shake your baby
up and down based on facial expression analysis done by

(25:58):
an AI.

Speaker 4 (26:00):
Yeah, that's what you do is I'm gonna show you
show it here. So here is the cutting edge facial
expressions we have anger discussed fear, happiness, sadness, and surprise,
and that basically that data will go into this little
crib which will start shaking and moving up and down
based on what they skin on your baby's face.

Speaker 2 (26:18):
So to be clear, there is a product snow that
exists where it drop my phone there. There's a product
called the Snow which is like a for infants and
it notices when they're fussing and it kind of like
lightly rocks them, but the way it rocks them is
so very light. It is very much a This is
this is what a mother would do with a brand
new baby, freshly baked. You don't want to move into much.

(26:41):
That one has like six pictures from the intro of
light to me and a heart rate monitor, and it's like, yeah,
hand over your baby to AI.

Speaker 3 (26:50):
Great.

Speaker 5 (26:51):
Yeah, this product looks like a baby Morocca the pace
that you've shaken it, which is dependent on what pretty
much you make pretty much.

Speaker 4 (26:58):
Well.

Speaker 1 (26:58):
I love it also because like a real scandal I
think from the I think it was in the eighties,
is like Nanny's shaking babies to death, like the idea
that like again a machine that can only go at
a certain pace that's very light.

Speaker 8 (27:14):
You know.

Speaker 1 (27:14):
I get that's a labor stave, especially for like a
single parent or whatnot. Like you know, some some people
will need that. But I just worry. I worry that
we're not all that far from our first and AI
killed my baby.

Speaker 4 (27:27):
You know, I think I think that. I I think
the real beauty of this product is usually when you
have a newber, maybe you have to like watch it
all night becas it'll wake up, you have to like
pick it up, pat it, make sure it gets back
to sleep. You can just leave that baby in the bed.
You can you can like go to the club. Yeah,
just leave the baby in the bed. If it starts crying,
don't worry. The AI will take the will take over.

Speaker 1 (27:45):
We are on the verge of beds that can raise
our children, just like the Venture Brothers.

Speaker 4 (27:50):
That's right, and and those and those kids turned out fine,
they turned out great, perfect, But I think luckily, luckily
for you, because I know none of us are babies anymore.
But we are all you know, eventually going to get old, hopefully, hopefully,

(28:11):
And there is air products that will also assist us
as we get older using the same AI baby tech.

Speaker 3 (28:17):
Here.

Speaker 4 (28:17):
One of the one of the one of the places
that me and Robert stopped by was called Blue Sky AI.
It's spelled ridiculously offensively, and they refused to do an interview.

Speaker 1 (28:28):
They were not happy.

Speaker 4 (28:32):
But I was able to get a pamphlet and they
have an AI that I think they're mostly targeted to
get at like older people. But quote by comparing the
way your facial and vocal behavior changes over time, using
your facial expressions, facial muscle actions, as well as where
you are looking, your body posts, and the tone of
your voice, we have the potential to identify and monitor

(28:52):
all kinds of medical conditions that manifest in the face
or voice, So it's it's a facial scanning and voice
scanning uses AI to try to diagnose you with medical
conditions specific specifically, the guy told us that it's it's
useful for Alzheimer's. Then he realized we were journalists and
the nasses to go away.

Speaker 1 (29:14):
It also, but yeah, that's how you know you've got
a good medical device.

Speaker 4 (29:18):
A good product. At CES Blue Sky uses a continuous
approach apparent valiance and arousal to measure to measure expressed emotion.
This better fits the real human experience of emotional states.
This approach allows emotion regions to be defined and to
measure the transitions away from and towards these regions. This
continuous approach, where appropriate, can be mapped back to a

(29:41):
much less exact categorical representation. For example, excited, calm or angry?

Speaker 5 (29:46):
Did he have horny?

Speaker 4 (29:48):
They do not have horny, not that I can see.

Speaker 1 (29:50):
Look, if you know old people, one thing they never
stop doing is fun.

Speaker 4 (29:54):
Now, they do have a list of all human emotions
here that char you turn it on a map. Finally
that using AI, we can finally figure out what emotions
you're feeling based on your face, so you can use
this just with your with your phone camera, with your
with your iPad camera. They do data collection, data analysis.
One of the weird use cases that we saw was,

(30:16):
I know we saw something similar to this already, but
just scanning your face when driving to tell you how
you're feeling, which is just quite fun.

Speaker 1 (30:24):
Yeah, it's a I could talk about that a second.
What this reminds me of there was a product a
few years ago. It was like a robot for the military,
and the idea was this robot can run in dangerous
situations and pick up troops that have been injured and
run them out, which is probably a thing that will
exist at some point and might even save lives. Right,
I can see how that would be a useful thing
in the military can be very dangerous to retrieve people.

(30:44):
Much better for a robot to get shot or blown
up in that situation than another person. But to try
and comfort the soldiers, they gave the robot the head
of a Teddy Bear, like a metal Teddy Bear head.
It looked like a fucking nightmare. It's just like, what
what do you think did you talk to There's all
sorts of guys who have been shot in combat. Did

(31:06):
you talk to one of them? Did you go with
the experience of having your arm blown off? Corporal have
been more pleasurable if a giant metal teddy bear it.

Speaker 2 (31:16):
So my first job was working on the characters and
twisted metal, but then I moved into robotics. It's so
cool that how many of these products are very clearly made, funded, prototyped,
r and D, hired PR teams. Everyone's done these big
presentations without talking to a single fucking human being.

Speaker 3 (31:34):
It's so cool.

Speaker 2 (31:35):
It's so cool how much waste there is at this
show where not a single human soul there is a
completely different subject. There was like an AI powered nail
salon thing as well. I saw, and I'm like, that's
definitely one where you didn't talk to talk to any
woman though. Yeah, first and foremost in my experience, one

(32:00):
is scared of a new nail place for fucking up
their hands. So are they going to spend eight hundred
goddamn dollars on this thing to maybe get burned? And
I saw in this article about it just now that
their thing they said was, oh, yeah, it's like an
espresso at home. I've had an espressos break multiple times,
and I realized it may sound weird. How can you
break an espresso? I'm just built different.

Speaker 1 (32:21):
But if I can break it in just like me
and strep throat, unbelievable.

Speaker 4 (32:27):
So I do have one more product, and then I'm and.

Speaker 1 (32:30):
Well, first, Garrison, I know you have one more product,
but we also have one more ad break. Ah, we're back, Garrison.
What's your next product?

Speaker 4 (32:48):
So we already talked about the Handy, which is you know,
like sure did, which is by all accounts actually like
works as intended.

Speaker 1 (32:57):
It's a good product. The people, the PR people, and
we talked to the CEO. We're not just knowledgeable, but
like remarkably good at keeping a straight face while talking
about their jack Well, that's professionalism. You have to respect it.

Speaker 5 (33:10):
Honestly, that was the most professional booth I saw the
entirety of cees. They were really on point.

Speaker 1 (33:15):
If you are looking for a jack off machine, I
can't recommend anything more highly.

Speaker 4 (33:20):
Well, Robert, except for our next product, which is an
AI power to jack off machine.

Speaker 1 (33:26):
Thank god.

Speaker 4 (33:28):
So this is called my Hixel. It is the first
it's the first app.

Speaker 1 (33:33):
That's an appealing names that's a name that sounds like sex.

Speaker 4 (33:37):
It is the first app for climax Control to incorporate AI.
Now I'm gonna I'm gonna read through their.

Speaker 5 (33:46):
Really redefines edge technology. Huh.

Speaker 1 (33:49):
I want to make a note before you get into it.
The thing that they're claiming this is useful for. There
are devices for and it is a real use case,
which is that like premature ejaculation, this is a serious
problem for a lot of it. It's like a quality
of life issue, right, like it stops people from feeling confident.
It's a serious problem. There are prosthetic devices people can
use to train themselves. That's fine, they already exist. This

(34:13):
is basically like what if an AI could teach you
how to come slower?

Speaker 4 (34:17):
Yes, and we have a six step layout here describing
why why my hixel is right for you. For the
first step is secure and anonymonized data collection, so you
can get o good all of your coming data stored,
but don't worry, it's secure.

Speaker 8 (34:37):
See.

Speaker 1 (34:37):
My first question to that is why is data on
me masturbating being collected at all?

Speaker 4 (34:42):
Well, it could be because they're putting it towards an
eight week training program.

Speaker 2 (34:49):
No, So, first and foremost, one of the first things
on the website for this is just the words happy
sex here, save sixty dollars in my Hixel Control. But
happy sex here is going to be something I think
about for a while. But also it says it has
my Hixel Care and my Hixel Control, two different things,
and then my hicksl Academy, and sadly you can't click
on that because I've never wanted to know more about

(35:11):
what how much material could that be?

Speaker 5 (35:14):
Unless a masturbation academy.

Speaker 1 (35:16):
Yeah, I thought they just called that eton. I was
a British public school joke.

Speaker 5 (35:20):
It's okay, I made an edging joke earlier and nobody
caught it.

Speaker 3 (35:23):
Yeah.

Speaker 2 (35:24):
I there's one thing the eating boys do and they
don't have sex.

Speaker 1 (35:29):
No masturbation.

Speaker 4 (35:30):
Yeah.

Speaker 3 (35:30):
Sorry.

Speaker 1 (35:32):
Part of what I hate about this is its name
is so clearly like trying to be respectful and like
respectable and a tech product name as opposed to like
one of the things that I respect about the Handy
people is they just went ahead and called it handy.

Speaker 4 (35:45):
I mean it's weird because like some of their some
of their free merch were labeled with stuff like download
the app to control your loads, we bring the game,
you bring the joystick. The first day you went for
a run, you couldn't last more than three minutes either.

(36:07):
So it's weird how they Yeah, I had this very
like sanitized branding except for their like free merch but yeah,
it has. It has Bluetooth connection, interactive and personalized settings.
You can monitor your user evolution, and it is it
is marked as a medical device. But on on their
brochure there's just two really really good sentences. There's video

(36:28):
feedback from our sexual health professionals. So after you come,
you can get on a video chat and talk about
there we go, there we go looking good.

Speaker 5 (36:37):
It's the pillow talking. Add on, I'd.

Speaker 2 (36:40):
Love to be one of those people as a gun man.
Three minutes. You can do better than that. Come on,
So they are you meant to encourage them?

Speaker 1 (36:47):
Yeah?

Speaker 4 (36:48):
Yeah?

Speaker 3 (36:48):
Are you meant to commiserate with them? Yeah?

Speaker 1 (36:50):
What is the goal here?

Speaker 3 (36:52):
Yeah?

Speaker 2 (36:52):
But also I cannot think of a single person i'd
want to talk about that with.

Speaker 1 (36:56):
Yeah, I'm just imagining like the guy in the other
and be like no, no, no, no, zoom me the camera
a little more. I want to see those ropes. No,
that's not bad, that's not bad. Good consistency.

Speaker 5 (37:05):
Okay, let's move that over. Let's see his O face again.

Speaker 3 (37:07):
Wow, you play that, my friend? That your load management
is very consistent.

Speaker 4 (37:14):
And I think I think we're really missing is how
much how much AI will assist in this because they
claim that using cutting edge technology? Eh eh my hixel
Control is the first solution to include AI and machine
learning for climax controlled treatment, which is just really really reassuring.

(37:34):
So yeah, it basically looks like a flashlight that connects
to your phone and it's an app with anatomical realistic
interior design and AI and secured it and and adanemonized data.

Speaker 5 (37:47):
I think this is really going to open up some
avenues for sex workers.

Speaker 4 (37:51):
Yeah, hope, hopefully, hopefully, Tafia.

Speaker 1 (37:54):
It's It's also like the design the handy is very
clearly a robot. You stick your dick inside and it
jacks you off. This looks like a flesh light except
the back and like the front end that we unscrewed
the top and it's like a fake vagina looks like
a fleshlight. The back end looks like an incense diffuser,
Like like someone decided these two products needed to be

(38:18):
Like what if you could fuck your your aromatherapy bot.

Speaker 4 (38:23):
Finally, So that is that is most of the the
the just the groundbreaking AI products that I was able
to see today. Does anyone else have any AI products
they would love to talk about?

Speaker 1 (38:34):
It's time to talk about ganert I okayert obvious you
want to start us off about ganert.

Speaker 5 (38:41):
Okay I guess We attended a panel. Which panel was it, y'all?

Speaker 8 (38:45):
That was the dhs AI Yeah, that was That was
the AI panel with one of the heads of the
Department of Homeland Security, who I can confirm because he
turned around to take a selfie has a Hank Hill ass.

Speaker 5 (38:58):
He was very insistent on that.

Speaker 1 (39:00):
No but absolutely no. But and I'm saying this not
to shame him, but because there are ORTHOTICX for that
you can get help, sir. That's even a whole episode
of King of the Hill, one of the better EPISODESOD times.

Speaker 5 (39:15):
Soert AI was announced before this talk that we had,
and it was a I think the guy announcing both
this this event as well as the panel had taken
some time to really focus on the fact that this
was his quote unquote.

Speaker 4 (39:31):
Opis his his opis opus. He said the word opus
like five times Ganert's what I'll be remembered by.

Speaker 5 (39:38):
This is my legacy. Yeah, and then I guess two
of the designers had come up who stuck out like
a sore thumb compared to like the sea of khaki
and blazers and things like that.

Speaker 1 (39:50):
Yeah. Yeah, they had clearly never ordered a drone strike,
unlike our hero and Homelands security.

Speaker 4 (39:55):
One of them had a wide brimmed hat that was
color matched to the Gert logo, which is pretty cool.

Speaker 1 (40:01):
What does Genert stand for?

Speaker 4 (40:03):
Ganert stands for generate, So I think it's actually just
called it generate. They just took out the vowels. But
this is going to be a three day event or
a conference held in Arlington, Virginia's they're claiming that it's
gonna have like two hundred speakers, one hundred and fifty

(40:24):
AI sessions, more than five hundred startups, one hundred fifty partners,
one hundred investors, and around five thousand attendees. They're trying
to target enterprise governments, platforms, AI tools, AI builders, services, investors, startups,
and media that it's it's these three events held simultaneously,
ones just called Gennert or Generate AI, which is about

(40:47):
just AI AI tech. It's about like AI companies, classes, keynotes, funding,
blah blah blah blah blah. There is then Voice and AI,
which is about AI language services. And there's also for
gov ai, which is about public sector and how the
government's gonna start integrating AI or regulating AI. And they
also have one for coding called code Forward, and it's

(41:11):
it's a bummer. We can't just play the opening video
because opening video had no like voice here.

Speaker 1 (41:16):
But there's yeah, there's no voice. I can read it though.
Ninety seven million new jobs in AI, five hundred billion
in annual AI spend by twenty twenty seven, two hundred
and fifty billion in VC.

Speaker 3 (41:27):
Funding by twenty twenty five.

Speaker 1 (41:29):
Conert generate for a new world in a new market.
Gonert connects and forms, elevates and inspires. It all happens
at Ganerts.

Speaker 4 (41:39):
And we cannot emphasize it enough. How they hyped up
VC cash there, there was there was so much build
up for VC cash.

Speaker 1 (41:48):
I have I have watched people who are dope sick
by heroin with less jittery excitement in their hands and eyes.

Speaker 3 (41:58):
All right, So but about shit like this?

Speaker 2 (42:01):
So I just did a brief cursory look up Ganert
and it's and it's it's connected, Confidence, Voice I and
Gove and code Forward, and all of them are claiming
the following. They're featuring GitHub, Microsoft, open Ai, Codium tab nine.

(42:21):
Their thing on LinkedIn has twenty eight followers, and their
engagement is like when I post the word twitter on Twitter,
it's not very good at all. I can get more
on that doing any of the Pocht picture of my
asshole and get more than that. But also I cannot
find a single person claiming to attend this. Despite them
claiming two hundred bus speakers, one hundred and fifty plus sessions,
five hundred stars, one hundred fifty part one hundred investors,

(42:43):
five thousand attendees, I can't find a single bit of
evidence that anyone is ganerting around at all. And also
they claim to have three different conferences, code Forward, Gove, AI,
Voice AI and of course Connert AI and I, and
of course all of these are part of the AI
beta experience.

Speaker 3 (43:01):
I don't know why you put beta.

Speaker 2 (43:03):
People are beta as hell, But also why have you
got beta on a conference?

Speaker 3 (43:08):
What are you doing? But also.

Speaker 2 (43:11):
Featuring open AI and video Microsoft, Google, and Verituan. I'm
gonna guess that they've got like chat, GPT, open on
a computer, an Nvidio GPU and something Microsoft Word and
they've used Google. And it's very strange because I don't
know what this thing.

Speaker 1 (43:29):
Is, you know. I think what it is is some
guys who have a degree of like nay like like
some guys who are hoping that they don't have any
actual ideas for it to do with AI, so they're
hoping that if they create a conference and make that
be like the Cees of AI, they can kind of
force a place for themselves and also attract a bunch

(43:50):
of suction up a bunch of money.

Speaker 2 (43:51):
I also found some I found some of the speakers.
You've got fellow called Adam Goldberg, who's an account director
and head of a is your open Ai enablement on
the go to market team open Ai. They found a
sales guy from open ai and then said they got
someone from open Ai. They got someone from JP Morgan,
Chile's data and AI design. These are all fake jobs.
These aren't real jobs. And I think that these conferences

(44:14):
are amazing as well, because all people do at them
is they go they watch these things where people go
up on stage and go, you know, generative AI is
going to create maybe even trillions of dollars of value
at some point, and you know, the synergy between generative
AI and data collection but also data silos is going
to be truly, truly innovative. And everyone's like, holy fucking shit,

(44:37):
whoa holy shit, piss no, and then they all post
it on Twitter and they all forget it. Ever happened immediately.

Speaker 5 (44:44):
Yeah, we call that the dividend.

Speaker 4 (44:47):
We do call that the dividend. So Gert's being put
on by this guy who runs this like panel collection
called brands gpt at ces.

Speaker 2 (44:56):
With a Z.

Speaker 1 (44:58):
No, it's not Pear and Gar with a Z.

Speaker 4 (45:01):
Should it should be. I think me and Robert both
went to like one or two of these brands GPT panels.
This is the one where Robert got to yell at
Google and Microsoft and get them mad.

Speaker 1 (45:12):
No Google and McDonald's McDon donald's ahead of AI, which
is a thing.

Speaker 4 (45:17):
So they look to basically just focus on like convention programming.
So now they're trying to put on their own convention
that they're calling Ganert instead of just running this brand's
gpt at CES. So that's that's the background. It's it's
done by Mode v events. That's Mode and the letter VD.

(45:37):
But one word that's like the parent company for this.
I'll be interested once we get closer to October. I'll
be interested to see if this is looking more like
a real event. It's it's not going to be that
far for me to travel. But no, they're they're promising
five hundred billion dollars in annual AI spending with two
hundred and fifty billion new VC cash investments, which is

(45:59):
which is quite promised.

Speaker 1 (46:00):
Yeah, so hopefully this beta test goes like the last
video game beta test that I went to and everybody
clips through the floor and disappears into avoid Well, I
think that's gonna do it for us in this episode.
And I want to leave you all with well, before
we've got one more thing. But before we get into that,
which which will be fun, I want to talk about

(46:22):
something sobering, which is that, as you may get from this,
nearly one hundred percent of the AI use cases that
we saw presented were either nonsense or incredibly vague. At
these different that where you had people from like Nvidia
and Adobe and whatnot that like they wouldn't say like,
we're going to use AI for this specific task. They
would say we're going to use AI to get more nimble,
which I think means firing people.

Speaker 3 (46:43):
You know.

Speaker 1 (46:43):
Outside of that, the only real specific use cases that
were not clearly nonsense. We're stuff like replacing you know,
customer service workers with chatbots, which is bad, and to
be fair, some also really good stuff like that telescope
that used kind of machine learning in order to like
clean up images so that you can get better, better
images and whatnot when you're in an area with a
lot of light pollution. There was some stuff like that,

(47:04):
but usually very vague. The use cases for AI was
always extremely clear were the harms and the very first
panel we attended, there's a company called Deloitte. They're a
huge consulting firm. If you know about McKenzie because they're
currently somewhat rightfully so a bit of a bugbear on
the left, Deloitte is a similar kind of organization, right
I think they're a bit less toxic, but to a

(47:28):
marginal degree. They're like a massive consulting firm. Companies bring
them in in order to help them streamline and make
processes more efficient and stuff, and one of their people
said that according to their internal metrics, they expected half
a trillion dollars in fraud this year in one year
due just to voice cloning AI. And that was a

(47:52):
more specific statement of what AI is going to do
to change people's lives than absolutely any positive use case
I heard presented at this conference.

Speaker 4 (48:02):
Could you like explain what you mean by voice cloning
so AI.

Speaker 1 (48:06):
You know, we did a couple of Bastards episodes talking
about scams and like how they've contributed to the decline
of trust in our society. One of the things that
is in the last year or so become a massive
problem is there are now AI things that can generate
a human voice near perfectly to the point where, especially
if it is a voice of say your kid calls
you and they're telling you that they have been fucking

(48:27):
kidnapped or you know, something else has happened and they
need you to wire them money desperately and you send
them the money, it's a fucking scam, right that is.
We had a person from Deloitte, and I think it
was a person from Adobe, say that they had been
called by a colleague who had gotten like a call
thinking it was that seemed to be them asking them
to buy a bunch of Apple gift cards like shit,

(48:49):
Like this is extreme and it's only going to get
more common. You can automate to the writing of the
scams and the sending of the scams using these AI tools,
and that is absolutely, in my opinion, much more of
a direct way in which AI is going to affect
people than any single product or even cumulatively, all of
the AI products we saw at CE.

Speaker 4 (49:13):
All that uplifting note Yeah, yeah.

Speaker 1 (49:16):
So that's a bummer, and well, we will be going
into more depth about that, but I wanted to end.
Tavia took notes at all of the buzzwords, particularly the
AI buzzwords that we heard during the convention, and she's
going to read that to us now.

Speaker 5 (49:31):
You gotta tell you, this list is incredible. I've worked
in and out of corporate America, and much like a cult,
they have their own internal vocabulary that they use, and
this convention we went to was just filthy with these buzzwords.
So I'm just going to dig in. The ones that
I've written down are double down, love that one. That
one comes up a lot. Versioning, versioning, versioning, which is

(49:54):
like a legitimate term in software, but I was hearing
it used in places where it didn't make much sense
to do it. Then our favorite liar's.

Speaker 4 (50:02):
Dividend by by by far the best term that we've
heard at the conference, So flexible.

Speaker 1 (50:08):
Yeah, I'm using versions of that and everything. You know,
it makes me think a lot about the murderer's dividend,
which is when you know, I longer I have to
deal with an annoying person.

Speaker 5 (50:16):
We got content credential, which is coming up a lot,
especially around the topic of AI. We have data rich
and it's sister term problem rich core values, which I
heard in every single panel that we were in.

Speaker 1 (50:29):
Yeah. Usually the context of this was we don't need
regulations around how AI can be made and put together.
The core values of the companies is what we'll make
sure that AI isn't used in a harmful way.

Speaker 3 (50:43):
Great, that's that's gonna happen.

Speaker 5 (50:47):
No, very trustworthy, very trustworthy groups. Got risk model. And
then my next term is the favorite one. It's so good,
I think I'm gonna give this one to you.

Speaker 1 (50:56):
Yeah, because I don't think we talked about this. Guardian
MM or something like that was then it's MM guardian
MM Guardian, which is an app you put on It's
not it used to be an app. Now it is
a phone you buy for your child. It's a modified
Samsung Galaxy something or other that there's no seven. It
gives your It gives you, as the parent, complete access

(51:19):
to your kid's phone and everything they're doing. And it
automatically monitors, monitors all of their not just their conversations,
but their browsing history, and sends you alerts. So like,
if someone sends your kid a text that says you
should kys you know, kill yourself. This is the example
he showed us. You get a message that like there's
this suicidal discussion or whatnot going on, we ask them,

(51:40):
you know. Hey. Garrison particularly was like, what if this
is a situation where a parent is abusive and like
using this in order to keep tabs on their kids
or like hates you know, is like a child is
gay or trans and their parents are not accepting of that.
Like does this still can parents still like spy on

(52:01):
them over that stuff? Are there any limitations? Are there
any sort of safeguards built in in case a parent
is being abusive right to like monitor or sin to
the authorities of a parent is using this in an
abusive way. And their answer was no, We're purely about
giving parents more power. And yeah, the term that they
used was tech contracts with children.

Speaker 5 (52:23):
I can't think of anything more dismal.

Speaker 1 (52:26):
Yeah, that is one of the most dystopian assemblies of
words I've ever heard.

Speaker 4 (52:31):
Should you should you should never say the phrase contracts
with children. That's just that's just like if you find
yourself ever ever hearing the phrase contracts with children spoken
by anyone, run away from that person as fast as
you can, maybe maybe maybe punch them in the face first,
and then run away as fast as you can.

Speaker 1 (52:50):
So that's a good one.

Speaker 3 (52:52):
That's that's some shit you just keep in Florida. I
guess now.

Speaker 1 (52:57):
It's a super Florida app.

Speaker 4 (52:58):
That is.

Speaker 1 (52:59):
That is the scent of this business.

Speaker 5 (53:01):
Moving on, We've got other terms called like visionary and
thought leaders, which comes up a lot in these types.

Speaker 2 (53:07):
Of I mean, the pr shit people love saying thought leader.

Speaker 1 (53:11):
I love it.

Speaker 5 (53:12):
Eat it up. We also have Edge Computing I know.

Speaker 1 (53:19):
Yeah again handy, great company.

Speaker 5 (53:21):
Incredible company, very very excellent product. We have Digital Twin
Horizon Scan.

Speaker 2 (53:27):
So Digital Twin's really good because it means like eight
different things. It can mean literally a copy of something,
or it can mean a digital version of something. It
can mean like a metaverse thing. And these are all
different industries using it, and no one can agree on
the meaning.

Speaker 5 (53:43):
Yeah, that's just tradition. That's just like what they do.
They have horizon scan. I actually kind of liked that one.
Was the first time I heard that one. When they're
just like looking into the future, I think they're calling
that horizon scan use case, which came up a lot
because everyone was groping for use cases for their technology
and didn't seem to have any that they could bring up.

Speaker 4 (54:04):
The next one I heard way more than I wanted
to hear, which was accelerate. Yes, always always a great
term to hear in tech. There was there was so
much accelerate and accelerating relating to their tech development and
their tech use cases. For another one of those terms

(54:24):
that Tavia just read off.

Speaker 1 (54:25):
Now, this next term is a real thing and an
important thing, and not a thing that anyone in the
tech industry wants or cares about the right to be forgotten.
This has actually been legislated. The reason they have to
care about this to some extent is it's been legislated
in the EU right, and it should be everywhere. I
actually think this is an incredibly important concept, and it's
basically the you know, we have people go viral that

(54:48):
become a main character on whatever app for being a
piece of shit sometimes or sometimes doing something stupid or
sometimes doing something innocuous that for no reason at all
makes a huge number.

Speaker 3 (54:58):
He's actually a really good example.

Speaker 2 (54:59):
There was a kid who posted a video of himself
and it was like four point zero gpa had a job,
brais money didn't get into Harvard or something. He didn't
mean it in this way, but someone took it and
then turned it into ay why kids are being kept
off Harvard thing? And he dm them was like, you're
ruining my fucking life. Yeah, this is how this, like
the right to be forgotten, should be everywhere.

Speaker 3 (55:20):
Yeah, is not.

Speaker 1 (55:21):
It is a hugely important thing, and you know, I
actually give the EU a lot of credit for the
fact that that has to some extent been legislated. All
of that needs to be more common in other countries
and more vigorously enforced. I don't I say that I
have no idea how you do it with the internet.
Working the way it does. Some of this, I actually
do think is a values thing where we all need

(55:43):
to be more okay with the fact that people, even
people who can do something shitty online, deserve to not
have that necessarily define the rest of their lives, especially
you know, teenagers.

Speaker 2 (55:56):
And the next one is one that I like to
associate with my posts, data poisoning. I believe every time
I interact with Twitter or blue Sky, that is what
I am doing. I have some data poisoning gap, or
I am data poisoning as a verb, or I am
data poisoning myself.

Speaker 1 (56:14):
Yeah.

Speaker 5 (56:17):
Uh.

Speaker 1 (56:17):
And then we've got oh, Garrison, you want to do
this one?

Speaker 3 (56:21):
Sure?

Speaker 4 (56:21):
These these are the last three that I got from
an AI ethics panel. We have data silos, how data
is all separated. We have data harmonization, kind of the
opposite of data silos.

Speaker 1 (56:32):
Yeah, that's basically using AI to generate pictures of dan
harmon Right.

Speaker 4 (56:37):
Yes, Then we have the last term, which I will
I will describe for you the speed capacity gap. So
the speed capacity.

Speaker 1 (56:49):
Gap, I know I can answer that for you. So
sometimes when I'm doing a shitload of amphetamines that I
purchased from some Turkish website via the dark web. You know,
I'm doing them with a friend and they od because
there's a day there's a speed capacity gap, which we
need two of us.

Speaker 4 (57:06):
Yeah, that's what that uh, that's what that DHS guy
was talking about. For using AI to monitor dark web
purchases is going to really get on that one. No
speed capacity gap the gap between tech acceleration and the
capacity of society to keep up and make informed decisions
about that technology, which is actually kind of a useful terms.

(57:27):
It's it's just one of those you know, it sounds
like a silly tech term, but when the win it's
actually explained like, oh, that's actually a really good way
to think about the way AI is being pushed in
all of these new ways, and are we actually as
a society, whether that's like as a government or just
like culturally, able to actually make inform decisions about how
we want this tech to be integrated into our lives.
And now the dark side of this term, the speed

(57:49):
capacity gap. For the to to kind of solve this gap,
we can either slow down a development or we can
speed up our capacity. And the panelists obviously we preferred
the latter, and so we should just speed up our
cultural capacity.

Speaker 5 (58:05):
Did they propose a solution for.

Speaker 4 (58:07):
That, Well, kind of, but it's it's a little unclear.
We can go through my recording at a later date
once we do our full AI episode. But they're rationale
for why we should instead of instead of slowing down
tech development instead speed up our cultural capacity is because
of the many benefits that tech improvements can be made

(58:29):
via tech iterations. Right, the more iterations you get of
technology that the more benefits are able to get from
said technology. A versioning y version exactly, which brings us
all the way back to versioning there we go.

Speaker 1 (58:41):
Yeah, which brings us all the way back to turkisham fetamines.
Because I've been for the last twenty years trying different
versions of turkisham fetamines and the blue pills. Man. You know,
normally you don't hallucinate on speed, but when you take enough,
it turns out you can. And so I think what
I'd like to leave everyone with is the knowledge that

(59:01):
Turkish shamphetamines are a thing you can purchase on the
dark web and should there's no health consequences to it
at all.

Speaker 2 (59:07):
I'm not part of this off line does not support
illegal drug purchases.

Speaker 3 (59:12):
Respectful podcast.

Speaker 1 (59:14):
They're not illegal if they're so new that the DEA
hasn't banned them yet, that's innovation exactly exactly.

Speaker 2 (59:21):
That's versioning, and that is the speed capacity gap, folks, that.

Speaker 4 (59:32):
The DA can't keep up with the tech improvements.

Speaker 1 (59:36):
All right, everybody that's going to do it for us
here at cool Zone. Before we leave, I want to
give Tabia and Ed both chances to plug their pluggables. Ed,
people are going to be hearing from you every week
on your new show, Better off Line, which is launching
in a what I'm sure you'll agree is a frighteningly
short time raace.

Speaker 4 (59:52):
Soon.

Speaker 2 (59:53):
It is going to be the best weekly tech show.
It is going to do the job that no one
is strong enough to do, which is a questions. Listen
to the answers, then actually make a question that follows them.
I'm very much looking forward to this and very excited
to work with the cool Zone team and Tabia.

Speaker 5 (01:00:11):
Oh. You can find me on Twitter at cutma and
if you want to learn a little bit more about
my interactive and immersive work, you can see that at
Tabimora dot com.

Speaker 2 (01:00:22):
Now you may wonder why I didn't give you any
links to anything, and that was a deliberate thing called subterfuge.
But you can find me at where's your ed dot
at at Edzeitron, on Twitter, x rateminutes, dot bierz, and
of course plue sky Zetron, dobisky stot dot social.

Speaker 1 (01:00:37):
Yeah, and you can find my profile on here. All right,
We're fucking done here.

Speaker 3 (01:00:51):
It could happen here as a production of cool Zone Media.
For more podcasts from cool Zone Media, visit our website
cool zonemedia dot com, or check us out on the
iHeartRadio app podcasts, or wherever you listen to podcasts.

Speaker 4 (01:01:02):
You can find sources for It could Happen here, updated
monthly at coolzonemedia dot com slash sources.

Speaker 5 (01:01:08):
Thanks for listening.

It Could Happen Here News

Advertise With Us

Follow Us On

Hosts And Creators

Robert Evans

Robert Evans

Garrison Davis

Garrison Davis

James Stout

James Stout

Show Links

About

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.