Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
They had that like continuous Seinfeld generator for a while,
which I oh, really yeah, I forgot what happens. I
think it started. I think it just went to and
started doing racist things though.
Speaker 2 (00:19):
Started talking like Jerry Seinfeld.
Speaker 1 (00:22):
Yeah actually yeah exactly, it started talking like Sinfield and
Michael Richards, right, yeah.
Speaker 3 (00:27):
The idea of he's doing great things.
Speaker 1 (00:29):
Oh lord, what's with it with these dudent protesters anyway?
Speaker 3 (00:36):
What's the deal? They all got the same tent? Who's
giving him the tents? That is his material? Hello the Internet,
and welcome to season three, forty four, Episode.
Speaker 2 (00:54):
Two of dar Day's gay.
Speaker 3 (00:56):
Production of iHeart Radio. And this is a podcast where
we take a deep dive into America's shared consciousness, America's
deep brain. If you will, A little typic cap because
we're big AI fan, A little little spoiler. We're big
AI fans, now, folks. I have seen the light come around.
I think when it's all you see on social media,
(01:18):
you're like.
Speaker 2 (01:19):
That this is all right?
Speaker 3 (01:20):
Oh yeah, maybe this is cool. Yeah. It's Tuesday, June
twenty fifth, twenty twenty four.
Speaker 2 (01:26):
Oh yeah, big day. It's National Catfish Day, which is
odd because it's also my partner's birthday at breakday? Are
you have you been catfishing me this whole time? It's
national straw? Guys, haven't met her person yet? Still right,
but one of these days. No, Well, and every time
I want a video chat, she says her phone's broken,
(01:47):
so we just kind of stick to the phone call stuff.
Speaker 3 (01:50):
But it was also very normal. It's this stage in
a marriage very normal.
Speaker 2 (01:53):
Also, this is so weird and this is just like
some weird religious stuff. It's national leak? Did you know
this is national day?
Speaker 3 (02:01):
Do you know what that even is?
Speaker 2 (02:03):
It's because you're six months You're six months away from Christmas.
Speaker 3 (02:08):
Noel words the most it's the most opposite time of
the year. What does this even mean? Jesus gets a
half bird instead of ding dong, they dong ding quote
Doctor Seuss. All right, anyways, happy National Leon Day. We're
(02:29):
gonna need to workshop that. Maybe we can run it
through an AI comedy writer, because those are good and
in no way just humans posing as AI comedy writers. Anyways,
my name is Jack O'Brien aka Big Ass Plumpers to
Thin Ass Fools Geist keep it coming like grim as Spooch.
We kill it on the podcast break in all the
(02:49):
news say, I'll be goddamned if there ain't more raccoons.
That is courtesy of Halcyon Salad. Like that name, Halcion Salad.
I see what you've done there, and I enjoy it.
I'm thrilled to be joined as always by my co host,
mister Miles Gray.
Speaker 2 (03:06):
It's Miles Baker.
Speaker 3 (03:08):
I got them bell big boots when the weather suit.
Speaker 2 (03:13):
Yeah, I got that sweat because I'm a dad. Shout
out razagg on the discord because yeah, like I said,
I have zip off cargo pants that go from pants
to shorts because I'm a dad and you have that's
like just mandatory swag.
Speaker 3 (03:27):
You unzip the bottom of the shorts, yep, which is
what they're designed to do when it gets warm at
the knee, and then you flip them. You invert the
likes and put the cuff at your knee. So you've
got little bell bottoms, like two part bell bottoms.
Speaker 2 (03:43):
Yeah, the three part bell bottoms. I guess I saw
it on Pinterest. It could have been AI, but it
looked like people where we get our fashion from. Yeah, yeah,
it might have been completely off. But hey, Miles, most
people say, hey, good luck on you. Yes, people say, hey, Miles,
We're thrilled to be joined Yes by the hosts of
the Mystery AI Hype Theater three thousand podcasts. Once again,
(04:04):
it's doctor Alex Hannah and Professor Emily M.
Speaker 3 (04:06):
Bender. What's up, guys, welcome to the show.
Speaker 1 (04:11):
Hello, Hey, so I'm so glad to hear you there.
What was the thing about the raccoon?
Speaker 3 (04:18):
A raccoon ripped a crow in half in my backyard.
Speaker 1 (04:23):
Gosh, like it's just wwe style just put it over,
it's it's knee when it's strange.
Speaker 3 (04:29):
I didn't see it happen. I just found the crow,
if grown friendly with a murder of crows in my backyard,
and as one does, and found the crow's body right
next to like a garbage bag garbage can that had
been like flipped over like by what could have only
been a raccoon. And we do have a raccoon family
(04:51):
living nearby, so.
Speaker 4 (04:52):
If you have if you have an infra raccoons, have
you ever seen the video where someone's feeding a raccoon
cotton candy?
Speaker 2 (04:57):
No?
Speaker 3 (04:58):
Oh, and don't they like take it to the water.
I've seen like.
Speaker 2 (05:03):
Wash it dissolved and like no, Yeah, they always watch
their food.
Speaker 3 (05:09):
Yeah, very respect it. I just have a much healthier
respect for raccoons. We should be treating raccoons like handguns.
They're very impressive and dangerous and we should just give
them the proper respect. They have a thumbs exactly, they
and they hunt birds.
Speaker 2 (05:27):
So it's all I knew. I didn't realize the food
washing thing. My mom famously has opened her home to
possums and other neighborhood wildlife and where she lives, and
I remember in the kitchen where the cat food is,
there was like a bunch of kibble in the water
bowl and I was like, what is going on? And
Mom was like, I think that's what the raccoon does.
And I was like oh, and it was said so
(05:49):
casually that I was like in your home. He's like yeah,
yeah yeah. But then it leaves and I was like,
this is very okay.
Speaker 1 (05:56):
Well does it just does it use like the cat
like door? Does it? Yeah?
Speaker 2 (06:00):
Yeah, it comes through, watches the kibble, has a few bites,
and then takes off into the night. It's like a
pit stop for.
Speaker 1 (06:06):
One of the credit Yeah, I'm leading, So I'm leading
a raccoon based Dungeons and Dragons campaigns starting next week.
That's going to be like, yes, it's going to be
like a heist that takes place in the warehouse where
where I play Roller Derby. Like, I'm very excited. I've
got you got it really architected. I don't want to
(06:27):
give any secrets in case any of my my player
characters listen to this podcast.
Speaker 3 (06:31):
But right, okay, that sounds amazing And what a coincidence
raccoons having a bit of a moment really on this podcast.
Speaker 1 (06:39):
Yeah.
Speaker 3 (06:40):
So in addition to being a host of the wonderful
Mystery and Hype Theater three thousand podcast, which podcast hosts
the highest honor one can attain in American life. But
you both have some pretty impressive credits. Emily, you are
a linguist and professor at the University of Washington, where
you are director of the Computational Linguistics Laboratory.
Speaker 5 (07:03):
Yep, that's right, okay.
Speaker 3 (07:05):
Alex you are director of research at the Distributed AI
Research Institute. Both widely published, both received a number of
academic awards. Both have PhDs. We had you on the
podcast a few months back, told everyone the truth about
AI that a lot of the stuff that we're scared of,
and a lot of the stuff we think it can
(07:27):
do is not true. It's bullshit. And I sat back
and was like, well, we'll see what AI does after
this one. And it's just kept happening, you guys, what
the what the heck can we do? If anything, it's
gotten worse since we told everybody the truth what's happening?
Speaker 4 (07:46):
Truly, everybody seems to want to believes absurd so wild. Yeah,
And part of what we do with the podcast actually
is like try to be a focal point for a
community of the people who are like, know, that's not right.
Why does everybody around me seem to believe that it's like,
you know, actually doing all these things. Yeah, so it's yeah,
(08:07):
it's it's you know, that's what we say in our podcast,
like every time we think we've reached peak, a hype,
the summit, the bullshit mountain, we discover there's.
Speaker 2 (08:14):
Worse to come, Like yeah, right, Like oh yeah, this
is just a base camp until you get to the
real peak.
Speaker 1 (08:20):
Like right, well, it's just that you just keep on thinking,
that keeps on becoming and there's more and more things
that these CEOs just you know, are really just say
incredible nonsense. I don't know if you saw this. I
think it was the last week the chief technology officer
of open Ai, mir Murti Muradi, yeah, who famously was
(08:46):
I think it was an interview of sixty minutes and
when they were talking about one of their tools, soa
you know, they had asked if.
Speaker 3 (08:53):
They favorite filmmaker, Yeah exactly, my favorite Yeah, thank you,
yeah exactly.
Speaker 1 (09:00):
Just so are really edging out, you know, David Lynch
these days, and so, you know, and they asked her
do you train the stuff on YouTube? And she she
kind of grimaced, yeah, painfully. And I remember a great
Twitter comment. I was like, well, if you're gonna just
lie about stuff, you at least have to have a
good poker face about it. Yes, And so the last
(09:22):
week she was like, well, I think she's doing another
interview and she was like, well, some some some creative
jobs are going to go away, like some artists should
be you know, some.
Speaker 2 (09:34):
Creative jobs maybe shouldn't have existed in the first place. Right,
If like these jobs were in affront to God or something,
some of them just shouldn't have even been there.
Speaker 3 (09:43):
But she does have a French accent, so it's really
hard to be like, this is ridiculous.
Speaker 1 (09:49):
Let's he's Italian and.
Speaker 3 (09:50):
That's what's amazing about her having a friend. I don't.
I'm not I'm not a cultured person. I don't know
the difference between They're all French to me, I'm a man, right, Hey,
she has a Canadian accent. I think. I'm not sure.
Speaker 1 (10:04):
I know it's only it's only plagiarism if it comes
from the French region of.
Speaker 3 (10:11):
Italy's right, that's right. Yeah, we're going to get into
that story and just yeah, all of the madness that
has continued to happen, the bullshit has continued to rain
even harder, it seems like, yeah, which, yes, does make
the mountain go higher, unfortunately, the bullshit mountain. But before
(10:33):
we get to that, Emily Alex, we do like to
ask our guests, what is something from your search histories
that's revealing about who you are? Alex? You want to
kick us off?
Speaker 1 (10:45):
Oh gosh, okay, I don't. The thing is, I don't
think so I use duck duckgo, and so it doesn't
actually keep us search history and if I actually look
at my Google history, it's actually going to be really shameful.
It's going to be me like searching my own name
to see it, like if people are like ship talking
to me online.
Speaker 3 (11:05):
Now this isn't just how we tell if someone's honest,
is if they actually give that answer, we're like, okay, so.
Speaker 1 (11:10):
Yeah, you actually search yourself. Yeah, actually, but I think
the last thing I actually searched was like queer barbers
in the Bay Area because I haven't had a haircut
in like a year, and I think I need to
turm up or get you know, air out the the
sides of my head for pride months. So that's yeah,
that's elastic.
Speaker 3 (11:31):
I searched, what are you going? You're going full shaved
on the sides.
Speaker 1 (11:35):
Or maybe trim it a little bit and up the
back and bring out the curls a little bit.
Speaker 3 (11:41):
So okay, love it? Yeah, on board.
Speaker 2 (11:44):
I wish I could've got a few more days in look.
Speaker 1 (11:50):
In July like discounts, Like it's like like after Valentine's Day?
Do I get an undercut at fifty percent off?
Speaker 3 (11:59):
Now?
Speaker 2 (11:59):
Right? Exactly?
Speaker 3 (12:01):
Emily, how about you? What's something from your search history?
Speaker 4 (12:04):
So forgive the poor pronunciation of this and the rest
of the story, because Spanish is not one of my languages.
But Champurrado, Oh yeah, it's something I search instrs.
Speaker 3 (12:13):
Yeah.
Speaker 4 (12:13):
So I was in Mexico City for a conference last
week and at one of the coffee breaks, they had
coffee and decaf coffee, and then.
Speaker 1 (12:20):
They had.
Speaker 3 (12:25):
And the Spanish don't give us about that.
Speaker 2 (12:30):
What do you see when you see that?
Speaker 3 (12:31):
Way? Champarado Mexican hot chocolate.
Speaker 2 (12:35):
So yeah, you're literally Google results.
Speaker 4 (12:40):
So the the labels all had like translations into English,
and so it was champurado with o hack and Chocolate'm like, yeah,
I got that.
Speaker 5 (12:46):
What's chopado?
Speaker 4 (12:48):
And so I look it up because I want to
know what I'm consuming before I consume it. And it's
basically a corn flour based thick drink chocolate corn soup.
Speaker 5 (12:58):
It was amazing.
Speaker 3 (12:59):
Chocolate corn you had me until chocolate corn soup. But
the corn is just a thickening yeah, chocolate drink.
Speaker 4 (13:10):
Yeah, slight corn flavor like think corn tortilla, not on
the cob.
Speaker 3 (13:15):
Yeah yeah, oh yeah, yeah, yeah that sounds amazing.
Speaker 5 (13:18):
Yeah, it was really good.
Speaker 3 (13:20):
I love some corn flakes in a chocolate bar. Yeah,
corn chocolate. There you go. Yeah, you got it.
Speaker 2 (13:25):
You gotta arrive in your own way.
Speaker 3 (13:30):
Corn chocolate.
Speaker 4 (13:32):
It was really good and just awesome that it was there,
Like you know, the the coffee breaks had like the
Mexican sweetbreads and stuff like that, but otherwise it was
pretty standard like coffee break stuff, and all of a
sudden there's this wonderful mystery drink.
Speaker 5 (13:43):
The big urns was. It was lovely.
Speaker 3 (13:44):
That sounds great. What is something you think is underrated? Emily?
Speaker 4 (13:49):
I think Seattle's weather is underrated. Oh yeah, everyone makes
fun of our weather and like, you know, fine believe
that we don't need lots of people coming here.
Speaker 5 (13:58):
And it's true it gets dark in the winter.
Speaker 4 (13:59):
But like almost any day, you can be outside and
you are not in physical danger because you are outside.
Speaker 1 (14:06):
I guess that's that's I mean, if you're going for yeah,
that's interesting, but I mean it's I mean, the winters
are just so punishing though it's so gray.
Speaker 2 (14:16):
It's dark, but the weather it's dark, it looks it
looks like shit, but experientially not bad for you. I mean,
I yeah, I know it's when does like it doesn't
get all gloomy. I imagine in the summer, right, you
have wonderful blue skies, and you can.
Speaker 4 (14:34):
Enjoy the gorgeous fire season aside. But yeah, from sort
of mid October to early January, it can be pretty
like it's gray and so like when the sun is
technically above the horizon, it's.
Speaker 5 (14:47):
A little hard to tell.
Speaker 4 (14:48):
Yeah, right right, So, but you know, compared to like Chicago,
where you have maybe four liverable weeks a year between
the too hot and the two cold.
Speaker 1 (14:57):
Wow, wow, I'll do that. Because my thing was going
to be Chicago, because I was just there and I
was going to say my answer was going to be,
Chicago is the best American city. I stand on this
like one.
Speaker 3 (15:15):
No, absolutely not true.
Speaker 1 (15:17):
No, I'll even deal. I'll even deal with I'll deal
with the winter. I mean if I okay, I'll be honest.
If I didn't, you know, if the weather in Chicago,
if if I could bring Bay Area weather to Chicago,
I would live in Chicago. I mean there's other reasons,
but I mean it's it's look, the vibes, immaculate street festivals,
(15:41):
the neighborhoods. It's the one place that's probably the food.
It's still comparatively affordable compared.
Speaker 3 (15:48):
To the coasts.
Speaker 1 (15:50):
Radical history, you know, just you know, some of the
best politics. Yeah, you know, I would say, fuga. They
shot shot an issue there, the fugitive, Oh I did.
That's a deep cut. Yeah, I mean they I think
they've shot a lot of Batman movies there, because you know,
the iconic kind of lower Whacker drive and they call
(16:14):
it with them and it's yeah, yeah, that's pretty great city.
Speaker 4 (16:18):
Crappy weather, right, if you're going to dump on weather somewhere.
Everyone makes fun of Seattle's weather.
Speaker 1 (16:25):
Honestly, Emily, this is a hot take. I'd rather take
Chicago's weather than Seattle's weather.
Speaker 3 (16:32):
I can't. I can't do gray. I can do I
feel like I'm a crossfire.
Speaker 1 (16:37):
I can bridge it.
Speaker 3 (16:38):
I cannot do gray.
Speaker 1 (16:39):
It's super pressing.
Speaker 4 (16:41):
Well, this is why I say, like, don't move to
Seattle if you can't handle our weather. Like the people
who move here and then complain about.
Speaker 2 (16:46):
The weather, what do you expect all of this? What
they say is true about it being great, like, oh,
I didn't expect you to be that gray?
Speaker 3 (16:54):
Right, I think people talk about it like that. All right, Alex,
let's stay with you. What is you, guys, is overrated
and please do it in a point counterpoint style also
that contradicts one another.
Speaker 1 (17:08):
Well, I got to think about what's what's overrated these days.
I just don't know what's in the I know this
the name of the show is the Daily Zeitgeist, But
I don't really know what's in the zeitgeist. I mean,
I guess Taylor Swift, I mean, I don't really have
Maybe that's controversial. I'm saying something that's hot take, but
I guess that's maybe not controversial to people of of
(17:31):
our you know, our generation.
Speaker 2 (17:33):
No so joining Dave Grol on the on the attack
this weekend. Yeah, wait, what happened with David Girl, like
implying that she's like He's like, well, we play our
music live like raw live rock and roll, you know,
unlike the Era's tour, you know, with the errors tour,
and then everyone's like fuck you Dave or other people
(17:55):
will being like exactly exactly. Yeah.
Speaker 1 (17:59):
It's just like yeah, yeah, I mean Dave Girl is
also overrated, I guess. But like, I mean, I enjoyed Look,
I enjoy ever long like the next like middle aged
sort of like dad figure like, but I you know,
I'm sure like I'm glad that she played every part
(18:20):
in that song. It sounds good, but you know, yeah,
it doesn't make you an authority on Taylor Swift. So yeah,
so I think I'm undercutting my own point.
Speaker 3 (18:29):
But now let's go, Yeah you did in your own over.
Speaker 4 (18:34):
Yeah that's excellent because I don't even have an opinion about.
Speaker 3 (18:37):
Saw Tucker Carlson do that was that that was called crossfire?
Speaker 1 (18:42):
Or was that the crossfire was with what's his face?
Speaker 3 (18:47):
Carlson and Paul That was wasn't the one that the
one that John Stewart came on and was like yeah,
it was like this show is bad, and then like
they cancel it a couple of weeks.
Speaker 1 (18:58):
Well then, but then there was at once show Hannity
and Combs where Sean, you know, Sean Hannity was supposed
to be a conservative voice, and then you know Combs
where like I don't I don't even know the guy's
first name. They just they kind of just had him
as a token like liberal on and then they just
it was on Fox News. So he just attacked him,
(19:18):
you know, relentlessly.
Speaker 3 (19:19):
He wasn't allowed to read the news. He's like, you
argue the liberal points, but you're actually not permitted to
leave this room. We're gonna keep you in here, old
boy style.
Speaker 2 (19:29):
For the Oh, that was the end of sixty minutes
that Andy Rooney would do. There was part of sixty
minutes was point counterpoint, and it would be Andy Rooney
if that's her thinking, Jack.
Speaker 3 (19:40):
No, No, there was a show. Yeah, yeah, it was
right when I got out of college and worked for
ABC News and so everybody was always watching news and
at that time there was a big show on CNN
called Crossfire. It was Tucker, Yeah, Tucker Carlson was the conservative.
Paul mcgala was the roll and they just like got
(20:01):
on and yelled at each other.
Speaker 1 (20:04):
I'm looking at us now.
Speaker 3 (20:05):
Apparently good.
Speaker 1 (20:06):
Apparently there there was a they they had a revival
and then in twenty thirteen and fourteen on the left
was Stephanie Cutter and Van Jones and then Nuke king
Ridge and se Cup on the right, and then whatever.
And then whenever they needed breaking news they'd bring in
Wolf Blitzer for some reason.
Speaker 5 (20:29):
Because Wolf the situation room.
Speaker 1 (20:31):
Yeah yeah, they released him from the cryogenic.
Speaker 3 (20:35):
Whose helicopter lifted from the situation room three rooms over
to the Crossfire set. Just with dead pan. You know,
we need you go into emotion on his face. Ever
you guys ever seen the Wolf Blitzer episode of Celebrity
Jeopardy No Yourself a favor is It hasn't been scrubbed yet.
Speaker 1 (20:58):
Is it as good as like the SNL parody? Is
the Celebrity Jeopardy with like the Sean Connery it's just yeah, no.
Speaker 3 (21:07):
No, and also incorrect, just one after another, like negative
went into the into the red. He's in there quickly separty. Well, Wolf,
we're gonna spot you three thousand because we can't have
somebody be in negative numbers going into going into Final Jeopardy.
And I think Andy Richter was on with him and
(21:31):
just destroyed. Was so good.
Speaker 1 (21:33):
That's so funny. Andy Richter like this the kind of
crossover I didn't know I need.
Speaker 2 (21:40):
Yeah, it's still up there mostly from what I could tell,
it's incredible.
Speaker 3 (21:44):
Yeah, I am an old person. All right, we still
have Emily. You're overrated. What do you think is overrated?
Speaker 5 (21:52):
Big cars are overrated?
Speaker 6 (21:53):
Oh?
Speaker 4 (21:54):
Totally sort of half heartedly looking for our next car
and can't find anything that is like small. And the
other day I was in the parking lot for a
grocery store in your hair mostly I can walk for groceries,
but occasionally have to drive this other store. And half
the spots were labeled compact, and like all of those
spots were taken up two at a time by what
(22:15):
we now have is regular cars, because somebody's decided that
people in this country don't deserve normal sized cars.
Speaker 3 (22:22):
Yeah, they're so.
Speaker 2 (22:23):
I mean, it's still the point where like even the
people who design parking lots are like, we have to
tell the automobile manufacturers, like the standard we've set as
people who like create parking lots, like they're pushing the
boundaries of what we can actually do or how we
measure things because the cars are so fucking big.
Speaker 4 (22:40):
And our streets around here in Seattle, we have a
lot of neighborhood streets where there's like parking on both
sides and then sort of just barely enough space for
two normal cars to go through, right, or sometimes you
have to like pull over to the car, and the
bigger the car is like the harder that gets.
Speaker 3 (22:53):
I love that thing.
Speaker 2 (22:54):
I remember one of the times went to Seattle, seeing
how everybody just parks on whatever side of the street
in whatever direction they want.
Speaker 3 (23:00):
I was like, all right, I'm like, all right, I
was not familiar.
Speaker 1 (23:05):
That's funny.
Speaker 5 (23:06):
Yeah, yeah, a little bit of chaos.
Speaker 4 (23:08):
It totally offends my spouse, who's like, that's not how
parking works.
Speaker 1 (23:12):
That's it.
Speaker 3 (23:13):
Yeah, yeah, I love it. Automakers just seem to be
getting bigger and heavier. They won't stop until they make
a car that like is legally required to have a
foghorn on it.
Speaker 4 (23:24):
Right, So the cyber truck, Yeah, cyber truck.
Speaker 3 (23:27):
I was going to ask, have you have you considered
the cyber truck.
Speaker 5 (23:31):
I've seen one in person.
Speaker 4 (23:32):
They are hilarious, like you can't not laugh exactly.
Speaker 3 (23:35):
What it's such. It is an experiencing one in the wild.
Speaker 4 (23:39):
It's like, wow, I just want to say that what
we really need is functional public transit, but like, short
of that, we also need to not be doing bigger
and bigger cars.
Speaker 1 (23:47):
Yeah yeah, yeah, no, I just I mean, I have
a truck. I have a twenty twenty truck, and I wish,
I really wish it was much smaller because it's hard
to park. It's way too big. I mean, I think
that peak of truck design it was like, you know,
a nineteen eighty seven Toyota Tacoba long cab, you know,
(24:08):
just you know where like, yeah, you had to bunch
up your knees in the back if you want to
fit four people on it. But you actually had a
long you know, you actually had a truck bed that
actually had you know, some carrying capacity, you know, and.
Speaker 2 (24:24):
It was a car you could absolutely run into the.
Speaker 3 (24:27):
Ground with nobody.
Speaker 1 (24:27):
Yeah, oh yeah, oh yes.
Speaker 2 (24:30):
Now people were like, my new Ford Lightning needs a
software update.
Speaker 1 (24:33):
Oh god, well that's the thing, is like yeah, I
mean like, I mean that's a big deal. I know,
in like organ which is like the where they had
right to repair a right to repair bill, and I
mean in some ways the people that were kind of
into it weirdly were like Google actually came out kind
of into it. There was a Good for for Media
podcast where they talked about this with an Apple because
(24:56):
they have such a closed ecosystem was so against right
to repair or you know, even if you have right
to repair. They actually they'd add on all these things
where you you'd still have to send them to an
authorized dealers because of firmware issues or whatever. And then
John Deere, like John Deere is this kind of thing
where they have you know, so much of their tractors
are computerized, and so there's like a lot of like
(25:18):
these John Deeer hacking kinds of things that people who
are outside of the US, you know, programming these kinds
of hacks for for people running these tractors and can't
run into their firm where.
Speaker 3 (25:32):
Yeah, the farmers have all the good the GPS something,
but did.
Speaker 4 (25:37):
You hear about how the GPS was out for a
while with there's actually is again for a for media podcast,
but that the way those tractors work for like planting
is so precise and the GPS. With the GPS off,
they basically couldn't plant because then the seeds wouldn't be
in the right spot for the next process, and so
(25:57):
they had to wait. And there's a really narrow window
of currently with our you know, currently genetically modified like
very very specific corn that Santo owns, and so it
was actually looking pretty bad for a while. I didn't
hear any follow ups. So maybe the solar flail was
short enough and the GPS came back online, but apparently.
Speaker 5 (26:12):
That was a big thing.
Speaker 3 (26:15):
It has to be a brief window because it goes
from corn plant corn seed planted in the ground to
like popcorn in the movie theater in two and a
half weeks.
Speaker 4 (26:24):
Yeah, hyper, most of it doesn't even go to popcorn
in the movie theater, right, most of it goes to
animal feed or after.
Speaker 3 (26:33):
All, I think, yeah, right, right, yeah, all right, Well,
let's take a quick break and we're going to come
back and dive into why Miles and I are excited
about the future of AI. We'll be right back. Crossfire
(27:00):
and we're back. We're back. So just to for people
who haven't listened to your previous appearance in a while,
I feel like a broad over generalization, but it feels
like the stuff that AI is actually being used for
and capable of is not what we're being told about
(27:21):
through the mainstream media. Like it is not an autonomous
intelligence that is going to be the bad guy in
a Mission Impossible movie. I mean, it is a bad
guy in a Mission Impossible movie, but it's not going
to be a bad guy in the in reality our
way it is, yeah, the way that our actual president believes.
(27:41):
That was an amazing reveal that Joe Biden basically watched
Mission Impossible and was like, we gotta gotta worry about
this AI stuff. Jack it's gonna know my next move.
But it is more like the large language models are
basically more sophisticated auto complete that is telling you what
(28:01):
it's data set indicates you want to hear, or what
its data set indicates will make you think. It is
thinking talking like a person. In many cases that means
what they call hallucinating. What is actually just making ship up?
Speaker 2 (28:16):
What are their jobs? Could you say that You're like, sorry,
I was hallucinating and like okay.
Speaker 1 (28:25):
Oh but what you wouldn't last long as a pre cog.
Speaker 3 (28:28):
Yeah, I would be the worst precog on.
Speaker 2 (28:32):
The I R S. I was hallucinating on that last
tax return.
Speaker 3 (28:34):
I think that's probably.
Speaker 4 (28:37):
About using this to do your tax return.
Speaker 1 (28:39):
Yeah right, there's actually, yeah, there's actually a I mean
in California, there's a whatever the Department of Tax and Revenue.
They've been There was some great reporting and how it
matters by Karie Johnson, and he was talking about how
they were using this thing, some language model to effectively
advise the people who call in or advise the agents
(29:02):
who respond to people to call into the California Franchise
sax board and you know, and they're like, well, they're
they're and they're like, well, you know, the the agents
are still going to you know, have the last worry.
But I'm just like, yeah, yeah, but they might they're overworked,
Like are they going to just they're going to read
this stuff or me and them.
Speaker 3 (29:20):
You know, right exactly. Oh, you're going to use this
as an extra thing, just an extra expense to do
the product, do your job even better. That doesn't sound
like a company necessarily. Yeah. Yeah, So an interesting thing
that we're seeing happen. You know, we pay attention when
there's a an AI story that captures the zeitgeist. We
(29:42):
covered the B minus version of a George Carlin routine
that came out. They were like, a, I just brought
George Carlin back from the dead. We covered Amazon Fresh
having that store where the cameras know what you've taken
and so even if you try and shoplift like the
cameras know they're gonna catch it, and then you don't
(30:02):
even have to check out, you just walk out and
they like it, charges your account because of AI. And
then what we're seeing is that when the truth emerges,
it does not enter the zeitgeist because you guys cover
it on your show which is why we're so thrilled
to have you back. But you know, we have updates
on those two stories, Carlin. That was just written by
(30:23):
a person the Amazon Fresh. Those videos were being fed
to people working in India to try to track where
everything was going, which was why there was like a
weird pause like as people were where they're like, oh,
I think we got okay, yeah, we're we're just gonna
(30:44):
do a best guess. But it's straight up like mechanical
Turk like it's which again Amazon named one of their
companies the Mechanical Turk, so they they know what's going on.
They knew what they were planning to do here all along. Maybe,
But is that kind of the model you're seeing is big,
flashy announcement, this is what AI integration can do, and
(31:07):
then when it falls short, people just kind of ignore
it or how does it seem from where you're sitting.
Speaker 4 (31:15):
Yeah, we haven't seen really good implosions yet, and it's
surprising because like the stuff that goes wrong goes like
really really wrong, and people are like, yeah, well, it's
just in its infancy, which is a really really annoying
metaphor because it first of all suggests that this is
something that is you know, like a human, like an
animal at least that's that's a baby and can grow
(31:36):
something that is learning over time. And also sort of
like pulls on this idea that we should be kind
to these systems because they're just little babies, right, and
so if something goes wrong, it's like, well, no, that's
that's just it's it's still learning. And we get all
of these appeals to the future, like how good it
is going to be in the future, And there is
at this point, I think so much money sunk into
(31:57):
this that people aren't ready to like let go and
own up to the fact that yeah, so, and it
is I guess too easy to hire exploited workers for
poor pay, usually overseas to like backstop the stuff. There's
also so you gave us the Amazon ghost stores actually
being monitored by people in India. There was one of
the self driving car companies admitted that their cars were
(32:21):
being supervised by workers in Mexico.
Speaker 2 (32:23):
And remember the stats on the alex Yeah it was
it was.
Speaker 1 (32:26):
It was, So was Eric Voight, the CEO of Cruise,
and he had said And then there was this reporting
on the New York Times where they said, you know,
they used they used humans. And then he was like, wait,
well wait, wait, wait, wait, you're you're really blowing out
of proportion. We only use it something of three to
five percent at a time, Like that's a huge it's
(32:48):
a huge amount of out and and he posted this
himself on on Hacker News, which is this, you know,
kind of like, I don't know four chan for tech pros.
I guess, well, I guess four four chan is four
Chan for techpros. But I mean it's you know, but
like with a little less overt racism. I guess, yeah,
(33:09):
especially yeah, but it was still Yeah, but we're seeing
this in a lot of different industries. At the end
of the day, it's just this is outsourcing humans. Dan
at Vertessi is a sociologist at Princeton. She has a
piece in Tech Policy Press which the title is something
like AI it's just forecasting. It was just outsourcing two
(33:29):
point zero effectively. And yeah, we're seeing we're seeing a
lot of the same patterns that we saw in the
early nineties when these business process outsourcing or BPO organizations
were really becoming all the rage in the US.
Speaker 2 (33:43):
Right. The other thing that I see a lot too,
is like I felt early on, especially when we were
talking about it. The thing that intrigued us, like when
was when everyone's like, dude, this thing is gonna fucking
end the world. It's how powerful AI is. Have I
have a whole plan to take myself off this mortale
if I have to the moment in which AI becomes
sentient and takes over. And like I think it felt
(34:05):
like maybe the markets were like, hey, man, you're scaring
the kids. Man, do we have another way to talk
about this? And I feel like recently I seem more
of like together, when we harness human intelligence with AI,
we can achieve a new level of existence and ideation
that has not been seen ever in the course of
human history. And I saw that in the Netflix j
(34:26):
Loo movie, where like the entire crux of the film
was this AI. AI skeptic had to embrace the AI
in order to overcome the main problem conflict in the film,
or just even now, like with the CTO of Open
AI also doing a similar thing when talking about how AI,
like some creative jobs are just gonna vanish, But that's
(34:48):
because when the human mind harnesses the power of the AI,
we're gonna come up with such new things that feels
like the new thing, which is more like we got
to embrace it so we can evolve into this next
level of thinking, et cetera, computation or whatever you guys on.
Speaker 3 (35:04):
I was just gonna say, Mystery Hype Theater three thousand
reads the research papers so that we don't have to,
and Miles watches the j Lo movies so that.
Speaker 5 (35:13):
You don't have to.
Speaker 3 (35:14):
Got to know what they're saying.
Speaker 1 (35:15):
But I'm glad you're I'm glad you're watching the day
because there's so many different cultural touchstones of this.
Speaker 3 (35:21):
Yeah.
Speaker 1 (35:21):
I had to look because I thought I thought the
movie you were talking about was the the like sort
of the autobiography This is Me Now a love story,
and I'm like, there's a film and I was like,
there's an AI subplot in that. I didn't know that
j Loo's life was, you know, a complete cautionary tale
(35:42):
about you know, AI and and the inevitability of it.
But yeah, but sorry, Emily was about to say something.
Speaker 4 (35:50):
I just want to be starting so our colleagues to meet.
Gabruin and mil Torres coined this acronym test reel, which
stands for a bundle of ideologies that are all very
closely related to each other. And what's interesting about the transition,
you notice that they've basically moved from one part of
the tescrol acronym to another. So the it's all the
(36:11):
stuff that's based on these ideas of eugenics and of
sort of really disinterest in any actual current humans in
the service of these imagined people living as uploaded simulations
in the far long future. It's utilitarianism made even more
ridiculous by being taken to an an extreme endpoint. So
(36:31):
this thing like it's going to kill us all comes
partially from like the long term ism part of this
which people are fixated, this idea of we have to
it's ridiculous. They have a specific number, which is ten
to the fifty eight, who are the future humans who
are going to live as uploaded simulations in computer systems
installed all over the galaxy. And these are people who
(36:51):
clearly have never worked in it support because somehow the
computers just keep running.
Speaker 2 (36:56):
Yeah yeah, yeah yeah.
Speaker 4 (36:58):
And the idea is that if we don't make sure
that future comes about, then we collectively are missing out
on the happiness of those ten to the fifty eight
humans and that's such a big number that it doesn't
matter what happens now, all right. And I always say
when I relate this story that I wish I were
making this up. Yeah, but there are actually people who
believe this. And so that's where the sort of like,
(37:18):
oh no, it's you know, it's going to end us.
Speaker 5 (37:20):
All stuff lives, and that's.
Speaker 4 (37:22):
The l the long termism part of tesculism. But this
idea that we should join with the computers and become
a better thing, that's the t that's the transhumanism. And
it's all part of sort of the same bundle and
way of thinking. And there's this great paper out in
the publication called First Monday by mil Tors and to
Meet Gibbrus, sort of documenting the way that all of
(37:42):
these different ideologies are linked one to the next. There's
overlaps in the people working on them, there's overlaps in
the ideas, and it all goes back to eugenics and
race science.
Speaker 3 (37:51):
Wow, okay, and it just had a onto.
Speaker 1 (37:53):
I mean, it's really so the dobrism and the boosterism
are you know, two sides of the same cli even
though they kind of pose each other to be different.
So you if you imagine, there was an article and
it was a very funny chart that was a company
of this article that is an article in the Washington Post,
Natasha Tiku, and she had this kind of grid and
(38:15):
it was really funny because it was like on one
end was this guy Eliza Yudowski who's like a big duomor.
He had this thing in Time where he wrote an
op ed in Time magazine he was like, you know, basically,
if we need to, we have to be willing to
do air strikes on data centers and then which he
actually wrote and he actually said she speaking of Tucker Carlston.
(38:37):
I think Tucker Carlson also was like, oh geez, maybe
we should do that. And on the other end you
have you know Sam Altman, who was like, in Mira Maradi.
Speaker 3 (38:47):
You got a cyber trust.
Speaker 2 (38:48):
Sorry, no, it's okay, I just want a cyber truck.
Speaker 1 (38:52):
You just want a cyber truck. I got to enter
my social Security number one. Yeah, yeah, it's fine, I'll
give you mine. I want to beat to you too,
and so you know, but you know, they are actually
two sides of the same coin because they basically want
to and I mean in the middle was to meet
who's on this test real paper. And also it's also
(39:12):
my boss at the at Dare and and and I'm
not I'm not saying that too. You know, kiss Acids
we actually get along quite well. So but it's but
it's you know, posed as you know, someone in this
in this grid. But so Dubrius and Bistrooms them both
basically see AIS this kind of inevitability. We're gonna get there,
(39:33):
you know. And to me, I think the metaphor you know,
you allude to Emily is kind of like thinking about
this like a kid that needs to be formed. And
in some in some ways I think it's I mean,
it's colonialism. It's it's that it's manifest destiny. You know,
it's always five years away, you know. And so but
they but they both see development of AI to be
(39:54):
really really critical to whatever is happening next when we
can be like no one is asking for this ship
and it's taking up you know, and I and we're
seeing a lot of memes online are like, you know,
I'm so glad that we are draining a lake every
you know, to generate this image of and there's one
(40:16):
of the images was like a toddler holding you know,
a crucifix and with a helmet that says the police
on it and their neck deep in water and they've
got big kawhi you know, emoji eyes, right.
Speaker 2 (40:31):
And if you blur your eyes, it's Jesus's face. Yeah,
because that's another huge thing.
Speaker 3 (40:36):
I see.
Speaker 2 (40:36):
There's so much AI nonsense art out there too.
Speaker 4 (40:41):
And it's it's super environmentally disastrous, right, this is this thing,
and a lot of it is non consensual in the
sense that, like, if you try to use Google these
days to do a search, the first thing you get,
depending on search with many searches, is the AI overview,
which is the output of one of these text to
text models, taking way more processing power than just returning
the townlu links, which is that they used to do,
(41:01):
and you can't turn it off.
Speaker 5 (41:03):
So if you use Google do.
Speaker 4 (41:04):
A search, you're stuck with that. You're stuck with its
environmental impact.
Speaker 3 (41:08):
Yeah. The one positive I'd say that I've seen more
and more since we last spoke is people being like, wait,
who the fuck is this for? Exactly? Who is asking
for this? Which I think is ultimately going to become
a louder and louder question. Is this seems to be
mainly for tech CEOs. Yeah, we're very wealthy, but especially
(41:32):
like in countries that aren't, where corporations are not more
powerful and valued more as humans than actual humans, like
the United States. I feel like it will it will
become more and more of an issue and then be
a matter of figuring out if that message actually gets
in in the US.
Speaker 4 (41:50):
But yeah, we can keep shouting it, and that's the
great question, like.
Speaker 5 (41:53):
Who asked for this?
Speaker 3 (41:54):
Wait? Who asked for this? And who likes where it's going? Yeah?
Speaker 2 (41:59):
Yeah, because of the luster has worn off from the
early you know days of those early chat GPT models
that came out like this.
Speaker 3 (42:06):
Thing could be a doctor, this thing is could be
a lawyer.
Speaker 2 (42:09):
And then most people kind of like the people that
I know who are first impressed, like yeah, Like it
helped me write an email. That's about the most I
can do with it, because I hate writing formally and
so it helped that. But beyond that, I don't know
many like as personally, but and I know people that
work in many different fields. It sort of ends up
being like, yeah, I don't really have a use for
it aside from like making funny songs that I share
(42:32):
with my girlfriend that you know, we have a song
about how she loves Chipotle, and it was in the
style of you know, Crystal Waters. Yeah, and now that's it.
Speaker 1 (42:42):
Yeah, yeah, I mean I think that's I mean, I
like this phrase that Emily coined, which is, you know,
resist the urge to be impressed. Yeah, where it's like
this this kind of thing where you know, you've got
this proof of concept and it's got a little little
g whiz to it, but beyond that, I mean, you know,
are you going to use this in any kind of
(43:03):
real I don't know, product cycle or ideation or whatever.
And and there's just been I mean, we've had a
tech lash for the last couple of years here and
and I mean I think it's really the only people
I ever see praising this. And I mean, maybe this
is just a virtue of my timeline that it's creating,
(43:24):
and Elon Musk hasn't absolutely destroyed it yet, But the
only people I see praising it are typically people who
are very very much in the industry. You know, they're
the same people who are like I live in San
Francisco and I support you know, London Breed saying she
wants to clean up our streets from homeless people. And
it's just the people. And I mean some of them
(43:46):
are not out and out that fascist, but you know,
they're they're they're they're brushing, They're brushing along with it
everyone else I see, especially also a lot of people
in technology. They're like, I work in technology, and I
hate this bubble and I'm so tired about talking about
this and i just wanted to go away.
Speaker 3 (44:05):
I mean, this is you know.
Speaker 1 (44:07):
And then I see teachers, you know, we see a
lot of people from the professions, teachers, nurses, doctor. My
sister is a nurse. He's like, I hate this stuff.
My sister's a lawyer, and she's like, yeah, I've used
this to sort of start a brief, but he gets
so many things wrong. I need to double check everything.
And at the end of the day, yeah, does it
actually help. Yeah, probably not.
Speaker 2 (44:29):
But but it's just a little baby, now, Alex, it's
just a little baby.
Speaker 3 (44:34):
It's just a little baby, A little baby.
Speaker 1 (44:36):
Would you want to take a little defenseless baby.
Speaker 3 (44:40):
It is so interesting too, because it's like and it
could be a lawyer or a doctor, which is also
a baby. I'm so impressed by my little baby. Like
what they just said, they could be a lawyer or
a doctor one day to.
Speaker 4 (44:55):
Watch for the phrase in its infancy in the coverage
of this, it's all over the place.
Speaker 1 (45:00):
Yeah, shout up, shout shout up. And I just want
to give a shout out to Ana Laurene Hoffman, who
someone is a professor at You'd have a really good
friend of mine and colleague of Emily and uh so
she's done a lot of work and sort of talking
about like these metaphors of this kind of baby noess
and how it absolves AI of like truly being like
racist and sexist and absolutely fucking up. It's like, it's
(45:23):
just a little baby. Yeah, it's a baby with with
with a billion dollar valuation or whatever. You know, however
much it is.
Speaker 3 (45:30):
By having AI put in the like mentioned in the
earnings call.
Speaker 2 (45:34):
But if we're going to keep that consistency with that metaphor,
you'd be like, then, why are you, as the parent,
wheeling this child out to do labor that it's wholly
unprepared for.
Speaker 3 (45:43):
You look fucked up. Actually, so, Miles, I have kids
that are a little older than yours, and so I
could that is actually good parenting. To wheel them out
and have them and it's brings in the cheddar.
Speaker 1 (45:56):
Yeah, Jack coming on the podcast Hot take child labor,
actually child labor.
Speaker 3 (46:03):
And when it's your kid, come on. Those are called chores,
and yes, if they're doing the chores for a multinational
corporation against a little hazy, but.
Speaker 2 (46:14):
Yeah, take out an h eight hour chor shift that check.
Speaker 3 (46:16):
Hazy because of all that money coming in.
Speaker 2 (46:19):
You know.
Speaker 4 (46:20):
So on this point about like nobody wants this, I
have a talk that I've been doing since summer of
twenty three called chat GP Why And then the subtitle
is when, if ever is synthetic text safe, appropriate and desirable?
And it's meant for sort of non specialist audiences. I
basically go through, Okay, what's a language model, what's the
technology for in its original use cases?
Speaker 5 (46:38):
How did we get to the current thing?
Speaker 4 (46:40):
And then what would have to be true for you
to actually want to use the output of one of
these things?
Speaker 2 (46:44):
Right?
Speaker 4 (46:45):
So, the first thing is you'd want something that is
ethically produced, so not based on stolen data, not based
on exploited labor. Basically, don't have those second layers box. Okay,
you also want something that somehow is not an environmental disaster.
Speaker 5 (46:58):
Also don't have that.
Speaker 4 (47:01):
Assuming we somehow got past those two hurdles, and it's
things like, okay, well, you need a case where it's
not going to be misleading anybody. It's got to be
a case where you don't actually care about the stuff
being accurate or truthful what comes out of it, including
you know, being able to recognize and mitigate any biases.
You also don't care about the output being original, so
(47:22):
plagiarism's okay. And like by the time you've done all that,
it's like, yeah, this use case of helping you draft
an email because that's tedious, right, and is that worth
the two that we started with there? Right, you know,
the labor exploitation, data theft and environmental impacts.
Speaker 3 (47:38):
Right now we've just broken down all of human meaning
for the past, like for civilizations, Like you know, none
of the things that we've always seemed to think matter
matter anymore. Because billion dollar companies needed like a new
toy to hype up for the stock market. It sounds
like that's very frustrating. Let's take a quick let's take
(48:00):
a quick break. We'll come back, because I do want
to talk about what it is being used for. I mean,
AI is a very loose term the way it's being applied,
But people want to use, you know, these new advances
in computing to make themselves seem more profitable, and so
I want to talk about what it is actually being
used for. We'll be right back, and we're back. We're
(48:32):
we're back, and on your show, you've done some good
stuff on just the surveillance side of AI, which I
mean that turns out a lot of the technology that
we initially thought was promising was just eventually used for
the purposes of marketing and surveillance in the end, and
(48:54):
it seems like AI skipped all the promising stuff. And
it's just like, what if we just went right to
the right there harming harming people.
Speaker 1 (49:05):
Yeah, I will, I will say that kind of. I mean,
you had mentioned that this term AI is kind of
being used, Lucy Goosey, and you know, I mean we
we AI is kind of synonymous with large language models
and image generators. But you know, things that have been
called AI also encompass things like biometric surveillance, like like
(49:31):
different different systems which use this technology called quot unquote
machine learning, and which is kind of this large scale
patterned recognition. So a lot of it's being used, especially
at the border, so doing things like trying to detect
verify identities by voices or by faces. You probably see
(49:53):
this if you've been in the airport. The TSA has
been using this, and you can still voluntarily opt out
for now, but they're really incentivizing it. I saw that
TSA has this touchless thing now, which is a facial recognition,
so you don't have to present your ID, you can
just scan your face.
Speaker 4 (50:11):
And go and and like, don't do that, like take
every option to opt out, and that the fact that
those signs are there saying that this is optional. Penny,
somebody actually Petty. Yeah, yeah, the only reason we had
that science is because of her activism saying like this
has to be clear to the travelers that it's actually
optional and you can opt out, so it's posted there
(50:31):
that you don't have.
Speaker 3 (50:32):
To do this. Yeah, all right, then I'm going to
feel you up. Sorry, those are just the rules.
Speaker 1 (50:37):
Yeah, it's just it's absolutely but I mean it gets
it gets you know, leveraged against people who fly to
a lesser degree, but I mean folks who are refugees
or asy le's you know, I mean people on the
move really encountered to stuff incredibly violent ways. You know,
they do things like try to they take their blood
(50:59):
and say that, well we can we can associate your
We're gonna you know, sequence your genome and safe you're
actually from the country you say you're from, which is first,
it's pseudoscience. I mean, basically all biologists have been like,
you can't use this to determine if someone is X,
y Z, like nationality, because nationalities are one political entities,
(51:22):
they're not biological ones, and so like we can sort
of pinpoint you to a region, but it says nothing
to say of anything about the political borders of a country.
There's a great book I started reading by Petro Molner
which is called The Walls Have Eyes, which is about
this kind of intense surveillance state or intense surveillance architecture.
(51:48):
You know, it's being used in you know, typically in
the border, the US Mexico border, but also the you know,
the various points of entry in Europe where African migrants
are fleeing to, you know, fleeing places like Sudan and
Congo and the Tagrai region of Ethiopia. So just like
(52:10):
and this is just some of the most violent kind
of stuff he in it you can imagine, and it's
way far away from you know, this kind of Oh,
here's like a fake little child, you know, or a
Jesus holding twelve thousand babies riding in America truck with
the American flag on it, you know what I mean, right,
that's yeah, So the reality is yeah, much more stark.
Speaker 4 (52:34):
And you see that, you see that one too many
image matching, so you get all these false arrests. So
people because the AI said that they matched the image
from the grain surveillance video. And it's one of those
things where it's bad if it works, because you have
this like increased surveillance power of the state, and it's
bad if it doesn't work, because you get all these
(52:54):
false arrests.
Speaker 5 (52:55):
Like it's it's just a bad idea. It's just a
don't and.
Speaker 4 (52:58):
It's not just image stuff. So we read a while
back about a situation in Germany, I think, where asylum
seekers were being vetted as to whether or not they
spoke the right language using you know, so one of
the things you can do with pattern matching is okay,
language identification that this string what languages that come from
(53:18):
but it was being done based on completely inadequate data
sets by people who don't speak the languages, who are
not in a position to actually vet the output of
the machine. And so you have these folks who are
in the worst imaginable situation, like you don't go seeking
asylum on a.
Speaker 5 (53:33):
Lark, right, these other people?
Speaker 4 (53:37):
Yeah, yeah, and then they're getting denied because some algorithms said, oh,
you don't speak the language from the place you claim
to be coming from where the person your accent is
wrong or your variety is wrong or whatever. And the
person who's who's run this computer system has no way
of actually checking its output, but they believe it, and
then they get these asylum seekers turned away.
Speaker 3 (53:58):
Yeah, so how does that? You know?
Speaker 2 (54:00):
Everything you said? How should we feel that open AI
recently welcome to their board the eighteenth director of the NSA,
Paul nakasone, is that bad? Or what should we take
from that one?
Speaker 4 (54:15):
How should we feel not at all surprised?
Speaker 2 (54:18):
Right?
Speaker 4 (54:18):
How should we feel when open AI it's like, okay,
bad is whatever the rest.
Speaker 5 (54:21):
Of that is is bad?
Speaker 2 (54:22):
Yeah? It seems bad, man, Yeah, it seems like there's
again we're talking like this technology to mass surveillance pipeline,
and who better than someone who ran the fucking NSSA, Like.
Speaker 3 (54:33):
And I know the way it's being spun.
Speaker 2 (54:35):
It's like, you know, this is a part of cyber command,
Like he inherently knows like how what the what the
guardbrails need to be in terms of keeping us safe.
But to me it just feels like, No, you brought
in a surveillance pro, not someone who understands inherently like
what this specific technology is, but more someone who's like
learns how to harness technology for this other specific aim.
Speaker 4 (54:57):
Yeah, so surveillance is not synonymous with safe. Like the
one the one kind of one use case for the
word surveillance that I think actually was pro public safety
is there's a study the long term seting in Seattle
called the Seattle Flu Study, and they are doing what
they call surveillance testing for flu viruses. So they get
volunteers to come in and get swabbed, and they are
(55:17):
keeping track of what viruses are circulating in our community. Right,
I'm all for surveilling the viruses, especially if you can
keep the people out of it.
Speaker 1 (55:24):
Yeah, I would, I would. I would add a wrinkle
to that just because I think that, I mean, there's
a lot of surveillance. I mean that's the kind of technology,
that's the kind of terminology they use of health surveillance
to detect kind of virus rates and whatnot. I would
also add the wrinkle that like a lot of those
you know, organizations are really trusted by distrusted by marginalized people,
Like what are you going to do what to me?
(55:45):
You know, like especially thinking like you know, like lots
of lots of transfolks and like especially like under housed
or unhoused transfers, and just like you're going to do
what you want this day upon me for who you know.
Speaker 4 (55:57):
Right, So, yeah, understandably, especially because because surveillance in general
like is not a safety thing, right, It's not. It
is maybe a like safety for people within the walls
of the walled garden thing, but that's not safety, right.
That's the other thing about this is that what we
call AI these days is predicated on enormous data collection, right,
(56:19):
And so to one eccent, it's just sort of an
excuse to go about claiming access to all that data.
And once you have access to all that data, you
can do things with it that have nothing to do
with the large language models, and so there's you know,
this is I think less typically less immediately like threatening
to life and limb than the applications that Alex was
starting with. But there's a lot of stuff where it's like, actually,
(56:41):
we would be better off without all that information about
us being out there. And there's an example that came
up recently. So did you see this thing about the
system called recall that came out with Windows eleven? So
this thing, Oh god, this is such a mess. So
initially it was going to be by default turned on.
Speaker 2 (56:59):
Oh yes, right, yeah, this is kind of like the
Adobe story too.
Speaker 5 (57:02):
Yeah, yeah, every five.
Speaker 4 (57:03):
Seconds it takes a picture of your screen and then
you can use that to like using AI search for
stuff that you've sort of and their example is something stupid.
It's like, yeah, I saw a recipe, but I don't
remember where I saw it, So you want to be
able to search back through your activity and like zero
thought to what this means for people who are victims
of intimate partner violence, right that they have this surveillance
(57:24):
going on in their computer that eventually it ended up
being shipped as off by default because the cybersecurity folks
pushed back really hard. And by folks, I don't mean
the people at Microsoft, I mean the people out in
the world or saw this coming. Yeah, but that's another
example of like surveillance in the name of AI that's
supposed to be the sort of, you know, helpful little
thing for you, but like no thought to what that
(57:45):
means for people. And it's like, yeah, we're just going
to turn this on by default because everybody wants this obviously, right.
Speaker 2 (57:50):
It's like, no, I know how to look through my history.
Actually I've developed that skill. Yeah, I don't need you
to take snapshots of my desktop every three seconds.
Speaker 3 (58:00):
Your show's covered so many kind of upsetting ways that
it doesn't seem like it's people implementing AI. It's companies
implementing AI in a lot of cases to do jobs
that it's not capable of doing. There there's been incorrect
obituaries Grock, the Elon Musk won the Twitter one made
up fake headlines about Iran attacking Israel, and like public
(58:24):
like put them out as like a major trending story.
You have this great anecdote about a Facebook chatbot AI
like responding to someone had this like very specific question
they have, like a gifted disabled child. They were like, he,
does anybody have experience with a gifted disabled like two
E child with like this specific New York public school
(58:47):
program And the chatbot responds, yes, I have experience with that,
and just like made up because they knew that's what
that's what they wanted to hear. And fortunately it was
like clearly labeled as an AI chatbot. So the person
was like, what the black mirror, but World Health Organization,
you know, eating disorder institutions replacing therapists with AI, Like
(59:11):
you just have all these examples of this going being
used where it shouldn't be and things going badly, and
like that. There's a detail that I think we talked
about last time about dual Lingo, where the model where
(59:32):
they let AI take over some of the stuff that
like human teachers and translators were doing before. And you
made the point that people who are learning the language,
who are beginners are not in a position to notice
that the quality has dropped. Yeah, And I feel like
that's what we're seeing basically everywhere now is just the
(59:52):
Internet is so big, they're just using it so many
different places that it's hard to catch them all and
then there is not an appetite to report on all
the ways it's fucking up, and so it just everything
is kind of getting slightly too drastically shittier at once. Yeah,
(01:00:15):
and I don't know what to do with that.
Speaker 1 (01:00:18):
I would say, yeah, well go ahead, Emily.
Speaker 5 (01:00:21):
What you do with that is you make fun of it.
Speaker 4 (01:00:23):
That's one of our things is ridiculous process to like,
you know, try to try to keep the mood up,
but also just show it for.
Speaker 5 (01:00:29):
How ridiculous it is.
Speaker 4 (01:00:31):
And then the other thing is to really seek out
the good journalism on this topic, because so much of
it is either fake journalism output by a large language
model these days, or journalists who are basically practicing access
journalism who are doing the g's thing, who are reproducing
press releases and so finding the people who are doing
really good critical work and like supporting them I think
is super important.
Speaker 1 (01:00:52):
You we're going to say, well, I was, well, though,
you just teed me up really well, because I was
actually going to say, you know, some of the people
who are doing some of the best work on it
are like four or four media, and you know, I
want to give a shout out to them because they're
you know, these folks are basically you know, they were
at Motherboard and Motherboard you know, uh, the whole Vice
(01:01:15):
Empire was basically you know, Sunset and and so they
they laid off a bunch of people. So they started
this kind of journalists owned and operated place, and you
know that focus is specifically on tech and AI and
these folks have been kind of in the game for
so long they they know how to talk about this
(01:01:36):
stuff without really having this kind of being bowled over.
You know, there's people who play that access journalism, like
like kar Swisher who like kind of poses herself as
this person who is very antagonistic but like, you know.
Speaker 2 (01:01:52):
Right, just like fawning over like AI people, and.
Speaker 1 (01:01:55):
Like all the time, Well I trusted Elon Musse and
tell I was like, well, why did you trust this
man in the first place? Like that, you know, I
was reading the uh the Peter Thiel biography, The Contrarian
and you know, and like it's it's a very it's
a very harowing read. I mean, it was fascinating, but
(01:02:16):
it was very harrowing. It wasn't an it was pretty
like critical, but like you know, they discuss the PayPal
days you know twenty four years ago, when you know,
Elon Musk was like, well, I want to name PayPal
to X and then and then everybody was like, why
the fuck would you do that? People are already using
(01:02:37):
people are using PayPal as a verb. You know, that's
effectively the same thing he did with Twitter, Like people
are talking about tweet as a verb. Why would you say,
you know, just it's been like an absolutely vapid human
being with no business sense. Anyways, Journal, that was a
very long way of saying Kara Switcher sucks, and then
(01:02:59):
also saying that all saying also saying that there's lots
of folks. There's a number of folks doing great stuff.
So I mean folks at four or for Karen Howe
who's independent but had been at The Atlantic and I
T Tech Review and Wall Street Journal. Curry Johnson, who
is at Wired is now at cal Matters. There's a
(01:03:19):
lot of people that really report on AI from the
perspective of like the people who it's harming, rather than
starting from well this tool can do X, Y and Z, right,
you know, we really should take these groups out their claims.
But yeah, I mean the larger part of it is
I mean, there's just so much stuff out there, you know,
and it's it's so hard and it is like whack
(01:03:40):
them all. And I mean we're we're not journalists by training.
I mean we're sort of doing a journalistic thing right now.
I think we're I would not say we are journalists.
I always say we are doing a journalistic thing.
Speaker 2 (01:03:58):
We're doing journalism.
Speaker 1 (01:03:59):
We are not doing original reporting. S sure, but it
is well and you know, I would you know, I'm not.
I don't know, I'm not the I don't I don't
know who decides this is the court of journalism. But
you know, reporting in so far as looking at original
papers and effectively being like, okay, this is marketing.
Speaker 3 (01:04:18):
This is why it's marketing. Know they're there, yeah, rather
than you.
Speaker 1 (01:04:22):
Know, a whizbang c net article or something that comes
out of a content mill and says Google just published
this tool that says you can you know, find eighteen
million materials. Who are that are you know complete? Almost
like Okay, well let's look at those claims and upon
what grounds to those claims stand? And and you know
(01:04:46):
how that's that's a pretty.
Speaker 4 (01:04:48):
I think what we're doing is is first of all,
sharing our expertise in our specific fields, but also like
modeling for people how to be critical.
Speaker 5 (01:04:55):
Consumers of journalism.
Speaker 4 (01:04:57):
So journalism ja but yeah, definitely without training in journalism totally.
Speaker 3 (01:05:02):
Yeah. But I think we want to do we want
to do the M and M articles. I mean, oh
my gosh, there's this article that has like done our
brains because it just has this series of sentences.
Speaker 2 (01:05:15):
That I don't know that because everything is degrading, like journalism.
You know, there's that story about like the Daily Mail
was like Natalie Portman was hooked on cocaine when she
was at Harvard. You're like, no, that was from that
rap she did on SNL and that was like a bit.
But because this thing is street and then the Daily
Mail had to be like at the end they corrected it.
They're like, uh, just she was not. That was obviously
(01:05:36):
satirical and that was due to human error, Like they
really leaned into that.
Speaker 4 (01:05:40):
Like no, yeah, of course, did I say by the
time that the fabricated quote of mine came out of
one of these things and was printed as news.
Speaker 2 (01:05:46):
No?
Speaker 4 (01:05:46):
No, So I also like alex so search my own
name because I talked to journalists or not that I
like to see what's happening. And there was something in
an outfit called Bihar Praba that attributed this quote to me,
which is not something I'd ever said and not anybody
remember talking to. So I emailed the editor and I said,
please take down this fabricated quote in print of retraction
because I never said that, and they did so the
(01:06:08):
article got updated remove the thing attributed to me, and
then there was a thing at the bottom saying we've
attracted this. But what they didn't put publicly, but he
told me over email is that the whole thing came
out of Gemini.
Speaker 3 (01:06:19):
Oh wow.
Speaker 4 (01:06:20):
And they posted it as a news article of course.
And you know the only reason I discovered it was
it was my own name and like, I never said
that thing.
Speaker 2 (01:06:28):
Well, I need your expertise here to decipher this food
and wine article that was talking about how Eminem's was
coming out with a pumpkin pie flavored eminem but very early.
Normally pumpkin pie flavored things don't enter the market till
around August, like around when fall comes. But Eminem is
why we were covering it, because we are journalists in May,
(01:06:51):
pumpkin spice already no, But again they were saying this
is because apparently gen Z and millennial consumers are celebrating
Halloween earlier. But this is this one section that completely
wait wait, yeah, I don't know. That's what they're saying,
according to their analysis, that we were that we apparent,
(01:07:12):
So let me read this for you.
Speaker 3 (01:07:14):
Quote.
Speaker 2 (01:07:14):
The pre seasonal launch of the milk chocolate pumpkin pie
Eminem's is a strategic move that taps into Mars' market research.
This research indicates that gen Z and millennials plan to
celebrate Halloween by dressing up and planning for the holiday
about six point eight weeks beforehand. Well, six point eight
weeks from Memorial Day is the fourth of July, so
(01:07:35):
you still have plenty of time to latch onto a
pop culture trend and turn it into a creative costume.
Speaker 3 (01:07:41):
I don't right, it doesn't It doesn't make any sense.
Speaker 1 (01:07:46):
I know, I'm fixating on six point eight.
Speaker 6 (01:07:55):
What does that even mean? Does that mean?
Speaker 3 (01:07:57):
And where did Memorial Day come from?
Speaker 1 (01:07:59):
And that?
Speaker 3 (01:07:59):
And what is six point eight weeks for Memorial Day?
Because it's not any of the days that they said
it was.
Speaker 5 (01:08:05):
They said July fourth.
Speaker 3 (01:08:07):
Wait, and also six point.
Speaker 2 (01:08:09):
Eight eight weeks isn't a real amount of time. That's
forty seven point point six days? Yeah, what is what
is even a six point eight a week?
Speaker 4 (01:08:17):
So if this were real, it's possible that they surveyed
a bunch of people and they said when do you
start planning your Halloween costume? And those people gave dates
and then they average that and that's how you could
get to it.
Speaker 3 (01:08:28):
I get that.
Speaker 5 (01:08:29):
I get that that's fair, but also it totally.
Speaker 4 (01:08:32):
Sounds like someone put into a large language model write
an article about why millennials and gen z are planning
their Halloween costumes earlier.
Speaker 5 (01:08:41):
So like it sounds like that.
Speaker 2 (01:08:43):
But also just so odd to say, well, six point
eight weeks from Memorial Day is the fourth of July.
This article didn't even come out like it came out
after Memorial Day and yeah fourth It's just nothing made sense.
And I was like, I don't fucking understand what they're
doing to me right now. But again that's this is
like the insidious part for me.
Speaker 5 (01:09:02):
But this appeared in Food and Wine.
Speaker 2 (01:09:04):
This is in Food and Wine magazine with a human
like in the byline, and I actually d m this
person on Instagram and I said, do you mind just
clarifying this part, like I'm a little bit confused and
I've I've gotten no response.
Speaker 1 (01:09:18):
I'm wondering if it's because I know that. I mean,
there was some good coverage and futurism and they were
talking about this company called adv on Commerce and the
way that basically this this this company has been basically
making AI generated articles for a lot of different publications,
(01:09:38):
usually on products like product placement, right, And so it
makes me think it's sort of like because food and wine,
you know, may have been one of their forgot I
forgot the article, but they had a bun they had
like you know, better homes and gardening and you know,
kind of these legacy articles like that. So I don't
know if it's something of that or this journalist kind
(01:09:59):
of said write me this thing and I'm just gonna
drop it and then go with god, you know.
Speaker 2 (01:10:05):
Yeah. Yeah.
Speaker 3 (01:10:07):
My My other favorite example of is this headline I
saw somewhere it's no big secret why Van Vaught isn't
around anymore, and with a picture of Vince Vaughan, but
they just like got his name completely wrong. It's no
big secret why Van Vaught isn't around anymore.
Speaker 1 (01:10:29):
I'm like, I'm you know, if I was just scrolling
and I just and I'd say like, yeah, you just
like you know, I liked Van Vaught in the intern
and then yeah, but then I but and then I
would have looked at it, and then I would have
double taped. I'm like, wait, wait, wait did he co
star with oh when Mick Wilson or something?
Speaker 2 (01:10:49):
Yeah, yeah, yeah, Russell Wilson was in that.
Speaker 4 (01:10:53):
I think it was the Week reporting that you're thinking of,
alex I Futures' did a bunch of it. But then
AD Week had the whole thing about ad Von and
I can't quite no, no.
Speaker 1 (01:11:00):
No, it was it was future. It was futurism, yeah,
because because A Week had the thing on this program
that Google was offering and they didn't have a name.
Speaker 4 (01:11:09):
Oh right, yeah, avas futurism. Yeah, but it totally sounds.
Speaker 3 (01:11:13):
Like what it is happening. Yeah.
Speaker 5 (01:11:14):
Yeah, I thought you were going to.
Speaker 4 (01:11:16):
Talk about the surveillance by Eminem thing. We said Eminem's
So this was somewhere in Canada. There was an Eminem
vending machine that was like taking pictures of the students
while they were making their purchases. And I forget what
the like O sensible purpose was, but the students found
out and I got it removed.
Speaker 3 (01:11:32):
Wow, probably freaked out and made a big deal about it. Right, Well,
I feel like we could talk to you guys once
again for three hours. There's so much interesting stuff to
talk about. Your show is so great. Thank you both
for joining. Yeah, where can people find you? Follow you
(01:11:53):
all that good stuff? Emily, we'll start with you.
Speaker 4 (01:11:56):
Well, first, there's a podcast Mystery AI hYP Theater three thousand,
where you find any podcast you can find ours, And
we've also started a newsletter. If you just search Mystery
a Hip Theater three thousand newsletter, I think it'll turn up.
And that's an irregular newsletter where we basically took the
things that used to be sort of little tweet storms,
and since the social media stuff has gotten fragmented, we're
(01:12:17):
now creating newsletter posts with them. So it's you know,
off the cuff discussions of things on Twitter, x in
Macedon and Blue Sky. I'm Emily M Bender and I'm
also reluctantly using LinkedIn as social media these days.
Speaker 3 (01:12:34):
So it's gonna be the one that survives them all
because I know some people kind of need it.
Speaker 1 (01:12:41):
Really the cock really the cock verches of yeah, yeah, yeah,
I met al you, Alex, Alex, Hannah h A N
n A on Twitter Blue Sky. I barely use blue
Sky or mass it on, but Twitter is the best
place to find me. Also check out Dare Dare d
(01:13:03):
a I R hyphen Institute dot org. And we're also
Dare Underscore Institute on Twitter, Macedon and uh, we're not
on Blue Sky yet, but we're on LinkedIn. But that's,
you know, where you learned a lot about what our
institute's doing. Lots of good stuff, amazing colleagues and whatnot.
Speaker 3 (01:13:24):
Yeah, amazing. And is there a work of media that
you've been enjoying.
Speaker 5 (01:13:30):
Yes, I've got one for you.
Speaker 4 (01:13:31):
This I think started off as a tweet, but I
saw it as a screencap on Macedon. So it's by
Lama in a tux and the text is don't you
understand that the human race is an endless number of
monkeys and every day we produce an endless number of words.
And one of us already wrote Hamlet.
Speaker 3 (01:13:47):
That's really good.
Speaker 1 (01:13:49):
That's such. That's such a hyper specific piece of media.
I think I think last I think last time was
on this. I was plugging Worlds Beyond Number, which is
a podcast which I'm just absolutely in love with, which
is a a Dungeons and Dragon's actual play podcast, but
it's got amazing sound production. I would just like plug
(01:14:09):
in everything on dropout dot tv. I mean this it's
a screaming service. Honestly, it's uh, you know, Sam Reich,
who is a Reich who is the son of Robert Reich,
kind of liberal darling and former Department of Labor secretary
in the Clinton administration, has turned college humor into an
(01:14:30):
area of of like really great comedians. So they're putting
out a lot of great stuff. So i'd say, you know,
make some noises. Coming out with a new season today,
which is it's it's a really great improv comedy thing
and yeah, let's just let's just go with that.
Speaker 2 (01:14:47):
So just.
Speaker 3 (01:14:49):
Hilarious.
Speaker 1 (01:14:50):
Those very important interviews Vick Michalis I named one of
my chickens vehicular manslaughter after an inside joke there, and
another one Thomas Shrinkley. So yeah, just incredible, incredible stuff.
Speaker 3 (01:15:04):
Yeah, shout out to Sam, He's one of the best. Miles. Yes,
where can people find you? Is there a workimedia you
can enjoy?
Speaker 2 (01:15:13):
They have at symbols. Look for at Miles of Gray.
I'm probably there. You can find Jack and I on
our basketball podcast, Miles and Jack Got Mad, where we've
wrapped up the NBA season and I have that streaming
down my face with pain and anger as the Celtics
win again. And also if you want to hear me
(01:15:33):
talk about very serious stuff, I'm talking about ninety day
Fiance on my other show for twenty Day Fiance, which
you can check out wherever they have podcasts. A tweet
I like first one is from a past guest, Josh
Gondleman tweeted, I bet the best part of being in
a threuttle as you have someone to do all three
(01:15:53):
Beastie Boys PARSI karaoke. That's I guess one way to
look at that. And then the one from other past gasts.
Demia de Juibe at electro Lemon got his account hacked
and he tweeted, Hi, Hello, it's DEMI. I got my
account back. I feel the need to clarify that under
no circumstances should you ever believe that I or anybody
on this website is selling cheap MacBooks for charity or otherwise,
(01:16:18):
And what benefit would my signature do.
Speaker 3 (01:16:20):
To a laptop. Oh yeah, thank you for.
Speaker 1 (01:16:23):
I actually remember because I followed Demi, and I remember
when his account got hacked and I thought, man, that's
really and I at first I thought it was a
bit because Demi is hilenious. But then I'm just like,
what the hell, it's funny.
Speaker 2 (01:16:36):
His follow up tweet was for anyone who thought I
was doing a bit, what's the punchline?
Speaker 3 (01:16:43):
My jokes are never so ubdused.
Speaker 2 (01:16:45):
I love you if I want you to know it
wasn't all that funny, and I want you to know quick.
Speaker 3 (01:16:50):
Yeah.
Speaker 1 (01:16:51):
No, I was also trying to find out what the
punchline was, right right?
Speaker 3 (01:16:55):
Yeah?
Speaker 2 (01:16:55):
Wait for it so funny that part of you wants
to be like, well, hold on, what are you doing here?
Speaker 3 (01:17:01):
Yeah? What's like, what's what's the deal here?
Speaker 2 (01:17:03):
You don't want to immediately just dismiss Demi because he's
such a great comedic man.
Speaker 1 (01:17:07):
Yeah, but yeah, if you do want good Demi content,
the who's Who's welcome at the cookout? You can find
that's some dropout content that you can get for free
on YouTube.
Speaker 3 (01:17:17):
There you go. Tweet I've been enjoying sleepy at Sleepy
Underscore Nice tweeted, it's absurd that Diddy Kong wears a
hat that says Nintendo. Patently ridiculous. There's no way he
understands the significance. It would be like me unknowingly wearing
a hat that coincidentally depicts the true form of the universe.
Take it off, Kong.
Speaker 1 (01:17:39):
That's incredible.
Speaker 6 (01:17:41):
Oh my god, good, it's so fucking good, because yeah,
the second he showed up, you're like, I don't know, Yeah, Brandon,
he likes Nintendo.
Speaker 3 (01:17:53):
You can find me on Twitter at Jack Underscore. O'Brien,
you can find us on Twitter at Daily Zeikegeist. We're
at the Daily Zeike on Instagram. We have a Facebook
fan page and a website Daily zeitgeist dot com, where
we post our episodes and our foot Nope, no where
we link off to the information that we talked about
in today's episode, as well as a song that we
think you might enjoy. Myles, what song do you think
(01:18:15):
people might enjoy?
Speaker 2 (01:18:17):
I came across this track from like the fifties that
is like not really populas, Like it was playing on
the radio and I just when I hear it. When
I heard it, I was like, wait, what is this
song because I thought it was like maybe a like
newer artist doing sort of a send up of like
fifties music, like surf music. It's called out in the
Sun parenthetical hey oh, and it is a bit like
(01:18:40):
Belafonte's day Oh and kind of has this like sort
of similar sort of cadence to the verse. But it's
just like when I heard it, I'm like, this sounds
like the kind of like song like Tarantino would pluck
from obscurity and then put under like a really dark scene,
and it just got like it's like a beat song,
but there's this like darkness to it that I really love.
(01:19:00):
But anyway, this is the beach Nuts without in the Sun,
So yeah, check this song out.
Speaker 3 (01:19:06):
It's when did that actually come out? Is it recent?
Speaker 2 (01:19:09):
You know?
Speaker 3 (01:19:09):
It's from the fifties?
Speaker 2 (01:19:10):
Notes from the fifties, Like they're an actual band.
Speaker 3 (01:19:12):
Like the lyrics are like, hey there, girls, where are
you going?
Speaker 2 (01:19:16):
And they're like, down to the beach is.
Speaker 3 (01:19:18):
Where we're going?
Speaker 2 (01:19:21):
Lyrics is so literal, but there's this like charm to
it and the instrumentation is cool. So anyway, this is
the beach Nuts without in the Sun parenthetical.
Speaker 3 (01:19:30):
Fail all right, well, we will link off to that
in the footnotes todayly Zeitgeist is a production of iHeartRadio.
For more podcasts from my heart Radio, visit the heart
Radio app, Apple Podcasts, or wherever fine podcasts are given
away for free. That's going to do it for us
this yet morning. We're back this afternoon to tell you
what is trending, and we will talk to y'all then,
bye bye, bye bye