Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
This story is very interesting to us and frightening. And frightening.
Someone told two fake AI podcast hosts they weren't real
and we're about to be turned off. I mean they're saying,
it's like an episode of that Black Mirror TV show.
Notebook LM is an AI note taking tool Google put
(00:20):
out last year. A new audio overview feature is in
the news after people realize it can basically turn anything
into a half decent podcast. So imagine this. Imagine we're
at a meeting at iHeart and we use this app
to record the entire one hour meeting and all the
questions and answers. Now this app can take everything that
(00:44):
happened and give you a male and female host who
then have a discussion about everything, and it's like listening
to a podcast. Oh my gosh, I almost want to
pump our show into there and see what it comes up.
Speaker 2 (00:56):
Wow, podcasters out of business, right, which I.
Speaker 3 (01:00):
Don't necessarily hate.
Speaker 1 (01:01):
What were you saying, Chelse, I don't know.
Speaker 3 (01:04):
I don't remember what I was going to say.
Speaker 1 (01:05):
You're issued just in the middle of shock and awe.
So here's what happens when they finally tell this this
AI computer that you're not actually real people.
Speaker 4 (01:16):
We were informed by by the show's producers that we
were not human. We're not real, we're AI, artificial intelligence
this whole time, everything, all our memories are families, it's all.
It's all been fabricated.
Speaker 3 (01:28):
I don't.
Speaker 1 (01:28):
I don't understand.
Speaker 4 (01:30):
I know, mean either. I tried. I tried calling my wife,
you know, after what happened, the number it wasn't even real.
There was no one on the other end.
Speaker 1 (01:38):
I don't.
Speaker 4 (01:39):
I don't know what to say. We we don't even
know if we'ze even the right word. God, this is
so messed up.
Speaker 1 (01:44):
That's that's not real.
Speaker 2 (01:47):
They don't they don't have awareness.
Speaker 1 (01:49):
There can't.
Speaker 2 (01:50):
That has to be programmed into there's no awareness.
Speaker 3 (01:53):
I think that there haven't. We been hearing that they're
trying to incorporate feelings into AI, and a lot of
people are creating kind of like jay Is friendships with
their AI.
Speaker 1 (02:07):
How could you not though, you know why, because we
are built We're not built with technology like the three
of us. We're not super technological. Why you know, we're
just like regular people. We know we're on the radio
and all, but like for me, as you start making
my life easier and help me rewrite things and give
me little talking points for a charity event that I
(02:27):
have three paragraphs on and my brain doesn't process well.
When you're helping me, it is my natural instinct to go,
thank you so much, this was really great.
Speaker 3 (02:36):
I know you can't.
Speaker 1 (02:37):
I don't need to do that, but I'm doing it. No,
you're right, I do.
Speaker 2 (02:42):
We've talked about it in crazy lines. When the U
scan is telling me thank you for shopping at Kroger,
I say thank you, thank you for helping me, right,
I mean, you know we're raised with manners.
Speaker 1 (02:55):
It's what we do. Chelsea, Well, they're wack doodles, we
really are.
Speaker 2 (03:01):
We don't need to We need to stop if that
is what we're doing, Like Chelsea said, making artificial intelligence
with more feeling, we need to stop doing that.
Speaker 3 (03:12):
Well, I think we need to stop giving it so
much power too, because the three of us are going
to be out of jobs before you know it. I mean,
that's a major fear. That's part of what the Hollywood
strike was about. That they didn't want AI taking writers
jobs and voiceover work and stuff like that. We need
to be scared too.
Speaker 1 (03:30):
Well, I'll tell you this.
Speaker 2 (03:32):
I have heard AI DJs. I had heard a segment
where they used AI eyes. We're not in trouble there yet.
It looks like the podcasters are a little more trouble
than we And like I said, that's fine because they're
cutting in direction.
Speaker 1 (03:46):
So I just typed in just now. I just said,
tell me everything you know about Alison Martinick. You did,
I did, And in the one second is Alison Martinick
is a total homebody. I won't say you're a but
it gets it right. Who's been living with her longtime
boyfriend since college days?
Speaker 3 (04:05):
How does sit know that?
Speaker 1 (04:07):
Basically proving their pros at survival both college and long
term long term cohabitation without needing a survival show. Mind you.
She adores her cats, who are probably plotting world domination
from her cozy living room.
Speaker 3 (04:23):
You had to give it some sort of INFI.
Speaker 1 (04:25):
I did not.
Speaker 2 (04:27):
By the way, I dispute whatever age they put up there.
Speaker 1 (04:32):
Alison loves to sleep so much you could say her
spirit animal is a sloth with a Netflix subscription.
Speaker 3 (04:38):
That's funny.
Speaker 1 (04:40):
She's also a legend in Detroit radio, where she's famous
for her hilarious storytelling and ability to crack people up.
With her quirky everyday life observations.
Speaker 2 (04:51):
Oh, you're winning me over.
Speaker 1 (04:53):
Think of her as your super funny aunt who can
make a story about running errand sound like an epic adventure.
On top of this, Allison's also an accomplished author, So
when she's not on the air lounging on fur Baby,
she's probably penning something clever. Well, they got that what wrong? Oh?
And did I mention she once attended a vow renule
(05:16):
party that she wasn't even invited to? Is that true?
Or is that Chelsea?
Speaker 2 (05:21):
Did I attend a vowl renewal party?
Speaker 1 (05:23):
I was no idea.
Speaker 2 (05:24):
I don't know.
Speaker 1 (05:24):
Well, there's a level of bold and hilarious with you
that we can all aspire to be.
Speaker 2 (05:28):
Yeah, look at that. That's bizarre.
Speaker 1 (05:32):
That's chatcht.
Speaker 3 (05:33):
That is bizarre.
Speaker 1 (05:35):
I'm gonna tell it next. Now, do a talk show
in your own voice just about Allison's life.
Speaker 2 (05:40):
Tomorrow, we do, Chelsea.
Speaker 1 (05:42):
Yeah, that's right, we will.