All Episodes

June 7, 2023 • 31 mins

There are many questions we still need the answers to when it comes to AI. And whether you accept it or reject it depends greatly on who you are and what you're using it for.

Using Julie's AI-generated bio as a springboard for today's conversation, you'll hear what AI gets right, where it still has much to learn, and where it lands concerning depth and emotion for this simple task.

So while the gaps are certainly there, and we all should proceed with caution, denying AI's existence does not make it disappear. However, as long as we pay attention, stay awake, mind the gaps, and consider the source, we can navigate this new and unprecedented Earth School lesson together.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome to Insider's Guide to the Other Side, a production
of iHeartRadio. Hi, y'all, I'm Julie.

Speaker 2 (00:09):
Hi there, I'm Brenda. Welcome to Insider's Guide to the
Other Side.

Speaker 1 (00:14):
Now, y'all need to know that we are obsessed with
everything on the other side.

Speaker 2 (00:20):
Yes, we are, because once you learn to navigate the energetic,
or to some the invisible world, life is going to
be more fun and much more serene.

Speaker 1 (00:31):
Uh heck, yes it can, because, let's be honest, for
in earth school is hard. In fact, you taught me.

Speaker 2 (00:37):
That let's crush Earth school together.

Speaker 1 (00:43):
Well, hello, my witchy pooh. Is it really you or
are you an artificial intelligence that I'm talking to?

Speaker 2 (00:50):
I wish I was Ai, I wish Sometimes there are times.

Speaker 1 (00:54):
You're far, far better than our leaving artificial.

Speaker 2 (00:58):
It be me me, you got me in Lulu. That's
that's as far as we go. Right, Well, you know,
and how are you today?

Speaker 1 (01:07):
I'm actually great. Do you mind if I give a
quick update to the cemetery that I bought, just the quickie.
It won't be long. It is the quikie.

Speaker 2 (01:17):
It's a small cemetery.

Speaker 1 (01:18):
It's a small cemetery. So the descendants. These folks have
reached out to me. Weird too, because it was on Instagram,
and it's like, how'd they know so fast? Like there
was somehow they knew I bought it and then they
found me on Instagram and have been checking me out.

Speaker 2 (01:36):
Maybe they listened to this show or there's that. So
in a case, hey, y'all doing.

Speaker 1 (01:44):
Ortego's and archilettos. So anyway, lovely people that have tons
of memories of the house. It's it's actually really sweet. Yeah,
the Weston is one of the I think he's a
great great and son around their great grand nephew something
in there. Lovely guy. In fact, I want to have

(02:06):
lunch with him. And then his aunt also reached out
and his aunt has a lot more information. This is great, right.
She actually came back and said to me that she
thinks that those headstones were done originally for like a
different like a cemetery plot, but they were the wrong
dimensions or something, and so they just kept them at

(02:28):
the house. I don't know where, so.

Speaker 2 (02:31):
It's not where they're buried, that's what she said.

Speaker 1 (02:33):
I'm not sure, but again, nobody would be surprised if
there was more bodies there, and then Michael, our contractor,
he spoke to the woman across the little gravel street
because there's so there's more gravel roads and paved roads here.
I still think they need to wipe the new off
of this state because it's Mexico and I love it.

(02:56):
I'm here for it all.

Speaker 2 (02:58):
I just came from Mexico. Amazing resort off and don't run.

Speaker 1 (03:03):
Yeah, there you go, that's New Mexico too, or New Mexico,
the extended Mexico extended right, Well did it belong to
Mexico at one point. But anyway, so the neighbor Mary
said to Michael that she actually believes that our house
was built in either the late seventeen hundreds or early
eighteen hundreds. Cool, huh. The thing is the city has

(03:25):
no doesn't have a record. They'd actually didn't when our
house was built. We're gonna end up hiring a historian
to go look at like the Governor's palace whatever where
the photograph things are. Yeah, yeah, because there's some This
street specifically was so famous because of it was the

(03:46):
Santa Fe trail, right, and it's got to be lots
of photographs of something, just to see if maybe when
it showed up. But anyway, so it's been just it's
quite a journey and it's been actually really fun.

Speaker 2 (03:58):
I love that.

Speaker 1 (03:58):
And people I've talked they're like, wait, you have ghosts,
And I said, listen, I'm far more afraid of AI
than I am ghosts. So it's like, you bring all
the ghosts along. It's good. I know what to do
with them. AI is a whole other Angelota, which is
how we got to today today's topic. Did you like
that connection?

Speaker 2 (04:17):
Really nice segue, right, smooth. I like it. I like it,
well done, well played.

Speaker 1 (04:26):
Thanks, So do you mind if I start?

Speaker 2 (04:28):
Yeah, this is you very.

Speaker 1 (04:31):
It is it is. It is my lane, but I
do i'm's.

Speaker 2 (04:35):
Gonna affect us. Also, peeps, pay attention, right because it.

Speaker 1 (04:39):
Is part of you know, our school is hard, right,
So this is something else that's being introduced to us
very quickly and in a massive way that I think
we need a little bit more understanding. And one caution
I have for everyone is do not hang on headlines.
Their headlines are all they want you to click, so

(05:01):
they get paid. So just everybody know, do not live
your life on headlines. So we're gonna go a little
bit deeper. But I'm gonna give you a little bit
of history of my experience with AI. So I've actually
had three work experiences with AI, and the first one
was at Fox and the AI that we used. And
by the way, I just go step back a second,

(05:22):
we were using AI to make the humans smarter. We
weren't using AI to replace humans. In fact, that's what
That's all. The AI I've been involved in was about
finding things that we couldn't find, you know, or putting
pieces together that we couldn't see. That was the goal
of a of AI. It wasn't to take anybody's job.
It was actually to enhance it. It was to make

(05:43):
business better, people better, right at least that's what I
thought at the time.

Speaker 2 (05:48):
So I think that's all that's all it did at
the time.

Speaker 1 (05:51):
It did, Yeah, it did. It was you know, there
was you know, the AI was used during COVID so
the sie scientists and the physicians could try to figure
out were their commonalities to those that were getting really
really sick from COVID versus those that weren't. And interestingly

(06:11):
enough is because a I can read so quickly across
the board, can pull it and read it What it
came down to is another I do weird shit when
I can't sleep, I read and I read stuff like this.
But what it came back with was people with high
is it a high glucose? So a lot of sugar

(06:32):
in your blood was actually the one consistent thing that
was across the board for people who were very ill
with COVID. Oh interesting, And it was Ai that found
it and quickly, so I think there's some really great
applications for it. It can also go awry, whether you've
seen movies about it. But I'll tell you when I

(06:53):
was at Fox, what would yes, yes, yes, uh look
at you. Oh my god, Oh my gosh.

Speaker 2 (07:03):
I don't know where that came from.

Speaker 1 (07:05):
Take a moment to just celebrate the witchy, the cultural
wishi poo.

Speaker 2 (07:10):
Okay wait that's at least eight years old.

Speaker 1 (07:12):
But still yeah, yeah, you know. I had our our
head data scientists come to me one day and he goes, Julie,
I think our AI is racist.

Speaker 2 (07:27):
Wow.

Speaker 1 (07:28):
And he was right by the way, because.

Speaker 2 (07:30):
How did he catch it?

Speaker 1 (07:32):
He caught it because he knew enough about film, which
one things I loved about him. He knew enough about
it and he started to see he's also brilliant, right,
like these these they're just brilliant humans. But he started
seeing and this was AI for video. Uh, like, I
sent my whole team up to Nerdville for three months,

(07:54):
which was Mountain View in California. It was the Google offices.
We paid for courses for these guys to take video AI.
And he what he noticed is that if there were
any dark colors elements, like even the sky something like that,
it would say, oh, this is this is uh this
this movie is for African Americans. We're like, what who exactly?

Speaker 2 (08:17):
It was racist?

Speaker 1 (08:19):
So he fixed it. But this is these are the
things that you start to see. You know that. It's like,
we're the ones that are are the input for AI,
We're the ones who are fueling it. And if anybody
has gone on to chat GPT, I will tell everybody

(08:42):
it's fueled by the Internet. You can pause for fear
and laughter because you know what is mostly out there
is bullshit, righte So knowing that right now chat GPT
is fueled by the Internet, that's feeding it, that is,
that is how it's basing everything. That's why really the

(09:03):
only thing truth be told that yet that chat GPT
it's a weird name to always mix it up, like chat,
what is GPT?

Speaker 2 (09:13):
I don't know the reason. That's good for a few things.

Speaker 1 (09:16):
Right now, college students are using it to write papers.
I don't say that's good, but that's what it's being
used for. You have some people writing like presentation decks
and things like that.

Speaker 2 (09:26):
From definitely definitely right. My clients are using it, but
can you download things into it? So like so that's
not like my presentation isn't on the internet until I
download it? Correct and then so does what I create
become part of the just the public.

Speaker 1 (09:46):
Domain for chat GPT, I don't know. I think it
depends upon the degree that people use it. So the questions,
I think because the way people get the presentation, they
ask it questions or it's like I need to create
a presentation that blah blah blah blah blah, and it'll
spit something out, okay. So as so it does get smarter,

(10:08):
that is absolutely true, not smart like a human gets smart,
but it starts to see what is accepted and rejected
makes it smarter, right, And so right now it's like
people are using it for papers or they take part
of it for papers, which, by the way, if I
were a parent paying seventy five thousand dollars a year
for private school education, be horrified that my child was

(10:32):
doing shit like that, I would be horrified. Like, it's
fine if it assists, right, Assisting is fine. I think
assisting is actually good. But when it takes that whole
process away, it's like, then what's the point of you?

Speaker 2 (10:48):
Yeah? Right, so I don't. Obviously from my question you
can tell I don't use chet GPT, but like I
use grammarly because I skip boards when I'm typing, or
I move them to places they're.

Speaker 1 (11:02):
Not supposed to be as you do, or I miss
spell things, or you ship things to yourself or I fed.

Speaker 2 (11:07):
Xings to myself. Order is not that important to my brain, apparently,
but so so grandmaly is really helpful, and there's a
lot of times I reject what it offers. I'm like, no, actually,
I mean I mean to do this, Like this is
how I want to say this.

Speaker 1 (11:22):
So why don't we take a quick break because I
want to come back and I'm want to read you
what chat GPT says about me.

Speaker 2 (11:29):
Okay, we'll be right back. Okay, I want to know
chat GPT says about my.

Speaker 1 (11:42):
Elf right, welcome about but I will say.

Speaker 2 (11:48):
Stupid chat gi Right.

Speaker 1 (11:50):
The only reason that I even know about this, it's
because my brother, who is highly intelligent tech guy, is
the one who is messing around on chat GPT, and
he did a He asked the question, who is Julie Rieger?

(12:10):
So I'm going to read you what it says. Julie
Rieger is a former executive in the entertainment industry, known
for her work in film distribution and marketing. By the way, no,
I didn't do anything in film distribution, FYI. She has
held leadership positions at major film studios. Not no one,
it's just one, It's twentieth Century Fox, and then it

(12:32):
lists Fox Searchlight in twentieth century studios untrue. By the way, No,
I'm just but I'm telling you with a song, right,
this is fantastic. Riager has been recognized for her innovative
approaches to film marketing That's true and her contributions to
successful film campaigns. True. She's also been an advocate for

(12:57):
diversity and inclusion in the entertainment industry, promoting the representation
of unrepresented which I think is interesting. Voices and perspectives
in film and media. Very true. How it found that
I found very interesting. By the way, Reager has spoken
at industry conferences and events sharing her insights and expertise,

(13:21):
expertise on film distribution, not true, marketing and leadership true.
She is also the author of the book The Ghost Photographer,
a Hollywood Executive's true story of discovering the real world
of make belief. That is what it says about who
I am. Now, it was ninety percent right. Did it

(13:44):
leave things out? Sure did. But if you're like, who
is she? You're going to find out, like what are
the high notes? So it went in and it read
what was on the internet and found probably because the
way a lot of it works is it finds what
is written about you the most right. And I was

(14:04):
written up a lot of industry publications, and I did
a lot of speaking for like Google and Twitter and
so on. And by the way, the new Twitter CEO
is a good friend of mine. Yeah, yeah, I really
think she'll do great Lynda Yacharino this side note. So
so it was like, Okay, that's fine, but the reality

(14:26):
is I can write a better bio and I'm human. Yeah,
so that's kind of my point. It's like you might
get something like that, but if you actually need to
submit a bio for yourself for either a job speaking.

Speaker 2 (14:41):
Well, it's a little flay, it's a little flat.

Speaker 1 (14:43):
It's super flat.

Speaker 2 (14:44):
Yeah.

Speaker 1 (14:45):
Inaccurate, Yeah, and somewhat inaccurate because somebody be like, oh,
you worked in film distribution. No, that is a completely
different job at a studio than marketing, completely different. Do
you do work for Fox Searchlight? Actually I didn't. Rob
Wilkinson did not me. And then what about twenty students?
That's television. I had nothing to do with that. So
it's like it's it's false. Yeah, anybody would read that

(15:09):
in the industry be like that is bullshit, right yeah,
so yeah, but it also shows you that it is.
It's very young, right, It's kind of like having me
cook a gourmet meal versus like a real chef. Mine's
going to be beginner, mediocre at best, probably burns something

(15:32):
and tastes shitty. That's what chat GPT is.

Speaker 2 (15:35):
Right now, have the same ingredients, have the same.

Speaker 1 (15:38):
Ingredients, but it tastes like shit. It doesn't even taste
like chicken, like shit. So I just think it's important
for folks to understand that right now, that's what it is.
I can't. I don't want to tell anybody how to feel, ever,
because I wanted to say, like, don't be afraid, but
I think you should be afraid. I think you should

(16:01):
be afraid because everybody in the world does not have
good intentions. You know, there's a lot of evil, and
this is a weapon that they can use if you remember,
you know, And why I worry about things that are
fed by the Internet is they also because they have
so many lawes in it. And I'm going to pull

(16:23):
something up if y'all don't mind give me a second here,
because there's an article today. Do you have something you
want to add to it?

Speaker 2 (16:29):
Would well, yeah, I can. I can talk just a
little bit here. So this is clearly not my wheelhouse,
but I did. I do feel like I want to
educate myself. So my mom told me to watch the
sixty minutes special on AI, which was interesting to me.

(16:49):
Also just the fact that they closed the episode with
this may be the new disclaimer that says this entire
broadcast was created without the assistance of AI. And I
thought that was interesting.

Speaker 1 (17:01):
Oh that's interesting.

Speaker 2 (17:02):
Yeah, it was just it was you know, I just
made you go, wow, you know, we're going to have
to qualify things. So that was interesting. And then I
so I was just because I I actually went to
look for that, and then the algorithm on YouTube sent
me other videos that were really interesting by more technical people,
and then eventually I found that the sixty minute was

(17:22):
one like two weeks later. But I think it's important
that to understand that, like your bio had glitches in it.
When they're testing things like medical diagnosis and running studies
through the AI, there are glitches that they they're finding

(17:44):
and they're saying it did not get this right, you know,
like it got it, you know whatever three out of
fifty times or something there, you know, but if you're
that three, you know, six percent or whatever, like that's
that's a really big gap.

Speaker 1 (18:00):
I'm gonna warn you, though, what you're saying is so
real because two of the biggest enemies to democracy countries
are China and Russia. It's just a fact like and
in fact like Russia, I mean, sorry China. You know,

(18:20):
our whole fentanyl crisis, the fentanyl that is making its
way to Mexico comes from China. So now imagine, if
you really want to get this country start putting in
false medical information in on the internet, and it will
absolutely be terrible, like the consequences of it, right, Like
that's these are the things to think about.

Speaker 2 (18:41):
It's big, It's really big. And to say nothing of
I mean we hear about it all the time. You
know that the deep fakes people, right, where videos and
audios are so expertly produced and people can't tell the difference.
That's a huge problem.

Speaker 1 (18:58):
Yeah, the deep fakes definitely are.

Speaker 2 (19:01):
So here's and it's and it's not just in politics, right,
I mean, any anytime you misrepresented it.

Speaker 1 (19:06):
No, they had uh songs, right, they had was it
ed Sharon maybe and maybe somebody else I remember who
it was. But they like you know, have as Sharon
write and sing a song kind of thing, and it's
like they can do that. So de fakes are a
real issue when it comes to intellectual property, and that's

(19:27):
humans and artists.

Speaker 2 (19:29):
Right, That's exactly right, they're living It's it's a huge problem.

Speaker 1 (19:32):
Exactly right. So what's really interesting. It was an article
that I found on the New York Times and they
were talking about they were talking about really the mistakes, right,
like you were just saying, and there's one where they
used chat GPT and then they used them I t oh,

(19:58):
they used recently two versions of open AI's chat gbts
what it is, two versions of it, and it asked
about where the MIT professor Tomas Lozano Perez was born.
One bot said Spain. The other bot said Cuba. You

(20:19):
would think that's a fact, right, that it would say
the right thing. Once the system told the bots to
debate the answers, the one that said Spain quickly apologized
and agreed with the bot that had the correct answer, Cuba.

Speaker 2 (20:32):
See this is how we know it's not human based,
because humans don't apologize.

Speaker 1 (20:36):
Hell no, they doesn't. That's isn't that crazy?

Speaker 2 (20:41):
Yeah, that's interesting they did.

Speaker 1 (20:44):
Oh there was In Australia, a government official threatened to
sue open ai after chat gpt said he had been
convicted of bribery, where in fact he was a whistleblower
of a bribery case. See right, yeah, and garbage out.

Speaker 2 (21:02):
Yeah, we got to really be mindful and go back
and identify some of these gaps. It's super important. So
we're going to take a break and we'll be right
back and welcome back.

Speaker 1 (21:25):
Thank you. That was my bot.

Speaker 2 (21:30):
It is pre programmed people at so, yeah, go ahead.
My helf.

Speaker 1 (21:38):
Thing is that I will say I can say it
now because it's been years since we wanted to do it.
But we had the beginnings of a chat bot of
Deadpool we were creating, and it was going to be
so much fun. I mean, it really was going to
be amazing. But anyway, so I just think.

Speaker 2 (21:59):
So, wait a minute, you didn't go through with it.

Speaker 1 (22:01):
We didn't know.

Speaker 2 (22:02):
Okay, do you know why? Do you remember why?

Speaker 1 (22:04):
Yeah? Yeah, I kind of remember why. No, Actually I
fully don't. Okay, I think I think that there was
It's a little bit of what we call the juice
worth the squeeze. It's like, is it worth doing this
and it going in? It?

Speaker 2 (22:18):
Not in?

Speaker 1 (22:18):
It going wrong and hurts the movie.

Speaker 2 (22:21):
Well, that's say that could go off the rails.

Speaker 1 (22:24):
That's the whole thing.

Speaker 2 (22:24):
Yeah, you could have.

Speaker 1 (22:25):
That's why we did.

Speaker 2 (22:26):
It, because it's it's a slight nuance of that character
that makes it work, and if he goes full over
the cliff, it may not work.

Speaker 1 (22:34):
Yeah, and we did some testing. But yeah, so weirdly,
I've been involved with this stuff for a while and
I just you know, we don't have answers here. This
is all so new. I just you know, when we
think like how how we either accepted or rejected depends
on who you are and how we operate in the

(22:55):
world with it, and think about it, I think is
where the your school lesson is. And I think that
I'd mentioned before about headlines. Be very careful with headlines, folks,
and also be very careful if you want to go
use it, but the headline part of it, you know,
because another thing that has transformed a lot in the

(23:16):
last decade has been monetizing media. You know. I think
one of the greatest downfalls actually of our country was
when news networks went to twenty four hours and that
means they had to fill twenty four hours, right, that's
not good.

Speaker 2 (23:36):
Well, I think between that and losing local newspapers.

Speaker 1 (23:38):
Local reporters completely agree with that. So now it's a game.
It's a game. And the way people and organizations get
paid unless it's a subscription basis like the Washington Post,
New York Times, the Guardian, those are things I actually
pay for when they are so called free. Nothing's ever free, folks,

(24:00):
but they're gonna they are. Some of the best and
scariest headline writers ever. And you know people that like
to just scan and they and they take in that
headline and then they talk about it and telephone.

Speaker 2 (24:14):
Yeah, the headline that was specifically designed by AI or
some data wonk that knows what what spikes your adrenaline, like,
what changes your blood chemistry? They know this. So but
we have to pay attention. We have to stay awake.
We have to not disengage because you know, AI is

(24:37):
not going away. We cannot put the genie back in
the bottle. And you know we can't control it either.
So this is you know, we have to stay awaken
and get educated and try and stay you know, just
stay conscious when it comes to this.

Speaker 1 (24:51):
And the way that I respond when people talk to
me about about AI, I always respond in a way
that I think is responsible to humans, which is as
long as we keep it where it is assisting humor
humans and not taking it over. And because the assist
can be life saving, life changing. It's the speed of

(25:13):
light to try to figure out a new disease, how
a cure for something like I think medicine. It could
be amazing for medicine, but it could or it could
be wrong. Of course, if it's using open and open system,
that's the other thing that's scary. So you know, I
just think it's always important to consider sources for anything.

(25:35):
You know, consider the source for your news, consider the
source for anything AI that you may read, right, like
that that thing the reason I wanted to read about
myself is like it wasn't correct.

Speaker 2 (25:46):
And it wasn't wrong either.

Speaker 1 (25:49):
It wasn't fully wrong, it wasn't also fully right. Yeah,
that's to me where it sits today.

Speaker 2 (25:55):
Yeah. But just like you would consider your source for
your crystals, down right, you consider your source for your media.

Speaker 1 (26:03):
Yeah. And don't you remember back in the day, I
remember my mother doing this, would only buy American cars.
It was like this source was really important. It was
more of a pride, patriotic kind of thing for her.
And then like a year later about a Toyota but whatever, Margaret.

Speaker 2 (26:20):
But eventually those were made in the US, too.

Speaker 1 (26:22):
Correct, and they lasted longer, and so that's why she
bought it.

Speaker 2 (26:27):
I grew up on Toyota's for sure, right, Yeah, oh my.

Speaker 1 (26:29):
God, we had the I called it the said the
Corolla called it the Crayola. So we had to Toyota
Crayola for the longest time.

Speaker 2 (26:38):
You have been a marketer forever.

Speaker 1 (26:40):
Right to Toyota Crayola. I fact, it was not a
bad idea. They could like offer sixty four colors, all
the Crayola colors. I don't know why they haven't called
me yet, but the thing is, I would love because,
like one of the big fears that is going through
Hollywood especially and now we're because we're in a writer

(27:00):
strike right now, is will AI take over? Well if
the definition of who I am is in the indication, No,
not right now, it won't. And by the way, it's
hard for humans to write humor good luck for a
computer to write it funny is money people, and it
is tough.

Speaker 2 (27:18):
So the other thing I get I do get asked
is do you think a I will do terror readings.
I'm like, of course they will, right, because you can
program the meaning of the cards. But that doesn't mean
you know, it doesn't get the X factor.

Speaker 1 (27:31):
No, it does not get the X factor in So
I am am I against it. It doesn't matter if
I am or not. I'm in the middle of it
because I've used it. Yeah, I've used it when I
was at Fox and we used it because we wanted
to understand our customers better. That's why we used it.
We wanted to know what we were most like so
we could find the right people who would enjoy the
film and not waste company resources because the more money

(27:54):
that we would blow on the wrong audiences.

Speaker 2 (27:58):
Come see your movie exactly right.

Speaker 1 (28:00):
You know, I've I've used it with story fit, and
storyfit has actually found inconsistencies in scripts or things that
are missing that professionals that like CEOs that read the
script ten times couldn't find. Like it's it really is remarkable,
really incredibly helpful. It is. But it's also considering the source, right,

(28:26):
because the stuff that Storyfit does does not use the Internet.
They have their own closed system that they use, so
it can't get buggy, infected by bullshit, by all that
kind of stuff. No, but it's Yeah, it's an important
thing because there are there are so many there are
probably hundreds of thousands of different AI models out there.

(28:48):
The one that people are being introduced to right now
is a chat gpt AI model, which is based on
the Internet.

Speaker 2 (28:55):
Okay, okay, yeah, because.

Speaker 1 (28:57):
They use it in small reed, Hi, everybody, she's talking
a lot right now. She's very chatty. So but anyway
she is, she is she should be a bot because
she talks a lot. But anyway, so it's just I
think we just wanted to let people know, like, these
are things to consider. This is part of your path.

(29:19):
You know, you're in it whether you like it or not.

Speaker 2 (29:21):
You know, yep, And remember like our school is hard. Yes,
this is one of the challenges we're going to have
to navigate. And you just don't be in denial. That's
the whole thing.

Speaker 1 (29:34):
Do not be do not put your head in the sand,
do your best to understand it, be thoughtful about it,
and don't live on headlines.

Speaker 2 (29:41):
Excellent, excellent summary. Thanks for listening to everybody, and remember her.

Speaker 1 (29:45):
School is hard without rebel barking and the other side.
Thanks she Al'm gonna go shut up my dog.

Speaker 2 (30:00):
Thank you for joining us everyone, and a special thanks
to our producer Joey Patt and our executive producer Maya
Cole Howard, who guides us while we guide.

Speaker 1 (30:10):
You hit us up on Instagram at other Side guides,
or shoot us a note at high Hi at Vibes
dot store.

Speaker 2 (30:18):
We want to know what you think. We want to
know what you know, and we want to hear your
stories and remember, our school is hard.

Speaker 1 (30:25):
Without the other side. Insider's Guide to the other Side
is a production of iHeartRadio. For more podcasts from iHeartRadio,
visit the iHeartRadio app, Spotify, Apple podcast, or wherever you
get your podcasts.
Advertise With Us

Popular Podcasts

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

40s and Free Agents: NFL Draft Season

40s and Free Agents: NFL Draft Season

Daniel Jeremiah of Move the Sticks and Gregg Rosenthal of NFL Daily join forces to break down every team's needs this offseason.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.