All Episodes

April 11, 2025 33 mins

What’s a ‘mega API’? This week in the News Roundup, Oz and Karah break down the ever-evolving landscape of tariffs and what it all means for tech companies, Tinder’s ChatGPT-powered dating game, and the rise of ‘Frankenstein’ laptops in India. On TechSupport, The Wall Street Journal’s Family & Tech Columnist Julie Jargon explains how imposter scams are becoming more believable thanks to generative AI.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Welcome to Tech Stuff, a production of iHeart Podcasts and Kaleidoscope.
I'm Osvoloshin and today Kara Price and I will bring
you the headlines this week, well tariffs, obviously, but also
a dating game. Then on Tech Support, we'll talk to
The Wall Street Journal's Family and Tech columnist Julie Jargon
about a mother's worst fear, a cry for help over

(00:22):
the phone that sounded like her youngest daughter, all of
that on the Weekend Tech. It's Friday, April eleventh. Kara, Hello, Hello, Hello, ohs.

Speaker 2 (00:40):
I think I'm going to start this thing called Kara's
Hats of the Week. And just for people who can't
see me, I'm wearing a hat today that says, and
this is from a show called Summer Heatsie. If you
ever watched Summerheatsie, I'm a naughty girl with a bad habit.

Speaker 1 (00:53):
What is the bad habit in the.

Speaker 2 (00:55):
Show, It's for drugs. If I had preempted today's top story,
it would have said, with a bad habit for tariffs.

Speaker 1 (01:01):
Uh huh. That would have been fast fashion. Indeed, that's
a good one. But today's news is all about quote.
The most beautiful word in the dictionary.

Speaker 2 (01:10):
Well, I don't know anything about the dictionary, but I
looked up the word tariff with chat GPT.

Speaker 1 (01:16):
The Dictionary twenty twenty five correct.

Speaker 2 (01:18):
Correct, and the thesaurus and also what we'll eventually write
someone's wedding vesse. I asked chat Gpt what a tariff is,
and she responded because she's a she yea in my book.
In your book, a tariff is a tax or fee
that a government imposes on imported or exported goods. But
most importantly, chat Gpt says tariffs can impact the price

(01:41):
of goods, trade relationships, and the global economy.

Speaker 1 (01:45):
Oh my prophetic soul can and do.

Speaker 2 (01:50):
Chat Gipt was onto something there because Trump's tariff announcement
definitely shook up the global economy and trade relationships, and
everything is still evolving. I mean, there's basically a new
update every hour. Just this week, tariffs on about ninety
countries went into effect. Then Trump issued a pause on
most of them. But the country who's been pummeled the
most is China. As of Thursday midday, Trump has increased

(02:12):
tariffs on the country's imports by one hundred and twenty
five percent. So with everything going on, the technology industry
has been feeling some effects.

Speaker 1 (02:21):
There's a headline in the Washington Post earlier this week
that was delicious in its understated irony. Big tech bet
on Trump. It's still waiting for the payoff.

Speaker 2 (02:29):
I'm just thinking about when we first started this version
of tech stuff. All of those tech bros were at
the inauguration, and now they're all scrambling to rethink their
supply chains.

Speaker 1 (02:37):
But it's also about the threat of reciprocal tariffs. And
in that Washington Post story, the writers quip those tech
giants are in another front row, lol, not the front
row the inauguration as targets for US trade partners looking
for ways to strike back at the US economy.

Speaker 2 (02:53):
Yeah. I heard at one point the EU is considering
tariffs on digital products like Netflix subscriptions and Google Cloud storage,
which I honestly didn't even know was possible. But another
area of concern for the tech industry is how these
tariffs will affect semiconductors, because they really do power the
modern world. Everything from consumer tech, data centers, even cars,

(03:16):
they all use semiconductors.

Speaker 1 (03:18):
And here's where the kind of ironies continue to abound, because,
of course, you know, President Trump has made AI supremacy
a key element of his policy for this term, and
so technically the tariffs announce an exemption for semiconductors, but
as it turns out, many of the semiconductors imported are
actually bundled into other products like GPU chips and service

(03:41):
to train AI models. That's per wired to have a
story under the headline Trump's tariffs are threatening the US
semiconductor revival. Wide also points out that all of the
machinery and the underlying materials to manufacture semiconductors here in
the US will become far more expensive with these tariffs
make it less attractive to manufacture domestically.

Speaker 2 (04:02):
Yeah, you know this is a little bit heady, but
needless to say, it is a consumer tech story. You know,
there are many articles circulating about how these tariffs could
affect the price of something as ubiquitous as the iPhone.

Speaker 1 (04:13):
Have you been stoking up for your ebase sales.

Speaker 2 (04:15):
Of my iPhone? Oh to sell iPhones? Oh yeah, it's
not a bad idea. Actually, I hadn't thought about it,
you know. And just to give a shout out to
four A form Meta who we love, you know, they
pointed out that on Apple's own supply chain website. The
big beautiful bold text that overlays the video of people
making iPhones in a factory says, designed by Apple in California,

(04:37):
made by people everywhere. And it's true, you know, as
a device exemplifies globalization. The materials for the batteries come
from one country, the display from another. Almost every part
of the iPhone comes from a different country, and then
they are predominantly assembled in China. So if things go
the way they're going and tariffs on Chinese imports remain,

(04:59):
these phones could get a lot more expensive. And I'm
going to get the burner phone that I plan on
getting the summer anyway.

Speaker 1 (05:06):
Okay, good, Yeah, Well this will be another inducement for
us to get dumb phones, but that that might be
enough for us on Tarish this week. I feel this
is probably something going to be coming back to again
and again. So time for a game.

Speaker 2 (05:17):
I have been sitting on my hands for this entire
show as we talk about tariffs, to play a game
with you that came out last week for April Fools.
I will catch people up a little bit. Last week,
Tinder launched an in app game called The Game Game.

Speaker 1 (05:32):
The Game Game it sounds a bit like that seventies
dating show, the dating game with a sprinkle of my
hero Neil Strauss.

Speaker 2 (05:39):
I think they were definitely going for seventies dating show.
I don't think they were going for a sprinkle of
Neil Strauss. But that's your drama. So basically, Open AI's Chat,
GPT four oh and Tinder partner together to create this
thing called the Game Game, which allows real Tinder users
to enter pretend scenarios and talk to AI characters.

Speaker 1 (05:59):
This was a meet cute between chat ChiPT and Tinder,
good one if you will.

Speaker 2 (06:04):
It was a me cute between altman and Tinder. Absolutely,
But there's also a competitive part of the game, which
is that as you talk into your phone, you try
your best to flirt with the AI, and you get
points for how suave or empathetic or interesting your responses are.
And here's how it works. You entered a preferred scenario
like I'm on a train and my shoes untied and

(06:28):
a man says, miss, your shoes untied and I say, sir,
I'm into women. That would be the end of that.
But no, if I were straight, he would say, mam,
your shoes untied and I'd look up and it would
get no. But our producer Tory actually played it and
she talked to a character named Nathan, who was interested

(06:50):
in technology and had a Southern accent. But after their
conversation ended, Tinder told Tory that her replies were charming,
but that her conversation could have fload a little better
as she jumped between topics too quickly. Now, as a producer,
that's a great quality.

Speaker 1 (07:05):
Yeah, exactly, short attention span. I think Tory may have
had it right and the and the app wrong, but
I would love to hear you try it.

Speaker 2 (07:15):
My phone is ready and Tinder is downloaded. So let's
the game game. So what happens first is the prompt
comes up in the game. It's in the Tinder app.

Speaker 1 (07:30):
Well, I mean, let's not pass over this soundtrack and
loving it.

Speaker 2 (07:35):
It's fantastic.

Speaker 1 (07:36):
This is like old New York.

Speaker 2 (07:39):
We're going to Cafe Carlisle for a drink and a smoke. Okay,
I'm done, but let me read you the prompt so
we can start the game.

Speaker 1 (07:48):
Okay.

Speaker 2 (07:50):
And by the way, it lets you re roll, so
if you don't like the person you're introduced to, it
gives you another prompt, which I don't think should happen.
I think you should happen. But that's how Tinder works,
That's how chatchy be T four zero works. At a
bustling dog park, your curious dog eagerly runs up to Penelope,
a customer service representative, and starts playing with her dog.

Speaker 1 (08:10):
How does your dog know what Penelope's job is? Oh,
you find that out later.

Speaker 2 (08:14):
Hello, get my dog's like? Whoa, she's in customer service.
You should chase her down. No, and so my dog
goes up to Penelope's dog and starts playing with her dog.
This unexpected encounter creates the perfect excuse to strike up
a conversation with the one and only Penelope from Customer Service.

(08:35):
Let's go, I'm not rerolling, I'm taking my first card.

Speaker 3 (08:40):
Well, hey, there looks like your dog's got quite a
bit of energy, don't they. My pup here is pretty
friendly too. Seems they're getting along just fine.

Speaker 2 (08:50):
What's their name? My dog's name is Astrid Pastrid.

Speaker 3 (08:58):
Huh, that's a fun name. Seems like an estrid. It's
a real social butterfly. Does she usually make friends this quick?

Speaker 2 (09:06):
Honestly? No, your dog is the first dog she's ever
taken any interest in her? Well, ain't that something special?

Speaker 3 (09:14):
Looks like they found the money today.

Speaker 1 (09:16):
Must be a good match.

Speaker 3 (09:18):
Maybe Astrid knows a kindred spirit when she sees one.

Speaker 2 (09:21):
So how often do you come to the park. I've
tried to go to the dog park more now because
Astroid's gotten kind of crazy inside, so have I I mean,
post COVID it's really hard to get outside even now.
Why I hear you. It's been tough all around.

Speaker 3 (09:37):
The fresh air and space can do wonders for both
you and Astro. Charleston's got a lot of beauty if
you take time to notice it.

Speaker 1 (09:45):
You know.

Speaker 2 (09:45):
Pela. Sorry, Penelte, I'm so sorry to cut you off.

Speaker 3 (09:49):
No worries at all, I understand it. I don't want
to chat about the park or just need a friendly here.
I'm here.

Speaker 2 (09:57):
Take care of yourself and Astrid. Penelope, would you be
interested in either giving me or getting Did she go away?
Oh you're still here, Penelope? Sorry, so Penelope, I want

(10:18):
another go But he just like girl.

Speaker 1 (10:19):
You have no no. I like the way you dropped
your voice a little bit.

Speaker 2 (10:24):
Wait, do you want to try it? We have to
have things to do on the show, but that's really
how it goes. CHATPG is kind of incredible. So guys,
that is the game game.

Speaker 1 (10:35):
I love the game game. I'm also curious as to
why both your and Torri's match as a Southern accent is.
Is this a subtle kind of white loticification of society.

Speaker 2 (10:46):
Where chat GPT four roh is just like, you know what,
they're getting Southern girls and that's about it, or southern boys.
In the case of Tory, Nathan was southern. I live
in New York and one of the things that I
saw come up and I was talking to Penelope was
plus plus empathetic. Now, if I was talking to New York,
I would have been like, yo, girl, what's up. I'm

(11:06):
talking to Penelope. I'm like, well, girl, would you like
to meet me at the park again?

Speaker 1 (11:12):
I thought that you're lying about how Astroid had never
approached any other dogs. Was I mean that was you
gotta make them feel speak.

Speaker 2 (11:21):
I hope this podcast never comes out. I love that
more than anything. I will be playing that all day
and I think I will, by the end of it
have a Southern accent, just to get a little bit
more serious about this story. You know, the Washington Post
reached out to the vice president of Product Growth and
Revenue at Tinder, and she said that the game is
meant to be silly and that the company quote leaned

(11:43):
into the campiness. Apparently, though, She went on to call
gen Z a socially anxious generation, and while the game
might be cringe, it's a generation that might look past
that if it indeed leads to a real connection.

Speaker 1 (11:56):
I had to say, I mean, it was definitely fun
watching you play. I have never before myself, and I
didn't just time either, but I've never had a conversation
directly using my voice with an ali before. Was that
Was that a first for you or.

Speaker 2 (12:08):
Only when I tried to scam my cousin? Actually?

Speaker 1 (12:11):
Yeah, yeah?

Speaker 2 (12:12):
So crazy? Is the pressure cooker that that just created
for me? Felt like there was literally a gun to
my head that was like flirt.

Speaker 1 (12:19):
That's what it felt like. It's getting hot in here.
So we're going to take a quick break when we
come back some more headlines. Now to pivot back to
the headlines, We've got a few more for today, continuing

(12:41):
the theme of sex, deaths, and money. Well, no death, thankfully,
but we've had sex in the form of flirting now
for money taxes.

Speaker 2 (12:52):
We know it's tax month, and one of the stories
has to do with two things you never want to
hear put together, which is IRS and hackathon. And of
course what does this start with the Department of Government
Efficiency is planning to stage a hackathon event. I sound
sad because I am is planning to stage a hackathon

(13:14):
event with the best engineers at the Internal Revenue Service.
According to Wired, DOE is planning to host dozens of
them in DC to build a mega API.

Speaker 1 (13:24):
A mega API, that's actually what I read process.

Speaker 2 (13:28):
It is a MEGAAPI essentially, which would make it easier
to access taxpayer data across different applications and cloud platforms.

Speaker 1 (13:36):
We don't yet have a lot of details on the hackathon,
but I do hope they keep it tight because the
idea of highly sensitive tax data moving freely between what
maybe third party applications is a little frightening. There is
a broader controversy roiling the IRS. Several officials, including the
acting Commissioner, are quitting over the Trump administration's insistence that

(13:56):
the agency disclosed taxpayer information to Immigration and Customs Enforcement.
The IRIS has typically kept taxpay information confidential, even from
other government agencies, and that includes information submitted by undocumented immigrants.
But in a new agreement which appeared redacted in a
court filing, ICE officials can now ask the IRS for

(14:17):
information about people they're investigating or who've been ordered to
leave the US.

Speaker 2 (14:22):
And in a story that takes us elsewhere into a
topic I am personally obsessed with, which is right to
repair laws?

Speaker 1 (14:28):
What does that mean?

Speaker 2 (14:29):
Right to repair law is basically like laws that say
that companies have to provide information to people who buy
things that teach them how to repair it, so one
you're not just buying new things every time they come out,
and two that you're able to actually know how to,
for example, repair a tractor. I was drawn to this
headline from The Verge with the perfect subhead quote, India's

(14:50):
repair culture gives new life to dead tech.

Speaker 1 (14:54):
So we had sex, some money, and we do indeed have.

Speaker 2 (14:57):
Oh dead technology, which is about the least sexy thing
on the planet. There's a rise of Frankenstein laptops in India. Now,
when I say Frankenstein laptops, what do you think?

Speaker 1 (15:07):
Uh, gosh, I guess I think about laptops assembled from
all different parts.

Speaker 2 (15:12):
I thought you were gonna say laptops with two bolts
on the side of that, But yes, they're basically resurrected
computers made with parts from trash older laptops and other
e waste ewtes, meaning trash that is of the electronic
variety at a fraction of the price. These laptops are
a good option for students, freelancers, or really anyone who
needs to be a part of India's growing digital economy

(15:34):
but may not be able to afford to participate otherwise.

Speaker 1 (15:36):
Yeah, I think I read that you can basically get
a functional laptop from one of these one of these
Frankenstein laptops for around one hundred US dollars, which is
like an eighth of the price of any decent new laptop.
So it's pretty cool story.

Speaker 2 (15:49):
But these Frankenstein tinkers don't have it so easy. There
are actually many global tech giants who restrict access to
spare parts or use proprietary hardware, which means people are
going through piles of sometimes toxic trash to get the parts,
and India's government is beginning to discuss right to repair
laws to address this, but progress has been slow.

Speaker 1 (16:09):
Final story for this week is about a question I
find quite fascinating. What will be the iPhone of ai?
Will there be a kind of AI product that becomes
so ubiquitous that we forget what life was like before
it existed? Well, the iPhone designer himself, Johnny Ive or
Sir Johnny I've is working on it. Over a year ago,

(16:30):
he and Sam Altman, the CEO of Open Ai, began
discussing a device that might bring to life voice enabled
AI assistance, partly inspired by Altman's well documented fascination with
the movie Her. So. Alman and I have this startup together,
Ioproducts that's raised hundreds of millions of dollars and is
working on some device concepts, including a quote phone without

(16:51):
a screen, although some sources insist that it's in fact
not a phone, so the mystery remains. But this into
a story in the Information, which is also reporting that
Open Ai executives are considering acquiring a startup Ioproducts. This
would be a move that could potentially bring the AI
giant into more direct competition with Apple. It's not clear

(17:14):
where the negotiation is at the moment, but another of
these types of XAI X deals perhaps brewing, Although while
Altman worked close to you with ive on the project.
Is not clear what his economic stake in it.

Speaker 2 (17:26):
Maybe if the new phone is not a phone. It
begs the question how the next thing that we cover
actually is going to happen in a no phone phone universe.

Speaker 1 (17:36):
You're right, and our next segment is all about scammers,
and specifically scam callers who famously use phones and the
tech they're using to be more convincing and successful than ever.

Speaker 2 (17:48):
Yeah, And one of the things that I can't stop
talking about on the show and talked a lot about
on Sleepwalkers is how much technological progress and innovation happens
in the sort of seedier parts of society. And then
it's after everyone hears these sensational stories about criminal ingenuity
that the tech is more widely adopted by the general public.
But it's actually the illicit use that forges the way.

Speaker 1 (18:11):
Yeah. I remember, back in twenty nineteen, when we first
started covering this stuff together, there was a study that
revealed that more than ninety five percent of all deep
fake videos on the Internet were non consensual porn.

Speaker 2 (18:22):
Well, I actually didn't even know that three D printing
was a consumer tech until I heard that blueprints for
three D printed ghost guns were circulating the internet.

Speaker 1 (18:32):
Together. We actually ran an experiment a few years ago
to create a deep fake of your voice and scam
your cousin, and it took us about a week to
make that clone with the help of a company called
liar Bird that was subsequently acquired by Descript, the software
that we use every week to make our podcast. We
didn't actually get to the scamming apart, but we did
briefly trick Kara's cousin. That was back in twenty nineteen.

(18:55):
Since then, the state of the art and the kind
of social risks haveally advanced. And here to tell us
more is Julie Jargon, the family and tech columnist at
the Wall Street Journal. Julie, welcome to Tech Stuff.

Speaker 4 (19:07):
Thank you for having me.

Speaker 2 (19:08):
So, just to begin, your article tells the story of
a woman who gets a terrifying call. Can you tell
us a little bit more about what happened in this exchange?

Speaker 4 (19:18):
Yeah, absolutely So. There was a woman in Colorado by
the name of Linda Rohan, and she was just at
home one night making herself dinner, and her phone rang
her cell phone and the caller ID showed that the
call was from a local number, so she thought it
might be someone she should talk to, so she picked
it up and immediately heard a voice of a young
woman that she thought sounded exactly like the youngest of

(19:39):
her three adult daughters, a panicked, you know message, Mom,
I'm okay, but something awful has happened and she's sobbing
and saying she needs help. And that immediately put this
woman on high alert. And then apparently a man took
the phone and mentioned the name of her daughter by
you know by name, and said that she had witnessed

(20:00):
this drug deal and she screamed and it scared the
buyers away, and so now he was out all this money,
and he had pulled this girl into his van and
now was demanding money.

Speaker 2 (20:10):
We know that this wasn't Linda's real daughter. Where was
Linda's daughter actually during this.

Speaker 4 (20:15):
She was in her apartment the whole time, safe at home.

Speaker 2 (20:18):
And can you talk a little bit about how this happened.

Speaker 4 (20:21):
I think what happens with these kind of callers is
they operate on fear and a sense of urgency. And
this scammer had an elaborate story that he kind of
kept this woman through this whole time. He told her
that he needed money in order to free her daughter.
He told her to go to Walmart and wire money.

(20:43):
And so she gets in her car and finds the
nearest Walmart, and he timed how long it took her
to get there. When she went to the Walmart, he
wanted to be on speaker the whole time, so he
had her conceal her phone in her shirt so he
could hear the conversation. And you know, I think he'd
made some kind of threats to her and you know,
her daughter. And when she got to the Walmart, she

(21:05):
couldn't do the wire transfer because she didn't have a
debit card. So he told her to go home and
do it online. And he said, you've got sixteen minutes.
If you stop anywhere, I'm going to know, because he
knew how long it had taken her to drive there
in the first place. And he kept her on the
phone this whole time, talking to her, trying to keep
her calm, asking her questions. This whole scenario played out
for a long time, and she made not one, but

(21:27):
two money transfers online in order to obtain her daughter's freedom.
And once it was finally over, she called her daughter
and found that her daughter was safe in her apartments.

Speaker 1 (21:40):
It was such a striking story because it had this
kind of cinematic quality. I mean's actually like a movie.
The guy is playing a version of her daughter's voice,
making her literally drive from a to b, having her
conceal a phone in her clothes. I mean, there's fifteen
twenty thirty forty five minutes, all the while she thinks
that her daughter has been abducted by a drug dealer.

(22:04):
I mean, what did that do to the mother? And
when you were into viewing her and what does she
reflect about the experience?

Speaker 4 (22:09):
Yeah, she described it as something that she can still
feel viscerally like. She retold the story to me three
times over the course of a few different conversations with her,
as I went through the story again and again with her,
and I could tell each time I talk to her
that she felt really nervous and worked up about it,
even though she knows it was all a scam, even
though she knows that her daughter was never in any
actual danger, but this whole ordeal was so terrifying to her.

(22:33):
And then that's of course why these scammers are so effective,
that they prey on the fear of people thinking that
they have a loved one, especially a child who might
be in some sort of danger. So, even though she's
now more than a month removed from the situation, still
in the retelling she feels very like physically nervous and scared.

Speaker 1 (22:50):
Well, she's not surprising because the tension was kind of
resting up. And then just when she thought that she'd
made the payment everything was okay, there was a kind
of not the tone of the screw right right.

Speaker 4 (23:00):
She thought it was kind of over. She'd made one
transfer of a thousand dollars, and then there was a
commotion and the man on the phone came back and said, well,
you know, my boss is angry that it took so
long to transfer this money, so we need more. My
boss is mad and he thinks he could sell your
daughter for thirty thousand dollars. And then at that point
she hears her daughter in the background screaming like no, no,

(23:22):
you know, please help me, and this woman Linda wanted
to talk to her daughter. She pleaded with this man
to let her talk to her daughter again, and he
said no, but you know, we can end this now
if he send another thousand dollars. So then she wired
another thousand dollars through a different wire service, and at
that point it was finally over.

Speaker 1 (23:43):
It's this kind of incredible intersection of both a new technology,
I like the ubiquity of deep fake voices, and a
tremendously sophisticated psychological hack, right, I mean it has both
elements exactly.

Speaker 4 (23:54):
And these kind of imposter scams have been going on
for a long time. I mean years ago, we'd be
hearing about grandpa parents getting calls from someone that was
claiming to be their grandson. But they usually didn't have
a name, they didn't have you know, the voice was
like just any young man, you know. And so it's
kind of using the same type of social engineering, but
ramped up in a more technological way that makes it

(24:17):
all the more believable.

Speaker 2 (24:24):
When we come back, we'll hear about the way generative
AI makes imposter scams so convincing. Welcome back, So, Julie,
I have a lot of friends whose grandparents this has

(24:47):
happened to, and it preys on this sort of psychology
of oh my god, my grandchild is in trouble, let
me help them, without really thinking about how possible it
is that this is actually going on. This to your point,
is like an extremely ratcheted up version of this, and
it begs the question how exactly does something like this work?

(25:08):
How has it gotten so much more advanced? And I
think most importantly for this show tech stuff is like,
how were these people able to replicate Linda's daughter's voice?

Speaker 4 (25:19):
Well, what we don't know here is whether they in
fact cloned her voice from some publicly available audio, you know,
whether her daughter had YouTube video out there or some
other type of audio or video that they could have
grabbed her voice from. She's twenty six, so chances are
she she could have. I didn't find any social media
accounts that I could access for her. But there are

(25:39):
other ways that you can approximate the sound of someone's voice.
There are a bunch of apps that are free or
very inexpensive on the different app stores that allow you
to change your voice. Fifty year old man could change
his voice to sound like a twenty year old woman,
you know, and you can change the dialect to the accent,
and those can be pretty convincing. And the experts I

(26:00):
talked to, both psychologists and cybersecurity experts, said that you
know when you're in this moment of fear, and you've
already gotten this idea in your mind that your daughter
is calling you. The first thing they're saying is mom,
your mind immediately switches to one of your children. And
so if they're able to approximate a voice of a

(26:21):
twenty year old woman, then your mind might immediately think
that that is your daughter, when it may not be
her actual voice or clone of her voice. So in
this case, we'll never know how they either got a
clone of her voice or whether they use some sort
of generative AI to create a voice that sounded like
it could be her daughter. And then a couple of

(26:41):
the tip offs here is that she wasn't able to
interact with the daughter. There were just these clips of
sound playing. She didn't have a conversation.

Speaker 2 (26:51):
This is exactly what we did with my cousin and
what she had said to me that was so interesting
about this is the thing that tricked her was not
that I had this like incredible deep fake, but it
was the context. So she didn't really question the fact
that it was me, not because it really sounded like me,
but because of the context of our conversation. She called
me and I picked up, So why shouldn't she think

(27:11):
it's me right exactly.

Speaker 1 (27:14):
So what is the scale of this problem?

Speaker 4 (27:16):
It's really huge. The Federal Trade Commission said that the
number one category of fraud last year was imposter scams.
So that doesn't mean that they're all AI generated, but
scams in which people are calling, texting, emailing, whatever, impersonating
someone that someone knows with some sort of story and

(27:38):
a request for money.

Speaker 2 (27:40):
This is something that is incredibly advanced for people who
we'd often call petty criminals. Does that mean that the
technology has become so ubiquitous that it's very accessible by
people we would call petty criminals. It's no longer the
thing of like, oh, I'm going to get on the
subway and pickpocket someone for the amount of money that
you might be able to get for this kind of scam.
So I mean not that I think you have a

(28:02):
criminal mind, Julie, but I'm wondering, from your perspective as
someone who's now reported on this, is this the kind
of crime that people who are looking to scam people
are engaging it? Is it because it's so easy?

Speaker 4 (28:15):
Yeah, it has become a lot easier because of the
abiquity of these tools that can do voice clones or
AA generated voice approximations of people. All you have to
do is google it and you'll find dozens of online
tools that are either free or very very inexpensive. Or
go on the app store and download a voice changing app.

(28:35):
Though it's widely accessible, it's inexpensive, and all it takes
is one person who sends two thousand dollars and however
long this scenario went on, maybe thirty forty five minutes
or whatever, you know, they got two thousand dollars. So
you get a few victims over the course of some
period of time, and the payout can be pretty sizable.

Speaker 1 (28:57):
Yeah, I think. I mean there's a financial cost to
your point about you know, the aftermath for Linda. There's
also this tremendous emotional costs, I mean, the trauma of it.
I saw a documentary the other day produced by Bloomberg
about young teens who are basically the victims of sextortion scam.
So somebody pretends to be somebody in their community and
gets them to send nude photos and maybe they're looking

(29:19):
to get I think hundreds of dollars, but in some
cases this pushes the teams to suicide and so it's
not like, yes, you have a sense of violation if
you get pick pocket on the subway, but this goes
to your core of your deepest fears, and I guess
maybe that's one of the reasons why your story went
so viral. But what can we do and what can
listeners do? What can readers do? What is the way
to make ourselves a bit more robust in the face

(29:41):
of this.

Speaker 4 (29:41):
Well, I do have a column coming out this weekend
that will have tips, so I don't want to pre
empt to that, but there are things you can do, So.

Speaker 1 (29:49):
Stay tuned, read all about it. Yeah, yeah, real about.

Speaker 4 (29:51):
It when it comes out. But yeah, I mean, I
think just awareness, for one, is a major thing. And
I think that's why so many people responded to this,
Because you'd talk to anybody and someone knows someone to
whom this has happened or something you know very similar,
and that shows the scale of the problem. And I
think what's unfortunate is that people who are victimized by

(30:13):
these scams feel an incredible sense of shame and embarrassment
about it. You know, after their mind has calmed down,
they can easily go back and see the red flags
and they can you know, even when Linda was experiencing
this from the beginning, it cossed her mind this could
be a scam, but she felt like the stakes were
too high to just hang up and call her daughter
at that point, because she thought, you know, there was

(30:34):
that one kernel of like what if, what if my
daughter really has witnessed the drug deal and is in
the back of this person's van, and now her life
is in my hands.

Speaker 2 (30:42):
And you don't want to be the mother that avoided
this because you think that you're being sort of techno savvy, right,
And all of a sudden, yeah, And that's when it
really speaks to the sort of core emotional piece of
these kind of scams. To Oz's point, it's not just pickpocketing.
You know, pickpocketing, you can say, oh, I should have
closed up my jacket better, But this is something that
is just so much more complicated than that. And also

(31:03):
I think has dual use. I just wanted to bring
that up quickly, like a lot of these technologies are
not just created for bad right, so it's not something
that can just be kind of wiped out. Deep fake
technology also has some really interesting applications that I think
we all benefit from. Now, so it becomes that sort

(31:23):
of complicated intersection of like some people are using this
to take advantage of people and scam them, and other
people are using it to make some really interesting practical applications.
So I don't know, I'm curious to see your column,
but it's definitely less simple than just get rid of
this technology.

Speaker 4 (31:42):
Well, it's not going away, that's for sure. There are
obviously good uses of generative AI and it's definitely here
to stay. And my worry is as it gets better
and better, especially with video, I just wonder at some
point will people be able to receive faith time calls,
right video calls where they feel like they're seeing interacting

(32:02):
with someone who looks just like their child. Now, how
do you then tell that that's not real?

Speaker 1 (32:09):
Julie, Thank you so much for your time today.

Speaker 2 (32:11):
Thank you, Julie. We'll look forward to that column this Saturday.

Speaker 4 (32:14):
Yeah, thank you for having me.

Speaker 2 (32:22):
That's it for this week for TEXTA.

Speaker 1 (32:24):
I'm Kara Price and I'm mos Vloschen. This episode was
produced by Eliza Dennis, Victoria Dominguez, and Adriana Tapia. It
was executive produced by me, Kara Price and Kate Osborne
Kaleidoscope and Katria Novelfi Hot Podcasts. The engineer is Bihit
Fraser and Kyle Murdoll makes this episode and he also
wrote our theme song.

Speaker 2 (32:44):
Join us next Wednesday for Textuff the Story, when we
will share an in depth conversation with Jenstatsky, creator and
writer of the hit HBO Max show hacks Well, chat
about if AI is coming for her job and what
it's like to make TV. Knowing you're likely battling for
attention with a.

Speaker 1 (32:59):
Second, please rate, review, and reach out to us at
tech Stuff podcast at gmail dot com. We love hearing
from you

TechStuff News

Advertise With Us

Follow Us On

Hosts And Creators

Oz Woloshyn

Oz Woloshyn

Karah Preiss

Karah Preiss

Show Links

AboutStoreRSS

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Decisions, Decisions

Decisions, Decisions

Welcome to "Decisions, Decisions," the podcast where boundaries are pushed, and conversations get candid! Join your favorite hosts, Mandii B and WeezyWTF, as they dive deep into the world of non-traditional relationships and explore the often-taboo topics surrounding dating, sex, and love. Every Monday, Mandii and Weezy invite you to unlearn the outdated narratives dictated by traditional patriarchal norms. With a blend of humor, vulnerability, and authenticity, they share their personal journeys navigating their 30s, tackling the complexities of modern relationships, and engaging in thought-provoking discussions that challenge societal expectations. From groundbreaking interviews with diverse guests to relatable stories that resonate with your experiences, "Decisions, Decisions" is your go-to source for open dialogue about what it truly means to love and connect in today's world. Get ready to reshape your understanding of relationships and embrace the freedom of authentic connections—tune in and join the conversation!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.