All Episodes

April 23, 2025 • 44 mins

From humble beginnings as a dating website to one of the largest platforms in the world, YouTube has experienced a meteoric rise. However, that rise is not without issues. We comment on the comments, sexism, advertising, creators, recent controversies, and the general ecosystem of YouTube in this classic episode.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Hey, this is Annie and Samantha. All kind of stuff
I Never told you aflection of My Heart Radio and
in our continuing kind of look back on some technology things,
I wanted to bring back this classic about YouTube because.

Speaker 2 (00:31):
I feel like a lot of.

Speaker 1 (00:32):
Things have changed with YouTube, even though this episode isn't
that old since we talked about it, and also just
hearing about how some of the creator creators we've interviewed
are using it differently, I'm just really intrigued and I
want to do a more in depth dive into some
of the things that's going on with YouTube now, especially

(00:57):
like I feel like all of our technologies and flux.
I just want to come back and talk about this.
But I have noticed some things personally. I know I've
talked about this before, but I've noticed.

Speaker 2 (01:08):
Some really.

Speaker 1 (01:10):
Really gendered ads that are shocking to me. So we'll
have to come back and revisit that. But in the meantime,
please enjoy this classic episode. Hey, this is Annie and Samantha,
and welcome to Stuff I Never told you production of iHeartRadio,

(01:42):
and today we are continuing our tech kick, which feels
like it's never ending.

Speaker 3 (01:47):
To be honest, we keep adding things you're right.

Speaker 1 (01:50):
We do, we do, but it's all fascinating stuff. And
I will say this one, which is about YouTube, was
very difficult to research because Google owns YouTube. So when
you type in like a YouTube women, all it is
is like videos from YouTube with women in there, right right,

(02:12):
so stricky.

Speaker 3 (02:13):
Yeah, you know what. TikTok is similar to that. When
you try to find information on TikTok, it takes you
to TikTok and like things about that subject on TikTok.

Speaker 1 (02:22):
Mm hmmm, o yah, we're speaking of we have a
TikTok now, yes we do, thanks to Joey who has
been on the show, who is amazing and patient. You
can go check us out there at stuff Mom never
told you? Yes, yes, yes, yes, yes, so yeah we're
we're there as well. Do you use YouTube a lot, Samantha.

Speaker 3 (02:47):
So this is the thing. I do use YouTube a lot,
but it's very specific things. I have used it to
listen to music, so way back when before licensing was
a thing and people were really like and YouTube got
really aware of that, I would listen to songs that
I couldn't find. Now, of course Spotify has more songs,

(03:08):
but like some songs you couldn't find on Spotify. Beyonce
songs were not available there, so I would go to
YouTube and listen to it there at night. I like
their deep sleep sounds.

Speaker 2 (03:22):
Ooh, I set.

Speaker 3 (03:23):
That on a timer and they have a lot of
dark screens. So I use it like that. My partner
uses it NonStop. But that's that's it. Like I don't
do things outside of that, huh.

Speaker 2 (03:35):
I use YouTube quite a bit.

Speaker 1 (03:37):
Actually, it's a thing that I have a love hate
relationship with and we're going to talk about that because
I do feel like, as as a viewer who doesn't
pay for like the expensive buys and I don't know expensive,
but I don't pay for the service to not have ads,
I feel like they time the ads at the worst

(03:58):
specific place on purpose. And I've even looked up like
on Reddit, does YouTube do this purposefully?

Speaker 2 (04:06):
There's no real answer yet.

Speaker 3 (04:08):
Yeah, I think I know it's gotten longer. That used
to be able to skip Now you don't have that
option as much. I do want a correction because I
do watch Hot Ones oh yeah, uh huh, And that
is a series I've watched a lot on there because
that's where they originated. Yes, so I love that series
as well as Binging with Babbage. Those are the two

(04:31):
shows that.

Speaker 2 (04:31):
I will watch.

Speaker 1 (04:33):
Yes, those are those are pretty great ones. There's a
lot of great stuff on YouTube.

Speaker 2 (04:37):
I watched. I watch a bunch of stuff.

Speaker 1 (04:39):
Actually, it's one of my biggest Like probably every morning
I watch a bunch of YouTube videos, which.

Speaker 2 (04:45):
Is not the case for a lot of people. I've learned.

Speaker 3 (04:48):
No, I don't do that.

Speaker 1 (04:50):
Yeah that is that is like my go too, that's
my morning routine.

Speaker 2 (04:56):
Yeah.

Speaker 1 (04:56):
Yeah. Content warning before get into this one, general grossness.
The same thing in our past tech episodes. If you're
like a woman or marginalized person on the internet, you
know what I'm talking about. A brief discussion of very
disturbing content pedophilia, sexual assault. We're not gonna get too

(05:16):
much into detail, but just so you know. You can
see our past episodes on Airbnb, Twitch, Evan Rachel Wood
and Tech Accountability, YouTube beauty groups, single use, hate accounts,
and Megan Markle the Amber her trial is involved in
this where Jamantha knows I have a very hilarious, spicy

(05:37):
outline coming up about DC phantom.

Speaker 3 (05:42):
Oh that's fun.

Speaker 2 (05:45):
Yeah, Oh, it's gonna be great.

Speaker 1 (05:46):
Also, we typically try to focus on intersectional feminist issues
in our episodes for what I hope for obvious reasons,
and I think pretty much everything could be that. But
YouTube has a lot of issues, so we're going to
discuss some of them. But just so you know, there's
a lot going on here that we could get into.

Speaker 2 (06:08):
We didn't have time for this, but there's a lot.

Speaker 1 (06:11):
It was one of those ones where I think I
was like, oh, I'll get this done in two days,
and I start opening tabs, I'm like, oh, no, oh,
been there. And I did want to start with a
pretty brief explanation of my YouTube experience because a good

(06:31):
chunk of my job at in my current it's so
confusing because we've been acquired by so many companies, but
I've essentially been working with the same group of people
for a while. Was YouTube based and I ran the
stuff one never told you YouTube, and so I'm going
to be inserting a lot of personal experience in this one.

(06:52):
Off the top, I will say that my bosses, some
of my bosses, treated this much more seriously than I
thought it had any warrant to be treated. Kristin, who
was in the videos and I was filming them and
editing them, well, she filmed a lot of them. I
was editing them primarily. She was told to like wear

(07:15):
makeup and get her hair done, all of this kind
of stuff. We were instructed on, like the best thumbnails
to use, which were generally like quote pretty looking thumbnails
or something so people would click on them. We did
run into a couple of copyright issues, which I'll talk
about more later. As you know, probably a lot of

(07:39):
people know by now because it's become sort of a
conversation topic. There's a lot of videos you watch that
look like it's just vlogging where they edited and do
it themselves.

Speaker 2 (07:49):
Are not.

Speaker 1 (07:50):
There's like a huge team behind it, and they don't.
The team doesn't get credited. I was never credited on anything,
I don't think. And that's just like how it is.
I had to go through YouTube training every year and
it is the most boring thing you can imagine. Oh
and you can't skip it. Oh my gosh. But I
did get an award. I wont a YouTube award. I

(08:11):
still have it. It's pretty cool. I mean it looks
pretty cool. On the not so fun side, I did
experience a lot of harassment. One of my very favorite
comments as I've said before I've ever gotten was I
know the producer and she's a slut.

Speaker 3 (08:26):
Cool? Wow?

Speaker 2 (08:29):
Yeah, whatever. And also I did get doxed.

Speaker 1 (08:32):
I got doxed based on YouTube video that we posted
that wasn't that inflammatory. And then accessibility was a thing
I was really passionate about when I was doing those videos.
So I would you would listen to the video and
type out the closed captions, and that was something I

(08:52):
kind of did and didn't get compensated for, but it
was really important for me to do it. But that
is a big issue with you two, and that's a
big issue with the show, and we've tried to get
transcripts with middling success because for a while we had them.
Before you and I were hosts, there were transcripts, and
now they are not, so that's an issue as well.

(09:16):
But yeah, that's kind of my brief encapsulation of my time.
Like I said, I have a bunch of I have
a bunch of comments throughout this one. But okay, let's
start with a very quick rundown of the history of YouTube,
which I found kind of surprising. So YouTube, which is
an online video sharing platform that allows for commenting, liking,

(09:36):
and disliking, sharing, making playlists, all kinds of things. I
feel like you know what YouTube is was founded by
Chad Hurley, Steve Chen, and Jahwid Katam in early two
thousand and five. A little over a year later, Google
purchased YouTube for a staggering one point sixty five billion dollars,
and it started.

Speaker 2 (09:57):
As a dating website.

Speaker 1 (09:59):
Yeah, So, the idea was users would upload videos of
themselves talking about their dream partners. The slogan was tune
in hook Up, and I was struck once again about
how humans and especially dudes, desire for sex really run
so much stuff.

Speaker 2 (10:20):
It's wild.

Speaker 1 (10:22):
The dating aspect failed even after they offered women twenty
dollars to upload videos, but the video uploading system was excellent,
so they decided to open it up to any video.
And they decided this in part because after the two
thousand and four Janet Jackson justin Timberlake Super Bowl incident,

(10:42):
they couldn't find video of it anywhere, so they thought, hey,
we could be the place people put videos like that
and you can watch them. Karim at the Zoo, a
short video of one of the creators at the Zoo,
was YouTube's first official video.

Speaker 2 (10:57):
Yeah.

Speaker 1 (10:58):
Several studies have found that the reasons people use YouTube
are quote, information seeking, sharing, information, status seeking like that
whole first thing, social interaction, and entertainment.

Speaker 3 (11:10):
Just by the way, So yeah, YouTube is huge. Behind Google,
It is the second most popular website in the world.
As early as two thousand and six, the site was
getting about twenty million monthly visitors one billion hours of
daily consumed content. As of twenty twenty three, about fifty
six percent mail to forty four percent female. That same year,

(11:33):
the first targeted ad campaigns launched by major companies took off,
and Time featured YouTube as Person of the Year, and
in twenty eighteen, YouTube made twice the amount of revenue
than any major television network. Numbers from twenty nineteen found
that five hundred hours of content was uploaded each minute,
and the company makes an estimated fifteen billion in revenue annually,

(11:58):
although a lot of that is to go back to
the creators. Twenty seven percent of Americans rely on it
as a news source.

Speaker 1 (12:06):
Yes, and honestly, there are so many numbers we could
throw at you, so many awards we could throw at you,
and moments in history about YouTube, like did you know
Rickrollings started in two thousand and eight. Gongham Style was
the first video to pass a billion views in twenty twelve.
But the basic takeaway here is it's a big deal

(12:27):
and it makes a lot of money. It has also
been the source of a lot of conversation and controversy
since its founding. One of the big things that has
hounded YouTube is copyright issues. How they handle copyright, what
you can upload, what will get you in trouble, will get.

Speaker 2 (12:45):
Your videos taken down.

Speaker 1 (12:46):
When I was working there, it was a three strength system,
but it was very wild West and what you could
get away with and what random thing would get you
in trouble. Some times you would get mistakenly flagged or
a video would be taken down without warning or any
clear reason, which may or may not have had to

(13:07):
do with their massive copyright library.

Speaker 2 (13:09):
So basically they have like this huge.

Speaker 1 (13:13):
Library that just like searches for oh, you're playing this
copyrighted item, take it down. But it does make mistake.
In my experience, it made some mistakes, so probably it's improved.
I should say. It's been about a good yeah, about
like seven years since I've been doing this job, so

(13:34):
I bet a lot of things have changed.

Speaker 2 (13:35):
Since I've been doing it.

Speaker 1 (13:36):
Another big problem YouTube has has to do with their
recommended videos function, because there have been reports about it
pushing things like conspiracy theories or lies, promoting violent or
sexual content to children, pedophilic content, and things like we
discussed in Bridget's episode about Evan rachel Wood that YouTube

(13:56):
is making money off of a video she alleges depicts
her sexual assault and they won't take it down even
though she said, like hey. In twenty nineteen, YouTube announced
they would recommend fewer videos that quote could misinform users
in harmful ways. This came after rising concerns that platforms
like YouTube relating to offline violence, death, and radicalization. This

(14:19):
of course led to heated discussion around free speech and
confirmation in some conspiracy theorist minds that they were being censored,
like see here's the proof.

Speaker 2 (14:30):
Right right.

Speaker 3 (14:31):
So this was after a Google engineer posted a ten
page manifesto in twenty seventeen criticizing the company's diversity initiatives,
claiming it discriminated against white men. That led to the
YouTube CEO's daughter asking mom, is it true that there
are biological reasons why there are fewer women in tech
and leaderships, but many pointed out that the engineer may

(14:54):
have been in part radicalized by YouTube and its algorithm.
He was fired and became an al right hero.

Speaker 1 (15:14):
As far as the disturbing content served to children goes,
these videos often feature beloved children's cartoon characters in violent
and or sexual situations. One such video featured a woman
with a mini mouse head getting stuck in an escalator
and bleeding profusely. It got millions of views in one
day and could be viewed in the kid friendly mode.

(15:36):
Other videos depict Pepa pig tricked into eating bacon, the
suicide attempt of a Paw Patrol character. A lot of
these videos are discovered through AutoPlay or the recommended video sidebar,
so after you've watched legitimate content, you find these videos.

(15:56):
Some tests found that a toddler went from viewing one
of his favorite legitimate videos to a video depicting stomach parasites,
eye gouging, and kids setting each other on fire and
only a few clicks, and that video had twenty million views.
And that's part of the problem is that these videos

(16:16):
get pushed because they have so many views, and that's
how YouTube's algorithm works. In part YouTube host millions of
hours of children's entertainment, and much of this content makes.

Speaker 2 (16:29):
A lot of money.

Speaker 1 (16:30):
YouTube claims is difficult to change their algorithm and largely
remove these videos on a case by case basis, which
isn't going to fix the problem. It also puts more
pressure on the guardian to view the media with the
child when it comes to when you know, that's something
you often do to keep a child busy while you

(16:52):
do something else. On top of that, it's kind of
complicated reporting a video in kids YouTube, and there's no
way to make sure it doesn't show up again. You
can delete it from your history, you can do all
kinds of things that still might show up again. YouTube
has been releasing improved reporting tools and methods for video
blocking in an attempt to improve the situation. There are

(17:13):
plenty of articles out there about what steps you can
take on your own if this is something you're worried about,
but again, that's sort of putting the impetus on you
to do that. In twenty nineteen, several articles reported on
pedophilic content running alongside ads for major companies, and the
videos showed children in their underwear and or their genitals

(17:33):
with what appeared to be pedophiles, timestamping content in the comments,
basically so other users could click the timestamp to go
to the part in the video where nipples were exposed
or something like that, and then recommending other similar videos,
sometimes exchanging numbers to exchange more videos with each other.

(17:55):
These videos have thousands, if not millions, of views, hundreds
of comments, and yes, they are being monetized. You probably
heard about this because the advertisers were not happy to
learn their ads are running on this content. But to
be clear, a lot of these videos, though not all
of them, are pretty innocent, like girls playing Twister and confused.

(18:20):
Girls who uploaded the video will respond to comments like
how old are you you'll answer or ask a comment?
Or what grow means. Many of the comments are about
the child in questions beauty, or claim that they're in
love with them, and sometimes they request specific lighting or outfits.
For a while, searching twister girl on YouTube auto corrected

(18:43):
to little Girl, Twister and skirt.

Speaker 3 (18:47):
Yeah, so the recommendation system was part of the issue too,
serving up other videos seemingly enjoyed by pedophiles. On top
of that, while some channels have been taken down over
child abuse. There are plenty as of writing this still
up dedicated to quote preteen models, girls bathing or doing

(19:08):
stretches or yoga swimming, things like that. YouTube enacted a
policy disabling comments on videos or the comments in question
or overwhelmingly inappropriate, but even so, the algorithm still would
serve up these videos alongside others teeming with pedophilic comments.
A lot of the comments aren't in English as a

(19:29):
way to get around the disabling, but several of them are.

Speaker 2 (19:33):
Yes. Yeah.

Speaker 1 (19:35):
Another issue advertising because YouTube has a changing and not
very transparent policy on what type of content and videos
can be monetized on their platform, which is primarily how
creators make money.

Speaker 3 (19:50):
Yeah. YouTube claims it's ninety nine percent effective at not
running an ad over inappropriate content, but advertisers have pulled
their ads after they ran against videos containing things like
rape apologist and does Semitism and terrorism. In twenty seventeen,
many big advertisers told YouTube they would end their relationship
if the platform didn't fix the issue. The solution decided

(20:13):
upon was that YouTube would work more directly with advertisers
to make sure their ads were only placed on the
desired content.

Speaker 1 (20:20):
Yes, and a quick aside here, my experience was you
had very little control over the ads on your videos.

Speaker 2 (20:26):
As a creator.

Speaker 1 (20:28):
You could flag topics you didn't want running, like cigarettes,
but what was served on your video could be fairly
random to downright offensive given the content. And this is
an issue we encounter pretty frequently as a Famis podcast
and what advertisers think of when it comes to women.
But basically it was pretty random with a very difficult

(20:50):
reporting system, especially if you didn't have a specific deal
with a sponsor. On top of that, this was how
you made money from a random pool of as again,
if you didn't have a specific sponsor, and plenty of
advertisers didn't want to run ads on feminist videos talking
about abortion or honestly feminist videos in general. Anyway, YouTube

(21:12):
announces it's going to work more closely with advertisers to
prevent something like these things we've been talking about happening again.
One of their largest creators, PewDiePie, who had had preferred
advertising status, lost it after he posted a video with
anti Semitic language and imagery in it, and basically preferred

(21:33):
advertising is like, oh you want that's like top tier
you want to advertise with this?

Speaker 2 (21:37):
This creator on the flip side.

Speaker 1 (21:40):
Creators were worried about how this policy and demonetization would
impact their own revenue. Advertisers could opt out of specific videos,
meaning if a channel about news wanted to talk about
a tragedy, ads could opt out, which is a financial
incentive not to talk about darker things that we need
to talk about. I do understand that it's strange, like

(22:03):
the whole Applebee's Ukraine invasion thing where they were running
CNN had this disturbing imagery and then there was this
very in your face Outbe's commercial. It is weird, and
we've had these discussions ourselves about our darker episodes where
we don't want specific ads playing in episodes about sexual assault,

(22:23):
for instance, But at the same time, we have to
make money, so it's just like a it's a strange situation.
Creator Philip DeFranco reported an eighty percent drop in revenue
after the policy was enacted and argued that new creators
would feel the worst of it. To address the issue,
YouTube attempted to make it more transparent to creators what

(22:43):
videos were being monetized and which ones weren't. There is
a process for requesting a review about why a video
isn't being monetized, but it's tedious and not without human
bias because it requires a human to watch the video
and then kind of decide like I don't know. On
top of that, these reviews were prioritized for bigger creators.

(23:06):
YouTube did update the algorithm later and the update decreased
demonetization by thirty percent. And this is something I remember
working with YouTube is every time a major company, not
just YouTube, would change their algorithm or policy, we would
have to have very serious meeting about it. Like we
would sit down and be like, what does this mean?
What do we have to do with our keywords? Like
what all of these things? And also, yeah, we just

(23:30):
kind of experienced a meeting like this because Sminty may
or may not be uploading episodes to YouTube soon of
like our whole our whole episodes, and yeah, we had
to talk about it.

Speaker 3 (23:43):
We had to have a whole breakdown of what we're
afraid of and what we want to avoid, including about
having control about ads because we tried really hard to
monitor that whether we want to or not, because there's
some things that we really miss out on, like ah dang,
who loved that sponsorship?

Speaker 2 (24:01):
You know what?

Speaker 3 (24:01):
I mean, but we want to be very aware and
it's hard to do that on something like YouTube when
it's so it has its own thing. God, each social
media platform it's so hard to learn because that's all
so different.

Speaker 1 (24:15):
Yeah yeah, but I mean it's so different, but a
lot of the same issues.

Speaker 3 (24:20):
Right, the same issues, but like different standards. It's so weird. Okay,
But then the reports about pedophilic content broke and the
company was once again in crisis mode to keep advertisers
from bailing. Then logan Paul Yes, one of the most
popular vloggers, posted a video of a man who killed
himself and he was stripped of his preferred advertising status,

(24:43):
And in the wake of this, the platform announced new
policies detailing which creators were even eligible to make money.
And by the way, he's still one of their top
money makers and just had a whole controversy happen because
a pick that he had for clout has been shown
as being severely abused. Anyway, creators now had to have
over four thousand hours of watch time over the course

(25:05):
of a year and one thousand subscribers to receive ads.
Small creators were understandably hurt and outraged and there was
a whole host of emotional videos posted about the decision.
A YouTube official set of the policy quote changes will
affect a significant number of channels. Ninety nine percent of
those affected were making less than one hundred dollars per

(25:26):
year in the last year, with ninety percent earning less
than two dollars and fifty cents in the last month.
Creators call the day the policy went into effect the
demonetization day. Yeah, that makes sense. Smaller accounts waited up
to have a year to have their accounts reviewed to
see if they could get ads. Multichannel networks dropped huge

(25:50):
amounts of small creators. Numerous creators quit. A lot of
the smaller or demonetized content is or was created by
marginalized folks or around less commercial topics, and many argue
that YouTube is losing what made it YouTube?

Speaker 1 (26:06):
Yeah, because like it's different now, but when it started,
it was like when I was reading the history of YouTube,
they had the first time a trailer from a company
appeared on YouTube, Like, it didn't.

Speaker 2 (26:17):
Used to be what it is now.

Speaker 1 (26:18):
It used to be like a much more kind of
hodgepodge random assortment of videos from various creators, and now
it's you know, it's got a lot of like movie
trailers or clips from these big companies, music videos, things
like that, which I don't think there's anything necessarily wrong with,
but if you're losing all of these small creators, then

(26:40):
it's just like the scale is tipping in one way.
And just for the record, Sminty had hundreds of thousands
of subscribers, tens of thousands, if not millions of views
on every video, and we made less than one hundred
dollars a year.

Speaker 3 (26:56):
Right, I think you still have that. We think we
still have like two hundred and fifty around almost subscribers,
which is hilarious because there's not been a video we've
posted since I think brit and Emily introduced themselves as
a new host. I think that's the last video.

Speaker 2 (27:11):
Yeah.

Speaker 1 (27:12):
Yeah, it's been a whole process finding out who owns
that channel.

Speaker 3 (27:16):
Now, now that's a whole different story.

Speaker 1 (27:19):
It is, it is, But I just wanted to put
that in there because like, we had the backing of
a pretty big company.

Speaker 2 (27:26):
There's a whole team dedicated.

Speaker 1 (27:28):
To this pretty much, and we got a lot of
views and we didn't make that much money. So but yeah, okay,
so a lot of the things we've been talking about
do impact women, but let's talk about women and YouTube specifically.
There are conflicting numbers when it comes to gender differences
in viewership on YouTube. One source reports sixty two percent

(27:48):
of users are male and seventy eight percent of men
use it in the US, compared to thirty eight percent
and sixty eight percent for women. Google's own numbers claim
that it's more like fifty to fifty, or at least
closer to that. Younger people are more likely to use YouTube,
and location urban versus rural, is a big factor as well.

(28:10):
In twenty eighteen, YouTube CEO Susan Wojitski said or one
of the reasons for tech's lack of women is because
of its reputation as being quote a geeky male industry
which did not go overwhelmed no fighting words.

Speaker 2 (28:26):
Yeah.

Speaker 1 (28:29):
One of the ways these moves around advertising has been
theorized to impact women is around beauty. Both the pressure
of the creator to put on makeup, wear nice perhaps
revealing clothes, which are things that are time consuming and
often expensive, but also to make videos about beauty, like
makeup tutorials, because they know that those can be monetized,

(28:52):
they might be pressured to use a sexualized thumbnail like
who we Were. There's nothing inherently wrong with these videos,
these makeup tutorials. I know people who love them, but
if women are being pressured into doing them for financial
reasons and being pressured to look a certain way, that's
an issue. We've discussed before how the beauty industry has

(29:13):
pushed standards to sell products, telling women they have to
look a certain way, and that is partially at play here. Yeah,
because you know, makeup can be great. We just need
to be clear, We need a clear picture of what's
going on here because a lot of women reported feeling
like they had to do this because it was the
only way they could get ads. Many women creators report

(29:36):
criticism around their appearance, their weight, clothes, and makeup. Further,
a lot of the top channels created by women have
to do with stereotypical, more feminine topics like cooking, and
there's a bunch of research about this, and again it
seems to be that they feel there's nothing wrong with it,
but they feel like this is the only way they

(29:57):
can make money, which I think there is something wrong with.

Speaker 3 (29:59):
That, right right, Although I mean we could come back
and have the conversation about how also, if you're a
woman creator, you can only be for women. Yes, so
that's I mean we run into that a lot. So yeah,
that's a whole different conversation. And by the way, women
are more likely to be subjected to trolling comments and harassment,
even stalking and threats, and it increases with every intersection race,

(30:23):
sexual orientation, gender identity, disability, and having pain dismissed, not
looking sick enough, not looking pretty enough. Wait, there was
even a study specifically on YouTube comments that showed gender
differences in how we talk about intoxication. Women are often
sexualized in fell videos. Women creators have reported fear after

(30:43):
accounts have posted hate videos about them and called for
their followers to attack them, and this travels onto other platforms.
By the way, I've seen this, it's really interesting. In
twenty nineteen, YouTube rolled out updated harassment policies, but creator
they haven't mitigated the issue. A twenty twenty two report
found that harassment against women is not only alive and

(31:06):
well on YouTube, it's flourishing. Women creators who have gone
viral have described the deluge of harassments they encounter. Some
even cited the Amber Heard Johnny Depp trial as emboldening
misogynistic creators and allowing them to amass huge numbers of followers,
and that it helped normalize a toxic level of hate
towards women. And I don't think we talk about this yet,

(31:30):
but the including a Megan Thee stallion that also brought
a whole amount of misogynistic trolls that got some notoriety,
and similarly YouTube started de ranking Megan Markele hate channels
people like Men's Right activist Andrew Tait have amssed millions
in revenue, though he was recently banned, but by the way,

(31:51):
he's been making a lot of money on the backs
of harassing women. Just that and just recently got banned
women creators that because YouTube is monetizing these accounts and
not disciplining their top male creators who post misogynistic content,
it empowers commenters to harass women. So many women have

(32:14):
left because of this. The report concluded misogyny is alive
and well on YouTube. Videos pushing misinformation, hate and outright
conspiracies targeting women are often monetized.

Speaker 1 (32:24):
Yeah, this is like within the year study of note
all the studies mentioned that there was a lack of
data and research around non binary folks, so that is
something that's missing.

Speaker 2 (32:50):
And I want to include this.

Speaker 1 (32:52):
These are actual guidelines from Google's anti harassment policy.

Speaker 2 (32:56):
Quote.

Speaker 1 (32:57):
Here are some examples of content that's not allowed on youtubo,
reportedly showing pictures of someone and then making statements like
look at this creature's teeth, They're so disgusting, with similar
commentary targeting intrinsic attributes throughout the video, targeting an individual
based on their membership in a protected group, such as
by saying look at this filthy slur targeting and protected group,

(33:19):
I wish they'd just get hit by a truck. Targeting
an individual and making claims they are involved in human
trafficking in the context of a harmful conspiracy theory where
the conspiracy is linked to direct threats or violent acts,
using an extreme insult to dehumanize an individual based on
their intrinsic attributes. For example, look at this dog of

(33:41):
a woman. She's not even a human being, she must
be some sort of mutant or animal. Depicting an identifiable
individual being murdered, seriously injured, or engaged in a graphic
sexual act without their consent accounts dedicated entirely to focusing
on maliciously insulting and identifiable and but.

Speaker 3 (34:00):
It has to be the entire account.

Speaker 2 (34:02):
Yeah.

Speaker 1 (34:04):
Well, and so I wanted to enclude this because it
feels like you're trying to teach a child how to
like behave right like that we have to say, like, hey,
don't gives an example, here's an example.

Speaker 2 (34:19):
I was reading it like whoa, Okay, I mean there's.

Speaker 1 (34:25):
A part of me that's like, has there ever been
anybody who read these? Just like, oh, I see, I'll
start doing that, right But anyway, Okay. When it comes
to money making, as of twenty twenty two, only one
creator on the top ten list of YouTube was not
a man, and that was seven year old Naustria. In

(34:46):
twenty twenty one, she made an estimated twenty eight million dollars. Uh,
and this is in keeping with the trends of recent
years for YouTube. One woman on the list, but in
twenty eighteen, no women at all made it.

Speaker 3 (34:58):
Wait, she's a seven year old? Yeah, what is her content?

Speaker 2 (35:02):
Yeah?

Speaker 3 (35:02):
Uh, because it's definitely not other seven year olds watching it. Well,
maybe I don't.

Speaker 2 (35:10):
Know, because there's also a ten year old boy on there.
I think I don't know that's concerning.

Speaker 1 (35:16):
Yeah, you've also probably heard of the credibility gap when
it comes to women and science or news content. Many
women creators who are scientists have posted the man's planning
comments they get I'm sure you've seen some of them
or heard some of them, or sexualization or outright harassment
that they receive viewership on STEM related video sues mail.

(35:39):
In some cases pretty drastically, This impacts interest in and
potential pursuit of STEM topics and careers, because several studies
have found that engaging in STEM content, especially at a
young age, can ultimately lead to a STEM career. Studies
have also found that, compared to an equally scientifically curious man,
women are twenty six percent less likely to watch a

(36:01):
science video on YouTube. Science creator Emily Grassley famously published
a YouTube video in twenty thirteen called where My Lady's
At Where she just read some of the comments her
videos receive. And if you've never watched her videos, they're
pretty sweet. They're kind of like she works in a
museum and she talks about things in the museum, and
it was just all of this, like very sexualized hateful

(36:23):
comments she receives all these videos.

Speaker 2 (36:25):
Yeah, who knows.

Speaker 3 (36:26):
Yeah, I was thinking about one of the YouTube channels
that my partner really wants, like like gets me to
watch Simone Geerts. Someone's gonna correct me and tell me
I'm completely wrong, because she's really, really famous and her
whole thing is trying to create unusual hacks in her
house through whatever tools she's gotten. She's gotten huge, so

(36:48):
she's gotten really cool tools, and I think one she
did was a chair for her dog to sit next
to her at the like as a desk chair, so
something big, and then creating something with tampons I forgot already.
But like, she has a lot, and she is technically
a tech slash stem creator and has millions of viewers,

(37:09):
so I would be interested to see what her comments
look like with her popularity.

Speaker 2 (37:13):
Yeah.

Speaker 1 (37:14):
I think one of the things that really disheartened me
in this research was that back in the early days
when christ and I were doing YouTube, we worked with
a lot of big women on YouTube because it's like
how you kind of cross promoted and got more viewers,
and a lot of the women we worked with who
had like millions of views and followers left YouTube like

(37:36):
they would show up and a lot of the articles
I was reading more like here's why I left, and
they're like big creators and so it just only imagining
like the smaller creators. And also didn't you you showed
me that video girlfriend reviews. Yes, and now they had
to like respond to a lot of hate.

Speaker 3 (37:58):
Yeah last years too. Hey, yeah, yeah, they had to
respond to a lot a lot of hate in general.
But a family. They're doing huge stuff in Twitch. But
the whole premises is pretty much the girlfriend is watching
her boyfriend play these games and doesn't commentary, and they
grew up to a huge success and is I think
living off of the content as content creators because they

(38:20):
have made sponsor deals. But yeah, they had to go
through some things where several video games, including the Last
of Us too, because they loved it much like you,
we always have to bring it back to Last of
Us or startar Wars're somewhere in here.

Speaker 2 (38:40):
Yeah.

Speaker 3 (38:40):
So part of the issue is a production budget staff. Statistically,
mail run shows are able to secure more funds and staff.
Another is that women who are scientists often have families
and have more of the share and taking care of children,
which means yes, less time for actually making content and videos.

Speaker 1 (38:59):
Yeah, which is something we heard during the pandemic with
why they were less scientific papers turned in by women.
And also, have you ever heard the saying never read
the comments said, tell me.

Speaker 2 (39:14):
That I said it to you a million.

Speaker 1 (39:19):
Well researcher Enoka Amara Sakara, and I apologize if I
butchered your name. I could not find a pronunciation anywhere.
But they did read the comments, and they came out
the other side with a published paper about sexism and YouTube.
They looked at over twenty three thousand YouTube comments specifically

(39:42):
comparing the treatment of men and women when it came
to science content, and no surprise, the women were treated
more harshly fourteen percent negative comments versus six percent for men.
I've said it a million times on the show, but
we once did this experiment at work where we had
a male host and a female host do the exact

(40:03):
same video and we compared the comments and it was
very clear it was very stark, which one they believed
was smarter, and all of the physical comments that the
woman got, and yeah, the women in the study got
a lot more sexually charged comments or comments about their
appearance in general, and the way YouTube comments work is

(40:26):
the most controversial ones rise to the top. And in fact,
I never personally would do this, but Kristen would sometimes
be like, let's do something really controversial because it'll generate.
It'll make you go to the top faster and make
you get more comments. And that's just how YouTube's algorithm works, which,
as I believe, one big YouTuber was like, this does

(40:47):
not facilitate healthy, helpful conversation, right.

Speaker 3 (40:53):
I feel like that's for everything, including news, like getting
the worst and most controversy stuff out there first because
that's going to get the tension, which sucks. Yes. So
YouTube is trying to combat this, to varying degrees of success.
In twenty eighteen, they launched the first hashtag women to
Watch as part of the next Step program. In the

(41:14):
company's own words, this was the first time they had
focused on empowering female voices. In twenty sixteen, they formed
a year long partnership between Women Creators and the UN
to Advocate to Gender equality, and other organizations have been
trying to fight this as well. In twenty fourteen, you
Coalition now Uplifts together formed to combat sexual abuse, emotional manipulation,

(41:38):
and other forms of violence in the YouTube community.

Speaker 2 (41:41):
Yeah.

Speaker 1 (41:42):
Yeah, I mean with every tech topic we do, because
of the show we are, we focus on the negative.
There are a lot of positives about YouTube. It can
be a great place to find community, be great place
to share ideas, to educate all kinds of things.

Speaker 2 (41:58):
But this is I just there's so much fluctuation.

Speaker 1 (42:04):
I can't say for sure the situation is getting better,
like people are talking about it, but and it's also
kind of it annoys me in so many of these
tech episodes where it's like the creators are the ones
making YouTube money and they're not getting paid anything and

(42:27):
are forced to leave or I don't know. It's just
like there's a lot of things that need to be
figured out with right situation.

Speaker 3 (42:36):
I think in general, content creation and content creators is
such a new thing. We've talked about it before, We've
talked about influencers and the good and the bad.

Speaker 2 (42:44):
But yeah, it is.

Speaker 3 (42:45):
There's a lot of things that the law policies and
just human rights things haven't caught up with as fast
as this medium has become.

Speaker 2 (42:53):
Right, it is complicated.

Speaker 1 (42:56):
Like to YouTube's credit, I get that it's complicated because
you know, you don't want an advertiser to be angry
about what they're at is running on. But you also
don't want a creator to not be able to make
money and leave. But right now it's not working at
least that's.

Speaker 2 (43:12):
My view of it.

Speaker 1 (43:15):
So I would love if anybody listening has more recent
experience with YouTube, has any thoughts or resources or numbers
anything like that, because this was one where I kind
of got overwhelmed. There's a lot more we could talk
about with this, and a lot is changing. Some of
the surveys I mentioned, even though they were recent. I

(43:35):
bet it's shifted. I bet the conversation has shifted since then.
So if there's anything like that, you can email us
at Stuffmedia, Mom Stuff at iHeartMedia dot com. You can
find us on Twitter at Mom's taff podcast, or on
Instagram and TikTok.

Speaker 2 (43:53):
At stuff Well' never told You.

Speaker 1 (43:55):
Thanks as always to our super producer, Christina, thank you,
and thanks to you for listening Stefan Never Told You
his protection of iHeartRadio. For more podcasts from iHeartRadio, you
can check out the iHeart Radio app, Apple Podcasts or
regular listen to your favorite shows

Stuff Mom Never Told You News

Advertise With Us

Follow Us On

Hosts And Creators

Anney Reese

Anney Reese

Samantha McVey

Samantha McVey

Show Links

AboutRSSStore

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Bobby Bones Show

The Bobby Bones Show

Listen to 'The Bobby Bones Show' by downloading the daily full replay.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.