Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hey everybody,
welcome back to the Mindful
Bytes podcast.
Speaker 2 (00:02):
Today we're going to
talk about TikTok for kids the
new teen Instagram account.
Speaker 1 (00:08):
And with that, let's
go ahead and dive on in.
Hey everybody, thanks fortuning in to the Mindful Bytes
podcast.
Don't forget, you can check outthe show notes and use the
chapter markers to skip aroundto topics that you really want
to hear about.
Let's go ahead and introducethe panel.
Speaker 2 (00:24):
I'm Olivia your
social media savvy millennial.
Speaker 3 (00:28):
I'm Ashton, your Gen
Z tech enthusiast.
Speaker 1 (00:30):
I'm Brian, your Gen X
business leader, slash digital
guy.
Speaker 4 (00:34):
And I'm Shauna, your
Xenial digital dinosaur.
Speaker 1 (00:37):
Shauna, tell us about
your topic today.
What do you want to talk about?
You said TikTok.
Is it right?
And is it TikTok and kids?
Speaker 4 (00:44):
to talk about was you
said TikTok.
Is it right?
And is it TikTok and kids?
Okay, that's what I said TikTokfor kids, because that is what
this app is being called by alot of different media outlets.
It's been around for severalyears, but I personally just
started hearing about it on adsfor my podcast, so it's called
Zigazoo, I think it's called-.
Speaker 1 (01:03):
Zigazoo, zigazoo, I
think it's called Zigazoo.
Speaker 4 (01:05):
Zigazoo yeah, it's
called Zigazoo Kids is what the
app is called, and they'rebranding it as TikTok for kids
and a safe social media outletfor kids to use.
It's branded as being for ageseight and up, but I don't think
(01:28):
they actually restrict any ages.
Just, uh, anybody who's under18 has to have their parents
permission.
So they have this whole uh kindof process where the parents
have to verify that they're anadult through an email situation
, and then it allows a kid tomake their profile.
And there are.
(01:49):
They talk a lot in their adsabout safety, security, no
bullying, things like that.
So you know, at first glance Ithought, wow, this sounds really
, you know, interesting andreally a good option, possibly.
So the first thing I want to dois tell you some of the pros.
Then I want to ask you know,interesting and really a good
option, possibly.
So the first thing I want to dois tell you some of the pros.
Then I want to ask you guys ifyou see any red flags or any
(02:11):
things to be concerned about,and then talk about the cons.
So here are the pros, and so Idid a lot of research through
third party websites to find outa lot of this information.
I had heard the ads so much onmy podcast that I could have
recited it for you, and so Iheard a lot of how they're
(02:32):
marketing it, which helped mestart to see some of the red
flags myself.
But okay, here we go.
Pros more parental control.
I already told you about theprocess of how they have to
create an account.
(02:53):
This, I think, is a great thing.
Every video it's mostly videobased is reviewed by a human to
look for any kind of bullying,any kind of negative speech, any
kind of like a child.
If they say any of theirpersonal information, the video
is not approved.
So that's a good thing.
And I want to say like I thinkthey're coming from a good place
.
This app is developed byparents and teachers, um, to
(03:15):
take out all of the uh negativethings about social media for
kids and to increase all of thepositive things, so I think
that's a good mission.
It's highly educational innature as far as they have
challenges and things, and a lotof times they are based around
doing an experiment or going tolearn about something or going
(03:36):
to read something.
Then the kids do a videotalking about that or showing
their experiment.
So that's where they're tryingto increase.
The positive things about socialmedia for kids is they want to
get them active, become creatorsand not just consumers.
There is no text commenting, soI think this is really smart.
(03:57):
You know, we've all heard abouthow damaging negative comments
can be to kids bullying,criticism, things like that,
especially from strangers whohave no reason to be gentle in
their criticism.
Kids can comment on eachother's videos with stickers or
(04:18):
emojis, and they're allZigazoo's own creations, like
their stickers and their emojis.
Okay, so what are some redflags you see right off the bat?
Or you know what you think isgood about this too?
I want to hear that too.
I don't want to be completelynegative on these people and
their app, but I just want tohave a conversation about it.
Speaker 1 (04:41):
I do like the idea.
If we're talking about the pros, I like the idea about the
challenges how they're do likethe idea If we're talking about
the pros.
I like the idea about thechallenges how they're more like
educational based Cause I thinkOlivia, wasn't it?
What was it on Snapchat?
You said that the kids are likekind of like become like oh,
the streaks.
They don't want to lose theirstreak and I love that.
This is kind of like thestreaks on this.
Speaker 4 (05:03):
What is it called?
Speaker 1 (05:03):
again, zigazoo,
zigazoo.
Wow, I can't believe Iremembered that.
I can't remember a real name, abasic name, but Zigazoo, gotcha
, gotcha, covered.
You can call me out, but youraffiliate influencer no but I
like that Target demo.
I like that their challenges areeducational and it's
encouraging kids to do that, andI do like that.
(05:24):
I was concerned about thecommenting and stuff like that,
because one thing that was a redflag to me is okay, it's great
that I wrote down that schoolslike that, these are all going
to be reviewed by a person.
But I also ask inside how longwill it be reviewed by a person?
Speaker 4 (05:41):
Yeah, because at some
point.
Speaker 1 (05:42):
if it blows up,
they're going to start leaning
towards something, as justthat's my thought they're going
to lean to towards thetechnology to do that.
My concern is this is an appthat you have to be very careful
of when it comes to kids app.
As we know and they probably nodoubt already thinking about
this you have, you know,pedophiles and stuff like that.
They have to be aware of it.
(06:03):
This could be drawn, this coulddraw them to this app and they
could set up a fake account andthen approve that fake account
as an adult.
So it's like what things dothey have in place for that?
Because you don't want to openup a door where they have a door
to these children of this agerange.
So that would be a big concernfor me.
So just having the customstickers and emojis made by them
(06:26):
could help too, that they can'tjust put a normal comment in
there, but do they havecommunications behind scenes or
anything like that.
Speaker 4 (06:35):
So I did see that you
can also comment with a video.
So that does kind of open adoor for bullying, I think,
although maybe those commentshave to be, you know, reviewed
too.
Speaker 3 (06:46):
that's probably what
it is, I would assume they would
the thing about.
Speaker 4 (06:50):
So they said.
I read that there, um is afeature where every so often,
the child has to um post apicture of their face to prove
that they're a kid and to provethat they are who they say they
are.
I, I think there's probablyplenty of ways around that, but,
um, that it's an interestingidea.
(07:11):
I think it was probably a smartidea, yeah.
Speaker 1 (07:14):
Yeah, Cause I
remember there was a.
There was an app I was usingthat, remember?
Oh, actually it was somebody onFacebook that texts me.
They sent me a private messageand you guys have probably all
got them.
You're like, this person issending me a friend request and
they're sending me a message inchat, but I'm already friends
with them.
Has anybody else experiencedthat before?
Speaker 4 (07:33):
Yeah.
Speaker 1 (07:33):
And then we're like
so there's one person that did
that, and Shauna, you'llprobably remember.
I said can you prove to me thatyou are who you are?
And they sent a picture oftheir license, their license
like their personal licensetheir license, just the picture
of them holding it.
And I said, oh, that's great,but just to be, I want to make
(07:54):
sure this is a real person andthis is my friend.
I would like you to take that.
Take that picture again withthe license flipped around.
So the person that wasmessaging me literally took the
photo and they flipped itmultiple times, didn't actually
hold the photo, the idea,different way, they just kept
flipping the picture and I waslike sorry, you're not the real
(08:16):
person, because they obviouslywere trying to get around it.
But that is a good thoughtthough.
Speaker 2 (08:22):
Yeah, Number one.
I think it's a good idea.
But my red flag is why does akid, eight years old, even need
something like this?
That's the one thing that I youknow.
When it comes to other socialplatforms, you have to be at
least 13 to get on them.
(08:44):
But I'm like why does the eight, nine, 10, like?
There's just no need for it inmy opinion.
But also the same thing, thatof Brian's red flag.
I'm like, yes, a human islooking at them, but every human
makes mistakes.
And what if one video Getsthrough?
(09:07):
That's detrimental to Everyoneand everything.
So, yeah, that's.
Speaker 3 (09:16):
Those are my quick
red flags that I'm I was kind of
thinking along the same linesLike why even risk it With that
young of an age, especially whenthings are so impactful, like
basic principles haven't beenset up and stuff, so allowing
influences like that could bedangerous, brian, in the way
(09:47):
that like it's not a greatbusiness model basically not a
great business model but like ait's not a solid business to
have every single video reviewedby a human.
That's not viable.
Once it gets bigger, the secondit gets bigger.
They have to hire so manypeople and in the end it's a
free social media, isn't it?
It's a free app.
Yeah, Like it's going to be sohard to keep that company in the
(10:11):
green.
Speaker 1 (10:12):
but I don't know.
Well, I mean, even if it is inthe green, let's say they, let's
say this app blows up, there'sgoing to be people that wants to
buy that app, so it's like thisapp could transition to
somebody else's hands too.
So there's a lot of thingsthere.
But I think that you know,olivia, I think that was a
really good question.
You know, and I know, like yousaid, it was teachers and
(10:35):
parents that were over this.
Is that what you said?
Speaker 4 (10:38):
The ones who started
it, yeah.
Speaker 1 (10:39):
The ones who started
it.
So I'm guessing, like I wouldlove to talk to them, like what?
What made them decide to startthis?
Because I'm sure they'reprobably seeing this, this, this
age group that wants that.
They don't feel part ofsomething.
So they they're looking forsomething.
They're probably asking theirparents I want to be on social
media, but this is the thing.
This is the line where us, asparents, have to draw the line
and say it's kind of not notputting it out there as as like
(11:04):
Like no, we don't trust you onsocial media.
We did this with Ashton.
He didn't get on social mediauntil he was 18 years old.
As parents, it's all right forus to tell our kids no, but I
think it's explaining to themwhy, when Ashton was wanting to
be on those things, we said it'snot that we don't trust you on
there, it's just that there's awhole community and things that
(11:26):
happen on there that you don'tneed to be part of, because it's
not good for you, it's nothealthy and like there's other
things that could be life givento him.
Speaker 4 (11:33):
I did read about that
and that was one of the reasons
why they started.
It is because kids are wantingto be on social media at a
younger and younger age and theywanted them to at least have a
safe place to do it.
This was one of my big redflags is that in the ad the mom
says like now I feel fineletting my kids run loose on
(11:55):
Zigazoo and, but then in otherplaces I was reading that it's
actually designed to be usedwith your parent right there
with you, and that just seems sounrealistic to me that a
parent's actually going to dothat, because there's so many
things you know that you coulddo with your kids.
Do you really want to sit rightnext to them as they use a
(12:17):
social media app and are youactually even going to be paying
attention to what they're doing?
That seems unrealistic to me.
Speaker 1 (12:22):
I think that's a
great.
A great thing to point out,shauna, is because if we're
using it as it's not good, ifwe're using it as a way to
escape our kids and it's alsonot good for we know what it's
like to be sitting in a roomwith friends that have their
phones and they're looking attheir phones.
This is a precious time of yourkids.
We know that Ashton is 20.
(12:42):
He's like we always talk aboutit all the time, like it goes
faster than what you think.
So if parents are listening andthey're thinking about this, do
you really want them in thereto be engaged on that, on that
device, all the time, when yourtime with them is so short, so
short?
Speaker 4 (12:58):
And so another thing
is that even though you know
kids can join you know, possiblyyounger than eight, but
definitely by eight their datais still treated as if they're
over 13, which is all veryconfusing to me, but that has to
do with the child.
It's called COPPA.
It's like something aboutchildren's privacy protection.
(13:20):
That's what the act is, and sobecause of that act, they have
to have certain protections inplace for kids to be able to use
it under 13, because that'swhere that act places the line.
If you're under 13, there haveto be additional protections in
place, but yet the app stilltreats your data as if you're 13
(13:41):
or up, no matter how.
So that's all confusing to me.
Um, I, I was starting to readthrough reviews.
Um, they were supposed to beparent reviews, so a lot of them
did sound pretty trustworthy.
Um was talking about how, eventhough the videos are approved
by a person, like not everyperson has the same values, so
(14:04):
that doesn't automatically makeeverything safe.
There were some videos that arekind of like I mean, I don't
want to go real far into this,but like videos where children
are like trying to teach otherchildren about like furry
culture, so like I think a lotof people would agree that
that's not really a safe thingfor an eight year old to be
watching.
Speaker 1 (14:25):
I think that is an
interesting thought.
I would love to hear what otherpeople that's listening to the
podcast have to say about it,because I mean, if you have
young children, what are yourthoughts about them being on a
platform, even a platform that'slike this, that we know, like
these platforms probably allstart off with the mindset of
this is safe for kids.
But this starts changing overand over and I think that it's
kind of like OK, let's justreally look at it from a
(14:46):
business perspective too.
Is that Even stuff like in themetaverse, which was 13 above,
dropped to 11.
?
Now they're dropping it to 10.
They're always looking at howcan we expand?
Speaker 4 (15:02):
our marketing.
Speaker 1 (15:02):
How can we expand our
target audience?
I think we talked about itrecently, about how we're in the
last generations that haveexperienced life without digital
and generations that are comingup that have only experienced
connection through digital,generations that are coming up
that have nothing, have onlyexperienced connection through
digital.
So, with that mindset, how doyou expand your platform and
(15:24):
grow your audience unless youstart going younger.
Speaker 4 (15:25):
The last thing I want
to say about it and it's kind
of going back to what you guyswere saying about what is the
need that's what I think isprobably the biggest danger of
the whole thing is that you'resetting up a pattern at such a
young age of like needing thevalidation of social media, of
like spending so much time justproducing content for someone
(15:47):
else to consume, um, and I thinkthat can be done responsibly as
the child is older, but yeah,it's just such a young age to
start making those patterns.
You know those thought patternsand so I think that's like you
know you're just ushering themin to the adult social media and
(16:08):
they're going to want to dothat, you know, at a younger age
too, because they're they'vealready been doing this, like,
how long is a kid really goingto be interested in a kid's
social media?
You know what?
Speaker 1 (16:18):
I mean so, yeah, and
I think it's concerning.
It's also like you think about,like how much social media and
what people are saying on socialmedia can influence us as
adults.
Now put that in a child's mind,right, it's so different.
Speaker 4 (16:31):
one thing I try to
tell parents is like, keep the
door shut for as long as youpossibly can, because once you
open, open it it's so hard totake back, like that's true with
a lot of things you know.
That's true with um video games.
That's true with you knoweverything really like.
Whatever you allow your kid todo, you almost can never take it
back.
Speaker 1 (16:52):
And this is why
you're not against it.
This is why you're not thefavorite co-host to the kids on
this podcast.
Speaker 4 (16:59):
I know, I know, and
that's all right.
Speaker 1 (17:01):
I should be because I
know, I know exactly.
Speaker 4 (17:05):
Literally, but yeah,
so it's just.
It's something I try to thinkabout is, you know, are there
other things you want to bebuilding in your children at
these young ages, you know, toset them up well as adults, and
does this really fit into yourplan for them?
And maybe it does.
Maybe you're trying to, youknow, grow a TikTok influencer.
Speaker 3 (17:27):
Something that that
kind of just reminded me of is.
I think it also has to do withsubject, and I kind of remember
this was a really long time ago.
This was like at least a longtime ago.
For me this would have beenlike nine years ago, I think,
but I had a Lego.
Social media.
(17:48):
And it was.
I think it was called Lego lifeand I think it also like kind
of depends on the subject.
If you have something like Legolife where it's like I don't
remember like if anything waseven moderated on there, I'd
have no memory of that.
But I never remember having anegative interaction on there.
I don't remember if it wasbecause there wasn't comments.
(18:09):
I just remember like everybodywas always publishing photos of
their little figures, theirlittle Lego figures, like out on
adventures.
You may remember like I had aStar Wars set and I would go and
I would have them like on anadventure, like they would walk
through all this snow and likeover our Creek and like all
kinds of really interestingthings.
Speaker 4 (18:30):
But I remember.
Speaker 3 (18:31):
I would get like
hundreds of likes.
I think a few of them were inthe thousands, and so there
definitely were people on thereand I remember seeing all kinds
of posts on there and like stopmotion movies.
I never remember a comment.
So maybe it also like, if youare introducing your child that
young, maybe it's a social medialike that where it's more
(18:55):
geared towards a certain subject, not open, kind of like youtube
kids or um, zigazoo, is thatwhat it was called?
Yeah, uh, nothing like that,where it's like real niched down
, because I don't even rememberever seeing a child's face on
there.
I've only I only remember thelego minifigures I think that's
(19:17):
a good thought there.
Speaker 1 (19:19):
Ashton, too, I would
even challenge that too, to be
thinking about like, ifsomething like that like we
still have to be careful of thatlike button, because now we're
teaching kids at such a youngage to kind of like be judging
everything they do by what thoselikes, those likes, so maybe
it's like not even having a likebutton, you know what I mean.
Like, those things can triggerus those addictions, but even at
(19:44):
a young age, all right.
So let's go ahead andtransition over to olivia olivia
.
Okay.
So we've been ripping apartsocial media for kids, so let's
go ahead and talk aboutinstagram for teens.
Let's take it up a level well,we can all agree.
Speaker 2 (19:54):
Apparently, social
media is the devil.
No, um.
Speaker 1 (20:00):
The dinosaur said so.
Speaker 2 (20:01):
The dinosaur said so
um, okay, so adam nasseri, the
head of instagram, my bestfriend, I'm always listening to
are you guys like, are you guysdating or something Cause you're
?
Always talking about I ammarried.
He is married with children.
Um, we're just besties, um,just besties.
Speaker 1 (20:22):
So, uh, don't worry
about it, nate, You're good.
Speaker 2 (20:27):
Well, um, okay.
Speaker 1 (20:31):
It's another podcast
now.
Speaker 2 (20:33):
Yeah, yeah, adam
Aseri came on his Instagram page
and announced that they are nowintroducing Instagram teen
accounts, and what these do, hesaid, is they automatically
place teens in built inprotections and reassure parents
(20:54):
that teens are having safeexperiences.
So if currently any of yourteens have just a normal
Instagram account, right nowthey're automatically moving all
teens over to these teenaccounts that have these
built-in protections, and sosome of the protections are teen
(21:19):
accounts will limit who cancontact the teens and the
content that they see.
So they are going to bemoderating what is being put out
in the feed and they're alsohelping to ensure how much time
the teen is actually spending onthe platform.
(21:39):
So any teens under 16 will needa parent's permission to change
any of the built-in protectionsif they want to be like, less
strict where they're like okay,you're 17.
, you can manage your own time,or whatever.
Um, anyone under 16 can'tchange any of the built-in
(22:05):
protections.
Um, so they said they're doingthis to help parents feel more
confident, um, and things likethat.
So there are more like littledetails I'll give, but I wanted
to hear your guys' thoughtsfirst.
Speaker 1 (22:24):
I'm here.
I'm seeing red flags again.
I'm just, I'm sorry about this,but I'm thinking red.
This is like my.
My day is red.
It's, everything's red.
But first off, I mean some ofthe things gosh, it's just we.
But first off, I mean some ofthe things gosh, it's just we.
We are living in this timewhere there's I don't know.
It's like we're trusting these,these tech companies, like how
(22:45):
much they want to rest assuredand let us rest assured that
this is a safer environment.
They're going to be moderating,they're going to be taking
these things.
But the question is that wejust talked about earlier was
well, we don't really know whothe person is that's making
those decisions.
We don't know where theirethics stands, we don't know
what their beliefs are.
So it's like, if we cancustomize those things, can we
(23:06):
choose some of the things?
I mean we're letting kids makedecisions that even today, kids
their minds Shauna will tellShauna, you've taught me that
kids' minds are sounderdeveloped.
Like do we want to trust theseplatforms to make those
decisions on what is okay andwhat is and is not okay for our
(23:27):
kids to see?
I guess, that's kind of astruggle with that.
Speaker 4 (23:33):
Honestly, too, for
teenagers, I think some of the
most dangerous things thatthey're consuming are things
that make them feel like they'reless than other people.
So fashion, beauty, things likethat there's nothing inherently
wrong with those things, butthat's what teenagers look at
and they're like oh gosh, I'mnever going to look like that, I
can never afford to dress like.
(23:53):
That's what teenagers look atand they're like oh gosh, I'm
never going to look like that, Ican never afford to dress like
that, I can never, whatever.
And that's not going to bemoderated out because it's not
bad.
It's just not good for you tobe consuming all the time and
comparing yourself, and there'sno boundary that anybody can set
that's going to change ateenage boy or girl from
(24:16):
thinking that way.
Speaker 1 (24:17):
I would love to see
Adam and them reach out to your
buddy, adam, adam and them.
Speaker 4 (24:22):
Is that what you said
?
Speaker 1 (24:23):
Yeah, adam and them,
adam and them.
It's a new sitcom, adam andthem.
I would love for them to putout a list of the things that
they are going to moderate.
What are the things thatthey're looking for, like what?
What hits those marks, what isconsidered okay and what's
considered not okay?
Speaker 2 (24:41):
Those are like both.
Yeah, really, really greatpoints.
A list would be fantastic.
So they said that they're goingto have sensitive content
restrictions, so teens willautomatically be placed into the
most restricted setting of oursensitive content control.
So this limits.
(25:04):
They're going to not showcontent of people fighting.
It does say that they're goingto limit content or they're not
going to show any content thatpromotes cosmetic procedures or
different things like that.
So your Explore and Reels pagesare going to look a lot
(25:27):
different than any teens wouldlook.
They're also going to havemessaging restrictions, so you
can only be messaged by peoplethat you follow and they follow
you back.
So there has to be that mutualconnection and teens can only be
(25:48):
tagged or mentioned by peoplethat they follow.
So they're also automaticallyturning on the most restrictive
version of their anti-bullyingfeatures, so hidden words, so
that offensive words and phraseswill be filtered out even of
(26:08):
the DMs, like they're filteringthose out of DMs and keeping an
eye on those.
And then teens will get anotification after they've been
on for 60 minutes each day,telling them to get out of the
app.
And then the last one is sleepmode is going to be turned on at
(26:30):
between 10 pm and 7 am, whichwill automatically mute any
notifications that they getduring that.
Speaker 4 (26:40):
Olivia, let me ask
you a question.
It's been a long time sinceI've set up an Instagram account
, but do you have to prove yourage on there, or do you just
choose your birthday or whatever?
Speaker 2 (26:49):
Yeah, you just choose
your birthday, so there's an
easy way to just make adifferent account with a
different age.
Speaker 4 (26:57):
Yeah, and I mean, I
don't know if you I'm sure
you've heard of this, olivia,but it was news to me and I
think Ashton knows about it too.
But OK, let's ask Brian.
Brian, what is a finsta?
Speaker 1 (27:12):
It's a fence, duh.
Speaker 2 (27:22):
Oh Shauna, yeah, OK,
it's offense, duh.
Oh shauna, yeah, okay.
So I just came across.
Speaker 4 (27:23):
This was I right.
Speaker 2 (27:24):
Teens lie about their
age and that's why we're
requiring them to verify theirage.
How so they're coming up with anew verification system to
verify the age.
Speaker 1 (27:39):
I was going to say
the biggest way is to have the
adults have to scan.
I mean, if you get verified youhave to take a picture of your
actual ID.
But kids at that age I don'tknow, sorry.
Well, I guess they would havean ID.
But yeah, go ahead, shauna.
Speaker 4 (27:52):
If I learned anything
from raising a teenager, it's
that they will find a way aroundwhatever block you put in the
way.
And they're so much faster atlearning about new technologies
and things like that.
They think of things.
Ashton has thought of thingsthat I never even knew, never
even thought of thinking about.
(28:13):
So it's just like I appreciatethat they're trying, though I
have a feeling that it'sprobably more so for the optics
of it.
Like, like, look, we, we wantto, you know, create a safe
place.
Like they know everyone'sworried about this, so they're
like good on them for doingsomething.
But at the end of the day, Idon't think any of those are
(28:34):
really going to make a hugedifference.
Teenagers are going to getwhere they want to get.
Speaker 3 (28:38):
They need a group of
teenagers to start combating
teenagers.
They need people just as fastas them and they need people
with good programming skills.
Speaker 1 (28:46):
I think that was a
good point that you're sharing
there, sean always, because Ithink we have to give them
credit.
They are trying to do something, but I think even as parents,
we have to say we can't trust it100%.
Again, we don't know the peoplebehind that's running all this,
but I would love to see if theycan turn on features like sleep
modes, where the phonesautomatically do that.
Then can they figure out a waywhere the parents can set that
(29:09):
sleep mode?
Speaker 2 (29:10):
Well, funny, you say
that this is my last point.
They are also adding asupervision feature.
So what this includes is aparent actually gets insight
into the teen's account to seewho they're chatting with, the
(29:32):
types of messages that they'resending in the past seven days.
They also have the ability toset total daily time limits for
their kids usage.
So if they only want their kidon there 15 minutes a day, they
can set that up, and they canalso block teens from using
Instagram for specific timeperiods.
(29:53):
So they can also block teensfrom using Instagram for a
specific time period.
So they can block their accountfrom 10 pm to 7 pm, from 7 pm
to 7 am, and they can also seethe topics that your teen is
looking at and making sure thatthey're age appropriate.
Speaker 1 (30:12):
So supervision I'm a
fan of that.
Sounds good.
're age appropriate.
So Supervision I'm a fan ofthat.
Sounds good, I like that.
Speaker 2 (30:17):
Yeah.
So I mean I agree with Shaunaand I unfortunately would say
this to my bestie Adam's face.
I think it's more of an opticthing versus like, is this
actually going to help withanything?
Speaker 1 (30:34):
I think it's good.
I think it's good to be puttingsomething out there, adjusting
it, but, like you said, shauna,kids figure out ways around it.
I mean awesome.
Well, hey, that's all we havefor today.
This is some really goodconversations.
Again, we want to hear yourguys' thoughts, so make sure you
click that text link in theshow notes.
We'd love to hear your thoughtson each one of these topics.
All right, if you enjoyed thisepisode, click, follow and don't
(30:57):
forget to leave us a review.