In September 2021, due to strong pushbacks from the US government, lawmakers, and parents, Meta (formerly Facebook), Instagram’s parent company, was forced to halt the development of Instagram for Kids for children aged 13 years and below.
The argument was based on leaked intel revealing that Instagram contributes to the increasing rate of anxiety and depression among teenage girls.
However, the global tech company pushed back, claiming that an exclusive photo-sharing platform for children would help parents better supervise their children and control their experience instead of relying on Instagram.
An official comment from Instagram revealed that parental permission would be required to sign up to join the app. It would not have ads and would only have age-appropriate content.
The public pushback could be considered suspicious because YouTube, Facebook Messenger, and TikTok already have similar children-exclusive platforms. Apps that don’t develop separate versions have safety/parental control tools.
This is no coincidence. Social media companies that were created without children's plans have realised that it is almost unrealistic to build fail-safe underage controls on their platforms to give a level of control to parents. Besides, the age that children have to be before they can use internet-enabled phones has become younger and younger.
On a macro scale
The approach to addressing the issue of online safety for children — a global concern — differs across platforms. Nearly all social media platforms have age restrictions of at least 13 years to use their platform. These restrictions are supported mainly by countries' data protection regulations.
All these restrictions are necessary to avoid inappropriate and age-sensitive suggestions popping up on children's timelines. Also, it would protect them from Internet predators. But this does not necessarily protect young users from other issues that might affect them during their initial exposure to social media — like an addiction that can preclude anxiety, depression, and other mental challenges.
When we discussed the highly addictive nature of social media during one of Techpoint Africa’s team bonding sessions, most of the team opined that just as children below a certain age should be protected from addictive substances, they should also be kept away from social media.
Let the best of tech news come to you
Give it a try, you can unsubscribe anytime. Privacy Policy.
With many divided on the move, China’s regulatory authority has implemented regulations requiring social media and gaming companies to disincentivise youth from spending too much time on social media.
Video-sharing app, Douyin — similar to TikTok — does not allow people below 18 who open it in Youth Mode to use the platform for more than 40 minutes at a time. Also, the app becomes inaccessible between 10 p.m. and 6 a.m. and it screens content.
Meanwhile, a new set of rules was recently released for gaming platforms to only allow young users to play games between 8 pm and 9 pm on Fridays, Saturdays, Sundays, and on official holidays.
While other countries might not take this route, they often find legal means to check the activities of these platforms. In Instagram’s case, US lawmakers stood their ground.
A cursory glance at a report that surveyed nearly 3,000 parents of teenagers in the US shows that parents are aware of the dangers of having their children online, but they either don’t know how to go about it or admit in resignation that it is a necessary evil.
“How do you want to keep track of your child’s online activities without getting drained or getting them riled up to the point of finding loopholes to get back at you?” complains one bemused parent.
Platforms and companies can establish restrictive rules, but parents and guardians have a part to play, and ignorance of what to do just won’t cut it.
Don’t be a clueless parent
Before exploring the provisions that different social media apps put in place to keep children safe, it is necessary to state that it's best if parents open social media accounts for their children.
With that, they can effectively define settings that suit their children’s ages since these platforms only activate specific security restrictions when age is stated.
Parents should understand any app their children want to use. Where necessary, they should download the app and familiarise themselves with what goes on there.
Facebook/Facebook Messenger
Facebook is one of the oldest surviving social networking platforms globally. As of October 2021, with over 2.9 billion users globally, the platform ranked first on the list of the most popular social networks. Attempting to protect a teenager with a mobile phone from the lure of such a platform is probably an exercise in futility.
For one, a Facebook account cannot be created for someone under 13 years old. For teens 14 years and above, parents have to activate the following settings during account creation:
- Change the default audience from “Public” to “Friends” under the ‘Audience and visibility” option in the settings menu. With this, only friends can see their activities.
- Restrict who can send friend requests from “Everyone” to “Friends of friends” under the “How people find and contact you” option. Here, you can also remove them from public search; this will stop their profiles from popping up in a Google search or appearing as a suggestion to anyone with their phone number or email address.
- Also, review who can follow, tag, comment, or share their posts by changing the settings to ”Friends” or “Friends of friends”.
Meanwhile, there is an under-13 child-friendly version of Facebook’s instant messaging platform, Messenger Kids, which allows parents to decide who their children chat with. This was launched in 2017.
To use the Messenger Kids app, the child does not need a Facebook account or a phone number since activation is only possible via an adult’s account.
After installing the Messenger Kids app on a child’s phone, a parent needs to authenticate it with their Facebook account before creating a mini-profile for their child.
The parent controls who becomes friends, hence who the child chats with. Even though parents control sending and accepting friend requests, they cannot spy on their children’s chats.
The Meta-owned photo-sharing app, Instagram, is another social network that has caught the fancy of young people. Given the several reported cases of online abuse on the platform, online safety precautions need to be taken, especially for young people.
To keep minors and all its users safe, the platform has several safety features — particularly 'Tools to fight bullying'.
To manage those that interact with your children, set profile as ‘Private account' in the 'Privacy' settings menu. With this, interaction will be limited to only followers. For people younger than 16, their profile is automatically set as 'private'.
You can also control comments, tags, mentions, and unwanted interactions, and direct messages.
YouTube
Like other platforms, YouTube does not allow people under 13 to open accounts unless, of course, the age isn't reported.
For teens and tweens, parents can supervise usage, change the video privacy settings, and disable autoplay. They can also take their children through these safety settings.
In 2015, Google introduced YouTube Kids to give younger users child-friendly content. YouTube's algorithm directs uploaded videos appropriate for children to the app.
Because a few cases have been reported where some offensive videos slipped through YouTube's filters, parents still have to monitor app use and report offensive content.
Snapchat
Snapchat has been around for a while — 2011 — and It is popular for sharing temporary videos and images.
Unlike other platforms considered, Snapchat posts — called Snaps — disappear after 24 hours. Direct messages also disappear after the recipient views them. However, the ephemeral nature of its content does not guarantee safety. If anything, extra care should be taken.
One of the ground rules on Snapchat's Terms of Use is that anyone under 13 is not permitted on the app.
An excellent way to help teens use the app safely is to discuss staying off explicit content and unfollowing any account that makes them uncomfortable.
Also, parents can adjust the safety settings and change location sharing settings to 'ghost mode'. With this, only followers can see their Snaps or send direct messages.
Here's how to enable Snapchat's Parental control features:
- On the profile page, on the settings icon, change the option in 'Contact me' to 'My friends'. Select the same option under the 'Who can' section (to manage who can see their snaps).
- To make their account private so that they don't get suggestions to subscribe to channels or make them appear in other users' suggestions, uncheck all the boxes in the "See Me in Quick Add' section.
- To deactivate location sharing, check 'Ghost Mode' under 'My location' settings.
TikTok
With over 2.6 billion downloads, TikTok is one of the most used social media platforms globally. Like every other social media platform, it has its safety issues.
The video-sharing app also has provisions for young users, although, like other platforms, age has to be reported in the account settings to activate the accounts.
The default account setting for users younger than 16 is set to 'Private account', which means videos will only be visible to followers. Direct messaging is also disabled. This setting can be manually activated in the settings menu.
For children under 13, TikTok takes them to TikTok Kids, where they are exposed to curated content; sharing, commenting, and messaging are deactivated. In some countries, parents can link their child's TikTok account with theirs. Once the 'Family Safety Mode' is enabled, parents can manage screen time, restrict content, and supervise messaging, following, commenting, etc.