- Meta has announced new Instagram accounts for teenagers that will automatically include safeguards to limit who can contact them, the content they see, and new ways for them to pursue their interests.
- Teens under 16 years will require parental permission to reduce any of the built-in protections within Teen Accounts.
- While Meta will gradually roll out the accounts in the United States, United Kingdom, Canada, and Australia over the next 60 days, it intends to launch Instagram's teen accounts globally in January, with expansion to other Meta platforms in 2025.
Meta confirmed that the development aims to better support parents and give them peace of mind by ensuring their teens have the necessary protection.
“We recognize parents are concerned that their teens might see mature or inappropriate content online, which is why we have stricter rules around the kinds of content teens see on our apps.
“We remove content that breaks our rules and avoid recommending potentially sensitive content—such as sexually suggestive content or content discussing suicide or self-harm.”
The platform has introduced default private accounts for teen users, which applies to both existing and new accounts. Users under 16, and new users under 18, must approve new followers, while non-followers are restricted from viewing or interacting with their content.
Teens on Instagram will have strict messaging settings, allowing messages only from people they follow or are connected to. Their accounts will automatically limit exposure to sensitive content on Explore and Reels. Additionally, they'll get reminders to log off after 60 minutes of daily use, and sleep mode will activate from 10 PM to 7 AM, muting notifications and sending auto-replies to DMs.
Meta requires teens to verify their age in more places and develops technology to identify teen accounts, even if they list an adult birthday, to tackle age misrepresentation. This will help ensure teens receive the same protections as those with Teen Accounts.
In January 2024, Meta revealed that it would begin hiding search results related to terms such as suicide, self-harm, and eating disorders. Instead, users will be directed to expert resources for help.