On Tuesday, March 10, 2026, Nigeria’s Minister of Communications, Innovation and Digital Economy, Bosun Tijani, announced that the government has launched a public consultation on policies aimed at improving child online safety, including possible age limits for social media use.
The consultation invites Nigerians, including parents, educators, young people, and digital professionals, to share their views on how the country should regulate children’s access to social media platforms.
While this announcement marks the start of the consultation phase, which began on March 10, 2026, the government says it wants to build an evidence-based policy, meaning the survey results will be analysed before formal laws or technical requirements are drafted.
The consultation explores several possible approaches, including introducing minimum age requirements for social media accounts, implementing stronger age-verification systems, increasing platform accountability, and expanding government oversight of digital platforms.
According to the Minister, the goal is to create a framework that balances the benefits of Internet access with the need to protect children from online harm.
“While the Internet provides opportunities for learning, creativity, and communication, it also exposes young users to risks such as cyberbullying, harmful content, online exploitation, and misuse of personal data.”
There is growing concern over social app features designed to keep users hooked, a major factor in the mental health crisis among teens. Also, current laws like the Nigerian Data Protection Act weren’t built to handle the hyper-targeted grooming or AI-driven misinformation that children face today.
For now, the government is still gathering public opinion before deciding on the next steps. If implemented, the policy could become part of a broader digital regulation led by the Ministry of Communications, Innovation, and Digital Economy.
How other countries regulate social media for children
This policy discussion also comes as governments around the world rethink how social media platforms should operate, especially regarding younger users.
Victoria Fakiya – Senior Writer
Techpoint Digest
Stop struggling to find your tech career path
Discover in-demand tech skills and build a standout portfolio in this FREE 5-day email course
Nigeria is not alone in considering tighter controls on children’s use of social media. Several countries have also introduced or proposed similar policies.
For example, Australia introduced a ban on social media use for users under 16, requiring platforms to enforce age restrictions. Countries like France require parental consent for users under 15 to create social media accounts. Denmark is also planning to ban users under 15 as part of its child online safety efforts.
On March 6, 2026, Indonesia announced its own plan to deactivate accounts belonging to children under 16 on high-risk platforms starting March 28. Malaysia is following a similar path, aiming for full implementation by the end of 2026.
In more developed countries such as China, regulation goes even further. Social media and digital platforms often require real-name registration tied to national ID systems, and minors can face restrictions on when or how long they can use certain online services.
While no African country has yet implemented strict measures to regulate the use of social platforms, several are also moving to put in place plans. Like Nigeria, South Africa is also debating age restrictions on social platforms. The Egyptian parliament is currently drafting legislation to set a minimum age for social media accounts.
These approaches reflect a global shift toward stricter oversight of social media platforms as governments attempt to balance digital access with child protection.









