Petition Introduce 16 as the minimum age for children to have social media

We believe social media companies should be banned from letting children under 16 create social media accounts.

More details

We think this would help:
1. Stop online bullying
2. Stop children being influenced by false posts
3. Stop children seeing content that encourages violence/could be harmful for their future.

We believe social media is having more of a negative impact to children than a positive one. We think people should be of an age where they can make decisions about their life before accessing social media applications. We believe that we really need to introduce a minimum age of 16 to access social media for the sake of our children’s future along with their mental and physical health.

Sign this petition

120,096 signatures

Show on a map

100,000

Parliament will consider this for a debate

Parliament considers all petitions that get more than 100,000 signatures for a debate

Waiting for 22 days for a debate date

Government responded

This response was given on 25 November 2024

The Government is committed to tackling the harm children face on social media and we are working with Ofcom to ensure the swift and effective implementation of the Online Safety Act to achieve this.

Read the response in full

I would like to thank all those who signed the petition on this incredibly important issue. It is right that tech companies should take responsibility to ensure their products are safe for UK children, and the Online Safety Act 2023 (the ‘Act’) is a crucial tool in holding them to account for this.

Children face a significant risk of harm online and we understand that families are concerned about their children experiencing online bullying, encountering content that encourages violence, or other content which may be harmful, including some instances of misinformation and disinformation.

The government is aware of concerns over the impact of being on smartphones and social media from a young age is having on children. The evidence on screentime is currently mixed and a systematic review by the Chief Medical Officer in 2019 concluded that the evidence on the impact of screentime on mental health was inconclusive. This month, the government has commissioned a feasibility study into future research to understand the ongoing impact of smartphones and social media on children, to enhance the evidence base in this area.

We live in a digital age and must strike the right balance so that children can access the benefits of being online while we continue to put their safety first. The Act puts a range of new duties on social media companies and search services, making them responsible for their users’ safety on their platforms, with the strongest provisions in the Act for children. The government’s priority is the effective implementation of the Act, so those who use social media can benefit from its protections.

Firstly, social media platforms, other user-to-user services and search services that are likely to be accessed by children will have a duty to take steps to prevent children from encountering the most harmful content that has been designated as ‘primary priority’ content. This includes pornography and content that encourages, promotes, or provides instructions for self-harm, eating disorders, or suicide.

Secondly, online services must also put in place age-appropriate measures to protect children from ‘priority’ content that is harmful to children, including bullying, abusive or hateful content and content that encourages or depicts serious violence. Under the Act, where services have minimum age limits, they must specify how these are enforced and do so consistently. Ofcom’s draft codes set out what steps services may have to take to meet these duties. Ofcom’s draft proposals would mean that user-to-user services which do not ban primary priority or priority harmful content should introduce highly effective age checks to prevent children from accessing the entire site or app, or age-restrict areas of the service hosting such harmful content.

Finally, companies in scope of the Act must take steps to protect all users from illegal content and criminal activity on their services. Priority offences reflect the most serious and prevalent illegal content and activity, against which companies must take proactive measures to protect their users. These include, amongst others, intimate image abuse, public order offences, fraud, the encouragement or assistance of suicide, people smuggling and the illegal sex trade.

In her letter to parliamentarians on 17 October, Ofcom’s CEO, Melanie Dawes, stressed that Ofcom’s codes are iterative. The letter also noted that Ofcom is seeking to strike a balance between speed and comprehensiveness with its initial codes. As mentioned in the letter, minimum age limits are an area that Ofcom is seeking to do more work on.

In addition, the Act also updated Ofcom’s statutory duty to promote media literacy. Media literacy can also help tackle a wide variety of online safety issues for all internet users, including children. It includes understanding that online actions have offline consequences, being able to engage critically with online information, and being able to contribute to a respectful online environment. Under the new duties, Ofcom is required to raise awareness of the nature and impact of misinformation and disinformation, and to take steps to build the public's capability in establishing the reliability, accuracy and authenticity of content found on regulated services. These duties are already in force.

The steps Ofcom has set out will represent a positive shift for how children and young people experience the online world. We expect that Ofcom's finalised Children’s Safety Codes of Practice will come into effect by the summer of 2025. However, the Act is designed so it can keep up with evolving areas, and Ofcom has been clear that the Children’s Codes of Practice will be updated as the evidence base of new and existing online harms grows.

The Government is working closely with Ofcom to ensure the Act is implemented as quickly, and effectively, as possible. We will continue to work with stakeholders to balance important considerations regarding the safety and privacy of children.

Department for Science, Innovation and Technology

Petitions Committee requests a revised response from the Government

The Petitions Committee (the group of MPs who oversee the petitions system) has considered the Government’s response to this petition. They felt the response did not respond directly to the request of the petition. They have therefore asked the Government to provide a revised response.

When the Committee receives a revised response from the Government, we will publish this and share it with you.

Share this petition