Petition Introduce 16 as the minimum age for children to have social media
We believe social media companies should be banned from letting children under 16 create social media accounts.
More details
We think this would help:
1. Stop online bullying
2. Stop children being influenced by false posts
3. Stop children seeing content that encourages violence/could be harmful for their future.
We believe social media is having more of a negative impact to children than a positive one. We think people should be of an age where they can make decisions about their life before accessing social media applications. We believe that we really need to introduce a minimum age of 16 to access social media for the sake of our children’s future along with their mental and physical health.
Parliament will consider this for a debate
Parliament considers all petitions that get more than 100,000 signatures for a debate
Waiting for 55 days for a debate date
Government responded
This response was given on 17 December 2024
The government is aware of the ongoing debate as to what age children should have smartphones and access to social media. The government is not currently minded to support a ban for children under 16.
Read the response in full
I would like to thank all those who signed the petition on this incredibly important issue. It is right that tech companies should take responsibility to ensure their products are safe for UK children, and the Online Safety Act 2023 is a crucial tool in holding them to account for this.
The government is aware of the ongoing debate as to what age children should have smartphones and access to social media; however, the government is not currently minded to support a ban for children under 16. Children face a significant risk of harm online and we understand that families are concerned about their children experiencing online bullying, encountering content that encourages violence, or other content which may be harmful. We will continue to do what is needed to keep children safe online. However, this is a complicated issue. We live in a digital age and must strike the right balance so that children can access the benefits of being online and using smartphones while we continue to put their safety first. Furthermore, we must also protect the right of parents to make decisions about their child’s upbringing.
The current evidence on screentime is mixed and a systematic review by the UK Chief Medical Officers in 2019 does not show a causal link between screen-based activities and mental health problems, though some studies have found associations with increased anxiety or depression. Therefore, the government is focused on building the evidence base to inform any future action. Last month, the government commissioned a feasibility study into future research to understand the ongoing impact of smartphones and social media on children, to grow the evidence base in this area.
The government’s priority is working with Ofcom to effectively implement the Online Safety Act 2023, so social media users, especially children, can benefit from the Act’s protections as soon as possible. Additionally, the DSIT Secretary of State has outlined the government’s five online safety priorities through a draft Statement of Strategic Priorities. These priorities, which include safety by design, focus on delivering the safest online experiences for all users, in particular children, through the Act.
The Act puts a range of new duties on social media companies and search services, making them responsible for their users’ safety, with the strongest provisions in the Act for children.
Social media platforms, other user-to-user services and search services that are likely to be accessed by children will have a duty to take steps to prevent children from encountering the most harmful content that has been designated as ‘primary priority’ content. This includes pornography and content that encourages, promotes, or provides instructions for self-harm, eating disorders, or suicide. Online services must also put in place age-appropriate measures to protect children from ‘priority’ content that is harmful to children, including bullying, abusive or hateful content and content that encourages or depicts serious violence. Under the Act, where services have minimum age limits, they must specify how these are enforced and do so consistently. Ofcom’s draft proposals would mean that user-to-user services which do not ban primary priority or priority harmful content should introduce highly effective age checks to prevent children from accessing the entire site or app, or age-restrict areas of the service hosting such harmful content.
Finally, companies in scope of the Act must take steps to protect all users from illegal content and criminal activity on their services.
In her letter to parliamentarians on 17 October, Ofcom’s CEO, Melanie Dawes, stressed that Ofcom’s codes are iterative. The letter also noted that Ofcom is seeking to strike a balance between speed and comprehensiveness with its initial codes and stated that Ofcom is seeking to do more work on minimum age limits in the future.
In addition, the Act also updated Ofcom’s statutory duty to promote media literacy. Media literacy can also help tackle a wide variety of online safety issues for all internet users, including children. Additionally, Ofcom is required to raise awareness of the nature and impact of misinformation and disinformation, and to take steps to build the public's capability in establishing the reliability, accuracy, and authenticity of content found on regulated services. These duties are already in force.
The steps Ofcom has set out will represent a positive shift for how children and young people experience the online world. We expect that Ofcom's finalised Children’s Safety Codes of Practice will come into effect by the summer of 2025. However, the Act is designed so it can keep up with evolving areas, and Ofcom has been clear that the Children’s Codes of Practice will be updated as the evidence base of new and existing online harms grows.
We will continue to work with stakeholders to balance important considerations regarding the safety and privacy of children.
Department for Science, Innovation and Technology
This is a revised response. The Petitions Committee requested a response which more directly addressed the request of the petition. You can find the original response towards the bottom of the petition page (https://petition.parliament.uk/petitions/700086)
Related activity
Petitions Committee requests a revised response from the Government
The Petitions Committee (the group of MPs who oversee the petitions system) has considered the Government’s response to this petition. They felt the response did not respond directly to the request of the petition. They have therefore asked the Government to provide a revised response.
When the Committee receives a revised response from the Government, we will publish this and share it with you.
Inquiry: Social media, misinformation and harmful algorithms
MPs on the Science, Innovation and Technology Committee have launched an inquiry into: “Social media, misinformation and harmful algorithms”.
The Committee is considering issues including:
- The spread of harmful or false content online
- The effectiveness of current and proposed regulations
- Algorithms used by social media and search engines
- The role of generative AI.
What happens next?
The Committee will hear from people with experience and understanding of the subject.
It will then consider the evidence it has taken and publish a report of its findings. This will include recommendations to the Government on any changes that might be needed.
Further information: Social media, misinformation and harmful algorithms.
What is the Science, Innovation and Technology Committee?
The Committee is a group of MPs from different political parties. It looks at the work of the Department for Science, Innovation and Technology.
Your UK Parliament newsletter
Sign up to a regular newsletter on the work of Parliament and how to get involved.
Original Government response
The Government is committed to tackling the harm children face on social media and we are working with Ofcom to ensure the swift and effective implementation of the Online Safety Act to achieve this.
I would like to thank all those who signed the petition on this incredibly important issue. It is right that tech companies should take responsibility to ensure their products are safe for UK children, and the Online Safety Act 2023 (the ‘Act’) is a crucial tool in holding them to account for this.
Children face a significant risk of harm online and we understand that families are concerned about their children experiencing online bullying, encountering content that encourages violence, or other content which may be harmful, including some instances of misinformation and disinformation.
The government is aware of concerns over the impact of being on smartphones and social media from a young age is having on children. The evidence on screentime is currently mixed and a systematic review by the Chief Medical Officer in 2019 concluded that the evidence on the impact of screentime on mental health was inconclusive. This month, the government has commissioned a feasibility study into future research to understand the ongoing impact of smartphones and social media on children, to enhance the evidence base in this area.
We live in a digital age and must strike the right balance so that children can access the benefits of being online while we continue to put their safety first. The Act puts a range of new duties on social media companies and search services, making them responsible for their users’ safety on their platforms, with the strongest provisions in the Act for children. The government’s priority is the effective implementation of the Act, so those who use social media can benefit from its protections.
Firstly, social media platforms, other user-to-user services and search services that are likely to be accessed by children will have a duty to take steps to prevent children from encountering the most harmful content that has been designated as ‘primary priority’ content. This includes pornography and content that encourages, promotes, or provides instructions for self-harm, eating disorders, or suicide.
Secondly, online services must also put in place age-appropriate measures to protect children from ‘priority’ content that is harmful to children, including bullying, abusive or hateful content and content that encourages or depicts serious violence. Under the Act, where services have minimum age limits, they must specify how these are enforced and do so consistently. Ofcom’s draft codes set out what steps services may have to take to meet these duties. Ofcom’s draft proposals would mean that user-to-user services which do not ban primary priority or priority harmful content should introduce highly effective age checks to prevent children from accessing the entire site or app, or age-restrict areas of the service hosting such harmful content.
Finally, companies in scope of the Act must take steps to protect all users from illegal content and criminal activity on their services. Priority offences reflect the most serious and prevalent illegal content and activity, against which companies must take proactive measures to protect their users. These include, amongst others, intimate image abuse, public order offences, fraud, the encouragement or assistance of suicide, people smuggling and the illegal sex trade.
In her letter to parliamentarians on 17 October, Ofcom’s CEO, Melanie Dawes, stressed that Ofcom’s codes are iterative. The letter also noted that Ofcom is seeking to strike a balance between speed and comprehensiveness with its initial codes. As mentioned in the letter, minimum age limits are an area that Ofcom is seeking to do more work on.
In addition, the Act also updated Ofcom’s statutory duty to promote media literacy. Media literacy can also help tackle a wide variety of online safety issues for all internet users, including children. It includes understanding that online actions have offline consequences, being able to engage critically with online information, and being able to contribute to a respectful online environment. Under the new duties, Ofcom is required to raise awareness of the nature and impact of misinformation and disinformation, and to take steps to build the public's capability in establishing the reliability, accuracy and authenticity of content found on regulated services. These duties are already in force.
The steps Ofcom has set out will represent a positive shift for how children and young people experience the online world. We expect that Ofcom's finalised Children’s Safety Codes of Practice will come into effect by the summer of 2025. However, the Act is designed so it can keep up with evolving areas, and Ofcom has been clear that the Children’s Codes of Practice will be updated as the evidence base of new and existing online harms grows.
The Government is working closely with Ofcom to ensure the Act is implemented as quickly, and effectively, as possible. We will continue to work with stakeholders to balance important considerations regarding the safety and privacy of children.
Department for Science, Innovation and Technology
This response was given on 25 November 2024. The Petitions Committee then requested a revised response, that more directly addressed the request of the petition.