We’re waiting for a new Petitions Committee
Petitions had to stop because of the recent general election. Once a new Petitions Committee is set up by the House of Commons, petitions will start again.
Find out more on the Petitions Committee website
Closed petition Ban smartphones and camera phones for under 16s
I have many concerns regarding the use of mobile phones that can access photographs and social media for children under 16. I believe there are too many safeguarding concerns, exploitation, cyber bullying and group trolling that can be mentally damaging children's well being.
More details
I would like an age restriction on phones for children up to the age of 16. Phones that can only text or call can be allowed for children but smartphones and camera phones should be age restricted to avoid harming children.
This petition closed early because of a General Election Find out more on the Petitions Committee website
Government responded
This response was given on 30 April 2024
The government is committed to making the UK the safest place to be a child online, as evidenced by the Online Safety Act. We are focused on implementing the regime as soon as possible.
I would like to thank all those who signed the petition and share our commitment to keeping children safe online. While most children have a positive experience online, using the internet to connect with peers and to access educational resources, information, and entertainment, I share concerns about the impact of harmful and age-inappropriate content and activity online, which can be particularly damaging for children.
It is important that we strike the right balance by protecting children from harm whilst also allowing them to benefit from safe internet use. We also recognise that parents will decide what is appropriate for their children. We were proud to pass our landmark Online Safety Act last year which will protect children from harmful content. The government keeps all options under review to keep children safe online and build on the progress of the Online Safety Act. However, we are not considering a complete ban on children under 16 having smartphones as that may impact the ability of children to access some of the benefits of the Internet.
The Online Safety Act instead takes a safety-by-design approach, enabling children to access the benefits of the Internet with far fewer of the dangers. The Act received Royal Assent on 26 October 2023 and we are working closely with Ofcom to ensure the regime is operational as soon as possible.
The Online Safety Act places robust, much-needed responsibilities on technology companies – including social media platforms, search services and other services which host user-generated content – to keep all users, but particularly children, safe online.
All companies in scope will need to take robust steps to protect children from illegal content and criminal behaviour on their services. This includes removing and limiting the spread of illegal content and taking steps to prevent similar material from appearing. Additionally, all services which are likely to be accessed by children will be required to provide safety measures for child users from content that is legal but nonetheless presents a risk of harm to children. User-to-user services, including social media platforms, must prevent children of all ages from encountering 'primary priority' content. Pornographic content and content that encourages or promotes either self-harm; eating disorders; or suicide have all been designated as kinds of ‘primary priority’ content. Where these services allow ‘primary priority content’ on their service, they will need to use highly effective age verification or age estimation to ensure children are not able to encounter this content on their service. The Act also includes a standalone provision which requires providers who publish pornographic content on their services to prevent children from accessing that content.
In addition, user-to-user and search service providers must also provide age-appropriate protections to children from 'priority' content that is harmful to children, such as bullying and content that depicts or encourages serious violence. Finally, providers which have age restrictions need to specify in their terms of service what measures they use to prevent underage access and apply these terms consistently. This ensures providers can be held to account for what they say in their terms of service and can no longer do nothing to prevent underage access. The Act will be overseen and enforced by Ofcom. As the independent regulator, Ofcom will set out in codes of practice the steps that providers can take to comply with their duties. Ofcom will also have a range of enforcement powers, which will include substantial fines and, where appropriate, business disruption measures (including blocking).
In addition to the work on the Online Safety Act, the government has also recently published guidance on the use of mobile phones in schools. We know that mobile phones are a distraction to learning for pupils and, if unregulated in classroom settings, lead to significant loss of learning time. That is why the Department for Education is acting on this challenge by strengthening the position on mobile phones – making clear that use should be prohibited throughout the school day. The guidance will provide headteachers with support and advice on how to successfully prohibit mobile phone use, including at break times, to tackle disruptive behaviour and online bullying whilst boosting attention during lessons. If schools fail to implement the new guidance, the government will consider legislating in the future to make the guidance statutory.
The government continues to look at ways that children and other internet users can be kept safe online, to further build on the protections of the Online Safety Act. Our current priority is ensuring that the Act is operational as soon as possible to ensure all children in the UK are provided with a safer online experience.
Department for Science, Innovation and Technology
This is a revised response. The Petitions Committee requested a response which more directly addressed the request of the petition. You can find the original response towards the bottom of the petition page (https://petition.parliament.uk/petitions/655473)
Related activity
Petitions Committee requests a revised response from the Government
The Petitions Committee (the group of MPs who oversee the petitions system) has considered the Government’s response to this petition. They felt that the response explains the Government’s policy on keeping children safe online, but does not respond directly to the request of the petition, for the Government to ban smartphones and camera phones for under 16s and have therefore asked the Government to provide a revised response.
When the Committee receives a revised response from the Government, we will publish this and share it with you.
Original Government response
The government is committed to making the UK the safest place to be a child online, as evidenced by the Online Safety Act. We are focused on implementing the regime as soon as possible.
I would like to thank all those who signed the petition and share our commitment to keeping children safe online. While most children have a positive experience online, using the internet to connect with peers and to access educational resources, information, and entertainment, I share concerns about the impact of harmful and age-inappropriate content and activity online, which can be particularly damaging for children.
It is important that we strike the right balance by protecting children from harm whilst also allowing them to benefit from safe internet use. We also recognise that parents will decide what is appropriate for their children. We were proud to pass our landmark Online Safety Act – to make services such as social media safe for children, provide assurances for parents that their children are having safer online experiences, and place greater responsibility on platforms.
The Online Safety Act allows children to access the positives of online interaction with their friends without the same risks of encountering harmful content or activity. By reducing harmful content on online services and requiring tech companies to take a safety-by-design approach, we hope that we can redress the balance to allow children to access the benefits of the internet with far fewer of the dangers. The Act received Royal Assent on 26 October 2023 and we are working closely with Ofcom to ensure the regime is operational as soon as possible.
The Online Safety Act places robust, much-needed responsibilities on technology companies – including social media platforms, search services and other services which host user-generated content – to keep all users, but particularly children, safe online.
All companies in scope will need to take robust steps to protect children from illegal content and criminal behaviour on their services. This includes removing and limiting the spread of illegal content and taking steps to prevent similar material from appearing. Additionally, all services which are likely to be accessed by children will be required to provide safety measures for child users from content that is legal but nonetheless presents a risk of harm to children. User-to-user services, including social media platforms, must prevent children of all ages from encountering 'primary priority' content. Pornographic content and content that encourages or promotes either self-harm; eating disorders; or suicide have all been designated as kinds of ‘primary priority’ content. Where these services allow ‘primary priority content’ on their service, they will need to use highly effective age verification or age estimation to ensure children are not able to encounter this content on their service. The Act also includes a standalone provision which requires providers who publish pornographic content on their services to prevent children from accessing that content.
In addition, user-to-user and search service providers must also provide age-appropriate protections to children from 'priority' content that is harmful to children, such as bullying and content that depicts or encourages serious violence. Finally, providers which have age restrictions need to specify in their terms of service what measures they use to prevent underage access and apply these terms consistently. This ensures providers can be held to account for what they say in their terms of service and can no longer do nothing to prevent underage access.
The Act will be overseen and enforced by Ofcom. As the independent regulator, Ofcom will set out in codes of practice the steps that providers can take to comply with their duties. Ofcom will also have a range of enforcement powers, which will include substantial fines and, where appropriate, business disruption measures (including blocking).
In addition to the work on the Online Safety Act, the government has also recently published guidance on the use of mobile phones in schools. We know that mobile phones are a distraction to learning for pupils and, if unregulated in classroom settings, lead to significant loss of learning time. That is why the Department for Education is acting on this challenge by strengthening the position on mobile phones – making clear that use should be prohibited throughout the school day. The guidance will provide headteachers with support and practical advice on how to successfully prohibit mobile phone use, including at break times, to tackle disruptive behaviour and online bullying whilst boosting attention during lessons. If schools fail to implement the new guidance, the government will consider legislating in the future to make the guidance statutory.
The government continues to look at ways that children and other internet users can be kept safe online, to further build on the protections of the Online Safety Act. Our current priority is ensuring that the Act is operational as soon as possible to ensure all children in the UK are provided with a safer online experience.
Department for Science, Innovation and Technology
This response was given on 15 March 2024. The Petitions Committee then requested a revised response, that more directly addressed the request of the petition.
MPs debate the impact of smartphones and social media on children
MPs held a debate on the impact of smartphones and social media on children on Tuesday 14 May in Westminster Hall. The debate was led by Miriam Cates MP.
The Parliamentary Under-Secretary of State for Science, Innovation and Technology, Saqib Bhatti MP, responded on behalf of the Government.
What is a Westminster Hall debate?
Westminster Hall is the second chamber of the House of Commons. Westminster Hall debates give MPs an opportunity to raise issues and receive a response from a government minister. Westminster Hall debates are general debates that do not end in a vote.
Visual explainer: Westminster Hall debates
Get involved in the work of the UK Parliament
Sign up to the Your UK Parliament newsletter for the latest information on how to get involved and make a difference.