Coronavirus (COVID-19)

What you need to do

Petition Make verified ID a requirement for opening a social media account.

Make it a legal requirement when opening a new social media account, to provide a verified form of ID. Where the account belongs to a person under the age of 18 verify the account with the ID of a parent/guardian, to prevent anonymised harmful activity, providing traceability if an offence occurs.

More details

My son Harvey is disabled. He is also the kind and gentle son of a person regularly in the public eye. The Online Harms Bill doesn’t go far enough in making online abuse a specific criminal offence and doing what ‘Harvey’s Law’ intended. To make the law work needs the removal of anonymity to ensure that users cannot cause harm by using online platforms to abuse others. Where an offence has taken place they ought to be easily identified and reported to the police and punished. We have experienced the worst kind of abuse towards my disabled son and want to make sure that no one can hide behind their crime.

Sign this petition

172,012 signatures

Show on a map

100,000

Parliament will consider this for a debate

Parliament considers all petitions that get more than 100,000 signatures for a debate

Waiting for 33 days for a debate date

Government responded

This response was given on 24 March 2021

Being anonymous online does not give anyone the right to abuse others. The Government’s online harms legislation will address harmful online abuse, including when the perpetrator is anonymous.

We are taking steps through the online harms regulatory framework to address abuse and other harmful behaviour online, whether committed anonymously or not.

In December 2020, we published the full government response to the Online Harms White Paper consultation, which sets out new expectations on companies to keep their users safe online. Social media, websites, apps and other services which host user-generated content or allow people to talk to others online will need to remove and limit the spread of illegal abuse on their services, including illegal anonymous abuse and other illegal harmful content, such as terrorist material or child sexual abuse imagery.

Major platforms will also need to set out clearly what legal content is acceptable on their platform and stick to it. This will include stating whether they will tolerate abuse that doesn’t meet a criminal threshold, whether anonymous or not.

The framework will deliver a higher level of protection for children online, with companies needing to protect children from inappropriate content and harmful activity. Companies will need to prove children are not accessing their service, or they will need to conduct a child safety risk assessment and provide safety measures for child users. These include protecting children from inappropriate and harmful content like pornography, and abusive behaviours behaviour such as trolling and pile-on abuse.

Companies will also have a duty to ensure they have effective and accessible reporting and redress mechanisms. These will need to allow users to report abuse, including anonymous abuse. Appropriate responses from the company might include removal of harmful content, sanctions against offending users, or changing processes and policies to better protect users. Users will also be able to report concerns to the regulator as part of its research and horizon-scanning activity.

Compliance with the online harms framework will be enforced by Ofcom, who will have a suite of powers to use against companies who fail to fulfil the duty of care. These include fines on companies - of up to £18m or 10% of annual global turnover - and business disruption measures. The Online Safety Bill, which will give effect to the regulatory framework outlined in the full government response, will be ready this year.

We are also considering the criminal law and its ability to deal with harmful communications online. The Government has asked the Law Commission to review existing legislation on abusive and harmful communications, including anonymous abuse. The Law Commission has consulted on proposed reforms and a final report is expected in the summer. We will carefully consider using the online harms legislation to bring the Law Commission’s final recommendations into law, where it is necessary and appropriate to do so.

The police already have a range of legal powers to identify individuals who attempt to use anonymity to escape sanctions for online abuse, where the activity is illegal. Police reporting shows that in 2017/18, 96% of attempts by public authorities to identify the anonymous user of a social media account, email address or telephone, resulted in successful identification of the suspect of their investigation. The Investigatory Powers Act allows police to acquire communications data such as an email address and location of the device from which illegal anonymous abuse is sent and use it as evidence in court. The government is working with law enforcement to review whether the current powers are sufficient to tackle illegal anonymous abuse online. The outcome of that work will inform the government’s future position in relation to illegal anonymous online abuse.

It is important to recognise the benefits of anonymity, as well as the challenges. Anonymity underpins people's fundamental right to express themselves and access information online in a liberal democracy. This is particularly important for those who sadly have reason to fear the consequences of speaking freely, such as young people exploring their gender or sexual identity, whistleblowers, journalists' sources, victims of abuse or modern slavery and political dissidents. A blanket ban on anonymity would wash away these important benefits and would be unlikely to stop online abuse.

Department for Digital, Culture, Media and Sport

Other parliamentary business

MPs to debate Online Anonymity and Anonymous Abuse

MPs will debate Online Anonymity and Anonymous Abuse on Wednesday 24 March in the main House of Commons Chamber. The subject of the debate has been determined by the Backbench Business Committee.

This will be a general debate. General debates allow MPs to debate important issues, however they do not end in a vote nor can they change the law.

The debate will start at around 12.30pm, following Prime Minister's Questions.

Watch here this Wednesday: https://www.parliamentlive.tv/Event/Index/4ee11fcd-f567-4897-8c0b-f04de0be4caa

You'll be able to read a transcript of the debate a few hours after it happens: https://hansard.parliament.uk/commons/2021-03-24

Find out more about how Parliamentary debates work: https://www.parliament.uk/about/how/business/debates/

Find out more about the Backbench Business Committee: https://committees.parliament.uk/committee/202/backbench-business-committee/

This debate is in addition to any debate the Petitions Committee schedules on this petition. We’ll message you to let you know as soon as the Committee schedules a debate on this petition.

Relevant work by the Petitions Committee

The Petitions Committee has been looking at how to tackle online abuse, following a number of petitions calling for action to be taken to tackle online abuse.

You can find out more about the Committee's work on online abuse, and read transcript of this evidence, here: https://committees.parliament.uk/work/307/tackling-online-abuse/

Share this petition