This petition was submitted during the 2015–2017 Conservative government

Petition Make online abuse a specific criminal offence and create a register of offenders

Trolling is a major problem in this day and age. People of all ages and background suffer every day, including my family - especially my son Harvey. I have tried my best to expose people and even had two arrested but nothing was done and there were no repercussions or penalties for this behaviour.

More details

This does not affect just high profile people it affects everyone from every walk of life from young children, teenagers, people at work, husbands and wives. This abuse includes racism, homophobia, body shaming and a whole range of other hate speech.

This petition is an important topical issue and I want it to help bring justice to everyone who has ever suffered at the hands of trolls. Help me to hammer home worldwide that bullying is unacceptable whether it's face to face or in an online space.

This petition closed early because of a General Election Find out more on the Petitions Committee website

221,914 signatures

Show on a map

100,000

Parliament debated this topic

This topic was debated on 29 April 2019

Government responded

This response was given on 18 April 2017

We recognise that behaviour that is not tolerated offline is now common online. DCMS is working on an Internet Safety Strategy which aims to make the UK a safer place for young people to be online.

Read the response in full

The internet has provided everyone with some amazing opportunities. It can help isolated individuals find safe communities and can be an amazing resource for people to develop their creativity. However, government recognises that behaviour that would never be tolerated in the real world has become increasingly common online with potentially devastating impacts.

The law does not differentiate between criminal offences committed on social media or anywhere else. Where something is illegal offline it is also illegal online and there is already legislation which applies in relation to online abuse. A variety of different offences already exist covering communications which are grossly offensive, obscene, indecent or false. It is also an offence to send certain articles with intent to cause distress or anxiety.

While we will continue to monitor the situation, we believe that our current legal approach is the right one. The House of Lords Communications Select Committee stated in their report into Social Media and Criminal Offences in July 2014 that the criminal law in this area, almost entirely enacted before the invention of social media, is generally appropriate for the prosecution of offences committed using social media.

The Crown Prosecution Service (CPS) published revised guidelines on prosecuting cases involving communications sent via social media on 10 October 2016. The guidelines provide prosecutors with a clear framework for a consistent approach to all cases involving social media and are available to the public on the CPS website.

The revised guidelines contain a number of new sections, including on:
● Hate Crime, which makes it more likely that a prosecution is required
● Violence against Women and Girls (VAWG), including potential cyber-enabled VAWG offences, such as “baiting”, humiliating peers online by labelling them sexually promiscuous
● False or offensive social media profiles
● Vulnerable and intimidated witnesses
● Reporting and preventing abuse on social media
This update also includes types of offences that may be committed, such as hacking into social media accounts in order to monitor and control them or potentially menacing or threatening actions such as posting pictures of the complainant’s workplace or children on social media sites, even if no reference is made to the complainant.

The issue of keeping people safe online is a top priority for this Government. DCMS is leading a new cross-Government drive on online safety, on behalf of the Prime Minister. We will involve ministers and officials from departments across Government as part of a coordinated effort to make the UK the safest place in the world for children and young people to go online.

We want to know more about the scale of the problems that young people face online, identify where the gaps are and start to think about solutions, which we will develop in a Green Paper to be published before the summer. The work is expected to centre on four main priorities: how to help young people help themselves; helping parents face up to the online risks and discuss them with children; how technology can help provide solutions; and importantly, industry’s responsibilities to society.

While the initial focus will be on children and young people, the Strategy will consider how to make the online world a safer place more generally, including by examining concerns around issues like trolling and other aggressive behaviour. The Strategy will draw out relevant links between solutions that will protect children online that will also help in the fight against online violence against women and girls.

The Strategy will build on the good work that has already been done in this area by the UK Council for Child Internet Safety (UKCCIS) which brings together government, industry, law enforcement, academia, charities and parenting groups to work in partnership to help to keep children and young people safe online.
In December 2015, UKCCIS published a practical guide for providers of social media and interactive services. The guide has examples of good practice from leading technology companies (e.g. Twitter, Facebook, YouTube), and advice from NGOs and other online child safety experts. Its purpose is to encourage businesses to think about “safety by design” to help make their platforms safer for children and young people under 18.

We already expect industry to improve online safety provisions at the same speed and with the same determination with which they bring out new products and innovations. We expect social media companies to have relevant safeguards in place, including access restrictions, particularly for children and young people who use their services. Social media companies should have reporting tools that are easy to access and act promptly when abuse is reported, removing content which does not comply with acceptable use policies or terms and conditions, and, where appropriate, suspend or terminate the accounts of those breaching the rules in place.

Department for Culture, Media and Sport