Grooming and Child Sexual Abuse material offences recorded by police have increased by more than a quarter since ministers promised to legislate to protect children on social media.
The NSPCC carried out a Freedom of Information request which found that grooming and CSA material offences recorded by police have increased by 27% between 2018/19 and 2020/21.
The children’s charity is setting out solutions to ensure the Online Safety Bill strongly tackles grooming and online CSA ahead of its second reading this month.
NSPCC CEO Peter Wanless said: “This historic Online Safety Bill can finally force tech companies to systemically protect children from avoidable harm.
“With online child abuse crimes at record levels and the public rightly demanding action, it’s crucial this piece of legislation stands as a key pillar of the child protection system for decades to come.
“This NSPCC report sets out simple but targeted solutions for the Bill to be improved to stop preventable child sexual abuse and to finally call time on years of industry failure,” he added.
To strengthen the Bill, the government should:
The public also want the Bill to address CSA with a YouGov survey finding four in five UK adults think it’s very important it tackles online grooming and CSA images.
The poll found:
Frida*, was 13 when she was groomed by a man on Facebook before being abused on encrypted WhatsApp and has been campaigning to make social media safer for children.
She said: “The government now have a responsibility to ensure this legislation works and makes tech executives do everything in their power to address how their sites contribute to grooming.
“No one took responsibility for the abuse I suffered except me. Not the man who abused me or anyone at the tech firms that enabled him,” she concluded.
Meanwhile The British Psychological Society and YoungMinds are warning children and young people will still be left vulnerable to harmful content online unless the Online Safety Bill is toughened up as it comes before parliament.
Both organisations are concerned that a failure to consider the cross-platform nature of harmful material and potential loopholes around ‘harmful but legal’ content mean that children will still be at risk online if the Bill passes in its current form.
As things stand, the Bill only requires the largest ‘Category One’ platforms to address the risk of exposure to ‘legal but harmful’ content, which includes pornography and material glorifying eating disorders, self-harm and suicide.
They are calling for an explicit duty to be placed on Ofcom to address cross-platform risks, and for companies to be required to co-operate to mitigate these.
Emma Thomas, Chief Executive of YoungMinds, said: “It is completely unacceptable that young people are routinely shown harmful content. We are concerned that young people’s views have not been given significant enough weight in this legislative process and this must urgently be addressed as the Bill progresses.
“Young people tell us they want the Government to listen to their experiences online. We need to know more about what young people find distressing, as well as the positive parts of social media that they want to protect.
“It is clear that many young people find content on social media harmful, so companies do not need to wait for this legislation to take action now to make these platforms safer. There is no excuse to delay.”
A Mackman Group collaboration - market research by Mackman Research | website design by Mackman