Children’s charity the NSPCC is urging the government to deliver legislation that sets out a Duty of Care on tech firms to make their sites safer for children, within 18 months.
More than 10,000 online grooming crimes have been recorded by police under a new law that made it illegal for adults to send sexual messages to children, the charity states.
Peter Wanless, NSPCC Chief Executive, said: “Child abuse is an inconvenient truth for tech bosses who have failed to make their sites safe and enabled offenders to use them as a playground in which to groom our kids.”
The NSPCC obtained freedom of information requests which show that police in England and Wales have recorded 10,119 offences of sexual communication with a child in the two and a half years since the law came into force.
The analysis showed that:
- The number of offences is accelerating, with 23% taking place in the six months up to October last year.
- Facebook-owned apps (Facebook, Facebook Messenger, Instagram and WhatsApp) were used in 55% of cases, from April 2017 to October 2019, where police recorded information about how a child was groomed.
- There were over 3,200 instances of Facebook-owned apps being used, of which half involved Instagram. Snapchat was used over 1,060 times.
Then Digital Minister Matt Warman promised to publish an Online Harms Bill in February following proposals set out in a White Paper. These proposals set out independent regulation of social networks with potential criminal sanctions if tech directors fail to keep children safe on their platforms.
However, the legislation not now expected until the end of the year and there are concerns that a regulator may not be in place before 2023.
The charity is now urging prime minister Boris Johnson to deliver an Online Harms Bill, that sets out a Duty of Care on tech firms to make their sites safer for children, within 18 months. The Online Harms Bill should:
- Enforce a Duty of Care on tech companies to identify and mitigate reasonably foreseeable risks on their platforms, including at the design stage, to proactively protect users from harm
- Create a regulator that can hand out GDPR equivalent fines and hold named directors criminally accountable for the most serious breaches of their Duty of Care
- Give the regulator robust powers to investigate companies and request information
- Create a culture of transparency by legally compelling tech firms to disclose any breaches of the Duty of Care and major design changes to their platforms
Peter Wanless added: “Last week the Prime Minister signaled to me his determination to stand up to Silicon Valley and make the UK the world leader in online safety. He can do this by committing to an Online Harms Bill that puts a legal Duty of Care on big tech to proactively identify and manage safety risks.
“Now is the time to get regulation done and create a watchdog with the teeth to hold tech directors criminally accountable if their platforms allow children to come to serious but avoidable harm,” he concluded.
Three specialist residential schools in Doncaster are to be investigated by the Child Safeguarding Practice Review Panel.Annie Hudson
Fullerton House, Wilsic Hall and Wheatley House specialist, independent residential schools are to be subjected to a national investigation by the Panel following allegations of abuse.
Annie Hudson, Chair of the Child Safeguarding Practice Review Panel, outlined plans [...]
The government’s draft Online Safety Bill in its current form is neither clear nor robust enough to tackle certain types of illegal and harmful content on user-to-user and search services, MPs have warned.
The Digital, Culture Media and Sports committee is urging the government to address types of content that are technically legal by [...]
Trauma informed activities rarely lead to evidence-based treatments, a study by the Early Intervention Foundation has found.
Trauma Informed Care practice varied widely across children’s social care services, with no two teams offering the same components, or attending the same training. Furthermore, the study found that TIC activities rarely led to evidence-based treatments but were [...]