Nine out of 10 people want social media networks and messaging services to be safer for children, a YouGov survey for the NSPCC has found.
The poll of more than 2,000 adults found that they overwhelmingly backed new laws to keep children safe on social media and wanted tech bosses to be held responsible for safety.
NSPCC Chief Executive, Sir Peter Wanless said: “Today’s polling shows the clear public consensus for stronger legislation that hardwires child protection into how tech firms design their platforms.
“Mr Dowden will be judged on whether he takes decisions in the public interest and acts firmly on the side of children with legislation ambitious enough to protect them from avoidable harm.
“For too long children have been an afterthought for Big Tech but the Online Safety Bill can deliver a culture change by resetting industry standards and giving Ofcom the power to hold firms accountable for abuse failings,” he added.
The survey found:
- 90% of respondents want firms to have a legal responsibility to detect child abuse, such as grooming, that takes place on their sites.
- 80% believed that tech bosses should be fined for failure to make sure their sites safe.
- 70% supported making it a legal requirement for platforms to assess the risks of child abuse on their services and take steps to address them.
- Only 8% of adults thought sites are regularly designed safely for children.
In 2019 the NSPCC set out detailed proposals for an online safety bill. This week, it published a report called ‘Delivering a Duty of Care’ which assessed the government’s plans for legislation, against the six tests they created to measure the successful achievement of online safety.
The report found that the government is failing on 9 out of the 27 indicators and the charity says when it comes to tackling sexual abuse, tougher measures are required.
Online safety can be achieved by making tech firms legally responsible for their output, and by:
- Clamping down on the “digital breadcrumbs” dropped by abusers to guide others towards illegal material. These include videos of children just moments before or after they are sexually abused - so-called ‘abuse image series’ - that are widely available on social media.
- Giving Ofcom the ability to tackle cross-platform risks, where groomers target children across the different sites and games they use.
- Getting government to commit to senior management liability, making tech directors personally responsible for decisions, drive cultural change and provide a strong deterrent.
- Making censure, fines and in some cases, criminal sanctions, the penalty for bosses who fail to make online a safe place for children.
Delivering a Duty of Care
A Mackman Group collaboration - market research by Mackman Research | website design by Mackman