Nine out of 10 people want social media networks and messaging services to be safer for children, a YouGov survey for the NSPCC has found.
The poll of more than 2,000 adults found that they overwhelmingly backed new laws to keep children safe on social media and wanted tech bosses to be held responsible for safety.
NSPCC Chief Executive, Sir Peter Wanless said: “Today’s polling shows the clear public consensus for stronger legislation that hardwires child protection into how tech firms design their platforms.
“Mr Dowden will be judged on whether he takes decisions in the public interest and acts firmly on the side of children with legislation ambitious enough to protect them from avoidable harm.
“For too long children have been an afterthought for Big Tech but the Online Safety Bill can deliver a culture change by resetting industry standards and giving Ofcom the power to hold firms accountable for abuse failings,” he added.
The survey found:
- 90% of respondents want firms to have a legal responsibility to detect child abuse, such as grooming, that takes place on their sites.
- 80% believed that tech bosses should be fined for failure to make sure their sites safe.
- 70% supported making it a legal requirement for platforms to assess the risks of child abuse on their services and take steps to address them.
- Only 8% of adults thought sites are regularly designed safely for children.
In 2019 the NSPCC set out detailed proposals for an online safety bill. This week, it published a report called ‘Delivering a Duty of Care’ which assessed the government’s plans for legislation, against the six tests they created to measure the successful achievement of online safety.
The report found that the government is failing on 9 out of the 27 indicators and the charity says when it comes to tackling sexual abuse, tougher measures are required.
Online safety can be achieved by making tech firms legally responsible for their output, and by:
- Clamping down on the “digital breadcrumbs” dropped by abusers to guide others towards illegal material. These include videos of children just moments before or after they are sexually abused - so-called ‘abuse image series’ - that are widely available on social media.
- Giving Ofcom the ability to tackle cross-platform risks, where groomers target children across the different sites and games they use.
- Getting government to commit to senior management liability, making tech directors personally responsible for decisions, drive cultural change and provide a strong deterrent.
- Making censure, fines and in some cases, criminal sanctions, the penalty for bosses who fail to make online a safe place for children.
Delivering a Duty of Care
A blame culture in social work impacts on risk aversion in the social work profession, some respondents to The Case for Change have warned.
Publishing the Case for Change in June, chair of the independent review of children’s social care Josh MacAlister said: “This Case for Change sets out the urgent need for a new approach [...]
Schools were forced to step in to support vulnerable families during the COVID-19 pandemic for many issues that were previously dealt with by social workers, research has found.
Schools found themselves helping vulnerable families with problems such as mental health problems, domestic abuse and poverty during the pandemic as more families were turning to schools [...]
Children and young people at Oakhill Secure Training Centre are being held in their rooms for 23 hours a day, a joint inspection has found.
Children have spent approximately 19 hours per day on average locked in their rooms on average since mid-July 2021, the centre’s records show and on some days, this has increased [...]