The children’s commissioner for England has said that she is “simply not satisfied” that enough is being done to keep children safe online.
Dame Rachel de Souza’s comments were made after it emerged in an inquest into Molly Russell’s death that unsafe online content contributed "in a more than minimal way" to her death after she took her own life.
“Girls as young as 9 told my team about strategies they employ when strangers ask for their home address online. In a room of 15 and 16-year-olds, three quarters had been sent a video of a beheading,” said Rachel de Souza.
“I conducted a nationally representative survey of 2,005 children and their parents to understand families’ perspectives on online safety. My survey found that children are frequently exposed to a wide range of inappropriate and harmful content online, included sexualised and violent imagery, anonymous trolling, and material promoting suicide, self-harm and eating disorders.”
“Children tell me that they rarely seek out this content. It is promoted and offered up to them by highly complex recommendation algorithms, which are designed to capture and retain their attention. When harmful content is reported to platforms, children tell me that little is done in response,” the children’s commissioner added.
She went on to slam tech companies for “failing” with regards to self-regulation.
“The rights and protections which exist in the offline world must, therefore, extend online. And we must hold tech firms to the highest standards on children’s safety and wellbeing. As the Online Safety Bill is shaped and formalised in Parliament, below I set out three key principles for the legislation, distilled from my conversations with children and industry. These are: children’s voice, recognition of childhood and a collaborative approach to online safety,” said Dame Rachel de Souza.
Sir Peter Wanless, CEO of the NSPCC, has also urged tech companies to be held to account for children’s online safety following the inquest of Molly Russell.
Senior coroner, Andrew Walker said material viewed by 14-year-old Molly Russell on social media "shouldn't have been available for a child to see".
Mr Walker told North London Coroner's Court: "She died from an act of self-harm while suffering from depression and the negative effects of online content."
Sir Peter Wanless said: “Finally Molly’s family have the answers they deserve thanks to their determination to see Meta and Pinterest questioned under oath about the part they played in their daughter and sister’s tragic death.
“The ruling will send shockwaves through Silicon Valley – tech companies must expect to be held to account when they put the safety of children second to commercial decisions. The magnitude of this moment for children everywhere cannot be understated.
“Molly’s family will forever pay the price of Meta and Pinterest’s abject failure to protect her from content no child should see, but the Online Safety Bill is a once in a generation opportunity to reverse this imbalance between families and Big Tech.
“This must be a turning point and further delay or watering down of the legislation that addresses preventable abuse of our children would be inconceivable to parents across the UK,” he concluded.
Dave Wareham, Head of Services at WillisPalmer, said: “Undoubtedly, the online world can have a huge impact on young people. We know that young people are groomed through various platforms, often young people ‘learn’ their sex education through viewing online pornography which is terrifying and dangerous and online bullying is rife meaning that bullying is present 24/7 rather than stopping at the school gates as occurred years ago.”
“On top of that there is vile, indoctrinating, discriminatory content online which can have a huge detrimental effect on young people.”
“Everyone needs to be part of the solution from parents using parental controls online to the government ensuring that children online are safe through the Online Safety Bill to tech companies ensuring that safety measures are properly implemented so that inappropriate content can be reported safe in the knowledge that it will be immediately removed.”
“Rather than utilising algorithms to their benefit, tech companies need to use them to ensure children’s safety is a priority,” concluded Dave.