Social media giants urged to tackle inappropriate content

Social media companies have been accused of treating child users as an after-thought by the children's commissioner for England.

Anne Longfield has questioned whether the owners of the leading social media organisations, including Facebook, which includes Instagram and WhatsApp, Snapchat, Youtube and Pinterest, have any control over their content.

"None of the platforms regularly used by vast numbers of children were designed or developed with children in mind, and for some children this is proving harmful, whether that is due to addictive in-app features, inappropriate algorithms or a lack of responsibility for the hosting of dangerous content," said Ms Longfield.

"Over the last few years, I have had dialogue with many of the big social media companies over how best to make sure children have the resilience, information and power they need to make safe and informed choices about their digital lives. I have been reassured time and time again that this is an issue taken seriously. However, I believe that there is still a failure to engage and that children remain an afterthought," she added.

The children's commissioner wrote to leading social media organisations following the tragic suicide of Molly Russell and her father’s appalled response to the material she was viewing on social media before her death.

"I would appeal to you to accept there are problems and to commit to tackling them – or admit publicly that you are unable to," said Ms Longfield.

She explained that by law, she has the power to demand data pertaining to children from public bodies. While social media organisations are not covered by this legislation, Ms Lonfield said "in the interests of greater transparency" they should answer the following questions, or explain to their users why not:

- How many self-harm sites or postings are hosted on your platform?

- How many under 18s access these?

- How many under 13s (often breaching your own age restrictions) access them?

- What support option do you provide to those seeking images of self harm on your platform and what percentage of searchers chose the ‘help’ option.

- What is your criteria for removing content or people from platforms?

- What does your own research tell you about the impact of such sites on children’s mental health and links to self-harm?

Ms Longfield said she shares the concerns of the Duke of Cambridge who said: "I am very concerned though that on every challenge they face—fake news, extremism, polarization, hate speech, trolling, mental health, privacy, and bullying—our tech leaders seem to be on the back foot … The noise of shareholders, bottom lines, and profits is distracting them from the values that made them so successful in the first place."

Ms Longfield added: "My experiences are the same. The potential disruption to all user experiences should no longer be a brake on making the safety and wellbeing of young people a top priority. Neither should hiding behind servers and apparatus in other jurisdictions be an acceptable way of avoiding responsibility."

In earlier publications, the children's commissioner has called for the establishment of a Digital Ombudsman, financed by the internet companies themselves, but independent of them. The ombudsman would be an arbiter, able to respond to the concerns of children and parents by demanding greater transparency and action from internet companies so material that is detrimental to the wellbeing of children is removed quickly.

"I am more convinced than ever that this is needed now and that the time has come for action. I have also called for companies like yourselves to be bound by a statutory duty of care, a legal obligation to prioritise the safety and wellbeing of children using your platforms," said Ms Longfield.

"With great power comes great responsibility, and it is your responsibility to support measures that give children the information and tools they need growing up in this digital world – or to admit that you cannot control what anyone sees on your platforms," she concluded.

Delivering a diverse, reliable range of services to children and their families across the UK
Registered Address:
Speed Medical House, Matrix Park, Chorley, Lancashire, England, PR7 7NA
Tel: 01206 878178Contact Us

A Mackman Group collaboration - market research by Mackman Research | website design by Mackman