Support our #Respect4SocialWork campaign today and celebrate the social work profession.
Make an Enquiry
Contact Us

Government urged to introduce rule book for social media giants

The government is being urged to ensure that all social networks follow a universal set of rules to protect children from online abuse.

The NSPCC wants new laws to be put in place to force social media sites to keep children safe claiming they are not doing enough to tackle issues such as child abuse, grooming, hate speech, and cyber-bullying on their platforms.

“That's why we're calling for new laws which will force social networks to protect children online, whichever sites they use,” said a statement from the children’s charity.

The NSPCC is calling for the government to create a rulebook that would be enforced by an independent regulator. Those rules will ensure that social media companies do these three things:

  • Provide Safe Accounts for under-18s with extra protections built in
  • Create grooming and bully alerts to flag up sinister behaviour
  • Hire an army of dedicated online child safety guardians.

Safe Accounts

The NSPCC wants social media platforms to provide under-18s with Safe Accounts with extra protections built in which should include:

High privacy settings as default. Location settings locked off, accounts not public or searchable using phone number or email. There should be privacy prompts when sharing personal information.

Control over who you connect with. Followers must be approved by the young person. Video chat and live streaming should be restricted to the young person's contacts.

Clear and child-friendly rules and reporting buttons. They should be easy to find and easy to read.

Grooming alerts

The charity wants harmful online behaviour to be automatically flagged. For example, when someone sends lots of friend requests to unknown people or a high rejection rate of adults' friend requests to under-18s should be flagged automatically to moderators to review or take action.

Patterns of grooming or abusive language can be tracked automatically to pick up on sinister messages, which can be reviewed by moderators. This works in a similar way to email prompts to remind you to attach a document if you've used the word 'attachment'.

When grooming behaviour is detected, a notification should be sent to the child so they pause and reflect on their contact with that person, and support can be offered where appropriate

Online guardians

The NSPCC is calling for every social media company to hire experts in child protection as dedicated child safety moderators. They must also make public the number of reports they receive and how moderation decisions are made.

Harmful, violent, abusive or adult content can be proactively filtered using key words – either to block it for Safe Accounts or to issue a pop-up warning to young people.

Peter Wanless, chief executive of NSPCC, said: “The internet can be a wonderful resource for young people to learn, socialise and get support. But leaving social media sites free to make up their own rules when it comes to child safety is unacceptable.

“We need legally enforceable universal safety standards that are built in from the start.

“We've seen time and time again social media sites allowing violent, abusive or illegal content to appear unchecked on their sites, and in the very worst cases children have died after being targeted by predators or seeing self-harm films posted online

“Enough is enough. Government must urgently bring in safe accounts, groomer alerts and specially trained child safety moderators as a bare minimum to protect our children. And websites who fail to meet these standards should be sanctioned and fined,” Mr Wanless concluded.

Working Together For Children

Make an enquiry

A multi-disciplinary organisation providing independent, high quality social work, psychological, psychiatric, therapeutic and family support services. Contact us with your requirements and speak to a member of our team who will help you today.
Make an Enquiry

Knowledge & Resources

Keep abreast of the latest news in the children's services sector.

Blame culture results in risk aversion in social work

19/10/2021

A blame culture in social work impacts on risk aversion in the social work profession, some respondents to The Case for Change have warned.

Publishing the Case for Change in June, chair of the independent review of children’s social care Josh MacAlister said: “This Case for Change sets out the urgent need for a new approach [...]

Read Full Story

Schools dealing with social work issues during COVID pandemic

15/10/2021

Schools were forced to step in to support vulnerable families during the COVID-19 pandemic for many issues that were previously dealt with by social workers, research has found.

Schools found themselves helping vulnerable families with problems such as mental health problems, domestic abuse and poverty during the pandemic as more families were turning to schools [...]

Read Full Story

Young people at Oakhill STC held in rooms for 23 hours a day

13/10/2021

Children and young people at Oakhill Secure Training Centre are being held in their rooms for 23 hours a day, a joint inspection has found.

Children have spent approximately 19 hours per day on average locked in their rooms on average since mid-July 2021, the centre’s records show and on some days, this has increased [...]

Read Full Story
Children First is an online resource for professionals working with children presented by WillisPalmer, providing you with the latest news, features and interviews.
Subscribe Today
Delivering a diverse, reliable range of services to children and their families across the UK
D1, Parkside, Knowledge Gateway, Nesfield Road, Colchester, Essex CO4 3ZL
Contact Us
WP Quality Assured

A Mackman Group collaboration - market research by Mackman Research | website design by Mackman

closechevron-downbars linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram