The government is being urged to ensure that all social networks follow a universal set of rules to protect children from online abuse.
The NSPCC wants new laws to be put in place to force social media sites to keep children safe claiming they are not doing enough to tackle issues such as child abuse, grooming, hate speech, and cyber-bullying on their platforms.
“That's why we're calling for new laws which will force social networks to protect children online, whichever sites they use,” said a statement from the children’s charity.
The NSPCC is calling for the government to create a rulebook that would be enforced by an independent regulator. Those rules will ensure that social media companies do these three things:
The NSPCC wants social media platforms to provide under-18s with Safe Accounts with extra protections built in which should include:
High privacy settings as default. Location settings locked off, accounts not public or searchable using phone number or email. There should be privacy prompts when sharing personal information.
Control over who you connect with. Followers must be approved by the young person. Video chat and live streaming should be restricted to the young person's contacts.
Clear and child-friendly rules and reporting buttons. They should be easy to find and easy to read.
The charity wants harmful online behaviour to be automatically flagged. For example, when someone sends lots of friend requests to unknown people or a high rejection rate of adults' friend requests to under-18s should be flagged automatically to moderators to review or take action.
Patterns of grooming or abusive language can be tracked automatically to pick up on sinister messages, which can be reviewed by moderators. This works in a similar way to email prompts to remind you to attach a document if you've used the word 'attachment'.
When grooming behaviour is detected, a notification should be sent to the child so they pause and reflect on their contact with that person, and support can be offered where appropriate
The NSPCC is calling for every social media company to hire experts in child protection as dedicated child safety moderators. They must also make public the number of reports they receive and how moderation decisions are made.
Harmful, violent, abusive or adult content can be proactively filtered using key words – either to block it for Safe Accounts or to issue a pop-up warning to young people.
Peter Wanless, chief executive of NSPCC, said: “The internet can be a wonderful resource for young people to learn, socialise and get support. But leaving social media sites free to make up their own rules when it comes to child safety is unacceptable.
“We need legally enforceable universal safety standards that are built in from the start.
“We've seen time and time again social media sites allowing violent, abusive or illegal content to appear unchecked on their sites, and in the very worst cases children have died after being targeted by predators or seeing self-harm films posted online
“Enough is enough. Government must urgently bring in safe accounts, groomer alerts and specially trained child safety moderators as a bare minimum to protect our children. And websites who fail to meet these standards should be sanctioned and fined,” Mr Wanless concluded.
New Deprivation of Liberty court launch for children
A National Deprivation of Liberty Court dealing specifically with applications relating to deprive children of their liberty has been announced by Sir Andrew McFarlane, president of the family division.
The court will deal with applications seeking authorisation to deprive children of their liberty and will be based at the Royal Courts of Justice under the [...]
Independent review into CSE in Oldham finds child protection procedures were not followed
Some children have been failed by the agencies that were meant to protect them because child protection procedures had not been properly followed, an independent assurance review into historic child sexual exploitation (CSE) in Oldham has found.
Evidence of poor practice was attributed to a structural flaw the review team found in the multi-agency system [...]
Sixty Second Interview with Chloe Bach
Find out more about our Business Administrator Chloe Bach who has been with WillisPalmer since 2009.
Tea or coffee?
Coffee (oat milk latte)
What 3 things would you put in Room 101?
Migraines, slugs and war
What is your favourite place in the world?
Wherever my family is (but I do love New York)
If you were on death row what [...]
A Mackman Group collaboration - market research by Mackman Research | website design by Mackman