Clare Jerrom looks at what the government proposes in its White Paper 'Online Harms' to protect vulnerable children.
Measures to tackle online abuse such as inciting violence, encouraging suicide, cyber bullying, terrorism, child sexual exploitation and abuse content and children accessing inappropriate material have been launched by the government.
Home secretary Sajid Javid has published the 'Online Harms White Paper' which includes measures designed to make the internet a safer place.
Home Secretary Sajid Javid said: "The tech giants and social media companies have a moral duty to protect the young people they profit from.
"Despite our repeated calls to action, harmful and illegal content – including child abuse and terrorism - is still too readily available online.
"That is why we are forcing these firms to clean up their act once and for all. I made it my mission to protect our young people – and we are now delivering on that promise," he added.
Duty of Care
The proposals include plans for:
- An independent regulator to be appointed to enforce stringent new standards.
- Social media firms must abide by a mandatory “duty of care” to protect users and could face heavy fines if they fail to deliver.
- Tough requirements on tech companies to ensure child abuse and terrorist content is not disseminated online.
- Making companies respond to users’ complaints, and act to address them quickly.
- Codes of practice, issued by the regulator, which could include measures such as requirements to minimise the spread of misleading and harmful disinformation.
The White Paper states: "Given the prevalence of illegal and harmful content online, and the level of public concern about online harms, not just in the UK but worldwide, we believe that the digital economy urgently needs a new regulatory framework to improve our citizens’ safety online. This will rebuild public confidence and set clear expectations of companies, allowing our citizens to enjoy more safely the benefits that online services offer."
Damaging for children
The document outlines that illegal and unacceptable content and activity is widespread online, and UK users are concerned about what they see and experience on the internet.
The prevalence of the most serious illegal content and activity, which threatens national security or the physical safety of children, is unacceptable, it adds. The impact of harmful content and activity can be particularly damaging for children, and there are growing concerns about the potential impact on their mental health and wellbeing.
'Online Harms' highlights that:
* Terrorist groups use the internet to spread propaganda designed to radicalise vulnerable people, and distribute material designed to aid or abet terrorist attacks.
* There are terrorists broadcasting attacks live on social media.
* Child sex offenders use the internet to view and share child sexual abuse material, groom children online, and even live stream the sexual abuse of children.
* Social media platforms use algorithms which can lead to ‘echo chambers’ or ‘filter bubbles’, where a user is presented with only one type of content instead of seeing a range of voices and opinions which can promote disinformation.
* Rival criminal gangs use social media to promote gang culture and incite violence.
* The internet can be used to harass, bully or intimidate vulnerable people.
* Young adults or children may be exposed to harmful content that relates to self-harm or suicide.
As a result, the White Paper sets out a programme of action to tackle content or activity that harms individual users, particularly children, or threatens the way of life in the UK, either by undermining national security, or by undermining shared rights, responsibilities and opportunities to foster integration.
The proposals will set clear standards to help companies ensure the safety of users while protecting freedom of expression, especially in the context of harmful content or activity that may not cross the criminal threshold but can be particularly damaging to children or other vulnerable users.
The government will establish a new statutory duty of care to make companies take more responsibility for the safety of their users and tackle harm caused by content or activity on their services. An independent regulator will be introduced to ensure compliance with this duty of care. The regulator will have a suite of powers to take effective enforcement action against companies that have breached their statutory duty of care such as substantial fines or imposing liability on individual members of senior management.
Social media companies and tech firms will therefore be legally required to protect their users and face tough penalties if they do not comply. There will be stringent requirements for companies to take even tougher action to ensure they tackle terrorist and child sexual exploitation and abuse content.
Companies will be obliged to fulfill the new legal duty and the regulator will introduce codes of practice in order to set out how to do this. Developing a culture of transparency, trust and accountability is a key element of the new regulatory framework. The regulator will have the power to require annual transparency reports from companies, outlining the prevalence of harmful content on their platforms and what counter measures they are taking to address these.
As part of the new duty of care, companies will be expected to have effective and easy-to-access user complaints functions, which will be overseen by the regulator, whose responsibility it will be to implement, oversee and enforce the new regulatory framework.
The regulator will have sufficient resources and the right expertise and capability to perform its role effectively and the government is consulting on whether the regulator should be a new or existing body. It is also consulting on which enforcement powers the regulator should have at its disposal.
Given that the White Paper is "a complex and novel area for public policy," the document sets out the government's proposals and poses a series of questions about the design of the new regulatory framework and non-legislative package. A 12 week consultation on the proposals has also been launched and on conclusion, the government will set out its plans for action in terms of developing proposals for legislation.
Exposed to grooming and harmful content
Prime Minister Theresa May said: "The internet can be brilliant at connecting people across the world - but for too long these companies have not done enough to protect users, especially children and young people, from harmful content.
"That is not good enough, and it is time to do things differently. We have listened to campaigners and parents, and are putting a legal duty of care on internet companies to keep people safe.
"Online companies must start taking responsibility for their platforms, and help restore public trust in this technology," added Mrs May.
The NSPCC welcomed the proposals with CEO Peter Wanless saying: "This is a hugely significant commitment by the government that once enacted, can make the UK a world pioneer in protecting children online.
"For too long social networks have failed to prioritise children’s safety and left them exposed to grooming, abuse, and harmful content. So it’s high time they were forced to act through this legally binding duty to protect children, backed up with hefty punishments if they fail to do so."
Barnardo's chief executive Javed Khan added that the White Paper was "a very important step in the right direction".
Children's commissioner for England Anne Longfield said the era of self-regulation needs to end, and the government's proposals to introduce a statutory duty of care "cannot come a moment too soon".
The Information Commissioner, Elizabeth Denham said: “I think the White Paper proposals reflect people’s growing mistrust of social media and online services. People want to use these services, they appreciate the value of them, but they’re increasingly questioning how much control they have of what they see, and how their information is used. That relationship needs repairing, and regulation can help that. If we get this right, we can protect people online while embracing the opportunities of digital innovation.
“While this important debate unfolds, we will continue to take action. We have powers, provided under data protection law, to act decisively where people’s information is being misused online, and we have specific powers to ensure firms are accountable to the people whose data they use.
“We’ve already taken action against online services, we acted when people’s data was misused in relation to political campaigning, and we will be consulting shortly on a statutory code to protect children online. We see the current focus on online harms as complementary to our work, and look forward to participating in discussions regarding the White Paper," she added.
Carolyn Bunting, CEO, Internet Matters, added: "We support the government’s desire to make the UK the safest place to be online. The internet simply wasn’t built with children in mind, so it is vital that government plays a greater role in determining and setting standards for the services that children commonly use, and that industry responds quickly and effectively.
"Proactive regulation and better technical solutions, whilst welcomed, are just one part of the solution. We have to help parents to have greater awareness and understanding of their child’s digital wellbeing. It would be unfair to leave those parents or guardians to figure it out for themselves. Instead we must make available as many accessible, simple resources for parents based on expert advice which makes it as easy as possible for them to understand," she concluded.
Freedom of speech
In a statement Facebook’s UK head of public policy Rebecca Stimson said: “New rules for the internet should protect society from harm while also supporting innovation, the digital economy and freedom of speech.
“These are complex issues to get right and we look forward to working with the Government and Parliament to ensure new regulations are effective," she added.
Susie Hargreaves, CEO of The Internet Watch Foundation, as part of the UK Safer Internet Centre, added: “We welcome the opportunity this consultation period affords, and we are looking forward to helping shape the future regulatory framework in the UK.
“It is of the utmost importance that the right thing is done for victims of child sexual abuse who deserve every opportunity to live in a world free from the circulation and reminder of the crimes committed against them.
“The world wide web is now 30 years old and we will bring our 23 years’ experience to help build a sustainable and safer online environment for the next 30 years and beyond," she concluded.
There have been 26 contacts to the NSPCC helpline a day on average from people concerned about child sexual exploitation and abuse.
This reached a record high of 4,735 reports from people concerned that a child is or has been sexually exploited and/or abused, a 36% increase in the first six months of 2021/22 when [...]
The Great British Bake Off team has contacted WillisPalmer to see if we know any professionals working in the children’s services sector who fancy applying to appear on the show next year.
The Bake Off, which airs on Channel 4, is asking if you have got what it takes to bake in the tent as [...]
The number of children’s homes in the private and voluntary sector has increased, while the number of local-authority run residential homes for children has decreased, the latest statistics reveal.
Ofsted’s report showed that between 1 April and 31 August 2021, there was a net increase of 69 homes representing a 3% increase from 2,707 to [...]