The long-awaited Online Harms White Paper, a joint publication from the Department for Digital, Culture, Media and Sport and the Home Office, was released on Monday. The paper proposes a new regulatory framework, setting clear expectations for technology companies to keep UK users (particularly children, young people, and vulnerable people), safer online.
The White Paper is released alongside a 12-week consultation. You can read the full paper and find out more about the consultation here.
The vision is outlined as:
A free, open and secure internet.
Freedom of expression online.
An online environment where companies take effective steps to keep their users safe, and where criminal, terrorist and hostile foreign state activity is not left to contaminate the online space.
Rules and norms for the internet that discourage harmful behaviour.
The UK as a thriving digital economy, with a prosperous ecosystem of companies developing innovation in online safety.
Citizens who understand the risks of online activity, challenge unacceptable behaviours and know how to access help if they experience harm online, with children receiving extra protection.
A global coalition of countries all taking coordinated steps to keep their citizens safe online.
Renewed public confidence and trust in online companies and services.
Why is this needed?
The internet has developed in ways which we could not have predicted, and whilst many of these developments have enhanced our daily lives, there have also been some unintended consequences. For many years, online platforms have operated under self-regulation which has been inconsistent between platforms.
Just last week Mark Zuckerberg, CEO of Facebook, reported in the Washington Post: “I believe we need a more active role for governments and regulators. By updating the rules for the Internet, we can preserve what’s best about it — the freedom for people to express themselves and for entrepreneurs to build new things — while also protecting society from broader harms.”
Corsham Institute fully agree with this, for too long social media companies have been playing by their own rules and approaches to online harms have been varied. But, it is important to acknowledge that there are other important factors that need attention, such as public education in media literacy (which is recognised in the White Paper) but also in digital citizenship. Whether online or offline, we are part of a connected society and should act responsibly and respectfully.
In his comment, Zuckerberg recognises freedom of expression is important. Critics have noted that proposals in the paper could threaten freedom of speech. The Government are under public pressure to be seen tackling online harms but if a duty of care is introduced with risk of fines, it gives strong incentive for platforms to restrict content. It is reasonable to believe only people posting harmful content should be concerned.
However, an Ofcom study last year revealed 45% of adult internet users indicated they had experienced some form of online harm, but it is important to note this included targeted advertising which came under the ‘moderately annoying’ category. We must carefully consider how we define harms. There are risks surrounding what might happen if the Government delegates censorship powers to the platforms themselves - if in doubt they will restrict content to protect themselves.
What is the role of the regulator?
The independent regulator will be responsible for drawing up a ‘code of best practice’ for online platforms to adhere to and will hold them to account. It has not yet been decided whether this will be done by an existing regulator (the White Paper hints at Ofcom) or a new one. The introduction of a regulator is a positive step which will provide a more consistent approach between platforms, but it is a huge task.
The new regulation body will have to keep up with large volumes of content – on average 300 hours of content is uploaded to YouTube each minute and 500 million tweets are posted per day. The White Paper encourages the use of technology as part of the solution, it explains that ‘companies should invest in the development of safety technologies to reduce the burden on users to stay safe online'. Safety by design is something the regulator will expect online platforms to be pro-active about.
Implications for children and young people
For children and young people growing up in the digital world, the introduction of steps to limit the amount of harmful content online is positive. We know that young people can come across content online relating to self-harm or suicide and that these experiences can have a negative effect on their mental health.
The White Paper also recognises factors relating to the potentially ‘addictive’ nature of platforms and how this affects screen time. Corsham Institute contributed to the recent report #NewFilters which also looks at some of these issues, summarising findings from the All-Party Parliamentary Group on social media and young people’s mental health.
Whilst regulation is the focus of the White Paper, we cannot ignore the importance of education. We need to ensure young people are equipped with digital resilience and emotional intelligence, so that if they do come across something harmful online, they know what to do and who to tell.
We can never eliminate all risks online and doing so would not help young people become resilient. This is why we are working in partnership with ParentZone, offering their Digital Resilience and Wellbeing curriculum to local schools in Corsham. The curriculum covers topics including what it means to show empathy online, how to determine the safety of an online service and what to do if something troubling happens online.
We will respond to the consultation and look forward to seeing how this develops.