Welcome to Simply-Docs

What The Online Safety Act will Mean for Businesses

October 2023

First introduced back in 2021, the Online Safety Bill set out to make the UK “the safest place in the world to be online”. It introduced a range of obligations on the operators of online platforms enabling the sharing of user-generated content, covering the design, operation, and moderation of those platforms.

The Bill had a tumultuous journey, seeing multiple amendments – some quite significant – and multiple prime ministers – some not so significant – along the way. On 19 September, however, the Bill passed its final parliamentary debate, and, on 26 October, it received Royal Assent, becoming the Online Safety Act.

What will the Online Safety Act Do?

The central purpose of the Online Safety Act is to make the internet safer, particularly for children. It aims to target a range of user-generated content shared online, particularly that which is illegal and/or harmful, such as child sexual abuse material (CSAM), online harassment, and content promoting self-harm, eating disorders, suicide, and similar dangerous material.

Other measures include those relating to content moderation, content filtering, age verification requirements for adult content, combatting fraudulent advertising, and more. Platforms that fail to comply will face GDPR-esque fines of up to £18m or 10% of global revenue, they may see their platforms blocked, and senior management could even face criminal penalties.

Since its introduction as the Online Safety Bill, the Act has faced a number of problems, not least in the form of the balancing act between protecting free speech, defining harmful content, and protecting users from such content. Also highly controversial has been the issue of end-to-end encrypted messaging services and the potential for Ofcom to be given powers to require messaging platforms to allow for such messages to be scanned. Indeed, so controversial were such proposals that big tech companies including Apple and Meta threatened to pull their services from the UK if these powers were implemented in their then-current form. The government has, however, recently stated that this requirement would not be enacted until suitable technology was available. That being said, plans to increase the scope and powers of the Investigatory Powers Act could still interfere with such messaging services, effectively putting an end to private encrypted communication in the UK.

The key features of the Online Safety Act are as follows. In relation to children, platforms will be required to:

  • remove illegal content or prevent it from appearing at all;
  • prevent children from accessing age-inappropriate and harmful content;
  • enforce age limits and employ age-verification methods;
  • publish risk assessments on potential online harms; and
  • provide reporting and complaints procedures for children and their parents.

Protections applying to all users include:

  • removing illegal content;
  • imposing a legal responsibility on platforms to comply with their terms and conditions; and
  • providing the facility to filter out content that is potentially harmful.

New criminal offences will also be introduced relating to cyber-flashing and the sharing of deepfake pornography. Further onerous obligations will address terrorism content.

What will the Online Safety Act Mean for You?

The Online Safety Act will apply to any service that enables user content to be shared between users or that enables users to search more than one site or database (i.e., search engines).

This will encompass a range of sites and services, including smaller ones and more transient content. Blogs, forums, and listing sites, for example, will all be caught under the Online Safety Act, and the Act’s definitions cover websites, apps, and other software.

While big tech makes the headlines, the Government suggests that some 20,000 SMEs will be impacted by the new legislation as well. There are exemptions, but they are narrowly defined. Most notably, these include:

  • user-to-user services where the only user-generated content is emails, SMS, or MMS messages;
  • user-to-user services where the only user-generated content is one-to-one live aural communications;
  • services with limited functionality where users only communicate by posting or interacting with comments or reviews (or other interactions such as “liking” or “disliking”) relating to content published by the service provider (and not by other users); and
  • internal business services.

Implementing the Online Safety Act

A lot of the detail about the implementation of these tough provisions will come in the form of Codes of Practice to be issued by Ofcom. Now the Online Safety Bill has become law, Ofcom is expected to begin consulting on codes of practice and guidance soon, with the first consultation launching on 9th November.

The Online Safety Act’s working life promises to be as colourful and controversial as its development. Some would even argue that the legislation places impossibly high expectations on the operators of online platforms, and certainly many big tech companies have lodged strong objections to some of the more controversial aspects.

The impact of the obligations and the finer detail is yet to be seen, but we will be monitoring developments now that the Online Safety Act has become law and waiting for Ofcom’s Codes of Practice. Our documents and guidance – our Website Terms & Conditions templates in particular – will be reviewed and updated in line with such developments and we will, as always, be keeping you up to date.

The contents of this Newsletter are for reference purposes only and do not constitute legal advice. Independent legal advice should be sought in relation to any specific legal matter.

Simply-4-Business Ltd Registered in England and Wales No. 4868909 Unit 100, Parkway House, Sheen Lane, London SW14 8LS

Top