Welcome to Simply-Docs

User-to-User and Search Services and Protecting Children from Harm Online

May 2024

The topic of children’s safety online is rarely far from the news in this age of an online world filled with user-generated content. Even with moderation and algorithms, it has become all too easy for users to share inappropriate, harmful, and even illegal content on user-to-user services online. As the risks and indeed the (sometimes fatal) consequences increase, more effective regulation and enforcement is clearly needed and comes in the form of the Online Safety Act. Under this new legislation, user-to-user and search services which are likely to be accessed by children face new safety duties in respect of harmful content.

User-to-user and search services must carry out a Children’s Access Assessment to determine whether their service is likely to be accessed by children. This is likely to be a low threshold in practice and Ofcom suggests that most services that do not already employ highly effective age assurance measures will qualify. Having established this, a service should then determine whether or not the Child User Condition is met:

  • A significant number of children are using the service; and/or
  • The service is of a kind to attract a significant number of children.

If this condition is met, the service in question will be subject to a wide range of duties designed to assess and mitigate the risk of child users being harmed. For those operating a user-to-user or search service that may meet these conditions, Ofcom’s latest Online Safety Act consultation (external link), launched on 8 May 2024, is both important reading and an opportunity to get an early look at the rules, codes, and guidance that they will need to follow.

What is in the Consultation?

This latest consultation from Ofcom covers the following:

  • How to assess if a service is likely to be accessed by children;
  • The causes and impacts of harms to children; and
  • How services should assess and mitigate the risks of harms to children.

Other key documents provided by Ofcom include:

  • Proposed Codes at a glance;
  • Guidance on completing children’s access and risk assessments (including risk profiles);
  • Draft Children’s Safety Codes; and
  • Large services guidance (a “large service” is one with an average user base of more than 7m per month in the UK).

Ofcom is proposing more than 40 safety measures in their draft Children’s Safety Codes for user-to-user and search services under the following broad headings:

  • Robust age checks. Ofcom expects much greater use of age assurance. All services which do not ban harmful content, along with those at higher risk of such content being shared on their service, should implement highly effective age checks to prevent children from seeing such content.
  • Safer algorithms. Services which use personalised recommender systems and that are at a higher risk of harmful content should configure their algorithms to filter out the most harmful content from children’s feeds and reduce the visibility of other harmful content.
  • Effective moderation. All user-to-user services should have content moderation systems and processes to ensure that action is taken quickly against content that is harmful to children. Search services should also have appropriate moderation systems in place.
  • Strong governance and accountability. Ofcom’s proposed measures under this heading include appointing a named person accountable for compliance with the children’s safety duties; an annual senior-body review of all risk management activities relating to children’s safety; and an employee code of conduct that sets standards for employees around protecting children.
  • More choice and support for children. This includes providing clear and accessible information for children and their carers, easy reporting and complaints processing, and tools and support to help children stay safe.

The most onerous obligations will, of course, only apply to the largest services. The draft Children’s Safety Codes take a proportionate approach, taking account of key factors including the type and size of service and the level of risk involved.

Ofcom expects their new measures to make a significant difference to children’s online experiences. If effective, such measures should help to ensure that:

  • Children cannot normally access pornography;
  • Children will be protected from seeing (and from having recommended to them) potentially harmful content;
  • Children will not be added to group chats without their consent; and
  • It will be easier for children to complain when they do see harmful content, and they can be more confident that such complaints will be acted upon.

Time will tell, of course, how effective such measures prove to be in the real world. Certainly in a world where VPNs can easily navigate around any number of restrictions and ID credentials can be forged, those determined to access content inappropriate for their age will likely still be able to do so. It is to be hoped, however, that the Online Safety Act and Ofcom’s new measures will reduce the amount of harmful content that children are exposed to online against their wishes.

What’s Next?

This consultation is open until 17 July 2024 and represents a valuable opportunity for those businesses whose offerings are likely to be impacted to have their say. The consultation is available here (external link).

Ofcom is expecting to publish final documentation in this area early next year, after which in-scope services will have three months to comply. Given the detail in this consultation, therefore, there is certainly no harm in being prepared ahead of time.

The contents of this Newsletter are for reference purposes only and do not constitute legal advice. Independent legal advice should be sought in relation to any specific legal matter.

Simply-4-Business Ltd Registered in England and Wales No. 4868909 Unit 100, Parkway House, Sheen Lane, London SW14 8LS