The UK’s long-heralded Online Safety Act (“the Act”) has now passed into law, and the likely impact of this legislation on a handful of the largest online platforms has been highly publicised. However, Ofcom estimates that around 100,000 services will be in scope – so what is the potential impact of the Act on the wider range of businesses who will need to comply?
Tiered legislation
The Act applies to two types of services: “user-to-user services”, which allow users to share content directly with other users, and “search services”, which are (or include) a search engine, and in each case which have links with the UK. These are known as “regulated services”, which the Act divides into four categories:
- Category 1: the highest reach user-to-user services with the highest risk functionalities. The Act imposes the most onerous obligations on these services, including transparency, fraudulent advertising, and user empowerment requirements;
- Category 2a: the highest reach search services, which will face transparency and fraudulent advertising requirements;
- Category 2b: services with potentially risky functionalities/other factors, which will face transparency requirements; and
- the remainder of the regulated services in scope of the Act.
Although the Act creates various obligations for specific other services, notably pornography sites, most of the obligations in the Act apply to these four categories of regulated services. Since Ofcom has estimated that only around 30-40 services (of 100,000 in scope) will meet the thresholds for Categories 1, 2a and 2b, this means that the vast majority of the regulated services will not be in those special categories, but rather those in the final category above.
But what about the rest of us?
The requirements that apply to all regulated services, including those outside the special categories, are naturally the least onerous under the Act; however, because these still introduce new legal obligations, for many businesses these will require considering compliance through a new lens. Key areas for those services to consider are:
- Risk assessments. Regulated services will have to conduct certain risk assessments at defined intervals. The type of risk assessments a service provider must conduct depends on the nature and users of the service.
- Illegal content assessment: all providers of regulated services must conduct a risk assessment of how likely users are to encounter and be harmed by illegal content, taking into account a range of factors including user base, design and functionalities of the service and its recommender systems, and the nature and severity of harm that individuals might suffer due to this content.
- Children’s access assessment and children’s risk assessment: all regulated services must carry out an assessment of whether the service is likely to be accessed by children, and if so they must carry out a children’s risk assessment of how likely children are to encounter and be harmed by content on the site, giving separate consideration to children in different age groups.
These risks assessments must be periodically updated, and additional risk assessments carried out before the design or operation of the service is changed, to assess the impact of this change. Ofcom has suggested that regulated services should look to existing privacy compliance best practice as a model for this, and take “online safety into account in the way they currently do with data protection – through proactive risk assessments at the point at which they are developing new or updated products”.
- Safety duties. Having conducted the relevant risk assessments, service providers are then required to use proportionate measures to mitigate and manage the risks identified. Although this sounds onerous, Ofcom has taken pains to emphasise that “proportionality is central to the regime”, and that the Act “does not necessarily require that services are able to stop all instances of harmful content”, but rather that the safety duties are limited by what is proportionate and technically feasible. In addition, service providers will need to balance compliance with these safety duties against protection of freedom of expression and privacy, and keep records of all risk assessments and mitigation measures adopted.
- User reporting, complaints and rights. Regulated services also need to have mechanisms for users to easily report i) content which they consider to be illegal, and ii) content that is harmful to children if the service does not use age verification or age estimation to prevent access by children. The Act also requires regulated services to include provisions in their terms of service, addressing:
- how users can make specific complaints, relating to both non-compliance with the law and non-compliance with the service terms, and
- the user’s right to bring a claim for breach of contract if they are suspended, banned, or their content is taken down/restricted on the service, in breach of that service’s terms.
The complaints provisions in particular must be easy to access and use, including by children; this will be challenging as the concepts covered by these provisions are not straightforward. Providers might like to look towards privacy best practice again here, and adopt concepts widely employed in privacy notices to convey complex information, such as layering, infographics and multimedia approaches.
- Notification – and possible fees? Providers of regulated services who are not exempt will need to notify Ofcom, and those meeting specified threshold conditions will need to pay a fee – but the exemptions and thresholds will be set out in delegated legislation, so the number of platforms in scope is not yet clear. As a comparison, organisations processing data must pay the ICO a similar regulatory fee of between £40-£2,900 per annum on a tiered basis according to size, and there are a number of exemptions to this requirement.
The consequences of not complying with obligations under the Act are potentially serious. The Act sets out a wide range of enforcement powers for Ofcom, from investigating and auditing services suspected to be in breach of their obligations, to taking business disruption measures in the case of the worst and most evasive offenders. Ofcom can also issue fines, capped at the greater of £18 million or 10% of worldwide revenue – but any fine must be appropriate and proportionate to the failure for which it was imposed.
The devil is in the details
Despite running to over 250 pages, the Act relies on Ofcom and the Secretary of State to fill in most of the details about what providers actually need to do, including setting thresholds for the special categories and payment of fees. Ofcom also needs to produce more than 40 regulatory documents including guidance and codes of practice, which will indicate and shape what good compliance looks like in practice, because if a service takes the measures set out in the codes relating to safety duties, it is held to be compliant with these duties under the Act. The Secretary of State even has discretion to make specific amends to the legislation, for example potentially widening it to regulate content that is harmful to children on app stores. Therefore, although providers of services likely to fall outside the special categories should start thinking now about their key compliance areas, there is still a way to go until these providers know the full weight of their obligations under the Act.
Ria Moody is TMT Managing Associate at Linklaters