Frequently Asked Questions

  • June 29, 2023

Online Safety Bill FAQ

Q: How is the Online Safety Bill endangering public interest projects (PIPs)?

A: As currently drafted, the Bill applies to organisations that allow UK users to see user-generated content,(1) or that allow them to search other websites.(2)  This affects numerous public interest projects that serve or directly involve the public.

As the Bill stands, PIPs will be required to understand and apply this new 260-page law, which imposes at least 29(3) new and often onerous legal duties.  Worse still, as a “skeleton” (or “future proofed framework” law), the Bill’s full impact on PIPs will only become clear to them once they have also mastered dozens of additional “implementation” rules, guidelines and Codes of Practice that will be issued by Ofcom and the Secretary of State.

New or evolving PIPs — no matter how important and beneficial they may be for the UK — will then be outlawed unless they first conduct “child access assessments” (“CAAs”, Clause 31) and “illegal content risk assessments” (“ICRAs”, Clause 8) for all projects that will involve user-generated content (such as a photography contest, or a discussion forum).   Each assessment must be documented (with records kept for inspection), and must be repeated frequently — some have a regular cadence (annual), while others require regular revision: ICRAs have to be revised every time there are “significant changes” to the design/operation of the service, or to Ofcom’s guidance.  CAAs, meanwhile, have to be revised at least annually; then when the service design changes; and again if signs emerge that more under-18s may be using the service. Assessments will in turn give rise to extra obligations (e.g. Clause 9, requiring new compliance measures).  

The Bill’s clearest requirements are often the most problematic for PIPs: for example, even “citizen history” and “open science” projects will be required to perform statutory assessments of their impact on (i) illegal immigration; (ii) operation of unlicensed crossbow rental businesses; (iii) selling stolen goods; (iv) controlling prostitutes; (v) and displaying words contrary to the Public Order Act 1986 (among many other “Priority offences”) (clause 8(5), read with Schedule 7).

The Bill may even subject the more widely-used PIPs to a new duty to submit annual earnings and userbase statistics to Ofcom, so that Ofcom can, if it sees fit to do so, charge that PIP a new “fee” — in essence, a tax to operate in the UK (Clauses 74-77).  Ofcom is also given the power to force PIPs to use content filtering and user blocking technologies, without a judge.  Those same “proactive technology requirement” powers have already attracted widespread criticism for threatening the privacy and confidentiality of WhatsApp and Signal conversations.

Noncompliance exposes PIPs to serious fines, UK blocking orders, and even staff imprisonment. 

Unable to manage this entirely new legal environment, many existing PIPs — some of which have served or been run by the UK public for decades — face closure, or could geoblock UK users.  New PIPs may never see the light of day, and those already operating will become change-averse (since some of the Bill’s obligations are triggered by “significant changes” to the “design or operation” of a website or app).  Many PIPs that do attempt to comply with the Bill, without Big Tech’s legal resources at their side, are likely to cut their risks: they can exclude under-18s, or suppress borderline-but-lawful content.  Even larger PIPs, like the non-profit Wikimedia Foundation that hosts Wikipedia, have spoken up about the risk of age-based discrimination and risk-reactive censorship.

Notes

(1) A “user-to-user service” is: an internet service by means of which content that is generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service” – OSB, clause 2(1)

(2) A “search engine”:

“(a) includes a service or functionality which enables a person to search some websites or databases (as well as a service or functionality which enables a person to search (in principle) all websites or databases);

(b) does not include a service which enables a person to search just one website or database.” – OSB, clause 201(1)

(3) Approximate figure only.  This only counts new duties applicable to regulated “user-to-user services, based on a count of obligations drafted in the form “a duty (…) to… .”  This conservative approach means additional duties (and prohibitions) are missed from the count, e.g. those expressed in the form “[x] shall” or “[x] shall not”.  This methodology also excludes (i) additional duties applicable only to services designated as “Category 1” or “Category 2a”; search engines; and/or pornography websites; (ii) additional duties that arise only in relation to compliance with Ofcom regulatory actions, e.g. cooperation with investigations.

Q: Doesn’t the Bill just require sites to take “proportionate” steps, so requirements only cause problems if the sites actually pose a risk to their users?

A: No.  Some of the Bill’s requirements are indeed written in a nebulous, “futureproofed” and “proportionality-centric” way — allowing Ofcom and future governments to spell out more concrete requirements down the line — but some parts are already extremely specific.  

For example, Section 8(5) requires PIPs and other covered entities to specifically assess the risk of their projects being used to see content corresponding to every offence listed in Schedule 7 of the Bill, and to more generally assess the likelihood of their service being used to commit (or facilitate) those offences.  The list in Schedule 7 is four pages long, and, as noted above, includes assisting illegal immigration, unlicensed crossbow rental, selling stolen goods, controlling prostitutes, and displaying words contrary to the Public Order Act.

Q: What do the signatories want to see changed?

A: The fix is simple: The signatories request the addition of a new paragraph to Schedule 1 that would exempt PIPs from the Bill.  The suggested drafting of this amendment is as follows:

“Services provided in the public interest(10A) A user-to-user service or a search service is exempt if it is provided for the purpose of indexing, curating, adapting, analysing, discussing or making available content in the public interest, including but not limited to historical, academic, artistic, educational, encyclopaedic, journalistic, or statistical content.”

Q: Would that exemption be open to abuse? How is “public interest” defined?

A: “Public interest” exemptions are already widely used in other UK laws.  For instance, they set aside some of UK data protection law’s most onerous provisions.  “Public interest” is also used in other important laws, such as whistleblower legislation.  

Someone abusing the exemption to harm the UK public would not be acting in the public interest, and would therefore be automatically disqualified from the exemption.

The Bill could also pair this exemption with a new power for judges, or Ofcom, to selectively suspend exemptions, in response to abuse – modelled on a similar provision in the Gambling Act 2005 (s. 284).  However, the UK’s neighbouring countries — such as France (which just built a similar exemption into its new social media law) — seem to view this as unnecessary.