The passage of the Online Safety Bill in the British government has concluded, marking a significant moment in the country’s legislative history. Originally conceived as a strategy paper on internet safety, the focus of the bill has shifted to the more contentious debates it generated. The aim of the law is to address harmful online behaviors such as child sexual abuse, terrorist content, revenge pornography, and fraud, which were already targeted in the analog world but required legislation for technology companies to assist in combating them online.
In recent months, there has been opposition from companies offering end-to-end encrypted messaging platforms. They argued that a provision in the bill would compromise user privacy and expose their messages to interception, even threatening to withdraw their services from the country. There were reports indicating a reduction in the scope of the provision, but these reports faced criticism. With the completion of the bill’s progress through the House of Lords, it is worth examining what the Online Safety Act actually entails.
Contrary to concerns, the act does not ban end-to-end encryption. It includes a provision that allows messaging platforms to use accredited technology to identify specific types of content, namely terrorism content and child sexual abuse material, if directed to do so by the communications regulator, Ofcom. However, Ofcom must consider it necessary and proportionate, and the technology used must be accredited. Presently, there is no accredited technology available, and the process of accreditation by Ofcom is yet to be defined. These conditions pose significant challenges for the regulatory framework in terms of justifying the implementation of technology, such as client-side scanning, before content is transmitted.
In 2022, researchers from GCHQ proposed measures to ensure the security of a CSS-style content check system, although this proposal did not reflect government policy. It is unclear whether the British government will endorse the GCHQ paper or if messaging companies will agree with its findings. A similar power already exists under the UK’s Investigatory Powers Act, known as a Technical Capability Notice, which places similar limitations on serving notices to technology companies. However, significant obstacles, including gaining support from the US Congress, need to be overcome since most of these companies operate within the US jurisdiction.
Apart from encryption, the Online Safety Bill also covers various aspects of online activities. The 225-page law introduces safety and security requirements for platforms, including measures to prevent children from accessing inappropriate content and the obligation for platforms to verify users’ ages. Platforms are also mandated to protect content of democratic and journalistic importance and report instances of fraudulent advertising and child sexual abuse material to the National Crime Agency. Failure to comply with these requirements can result in fines of up to £18 million ($22.3 million) or 10% of a company’s global turnover.