Legislative proposals
In recent years there has been an increase in the number of proposals for legislation and other initiatives around the digital environment which impact encryption, often with the aim of keeping people safe.276
The UN Special Rapporteur on freedom of expression identified in 2018 a variety of trends in State restrictions on encryption.277
- Some have adopted criminal laws banning the use of encryption, like Iran has done through its Computer Crimes Act.
- Some, like Russia, have passed laws requiring registration and government approval of encryption tools.
- Some countries have put forward frameworks in order to provide law enforcement and security agencies with access to communications. For example, China’s Cybersecurity Law requires network operators to give ‘technical support and assistance’ to public and national security organs. The UK’s Investigatory Powers Act in 2016, supplemented by the secondary regulations in 2018, allows authorities to issue a “technical capability notice” to online services, which might compel them to build backdoors and remove end-to-end encryption. It has been dubbed “the Snoopers’ Charter” by privacy campaigners and was described as “the most intrusive and least accountable surveillance regime in the West” by Edward Snowden in 2015.278 Australia followed suit, passing the Assistance and Access Act 2018, which requires service providers to develop technical capability to assist law enforcement and intelligence agencies. A similar proposal in the US, the Lawful Access to Encrypted Data Act, was put forward in 2020.
- Other States have used encryption as justification to institute broad hacking regimes, or have required online services to store personal or sensitive data locally, including encryption keys.
- Yet others, like India and Brazil, have proposed traceability requirements, asking the service providers to be able to identify the original sender of a message.279
The first attempt at legislation for online safety came from Australia in 2015, with its Enhancing Online Safety Act. This was updated in 2021 by the Online Safety Act, which came into effect in January 2022.280 It provides a set of Basic Online Safety Expectations for online services that make them accountable for users’ safety. It also requires industry to develop mandatory codes for illegal and restricted content, which can require platforms to remove child sexual abuse material and put greater pressure on online services to protect children from content which is not age-appropriate. The Act gives considerable power to the eSafety Commissioner, who can impose standards for the industry if no agreement is reached on the codes or if the standards developed are not appropriate.
Arguably, 2022 has been the most important year so far for regulatory discussions about protecting children online, particularly from sexual abuse and exploitation. Three proposals were put forward and are currently under discussion in the US, UK and the EU. Their aims are uncontroversial, but the suggestions for keeping children safe and the impact these suggestions have on (end-to-end) encryption are giving rise to disagreements.
US Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2022 (EARN IT Act of 2022) 281
Changes envisaged by the proposal
The EARN IT Act of 2022 was introduced in January. The original version of the bill had been set out in 2020.
The Act creates the National Commission on Online Child Sexual Exploitation Prevention, whose purpose is to “develop recommended best practices” that platforms can choose to implement to “prevent, reduce and respond to the online sexual exploitation of children”.282
The Act also amends Section 230 of the Communications Act of 1934, the current regime for intermediary liability online.284
Currently, Section 230 prevents platforms from being treated as the publisher or speaker of what users post online. No liability can be imposed under State law that is inconsistent with this section. However, under Section 230 platforms’ immunity does not extend to federal criminal law regarding the sexual exploitation of children. It is a federal crime for platforms to knowingly possess or share child sexual abuse material.285 Platforms are also required to report such material on their services that they know about.286
The EARN IT Act removes platforms’ immunity regarding the “advertisement, promotion, presentation, distribution, or solicitation” of child sexual abuse material in civil and criminal actions under State law.287 However, none of the following three factors — that platforms use end-to-end or other encryption on their services, that they do not have the necessary information to decrypt a communication, or that they fail to take actions that would undermine their ability to offer end-to-end or other encryption — can be an “independent basis for liability”. Courts can consider evidence regarding those three factors if it is otherwise admissible.288
Areas of disagreement regarding encryption and children’s rights
The aim of the EARN IT Act of 2022 is to fight against the online sexual exploitation of children.
However, there have been warnings that the Act threatens the privacy of all users, that it could lead to over-removal of content, and that it might make it more difficult to prosecute those who exploit children online.283
There are concerns that the Act might impose, in effect, a monitoring obligation on platforms.
This is because, even though under federal law the standard for liability is actual knowledge of child sexual abuse material, State laws might have a lower standard such as recklessness or negligence regarding its existence. So a platform that does not know about child sexual abuse material on its services might be liable under State law if it should have known or was negligent about the existence of this material. The concern is that claimants or prosecutors might argue, for example, that offering encryption services is not an independent basis for liability, but is one of the factors contributing to the platform’s reckless behaviour. Therefore, in order to avoid costly and lengthy litigation, platforms could be pressured to weaken or remove encryption from their services. They might also be incentivised to use client-side scanning to detect child sexual abuse material before the communication is encrypted or after it is decrypted.
Critics also warn that the Act would deputise platforms as agents of the government, making the evidence they obtain inadmissible in court under the Fourth Amendment to the US Constitution, which prohibits the “unreasonable searches and seizures” of individuals’ communications by law enforcement without a warrant.289
UK Online Safety Bill 290
Changes envisaged by the proposal
The UK Government introduced the Online Safety Bill in the House of Commons in March 2022. It had proposed the draft bill in May 2021, following its response to the public consultation regarding the Online Harms White Paper from April 2019. In 2022, the UK Government invested £500,000291 into the “No Place to Hide” campaign,292 which asks social media companies to commit to rolling out end-to-end encryption only when “they have the technology to ensure children will not be put at greater risk as a result”.
The Online Safety Bill imposes duties of care on providers of user-to-user services and search services.293 All these providers have a duty to address illegal content such as child sexual exploitation and abuse, by carrying out risk assessments and taking proportionate measures to effectively mitigate and manage the risk of harm to individuals.294 For example, user-to-user services have a duty to prevent individuals from encountering child sexual exploitation and abuse content, minimise the time for which such content is present, and swiftly take it down where they become aware of it.295 All such content that is detected must be reported to the National Crime Agency.296
In addition, user-to-user services and search services that are likely to be accessed by children must carry out a children’s risk assessment and take proportionate measures to protect children from content that is harmful to them.299 This is to be defined by the Secretary of State in secondary legislation.300
Ofcom, the UK’s communications regulator, can impose a “proactive technology requirement” on a service for the purpose of complying with the illegal content duties and children’s online safety duties.301 Moreover, Ofcom can order a provider of services to use “accredited technology” to identify and swiftly take down child sexual exploitation and abuse content, whether communicated publicly or privately.302 In deciding whether it is necessary and proportionate to order this, Ofcom must consider a number of factors, including the kind of service, its functionalities, its user base, the prevalence and dissemination of the content, the risk and severity of harm, the systems and processes used by the service to identify and remove the content, and the risks to users’ freedom of expression and privacy.303
Ofcom can request providers of services to give any information that Ofcom requires for exercising, or deciding to exercise, its functions.304 It is a criminal offence to provide information which is encrypted such that Ofcom cannot understand it, where the intention is to prevent Ofcom from understanding that information.305
Areas of disagreement regarding encryption and children’s rights
The Online Safety Bill is intended to deliver the commitment to “make the UK the safest place in the world to be online”, including for children.297
Concerns around the Online Safety Bill centre on the fact that, in practice, it seems to impose a general monitoring obligation, even for providers of services that use end-to-end encryption. In order to comply with the risk assessment and content moderation duties, as well as with any requirements from Ofcom, service providers would need to scan all user content. The failure to distinguish between public platforms and private messaging services means that offering end-to-end encryption might violate the duties under the Bill.298 Platforms might have to use client-side scanning before the communication is encrypted or after it is decrypted.
More broadly, critics have warned that the Bill focuses too much on content moderation instead of tackling the business model of platforms (the monetisation of users’ attention), deputises platforms to make determinations regarding the illegality of content, infringes users’ freedom of speech and privacy by covering “harmful” content that is not illegal, and endangers the independence of Ofcom by giving too much power to the Secretary of State over the implementation of the Bill.306
From a children’s rights perspective, it is concerning that so much power lies with Ofcom, even though it does not have specific expertise in this area.
EU proposal for a “Regulation laying down rules to prevent and combat child sexual abuse” 307
Changes envisaged by the proposal
The EU proposal for a Child Sexual Abuse Regulation (“CSAR”) was put forward in May 2022.
The EU CSAR was developed in the context of the EU strategy for a more effective fight against child sexual abuse, which was adopted in July 2020.308 It provides a framework for developing a comprehensive response to online and offline child sexual abuse. The strategy sets out various initiatives, including ensuring the complete implementation of the current legislation like the Child Sexual Abuse Directive,309 identifying gaps and proposing new legislation, strengthening law enforcement and prevention efforts, and creating a European centre to prevent and counter child sexual abuse. In November 2020, the Council of the EU issued a resolution on “Security through encryption and security despite encryption”.310 In July 2021 the EU adopted a temporary derogation from the ePrivacy Directive,311 allowing service providers to take voluntary measures to detect, report and remove child sexual abuse material. In October 2022, the EU adopted the Digital Services Act,312 amending a 20-year-old directive313 which applies to online services.
The EU CSAR sets out rules to address “the misuse of relevant information society services for online child sexual abuse”.317 These services are defined as: hosting services, interpersonal communications services, software applications stores, and internet access services.318
The CSAR imposes risk assessment, mitigation and reporting obligations on hosting and interpersonal communication services regarding online child sexual abuse. This covers the “dissemination of material previously detected and confirmed as constituting child sexual abuse material (‘known’ material), but also of material not previously detected that is likely to constitute child sexual abuse material but that has not yet been confirmed as such (‘new’ material), as well as activities constituting the solicitation of children (‘grooming’)”.319
When carrying out a risk assessment regarding online child sexual abuse, among other factors, services must take into account various functionalities to address the risk such as prohibitions and restrictions laid down in terms and conditions and ways to enforce them, age verification and reporting tools.320
They must also consider the manner in which users use the service,324 and in which the provider designed and operates the service.325 Regarding the risk of solicitation of children, they must consider, for example, functionalities like enabling users to contact others directly and share images or videos with them, particularly through private communications.326 Then, services must take mitigation measures to minimise the risk identified. These measures must be effective, targeted and proportionate, and they must be applied in a non-discriminatory manner, with due regard to consequences for fundamental rights.327 They must also report the risk assessment and mitigation measures to the national Coordinating Authority.328 The Coordinating Authority can request the competent national judicial authority to issue a detection order where there is a significant risk of the service being used for abuse, and the benefits of issuing the detection order outweigh the risks for the rights of all parties.329
Services that receive detection orders must install and operate technologies that detect abuse,333 which must be effective, sufficiently reliable, not able to extract any information other than that which is strictly necessary, and the least intrusive in terms of the impact on users’ privacy, including the confidentiality of communication.334
The EU CSAR also establishes the EU Centre on Child Sexual Abuse as an entity which is independent, although it relies on the support services of Europol. The EU Centre has a number of tasks, from facilitating the generation and sharing of knowledge and expertise to acting as a dedicated reporting channel for the EU, and in some circumstances conducting online searches for publicly accessible abuse material.335
Areas of disagreement regarding encryption and children’s rights
For background, there is a complex landscape regarding the approaches of the various EU initiatives and laws to end-to-end encryption. The EU strategy for a more effective fight against child sexual abuse acknowledges the use of encryption for criminal purposes and calls for “possible solutions which could allow companies to detect and report child sexual abuse in end-to-end encrypted electronic communications”.314 The EU Council Resolution on Encryption refers to “technical solutions for gaining access to encrypted data”, noting that they should respect the “principles of legality, transparency, necessity and proportionality including protection of personal data by design and by default”.
On the other hand, the text of the temporary ePrivacy derogation specifically states that nothing in it should be interpreted as “prohibiting or weakening end-to-end encryption”.315 The Digital Services Act retains the prohibition on general monitoring, meaning that service providers cannot be asked to monitor information transmitted or stored, or actively seek circumstances indicating illegality.316
The EU Parliament had approved language protecting end-to-end encryption, but this did not make it in the final version of the Digital Services Act.321
With regard to the EU CSAR itself, some EU authorities and civil society organisations have warned that the proposal poses risks to encryption and fundamental rights.
Data protection bodies consider that the proposal raises “serious data protection and privacy concerns” and have called for it to be amended, “in particular to ensure that the envisaged detection obligations meet the applicable necessity and proportionality standards and do not result in the weakening or degrading of encryption on a general level”.322
Looking at communications around the proposal, such as the Impact Assessment and public statements from the EU Commission, it has been argued that end-to-end encryption would be a factor making a service risky. In order to mitigate the risk, services could feel pressured to remove encryption, or apply client-side scanning. This pressure will apply even without the services being subject to a detection order.323
Concerns have also been raised regarding the degree to which the EU Centre would actually be independent from Europol and law enforcement, with some fearing that the proposal in practice gives a mass surveillance mandate to a centralised police organisation.330
Internal documents suggest that EU Member States are divided.331 Austria, for example, voted on a binding resolution to reject the EU proposal in its current form, given the risk of a general monitoring obligation and how this threatens encryption and fundamental rights.332
***
Footnotes
276 For an overview of recent regulatory discussions, see Tech Against Terrorism, Terrorist Use of E2EE: State of Play, Misconceptions, and Mitigation Strategies, 2021; For a global overview of the legal status of encryption, see Global Partners Digital, World map of encryption laws and policies, see here.
277 Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression as follow-up to the 2015 report on the use of encryption and anonymity to exercise the rights to freedom of opinion and expression in the digital age, A/HRC/38/35/Add.5, 13 July 2018, see here.
278 See his remarks on Twitter, see here.
279 See Tech Against Terrorism, Terrorist Use of E2EE: State of Play, Misconceptions, and Mitigation Strategies, 2021, pp. 33-34.
280 Australian eSafety Commissioner, Online Safety Act 2021 - Fact sheet, July 2021, see here.
281 Available here.
282 Section 3 of the EARN IT Act of 2022.
283 See, for example: Riana Pfefferkorn, The EARN IT Act Is Back, and It’s More Dangerous Than Ever, 4 February 2022, see here; Jeffrey Westling, Unintended Consequences of the EARN IT Act, 23 February 2022, see here.
284 See under Title 47 of the US Code: 47 U.S.C. 230 (e), available here.
285 Section 2252A, available here.
286 Section 2258A, available here.
287 Section 5 of the EARN IT Act of 2022. It also authorises federal civil suits for conduct that violates Sections 2252 or 2252A of the US Code.
288 Section 5 of the EARN IT Act of 2022.
289 Available here.
290 Available here. The analysis is based on the text of the Bill as of 5 December 2022.
291 Computer Weekly, Government funds charity campaign to warn big tech over the risks of encryption to children, 19 January 2022, see here.
292 Available here.
293 See, for example, sections 2, 6, 7, 22, 23 of the Online Safety Bill.
294 See, for example, sections 8, 9, 24, 25 of the Online Safety Bill.
295 Section 9 of the Online Safety Bill.
296 Section 60 of the Online Safety Bill.
297 UK Government, Online Safety Bill: factsheet, last updated on 19 April 2022, see here.
298 See, for example: ARTICLE 19, UK: Online Safety Bill is a serious threat to human rights online, 25 April 2022, see here.
299 See, for example, sections 10, 11, 26, 27 of the Online Safety Bill.
300 Section 54 of the Online Safety Bill.
301 Section 120 of the Online Safety Bill.
302 Section 106 of the Online Safety Bill.
303 Section 108 of the Online Safety Bill.
304 Section 87 of the Online Safety Bill.
305 Section 94 of the Online Safety Bill.
306 ARTICLE 19, UK: Online Safety Bill is a serious threat to human rights online, 25 April 2022,
307 Available here.
308 Available here.
309 Directive 2011/93/EU, available here.
310 Available here.
311 Regulation (EU) 2021/1232, available here.
312 Regulation (EU) 2022/2065, available here.
313 Directive on electronic commerce or E-Commerce Directive 2000/31/EC, available here.
314 Introduction to the strategy.
315 Recital 25 of the temporary derogation.
316 Art. 8 of the Digital Services Act.
317 Art. 1 of the CSAR.
318 Art. 2(f) of the CSAR.
319 Recital 13 of the CSAR.
320 Art. 3(2)(b) of the CSAR.
321 European Pirate Party, Digital Services Act: Decision in part strengthens, in part threatens privacy, safety and free speech online, 20 January 2022, see here.
322 European Data Protection Board and the European Data Protection Supervisor (EDPB-EDPS), Joint Opinion 04/2022 on the Proposal for a Regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse, 28 July 2022, see here.
323 EDRi, Private and secure communications attacked by European Commission’s latest proposal, 11 May 2022, see here.
324 Art. 3(2)(c) of the CSAR.
325 Art. 3(2)(d) of the CSAR.
326 Art. 3(2)(e)(iii) of the CSAR.
327 Art. 4(2) of the CSAR.
328 Art. 5(1) of the CSAR.
329 Art. 7(4) of the CSAR.
330 Centre for Democracy and Technology (Europe Office), Briefing on Key Concerns Relating to a Proposal for Regulation laying down the Rules to Prevent and Combat Child Sexual Abuse (CSAM), 26 May 2022, see here.
331 Patrick Breyer MEP, Chat control: Internal documents show how divided the EU member states are, 15 September 2022, see here.
332 epicenter.works, Chat control - a good day for privacy, 3 November 2022, see here.
333 Art. 10(1) of the CSAR.
334 Art. 10(3) of the CSAR.
335 Arts. 40-50 of the CSAR.