EU: Combatting child abuse online is more complex than privacy vs protection
To protect children’s rights, policymakers, civil society and others need to engage fully with the complexities of the EU’s draft law to prevent and combat child sexual abuse.
EU policymakers are about to decide on one of the most contentious legislative proposals in recent memory: the draft Regulation to prevent and combat child sexual abuse. Its premise - protecting children online - is uncontroversial and necessary. However, its implications are causing profound disagreements. Child protection advocates broadly support the proposal as a proportionate and effective response to online abuse of children. But digital rights advocates and others think it is legally and technologically unsound, and warn against normalising surveillance.
Addressing a growing threat to children
It is essential that the EU act to address child sexual abuse. It is an extremely serious problem, exacerbated by the Internet and it crosses borders. There has also been an alarming increase in the volume of reported abuse. Current responses are clearly inadequate and need reform. At the same time, child sexual abuse is a complex issue which no single law can solve. Progress can only be made to combat abuse if the proposal is seen within a wider context.
Some elements of the proposal could gather consensus. For example, the proposal would require providers - platforms like Instagram, chat services like Messenger, cloud services like iCloud, file-sharing services like DropBox, and other services - to assess and reduce the risk of children being abused on their services. Anyone who finds abuse online should be able to easily report it to the service. It would also require providers to notify authorities whenever they become aware of abuse on their services. Also, an independent EU Centre could promote EU-wide knowledge sharing, coordination and victim support. As far as these go, the intention is good and clear.
Unclear impact on children’s rights
What is trickier is whether these uncontroversial aspects would be implemented in a rights-respecting way. And other parts are much more contentious, even in theory. Indeed, the impact of the proposal as a whole on children’s rights is far from clear.
This is partly due to some serious omissions in drafting the proposal. Children were not consulted directly. This is part of a bigger issue: the European strategy for a better internet for kids recognises that children’s underrepresentation in policymaking leads to unfulfilled needs and unmet expectations. Having their views heard in matters which affect them is not optional - it is children’s right.
The failure to consult children is a consequence of not carrying out a truly comprehensive “child rights impact assessment” - an analysis of how the proposal engages the full range of children’s rights. The insufficient understanding of how human rights are impacted more generally has led to discussions pitting privacy and protection against each other as incompatible. Even the Eurobarometer asks what is more important: the ability to detect child abuse or the right to online privacy.
But as an organisation working on all children’s rights, CRIN underlines that the “privacy versus protection” dilemma is false. All rights - from privacy and free expression to protection - are equally important, and the fulfilment of one right is necessary for the realisation of others.
In order to propose rights-respecting legislation, it is crucial to first carry out human rights impact assessments. These should identify all, not just some of, the rights at stake for all, not just some of, those affected. And the analysis of possible impacts should reveal rather than conceal complexities. Policymakers should have a fully informed understanding of what is at issue before examining various options. These could include less intrusive measures that reconcile conflicting rights where possible, or going further and balancing those rights where necessary.
This question of balance is reflected in the fact that several aspects proposed by the Commission require more discussion to prevent the violation of the rights of the children the proposal seeks to protect. The Parliament and the Council seem to be debating some of them during the latest legislative procedures and it is important this continues.
The main remaining challenge revolves around the detection of online abuse. Under the proposal, in some cases where “significant risk” is identified, providers could be ordered to detect abuse on a specific service. But several EU bodies - including the Council’s Legal Service - argue that detection orders, as envisaged by the Commission, might not be necessary and proportionate under human rights law. This is because they would allow generalised access to the content of people’s communications, instead of targeting users suspected of connections to child sexual abuse. There are concerns that the detection orders might infringe the essence of the fundamental rights to privacy and data protection. So there is a real risk that detection orders, if passed in their current form, would be challenged before the EU Court of Justice and struck down.
Should detection be effective in end-to-end encrypted environments? Some argue that these environments propagate child abuse. Others see restrictions on encryption as jeopardising everyone’s privacy. But again, CRIN’s research shows that it is not black and white. Encryption poses both benefits and risks to children’s rights. Its impact (positive or negative) varies significantly for different children. Given this complexity, there should be no generalised ban on encryption in law.
The UN High Commissioner for Human Rights has warned against “all direct, or indirect, general and indiscriminate restrictions on the use of encryption”. Where technologies are used which weaken or circumvent encryption, especially if aimed to be deployed at scale, there should be a thorough understanding of their implications, a stringent review of their workability, and sufficient agreement on their compliance with fundamental rights. None of this is a foregone conclusion under the proposal.
Additionally, the proposal does not properly deal with sexual material shared consensually between teenagers. It could effectively draw some under-18s into the criminal justice system for consensual sexting. It could also put, for example, LGBT+ children at risk: they could face marginalisation and even violence if their intimate communications were revealed.
The proposal also seems to introduce the verification or estimation of users’ ages online, but without adequate safeguards. This is concerning, because age verification and assessment methods each impact rights differently. For instance, biometric analysis of users’ faces or voices could be discriminatory due to skewed data and biased algorithms, and its high intrusiveness could threaten children’s privacy and data protection. Meanwhile, verifying age through official documents could disadvantage particular groups.
Continuing the conversation
A lot remains to be discussed if the proposal is to fully respect the rights of the children it aims to protect. It is an important effort towards combatting abuse. But no single law, policy or technology can protect children online or fulfil their rights more broadly. For progress to happen, the conversation must continue. Child sexual abuse is a societal problem, which needs to be addressed by a wide range of actors working on child protection and beyond.
Read CRIN’s explanation of the technology for detecting child sexual abuse online.
Read CRIN and defenddigitalme’s research on a children’s rights approach to encryption.
Sign up to our newsletter to stay updated and follow us on Twitter, LinkedIn and Instagram.