A children’s rights approach to encryption: Principles for policy-makers

 

The realisation of the full range of children’s rights in digital environments is complex and nuanced. There are no one-size-fits-all solutions. This report sets out a principles-based set of recommendations for future regulation in ways that respect children’s rights.

Challenges persist in upholding children’s right to protection from violence in the digital environment, in the detection and reporting of content and action against perpetrators, as well as insufficient and inconsistent state support for the prevention of violence against children, assistance to victims and survivors and cross-border cooperation.

Where there is interference in children’s behaviour and activity in a digital environment, including digital content access and/or content moderation whether encrypted or not then the law must be applied in a specific context and case, and its impact assessed both in terms of understanding the big picture at scale, and the specific incident.

An understanding of the functioning of encryption and the roles that it plays in the digital ecosystem is essential for effective and rights-respecting regulation. The different purposes of cryptography need to be understood in the context of where it is used in the digital environment, and its purposes.

Encryption cannot be addressed in isolation as a child protection issue, or placed in opposition to privacy and security but must be seen as a part of the systems in the digital environment. The digital environment itself forms part of the wider societal ecosystem. No single law or technological development can protect children online or secure their human rights in isolation. Each part of the wider societal ecosystem requires both proper investment and that we recognise any limitations. Attention should be paid to the wide range of actors that engage children in society, including law enforcement, health services, social services, schools and other institutions, and the role that each can and should play effectively and legitimately, their boundaries, and need for cooperation.


Framing and Process

1. Actions affecting the digital environment must respect the full range of children’s rights.

All interventions that affect the digital environment in general, and actions that engage encryption in particular, must respect the full range of children’s rights, from protection from violence to privacy and freedom of expression.

  • Privacy and protection: Discussions should move beyond the dichotomy “privacy versus protection”. All those involved in decision-making processes should recognise that all children’s rights, including privacy and protection, are universal, indivisible and interdependent. This means that these rights apply to all children everywhere. There is no set of rights which is more important than others - all rights are equally important. These rights also support each other, with the fulfilment of each being necessary for the realisation of others.

  • Child rights impact assessments: All interventions that have a significant impact on children must be based on child rights impact assessments. This should involve pre-legislative scrutiny that assesses the impact of any law reform proposal on the full range of children’s rights. Where an independent body is responsible for regulation, that regulator must include sufficient child rights expertise. Businesses with a significant impact on children’s rights in this context should also conduct children’s rights impact assessments, act on the outcomes of those assessments, and report on their implementation.

2. Interventions engaging encryption must be seen within a wider ecosystem.

No single law, policy or technological development can protect children online or secure their human rights more broadly. Encryption cannot be addressed in isolation, but only as part of a wider ecosystem with a range of actors that can meaningfully interact, each with its own role that it can effectively and legitimately play.

  • Start with the societal problem: Encryption should not be the starting point in debates around societal problems affecting children. Instead, policy-makers should identify the policy goal to be achieved and consider the range of options, of a technological nature or otherwise, that could be implemented for this purpose. In assessing possible solutions, policy-makers should consider the variety of actors interacting in the societal ecosystem, including governmental agencies, law enforcement, health services, social services, schools, care centres and other institutions.

  • Beware of techno-solutionism: Policy-makers and other stakeholders should avoid relying on one-size-fits-all technological fixes. Decision-making should be based on a thorough understanding of the complex technological landscape, including in particular the multiple roles that encryption and other technologies play. Policies should be grounded in the reasonable capability of technology as it is, not as might be hoped for.

  • Support the complete child protection ecosystem: Child protection requires human trust and meaningful interaction across solid infrastructures for knowledge-sharing and intervention. To the extent that laws, policies and other initiatives already exist for the purpose of child protection, they should be fully implemented. There should be an emphasis on prevention and education, and appropriate funding should be provided to the wide range of services interacting in the ecosystem, from law enforcement and the justice system, to social services and victim support. Particular emphasis should be given to staff training, which should include, where appropriate, digital evidence management, analysis and practice, in order to promote the investigation and prosecution of the perpetrators of technology-enabled violence against children. Physical and mental health support services for child and adult victims and survivors of child sexual exploitation and abuse must be a priority. The need for a multidisciplinary approach to protection should be emphasised in order to break down barriers to cooperation between disciplines and professionals.

3. All those with relevant expertise must be involved.

All professionals with relevant knowledge must be able to engage in discussions and decision-making regarding children and the digital environment, including on encryption. They must be able to do so on an equal footing and in an environment of mutual respect. Conversations must include specialists working on child protection, technology and Internet regulation, data protection and privacy, as well as participants with more generalist expertise in children’s rights, human rights and digital rights. The views of civil society, academia, government, law enforcement and the business sector must be taken into account. Particular efforts should be made to include those working outside currently dominant Anglo- and Euro-centric spaces.

  • Language: There should be a recognition of the extreme sensitivity of aspects of the debate around encryption and children’s rights, particularly as regards online child sexual exploitation and abuse. Those involved in discussions should exercise empathy and pay special attention to the framing and language used, as well as the expectations that are being created for victims and survivors of abuse.

  • Data: Emphasis should be placed on the importance of accurate data, in particular about the scale of abuse and the accuracy of content-detection technologies. All participants to discussions should strive to fully explain the ways in which they use data in support of their arguments, in order to help disaggregate between the various causes of problems and move the debate on solutions forward.

4. Children and other directly affected communities must be heard and their views given due weight.

Children’s right to have their voices heard and given due weight must be upheld in all decision-making processes which concern them. Other directly affected communities, such as the adult victims and survivors of child sexual exploitation and abuse or those disproportionately affected by policing, surveillance, intelligence gathering or other intrusive data practices, must also be meaningfully included in these processes. Assumptions should not be made about the outcomes these groups may want. Not all children or members of a community have the same experiences, views or concerns. Decision-making processes should therefore aim to include diverse voices.


5. Policy-makers engaging with encryption must address the impact beyond their own jurisdiction.

The digital environment is interconnected and regulation in one jurisdiction is very likely to cause ripple effects in others, or even globally. Policy-makers must work to understand these links, including by engaging in conversations with those working in different jurisdictions, especially where they are not part of the dominant Anglo- and Euro-centric debates.


Substance

6. There should be no generalised ban on encryption for children.

If encryption were removed from all services that children use, far from protecting them, this would leave them vulnerable to a wide range of exploitation and abuse. It is possible to regulate the applications of encryption, however this must be consistent with children’s rights.


7. Interventions engaging encryption must be context-specific.

Measures should be tailored to the diverse experiences of children as full rights-holders, including children from disadvantaged and marginalised groups. Interventions must consider and address specific political, economic, social and cultural contexts and the varied ways in which children relate to the State, businesses, and their community and family.

  • Real-world uses of the digital environment: Those involved in decision-making should promote a better understanding of the variety of real-world uses of the digital environment, including communications involving medical information, legitimate political organisation in repressive environments, or the routine reliance on particular platforms where there is limited accessibility to other services. More efforts should be made to include perspectives which are not necessarily consistent with the expectations of those within the Anglo- and Euro-centric contexts.

  • The repurposing of technology: There should be a recognition that technologies for content detection in the digital environment can be repurposed. The nature of the content that needs to be identified is not technology-specific, but policy-specific. Tools used to detect illegal content, such as child sexual abuse material, could also be deployed to identify legitimate content and infringe the rights of those accessing it.

8. Measures engaging encryption must be legal, necessary and proportionate.

Interventions engaging encryption should respect the principles of legality, necessity and proportionality. These principles apply to the content of communications, but also to the collection, sharing and retention of metadata. Measures should be provided for by law and should be sufficiently clear and precise. They should be limited to achieving a legitimate policy goal and should be the least intrusive way of doing so. Interventions must be necessary and proportionate limitations on children’s qualified rights such as privacy, therefore they must strive for a high degree of specificity, instead of applying indiscriminately.


9. Policy-making should address the role of business.

Regulation and policy should mandate more transparency around how platforms prevent and remedy violations of children’s rights, including by requiring clear, accessible and child-friendly terms of service. Platforms should receive guidance on how to improve the design of services, especially user reporting for children. Businesses whose activities have a significant impact on children’s rights should be encouraged to invest in researching, developing and sharing findings on new technologies, as well as in supporting the efforts of others working in this area.

  • Reporting to authorities: Where businesses obtain knowledge of the existence on their services of illegal content such as child sexual abuse material or illegal activity such as violence against children, they should take action under their terms of service, and expeditiously report this to law enforcement or other appropriate authorities.

  • Transparency: Companies should publish transparency reports regarding the scale of online child sexual exploitation and abuse on their services that comes to their knowledge, detailing the types of content and behaviour identified and the actions taken as a result. Efforts should be made to reach as much specificity as possible, disaggregating events into individual instances of abuse, analysing the prevalence of revictimisation through the sharing of identical or altered content, and indicating the context in which the events took place if relevant for ascertaining the intention of the users involved (e.g. consensual image sharing between children, or content shared in outrage).

10. Children must have access to justice.

Free, effective and child-friendly complaint mechanisms, both judicial and non-judicial, must be in place to ensure that children are able to access remedies, in a timely manner, for all violations of their full range of rights in the digital environment. There must be independent oversight mechanisms to ensure the lawful and rights-respecting implementation of measures engaging encryption.

  • User reporting: Confidential, safe and child-friendly user reporting tools should be made available to ensure that children are able to report material and behaviour on services they use, and seek action. “Trusted flagger” mechanisms should also be considered. The decision following user reporting should be made in a timely manner, and it should be based on a clear and transparent process, giving users the possibility to resort to appeal mechanisms. Transparency reports should be produced to enable the scrutiny of systemic policy and practice around user reporting, while protecting the rights of users, as well as victims and survivors.

  • Content detection accuracy: An overreliance on automated tools risks errors in the detection process and the wrongful removal of content, as well as other potential negative consequences such as the banning of users’ accounts. Automation may support but cannot replace human content moderation. Any inadvertent outcomes due to errors from automated processes must be reversible through human support.

***