Frictions and faultlines: The search for consensus

 

"There is a lot more common ground in the debate than perhaps some of us recognise. [...] It’s the details where it gets tricky."97

 

The encryption debate was once described as “thermonuclear”, with “emotions running high on either side”.98 To move beyond the divides that currently exist with regard to encryption, it is necessary to understand the frictions, fractures and faultlines that exist in this space as well as where there is room for consensus.

This chapter explores the themes that emerged throughout the interviews, private conversations and literature review that form the backbone of this report, with a view to better understanding the perspectives and thinking regarding children’s rights and encryption and identifying a way forward.


The pressing need to address online child sexual abuse and exploitation

The discussion about the proper regulation of encryption and the challenges of preventing and identifying online child sexual abuse and exploitation have become inextricably linked. Particularly in the European and North American context, this issue occupies a central point in legislative and regulatory reform processes. It is also in this context that many of the most explicit tensions emerge. Yet despite these tensions, across the full range of interviews, conversations and literature reviewed as part of this research, there was no dispute that online child sexual abuse and exploitation requires urgent action to protect children and secure the accountability of abusers. Where disagreement was evident, it related to the different perspectives on how to achieve this goal and how to protect human rights more broadly in doing so.


“It’s not a question of: should we protect children or not? We completely agree on the need for protection.”99

“We all want to protect children. [...] The point is that the means of doing so can be different..”100

Despite this fundamental basis of agreement, many interviewees who reflected on the public debate about encryption and online child sexual exploitation and abuse described an environment that had become hostile and emotive in a way that has held back reform. Some participants revealed that during conversations on the risks of encryption – particularly in the context of child abuse – they witnessed a tendency to move away from the criticism of arguments towards more personal denunciations of what were perceived to be callous, immoral positions. As one interviewee pointed out, “rarely do we get the chance to have a nuanced informed debate around this because it’s just so emotional”.101 Others identified instances of “scaremongering” and “rhetoric” that is quite inflammatory, sometimes even toxic, in the debate.

This tension risks preventing the engagement across different areas of expertise that will be necessary to meaningfully address online child sexual abuse and exploitation. Yet despite this challenge, interviewees commonly felt that the conversation was now shifting to make progress possible. In the words of one interviewee: “It does feel like ground has been conceded on both sides. I feel kind of quietly optimistic about getting to a place where there’s more understanding on both sides.”102

 

A note on the scale of online child sexual abuse and exploitation


“Numbers just look very flat when there’s a much more robust story behind them.”103
“What’s an acceptable number of children being sexually abused? I just don’t think that’s ever a question we should be asking ourselves.”104

In 2021, NCMEC received 29.3 million reports of suspected child sexual exploitation, 35 per cent more than in 2020. The reports provided by electronic service providers included 39.9 million images, of which 16.9 million images were unique, and 44.8 million videos, of which 5.1 million were unique.105

Given the prominence of the principles of necessity and proportionality in the debate on encryption and children’s rights, there is a tendency to reach for numbers in order to advocate for particular solutions.

The data has been seen as “vital to enabling nations to understand the extent” of the problem of child sexual abuse material online and to making the case for “increased government investment” in tackling it.106 The data has also been used in arguing about the effectiveness and necessity of automated detection tools, and in warning about the consequences of turning them off.107 For example, NCMEC saw a 58 per cent decrease in reports of EU-related child sexual exploitation when the EU ePrivacy Directive went into effect and before the temporary derogation was adopted,108 which limited industry’s ability to detect, report and remove child sexual abuse material.109

However, numbers are not as helpful in moving the debate forward as it may seem. Some fear that at times “people might glaze over numbers that feel just too large to think about”.110 In any case, the current numbers are far from accurate depictions of the problem. As one interviewee from NCMEC explained, underreporting is felt to be a significant issue, because platforms fear the reputational risks of making a large number of reports: “there are many companies out there that are maybe in everyone’s pockets or everyone’s purse right now, and they have very few reports, and we just know that this is not a reflection of what is happening on their services”.111 The emerging trend of “sextortion”, a “combination of white collar crime and child sexual exploitation”112 also complicates the picture, because digital payment platforms over which the exchange between the abuser and the child happens do not report financial activity as sexual abuse.

On the other hand, the data is sometimes felt not to be a true reflection of the nature and extent of the problem because of the number of duplicate pieces of content in circulation. For instance, a study carried out by Meta on content reported to NCMEC in October and November 2020 revealed that “90% of this content was the same as or visually similar to previously reported content. And copies of just six videos were responsible for more than half of the child exploitative content we reported in that time period”, indicating that “the number of pieces of content does not equal the number of victims, and that the same content, potentially slightly altered, is being shared repeatedly”.113

To these points survivors and child protection advocates responded forcefully: “People are assuming that that’s a good thing, because fewer images are being shared more times. But that doesn’t offer me any comfort. So if my image is shared one time, that’s horrific. If my image is being shared 1,000 times or 10,000 times… am I supposed to feel better because it’s the same image?”114 and “There’s something kind of disingenuous in saying it’s repetitive. It’s not. It’s a new crime every single time, with a new perpetrator and a new victimisation. It’s like the humanity is lost in this conversation, right?”115

If some numbers must be sought, perhaps what is more important than the amount of total reports is the number of “meaningful reports”,116 which provide information that could potentially save a child. But, according to the interview with NCMEC, too many companies provide instead “barely a shell of a report”117 in what seems like a tick-boxing exercise.

Even where there is some agreement on the importance of numbers and what they mean, it is not clear to what extent reports lead to actually solving crimes against children. Regarding the UK specifically, the National Crime Agency “received 102,842 reports from NCMEC, but some of these were incomplete or, once investigated, not found to be child abuse. Of these, 20,038 [reports] were referred to local police forces and started (or contributed to) investigations. In the same year, over 6,500 individuals were arrested or made voluntary attendances due to offences related to child abuse and over 8,700 children were safeguarded.”118 But the response from national law enforcement “varies widely as a consequence of capacity and resource constraints”. It is far from clear “how many investigations and arrests directly derive from NCMEC reports at the global level, or how many fewer would have been made with end-to-end encryption implemented”.119

 

Privacy and protection


“Really the challenge that we have here is: how do we safeguard children, whilst protecting privacy and other fundamental rights?”120

“We need to have a balanced conversation about all of the rights that takes into account safety as well as privacy.”121

A second, central point around which debate has formed has been the characterisation of online regulation as a matter of “privacy versus protection”, or sometimes more bluntly “protection of children versus privacy of adults”. This divide, often seen as at the core of disputes with regards to encryption, did sometimes appear in advocacy messaging reviewed during the research for this report, but among the interviewees and more in depth written analysis, issues related to privacy and encryption were rarely treated in those terms.

As might be expected, the defence of the value of privacy in the regulation of encryption was strongly made among organisations and experts whose work focuses on privacy, as well as those working most directly with technology. As one interviewee put it, “We fight to protect privacy because we know that it’s a really important right and in many ways a gatekeeper to other rights. [...] Under surveillance, people are suppressed and their rights are limited [...] Privacy is a fundamental underpinning of how states work and violations to privacy are very, very concrete and can lead to huge harms. [...] We know from history how dangerous it is when our privacy is intruded on by the state - having the right to privacy is about redressing that balance of power.”122

This perspective, however, was not limited to organisations focused exclusively on privacy rights. Some children’s rights advocates stressed that, both in the debate around encryption and more generally, privacy is often wrongly seen as the preserve of adults. They viewed this as a symptom of a general failing in the way that children’s rights are discussed, which tends to regard children as “objects of protection instead of fully formed subjects of rights”.123

The same interviewee emphasised that children have the right to privacy, but also added that there needs to be a better understanding of how privacy impacts their development.124 One interviewee warned that “if privacy is violated, especially during childhood, when key aspects of psychological and emotional life are being developed, this can jeopardise the formation of social and political individuals.”125 Another echoed this concern, stating that “children who are being surveilled feel that they cannot actually express themselves freely, in an independent way. And this might actually affect their development and the way they put their personalities out there in the world.”126


"[Privacy] allows children to safely develop their personality, to find out who they are.”127

Many interviewees emphasised that privacy enables the exercise of other rights, including protection from violence, strongly supporting the idea that privacy has a protection element to it. This protective element was particularly stressed in so far as it relates to children from disadvantaged and marginalised groups. Interviewees pointed to the link made between privacy and safety in General Comment No. 25 of the UN Committee on the Rights of the Child, which states that “privacy is vital to children’s safety”.128

Digital privacy advocates also suggested that the “privacy versus protection” polarisation might be partly due to “a perception that privacy is somehow abstract and hypothetical, getting in the way of the concrete right to protection of children”.129 They were concerned by the suggestion that those who have nothing to hide should not fear the weakening of encryption. To this perception they forcefully responded that a preference for encrypting communications to keep them private does not in and of itself indicate any harmful activity. Overall, they strongly emphasised the importance of privacy as a fundamental right which is not inferior or secondary to protection.

Among the most contentious applications of children’s right to privacy that emerged from this research was in relation to that of survivors of child sexual abuse. A number of child protection advocates argued that there tends to be a one-sided view of privacy in the debate around encryption, that treats encryption as wholly positive in terms of promoting privacy. They felt that too little attention is paid to the way encryption threatens the privacy of those who have been sexually abused: “What about the rights of victims whose images are being spread using encrypted channels? What about survivors who know that their images have been repeatedly shared?”130


"I don’t know who’s seen my images, I don’t know who will ever see them. I don’t want anybody to see them.”131

An interviewee with lived experience of online child sexual exploitation and abuse emphasised that “in order to protect children’s privacy, we need to be able to identify and remove images [of online abuse]”, adding that they would want all those involved, from technology companies to law enforcement, to be doing “every single thing they possibly could to get those images down before anybody saw them.”132

Across the full spectrum of interviewees who took part in this research, the focus and emphasis of the privacy issues that they addressed varied significantly, but there was a shared recognition that children’s privacy matters and is a legitimate concern in regulating encryption.

 

Understanding children’s perspective on privacy 133

Researchers working with children have warned that “it is vital not to confuse interpersonal with institutional and commercial contexts for privacy, for these contexts differ hugely in who or what one might seek privacy from.” They have pointed out that in the common discourse around privacy and children, for example when children are seen to lack a sense of privacy because they share personal information freely with others, or when parents are concerned about grooming, “the focus is children’s interpersonal privacy and its safety implications”. Children then tend to “(over)extend what they know of interpersonal relations to the operation of platforms. For example, they might talk trustingly of Instagram because so-and-so’s father works in technology, and he would surely play fair. They assume ethical reciprocity: if they would never track someone without their knowledge or keep images against someone’s will, why would a company?” Children also seem to assume that the way in which they keep their information private from other people (pseudonyms, ghost mode, incognito search, clearing one’s history) also keeps it private from companies.

When 11- to 16-year-olds in the UK were encouraged in workshops to think beyond e-safety to how data is processed by schools, doctors, search engines and social media platforms, their attitude changed. “Their confident expressions of agency and expertise would falter, and they would say, outraged: it’s creepy, platforms shouldn’t be poking around in the online contacts, I want to control who they share my data with and, most tellingly, it’s none of their business!”

 

Encryption and the voices of survivors of child sexual abuse


“It’s very easy for people to make assumptions about what we would like orwe would say.”134

In a 2021 global survey about sexual violence, 54 per cent of respondents said that they had experienced online sexual harms as children.135 But survivors are not a uniform group; they are diverse, have varied experiences and varied views about all issues, including encryption and online child sexual abuse.

In engaging with survivors throughout this research, some felt that there is already a significant emphasis on privacy - though not necessarily the privacy of victims and survivors - but that not enough attention is being paid to online safety. There was a recognition that there is a consensus about the horrific nature of online child sexual abuse and exploitation and a desire to address it and that victims and survivors’ rights must be upheld in achieving this. However, a concern that emerged during interviews was that the severity and urgency of online abuse can be downplayed in how the issue is discussed. As one interviewee explained: “A lot of people do not fully understand the nature of what we’re dealing with here, the sophistication of the offenders [...] And most of them have never had to deal with victims or survivors.”136

Several participants identified conspicuous examples of victim-blaming, particularly in public-facing discussions. For instance, one interviewee expressed their dismay at how a radio magazine programme presented the issue as “perpetrators grooming children online and coercing them into sexually abusing themselves”.137 “Now just think of that language, just think of it. You’re talking about children’s rights. Their right to be safeguarded is key. And it is actually our duty as adults to safeguard children. Children do not go around sexually abusing themselves. So where do we start with children’s rights? We need to start with language.”138

Many interviewees argued that victims and survivors’ voices should be heard more in the debate. They emphasised that even though there are sizeable organisations that argue for the benefit of those who are or have been abused, very rarely are they actually led by people with lived experience. “We need to find a way to include the voices of victims and survivors. Pure, not diluted or interpreted. [...] I’ve seen far too many professionals going around calling themselves ‘survivor consultants’ or ‘safeguarding consultants with expertise in working with victims and survivors’ being consulted by [tech companies] [...] That’s second-hand, it’s that person’s view on it, through their filtering, with their bias. It is not the true, pure voice of the victim and survivor.”139

One survivor explained, “My voice has been heard in this debate, because I’ve chosen to speak out. But I don’t hear, I don’t see very many other people with lived experience having the opportunity to do that at all [...] There has been some engagement [with tech companies], but it was initiated by me. It’s still, if I can say, fairly defensive on the tech side. It’s not collaborative in any way with victims and survivors, which is quite disappointing because it’s such a big issue for us.”140

This consensus on the need for meaningful inclusion of survivors in reform processes was clear and unambiguous, but it was not a simple expression of support for any specific outcome. Some survivors of child sexual abuse emphasise the need for stronger technological development to address online child sexual abuse material. As one interviewee explained, “For me, the ultimate goal would be for content to be pre-screened prior to upload or sharing. And then it isn’t on the platform, it doesn’t see the light of day.”141 Other people with lived experience, by contrast, are staunch privacy advocates who are finding it offensive that abuse survivors are being used, as they see it, to “further a political surveillance agenda”. They worry that current proposals to protect children online leave the door open for abuse of power and that they would push harmful activities underground, making them more difficult to detect.142


The role, possibilities and limitations of technology


“This is a technology and society debate that we haven’t really been having so far [...] Technology’s kind of happened, the internet’s kind of happened [...] and you get to a crisis point where we don’t know how to have that debate, and that’s where the polarisation comes.”143

“It is a myth that if you just make the law, then the technologists will figure it out.”144

The role and potential for technology in tackling online child sexual abuse cuts across the debate on how to regulate the digital space in a way that respects human rights, and held a prominent place in the interviews conducted as part of the research for this report.

There was consensus across the spectrum of interviewees of the central role of technology in addressing this issue. Interviewees approaching the issue from a child protection perspective recognised that child sexual abuse is a complex problem with many causes, but stressed that technology plays a key role in the problem. Technology is directly facilitating abuse, enabling the spread of child sexual abuse material on a vastly higher scale than has been possible before. More indirectly, it can also contribute to a culture of normalisation of abuse and the sexualisation of children. Building on this perspective, these interviewees argued that the strong technological aspect must be addressed and technical solutions developed.145

Several interviewees thought the focus on technology flows naturally, at least in part, as a consequence of privacy advocates’ efforts to find the least intrusive option in terms of interfering with the right to privacy. Suggesting a way forward, one interviewee analysed the problem in this way: “I can’t think of anything more necessary than protecting a child from sexual exploitation and abuse. So let’s have this debate. Let’s look for those technologies that can make legitimate inroads into privacy, but don’t impair the essence of this right.”146

When looking for these solutions, some suggested that companies, particularly those that are very large and politically influential, could research new technologies, consult with governments about what these technologies would look like in practice, and perhaps even get to a point where they can test some of the empirical claims being made.147

There was also a note of caution, however, from some organisations working on the issue from a children’s rights perspective that technology cannot be a silver bullet, but that given the rhythm of change in the digital world, some technological solutions are needed: “We need to ensure that we have the tools at our disposal that are as good and as modern as the environment that children are inserted into”.148

The caution about overstating the potential role of technology found its strongest expression among those who warned against “techno-solutionism”. They warned of the limitations of the ability of technology to address such a complex problem as online child sexual abuse while upholding fundamental rights.

A significant concern that emerged from interviews was that a focus on technology - and specific technologies in the context of encryption - as the solution risked obscuring the nature of the wider issue.149 One interviewee framed encryption as a tertiary part of the discussion. They identified the primary level as being about a detailed and comprehensive understanding of the problem of sexual abuse of children online and defining the outcomes that should be achieved in addressing abuse. At a secondary level, they saw a variety of solutions, some of a technological nature and some that were not, that could address aspects of the problem. They considered that encryption, particularly end-to-end encryption, comes into play at a third level of balancing the possible solutions and deciding on those that are most effective. Another interviewee similarly argued, “We need to reframe the debate: what are we actually trying to achieve? Different policy options can be used to try to achieve different outcomes [...] Identifying images is only a means to an end. A higher number of images is not really a key metric in determining success or failure.”150

A connected theme that emerged was a challenge to the idea that technology can be a quick fix. There were concerns that this idea can lead to broad claims, without the necessary supporting evidence, regarding what various technical proposals can achieve in terms of accuracy and security, and the extent to which they are rights-compliant. One participant stated, “There is an overfocus on encryption in the sense of ‘we can’t do much against abuse because of encryption’ […] and an overbelief in what technology is even able to achieve.”151 Another concluded, “It’s a myth that if you just make the law, then the technologists will figure it out.”152

This expression of the limits of what technology is capable of achieving to address online child sexual abuse and exploitation was most clearly stated in examining particular technological proposals:


“The technology on prevention doesn’t exist yet. When you look at things like grooming, for example, the notion of trying to predict what language somebody might use… if we can’t do that in real life, which we can’t - we can’t unfortunately predict what language somebody with that intent would use - then technology can’t do that either because the data and input obviously has to come from the real world. So I would definitely say that on prevention, it is particularly dubious to turn to technology for a solution.”153

Among those interviewees who were critical or cautious about the possibilities of technology to address online child sexual abuse and exploitation, substantial debate emerged about the role that specific technologies could play and who can have a legitimate role in employing these technologies.


The role of law enforcement

One interviewee who approached this issue from the perspective of the right to privacy, explored the potential for the use of existing technology and existing powers that would not require further legislation or regulation in many jurisdictions:


“[T]here are many technologically-based investigative techniques right now that law enforcement have access to that do not require the breaking of encryption. So, for instance, if they have a particular suspect, they can get a warrant to seize their device and then look at what’s on the device itself. Or they can get a warrant to look at the metadata of particular communications.”154

This approach that relies on the use of law enforcement powers by law enforcement authorities was a common theme across several interviews. A challenge that was posed in the context of online child sexual abuse and exploitation, however, was addressing the scale of the abuse. Looking at the ubiquity of the Internet and the proliferation of illegal material, some interviewees explained that “you realise that you can’t moderate your way out of that with just people checking”, and that some intensive intervention in the form of automation is necessary.155

For some, this challenge is unavoidable if the matter is treated as within the remit of law enforcement. Others questioned this framing, in particular arguing that the narrative of “stranger danger” is not supported by evidence.156 They further argued that if child sexual abuse is more often than not perpetrated not by strangers, but by family members and others known to the child, for example teachers and religious figures, then proposals for protecting children might need to focus more on the role of law enforcement in identifying these perpetrators, and less on the use of automation to detect child sexual abuse material in all private communications. Others, sometimes acknowledging the dangers of relying too heavily on the “stranger danger” narrative, sounded a note of caution about the capacity of law enforcement to fulfil their role in general.

One interviewee from the UK, who has been working in this space for almost 25 years, said: “The police service has deteriorated in the last 10 years [...] The police were getting a lot better, but unfortunately that has gone back and that, I think, is mainly a result of lack of funding and very experienced officers being laid off because they’re more expensive [...] But experience is hugely valuable, it’s about mentoring new officers etc.”157 The interviewee also stressed the importance of investing in providing a standard level of training to law enforcement: “Some teams I’ve dealt with have been absolutely abysmal. Some have been absolutely fantastic. So it’s a bit of a lottery.”158

Training has been flagged as particularly important where police officers need to speak directly to children who are potential victims of abuse. Another UK-based interviewee explained the lack of sensitivity and trauma-informed interviewing techniques, “[T]hey are called to the school because the child has got an image on their phone. How do they have that conversation? They don’t know. And it’s not because they don’t want to know, it’s because we’re cramming their training in such a short period of time.”159 A participant warned that, if funds are lacking with regard to basic features like officer training, it cannot be expected that the police would be able to apply more innovative investigation methods, for example going undercover in video games and using the in-game microphone and chat in order to speak to children in confidence and identify instances of abuse.160

This assessment of a deterioration in law enforcement’s capacity to address child sexual abuse and exploitation was also met with a wariness of overly empowering law enforcement entities:


“There is an overarching trend across the European region and globally for a creep of power for law enforcement and a dilution of checks and balances on that power.”161

This concern was particularly evident in discussions about marginalised children, who are more likely to have negative experiences of policing, including racism. Some participants warned that technology-enabled police surveillance of disadvantaged communities would worsen injustice and would contribute to a climate of impunity.162 One interviewee perceived a lack of consistency at the European level in discussions about law enforcement and artificial intelligence, on the one hand, and technologies for detection of child sexual abuse, on the other. “I would say there is very strong agreement at the moment [regarding the EU AI Act proposal] that law enforcement deploying AI is high-risk and needs to be heavily regulated. So it’s extraordinary that in the [EU CSA Regulation proposal] we then have law enforcement using different degrees of AI with the most vulnerable children”.163


The wider ecosystem

The limitations of detection technologies and the practical restrictions on what law enforcement can achieve even with these technologies led a number of interviewees to call for a systems approach to the problem of online child sexual abuse. As one participant explained, “The more we learn about it, the more we realise that it requires lots of different interventions. [...] There is not one magic thing. You should be doing everything.”164

An opportunity for consensus emerged from interviewees when the value and merits of any particular technological application were put into context.


“System design and the design of the services also play a huge part. And, in fact, many of these services could make relatively small adjustments - whether adults can contact children directly, whether they are able to befriend or follow a child - you know, some of these kinds of designs as a way of preventing grooming pathways.”165

This recognition that no individual application of technology will prevent and secure redress for online child sexual abuse and exploitation, but that many small adjustments in conjunction can be effective, sets out a space where the potential for consensus could be explored.

Beyond prevention by design and the interconnectedness of the online space, many participants emphasised the need to pay more attention to the various actors in the wider ecosystem. Some suggested that the excessively narrow focus on finding a technological silver bullet is a product of politics: it is more convenient to put forward proposals to tackle abuse without seeming to violate human rights than to recognise that there are still many unanswered questions and that long-term effective solutions to what is ultimately a societal problem are difficult to achieve.

Therefore a number of interviewees called for some honest conversations about the need for state as well as business investment at various levels. They identified schools and the health sector as vital actors, and suggested that there should be an increased focus on: digital literacy, particularly among young people to make them better understand the risks of generating material of themselves and sharing it with people they know; awareness raising among parents about how technology might be used by their children, as well as better equipping doctors and other health professionals to identify the physical and psychological signs of abuse.


"Social workers, teachers, we’re all letting victims and survivors down. And that’s not because we don’t have the will [to fight against abuse], it’s because we don’t have the resources to.”165

"I didn’t get therapy for nine years after my experience [of abuse]. That’s never ok. There need to be resources put into recovery as well.”167

Social services in particular were the focus of some animated discussions in interviews. A clear need for investment was identified by many. As one interviewee who worked as a social worker in the UK public sector before transitioning to the charity sector explained, “I couldn’t make a difference in our political climate. Experienced social workers were leaving left, right and centre. Good ones move on. [...] Whenever you have austerity, the first thing that goes is training. The second is staff morale.”168 A particularly important area which deserves considerably more attention, as pointed out by an interviewee with lived experience, are recovery services.169

All these investments, it was said in some interviews, should be complemented by deeper research into what drives the behaviour of abusers, starting, for example, with a real questioning of the phenomenon of sexualisation of children, which - as some survivors have emphasised - has been hugely profitable for the advertising, fashion and entertainment industries.170 Some also suggested that there should be interventions into known offenders while they are incarcerated or on probation, and a much stronger focus on rehabilitation in the criminal justice system.


Beyond self-regulation

The role of online platforms - particularly, but not exclusively, large technology companies - has been part of the debate about online regulation for decades. Diverse views emerged from the interviewees that took part in the research for this report about the best way of achieving effective online regulation, but from first principles there was a great deal of consensus.

There was broad agreement that under international human rights standards, States have a duty to respect, protect and fulfil children’s rights, which applies in the context of business activities.

There was also a general consensus that the impact that platforms have on society is so significant that the era of self-regulation is over. As one interviewee argued, “There does need to be a degree of oversight, and democratic oversight is preferable in many cases.”171

There was also agreement that there is a lack of uniformity or transparency regarding the way that platforms tackle child sexual abuse material in the absence of regulation. Interviewees saw a discrepancy that should be addressed and identified the need for clear guidance to the companies to tell them what is expected of them and how they are supposed to do it. There were strong arguments in favour of consistency and accountability. Advocates whose work focused specifically on the internet saw the values of openness and trust as essential for the Internet to flourish and that technology must be trustworthy and secure for this to be achieved.172

Beyond this broad basis of agreement, divergence began to enter the frame around the precise role and functioning of regulation, including where to place the burden for action. A common trend that emerged from most privacy and technology-focused actors was that if too great an emphasis is placed on the responsibilities of businesses to detect criminal activity, particularly related to child sexual abuse, there was a risk of privatising law enforcement functions. They were concerned about the shirking of responsibility on the part of democratically-elected governments and the passing of the buck to politically unaccountable platforms. They warned that this would lead to a dependence on monopolistic tools built by private actors, to the detriment of traditional methods of investigation and prosecution.173

A similar concern that emerged from interviews was that where platforms are overly empowered, this can have an impact on disempowering other services, such as social services and education actors. An overfocus on platforms would lead to a narrow concern with technological solutions and a corresponding failure to fully take into account the roles that other services play, their needs and how they interact in the wider ecosystem.

By contrast, interviewees who emphasised the focus on technology as natural tended to highlight that, ultimately, the tools that private companies build benefit law enforcement, as they are being used to report child sexual exploitation and abuse to authorities.174


Beyond Europe and North America

For laws to be effective, they must be well tailored to national contexts and regulatory structures. The same law transplanted from one jurisdiction to another can also have significantly different impact and implementation. As one interviewee expressed the challenge: “[t]here is the danger of replicating legislation from one jurisdiction in another. It’s always important to have these widespread, public, transparent consultations, in order to develop legislation that’s tailored to each jurisdiction.”175

One participant highlighted that those working outside Europe and North America face a particular set of challenges in dealing with the issue of platform regulation.176 Since the Big Tech are mostly based in the US and Europe, most of their approaches and resources are directed towards these geographical areas.177 Research has highlighted the phenomenon of “design discrimination”, whereby some children are afforded less privacy and less protection on a platform than other children on the same platform, depending on where in the world they live.178 Consequently, there needs to be substantially more engagement between platforms and countries outside Europe and North America, which form a high proportion of the user base, in order to take diverse contexts and specificities into account. For example, the technological solutions that platforms might adopt in order to protect children’s rights online need to be compatible with the wide variety of devices that children use across the world. Crucially, this includes low-end devices. Another concern is around children’s access to the Internet. An important example here is the practice of zero-rating particularly in developing markets: offering packages that provide cost-free access to particular applications and services. The platforms that children have free access to will in practice control the flow of information. Whether these platforms are encrypted or not will have a disproportionate impact on children if they are not able to access alternatives.

The interviewee argued that there is a particular tension at play. On the one hand, countries outside the Anglo- and Euro-centric spaces need to make more efforts to put in place regulation to hold platforms to account. Otherwise, there is a real danger that, in jurisdictions where regulation is less advanced, platforms will not extend the same protections that they are extending to children from countries “closer to the decision centre”. At the same time, regulation is a difficult and slow process, so realistically, in some jurisdictions it will constantly lag behind platforms’ initiatives. In this case, platforms must still be pressured to take proactive steps in protecting children’s rights in the digital environment. This could be achieved, for example, by making creative use of legislation that is not specifically about encryption, like child protection or consumer protection laws.

***

 

Footnotes

97 CRIN and ddm interview with WeProtect Global Alliance, 19 August 2022.


98 POLITICO, Europe’s thermonuclear debate on privacy and child sexual abuse, 20 November 2020, see here.


99 CRIN and ddm interview with Electronic Frontier Norway, 15 September 2022.


100 CRIN and ddm interview with ISOC, 30 August 2022.


101 CRIN and ddm interview with 5Rights, 5 September 2022.


102 Ibid.


103 CRIN and ddm interview with NCMEC, 3 November 2022.


104 CRIN and ddm interview with IWF, 3 November 2022.


105 NCMEC, CyberTipline 2021 Report, see here.


106 Kardefelt-Winther, D. et al., Encryption, Privacy and Children’s Right to Protection from Harm, 2020, UNICEF Office of Research – Innocenti Working Paper 2020-14, p. 9, see here.


107 Dan Sexton (IWF), Not all Encryption is the same: social media is not ready for End-to-End Encryption, 14 March 2022, see here.


108 See the chapter on recent legislative proposals.


109 NCMEC, Battle won but not the war in the global fight for child safety, 11 May 2022, see here.


110 CRIN and ddm interview with NCMEC, 3 November 2022.


111 Ibid.


112 Ibid.


113 Meta, Preventing Child Exploitation on Our Apps, 23 February 2021, see here.


114 CRIN and ddm interview with the Marie Collins Foundation, 22 November 2022.


115 CRIN and ddm interview with NCMEC, 3 November 2022.


116 Ibid.


117 Ibid.


118 Levy, I. and Robinson, C., Thoughts on child safety on commodity platforms, 2022, p. 3.


119 Kardefelt-Winther, D. et al., Encryption, Privacy and Children’s Right to Protection from Harm, 2020, UNICEF Office of Research – Innocenti Working Paper 2020-14, p. 9.


120 CRIN and ddm interview with EDRi, 9 August 2022.


121 CRIN and ddm interview with IWF, 3 November 2022.


122 CRIN and ddm interview with EDRi, 9 August 2022.


123 CRIN and ddm interview with the Alana Institute, 22 September 2022.


124 See, for example, the idea that Art. 8 of the European Convention on Human Rights protects the right to personal development, whether in terms of personality or of personal autonomy: European Court of Human Rights (Registry), Guide on Article 8 of the European Convention on Human Rights Right to respect for private and family life, home and correspondence, updated on 31 August 2022, p. 25, see here.


125 Answer provided by a researcher at the Alexander von Humboldt Institute for Internet and Society to CRIN and ddm’s questionnaire.


126 CRIN and ddm interview with the Alana Institute, 22 September 2022.


127 Answer provided by Bits of Freedom to CRIN and ddm’s questionnaire.


128 UN Committee on the Rights of the Child, General comment No. 25 (2021) on children’s rights in relation to the digital environment, CRC/C/GC/25, 2 March 2021, see here.


129 CRIN and ddm interview with EDRi, 9 August 2022.


130 CRIN and ddm interview with ECPAT, 23 August 2022.


131 CRIN and ddm interview with the Marie Collins Foundation, 22 November 2022.


132 Ibid.


133 This text is based on findings and quotes from: Prof Sonia Livingstone OBE, “It’s None of Their Business!” Children’s Understanding of Privacy in the Platform Society, 2020, see here.


134 CRIN and ddm interview with the Marie Collins Foundation, 22 November 2022.


135 WeProtect Global Alliance, Global Threat Assessment 2021, p. 6, see here.


136 CRIN and ddm interview with WeProtect Global Alliance, 19 August 2022.


137 The programme was Woman’s Hour on BBC Radio 4, 18 November 2022, see here.


138 CRIN and ddm interview with the Marie Collins Foundation, 22 November 2022.


139 Ibid.


140 Ibid.


141 Ibid.


142 Alexander Hanff, Why I don’t support privacy invasive measures to tackle child abuse, 11 November 2020, see here.


143 CRIN and ddm interview with ECPAT, 23 August 2022.


144 CRIN and ddm interview with Privacy International, 26 September 2022.


145 These points were made mainly in CRIN and ddm’s interview with WeProtect Global Alliance, 19 August 2022.


146 CRIN and ddm interview with a civil society representative, 12 August 2022.


147 Some of these points were made in CRIN and ddm’s interview with Ian Brown, 6 October 2022.


148 CRIN and ddm interview with the Alana Institute, 22 September 2022.


149 CRIN and ddm conversation with a civil society representative, 1 June 2022.


150 CRIN and ddm interview with 5Rights, 5 September 2022.


151 CRIN and ddm interview with EDRi, 9 August 2022.


152 CRIN and ddm interview with Privacy International, 26 September 2022.


153 CRIN and ddm interview with the Centre for Democracy and Technology (Europe Office), 13 October 2022.


154 CRIN and ddm interview with Privacy International, 26 September 2022.


155 CRIN and ddm interview with IWF, 3 November 2022.


156 WeProtect Global Alliance, Global Threat Assessment 2021, p. 6.


157 CRIN and ddm interview with One in Four, 14 November 2022.


158 Ibid.


159 CRIN and ddm interview with the Marie Collins Foundation, 22 November 2022.


160 One example given was the Undercover Avatar project by the youth protection association L’Enfant Bleu: see here.


161 CRIN and ddm interview with the Centre for Democracy and Technology (Europe Office), 13 October 2022.


162 For a discussion of the risks posed by data-driven approaches to policing, see: BBC, Civil liberties group says data not silver bullet to reduce crime, 24 November 2022, see here.


163 CRIN and ddm interview with the Centre for Democracy and Technology (Europe Office), 13 October 2022.


164 CRIN and ddm interview with IWF, 3 November 2022.


165 CRIN and ddm interview with 5Rights, 5 September 2022.


166 CRIN and ddm interview with the Marie Collins Foundation, 22 November 2022.


167 Ibid.


168 Ibid.


169 Ibid.


170 Alexander Hanff, Why I don’t support privacy invasive measures to tackle child abuse, 11 November 2020.


171 CRIN and ddm interview with Richard Wingfield, 6 September 2022.


172 These points were most clearly made in CRIN and ddm’s interview with ISOC, 30 August 2022.


173 CRIN and ddm conversation with a civil society representative, 1 June 2022.


174 WeProtect Global Alliance and ECPAT International, Technology, privacy and rights: keeping children safe from sexual exploitation and abuse online - Expert Roundtable Outcomes Briefing, 8 April 2021, see here.


175 CRIN and ddm interview with a civil society representative, 12 August 2022.


176 These points were made in CRIN and ddm’s interview with the Alana Institute, 22 September 2022.


177 One example given was Facebook’s language gap in content moderation: WIRED, Facebook Is Everywhere; Its Moderation Is Nowhere Close, 25 October 2021, see here.


178 Fairplay, Global platforms, partial protections: Design discriminations on social media platforms, July 2022, see here.