Surveillance and digital privacy during Covid-19
This article is part of a feature series exploring how the Covid-19 pandemic and the measures to prevent its spread are impacting the human rights of under-18s.
In the final days before lockdown was introduced in the United Kingdom, CRIN hosted a panel discussion on surveillance and facial recognition at the Tate Modern where we addressed some of the risks they pose for children's rights. Since then, the Covid-19 pandemic has forced many people to move their lives almost exclusively online, as adults began working from home and schools resorted to online learning. Such big changes, however, raise basic questions.
When online classes were introduced, were proper safeguards put in place? Does studying online ensure school children’s privacy? What are the concerns around the collection of children’s data and their surveillance? And now in post-lockdown, how are surveillance tools shaping our children’s lives, from parental use of spyware to the collection and monitoring of health data?
To find out more, CRIN spoke with three experts on the issue:
three members of the youth organisation, the European Council for Digital Good (ECDG) - Philippine (France), Elisavet (Greece) and Diego (Italy);
Melanie Penagos, working on UNICEF’s AI and Children policy project; and
Amelia Vance, the Director of Youth and Education Privacy and Senior Counsel at Future of Privacy Forum.
This interview has been edited and condensed for clarity.
Are there any specific concerns regarding the surveillance and data collection of under-18s in response to Covid-19?
Melanie: As children around the world have shifted to remote, online learning and are spending more time online while under lockdown, their digital footprint continues to grow. Not many data privacy regulations specifically relate to children, and those that do may be fragmented or open to interpretation by the parties that process this data. Children’s data requires a higher duty of care and the pandemic is underscoring the need to strengthen government data protection laws by including child-specific safeguards and enforcement mechanisms for children at different ages and stages of development. Furthermore, education technology companies must be transparent about how they use children’s data and clearly communicate their company policies in a way that students and educators can understand.
ECDG: Schools use online platforms to do online learning, but what most people are unaware of - even people working for the schools - is that in most cases these platforms collect their data. But there is no way to fully eliminate this problem because this would mean that the companies would no longer make money. Thus, schools should educate their personnel and should think very carefully which platform they use to be as safe as possible for their teaching community. They should also educate the students and their parents to make them aware of what the platforms are doing and how they’re using their data.
Our main concern is the fact that many children, teenagers and even adults do not know that they are being surveilled. How can you ask someone to give their consent if they are not even aware of what the companies are doing? The terms and conditions of the platforms make it quite impossible to find out, since the texts are very long and use complex language which is incomprehensible for the majority of the users. We also do not have a choice concerning our answer since they ask us a rhetorical question: if we want to use their service, we have to consent by saying “I agree”, otherwise we cannot use their platform.
Amelia: A lot of education technologies include some level of learning analytics which are a collection of data and metadata in order to track a student's progress or provide information to the school or to the company that that product is working. This might involve things as innocuous as crash reports. It can also send information about what that user was doing and those reports could be identifiable.
Then you have more proactive learning analytics, so you have information that's collected that's meant to give insight to the school, the teacher, or the company itself about how the product is working and how the student is doing. So it may provide information about how many students are actively using that particular product and how much they have learned. So you might have less privacy invasive information. But what you most likely see is personally identifiable learning analytics that can provide a variety of insights.
You also have parents as an actor that potentially affects children's right to privacy. [They’re] worried about the amount of time that their children are spending online. We've heard that there's been a big growth of parents downloading spyware that they install on their child's device. There are lots of companies that offer this and it's really difficult for a lot of parents because it's often framed as “you're a bad parent if you're not watching what your child does online.” Unfortunately, many of these companies have a really bad history of security and many have had breaches in the past. Not to mention, of course the fundamental rights of the child themselves to have some level of privacy.
We’ve seen extraordinary measures introduced in response to the pandemic, such as health monitoring through contract tracing apps. Are there any specific concerns about the collection and use of under-18s’ health data during the pandemic?
Melanie: Unfortunately, not many technologies are designed with children in mind. This is particularly problematic as public health surveillance tools to track and combat Covid-19 may be quickly rolled out and potentially bypass traditional cycles of rigorous testing. Therefore, the efforts to immediately contain the virus may cause long-term privacy risks to be overlooked. To date, there have been some documented cases of children being stigmatised or placed at risk as part of broader health surveillance efforts. However, beyond anecdotal evidence, very little is known about the long-lasting impact these surveillance measures can have on children.
ECDG: We hear, for example, about apps that can tell you if any of the people you have been in close contact with are infected with Covid-19. In this pandemic, is this ethically right and legal, since it is for good reasons to try to keep people safe and to limit infection? Who holds this private data, and for how long? Such questions are crucial, especially for minors and need to get accurate answers.
Amelia: When you have a pandemic like this, you're dealing with a health or safety emergency, and there is an exemption in the law for that, but it's not unlimited. If disclosing the identity of a student or their parents or a staff member [who was diagnosed with coronavirus] isn’t essential, then you can't. You can only disclose as much information as you feel is necessary to protect against an active imminent health or safety threat. For example, in the vast majority of cases, schools don't need to say Jane Smith has Covid-19, they can just tell the school a third grader or someone on the basketball team has Covid-19 and therefore you should take appropriate precautions.
How can we make sure extraordinary measures are not used beyond the crisis?
Amelia: The absolute best thing that children and parents and others who care about this can do is push all the people who are collecting this information, all the people who are adopting these technologies about what policies they have put in place to protect the information [collected]. How are you protecting that information? Can you get access to information about you or your child? Can you challenge information that may be inaccurate? When will the information be deleted? Is there a legally binding promise that that information won't be used down the road? How are you protecting that information? Obviously people have a million other priorities and privacy is not at the top unless people put it there.
ECDG: International laws concerning the minors’ privacy should be established worldwide, and all companies should be forced to explicitly comply with them.
Melanie: International organizations and civil society have an important role to play in advocating for government and corporate transparency in their decision-making of public health surveillance measures. Any new measures imposed during this time should be necessary, clearly defined, proportional, accountable and timebound. They must protect children’s rights and prioritize their interests, to ensure they are not left behind. Policymakers and technologists should also be mindful of the unique data collection risks for children and continuously evaluate their methods to track and better understand the pandemic.*
Are there any resources that you’d like to share on children, privacy, surveillance and Covid-19?
Melanie:
COVID-19 and children’s digital privacy: How do we use technology and data to combat the outbreak now, without creating a ‘new normal’ where children’s privacy is under constant threat? (UNICEF, April 2020)
Digital contact tracing and surveillance during COVID-19: General and Child-specific Ethical Issues (UNICEF, June 2020)
Schools and Covid-19 (Privacy International, April 2020)
Children’s privacy is at risk with rapid shifts to online schooling under coronavirus (The Conversation, April 2020)
ECDG:
Proton mail is an encrypted email server based in Switzerland, that doesn’t require any personal information which makes it very safe.
Privacy Tracker series by the IAPP.
PRIVO is an industry expert in children’s online privacy and consent.
Amelia:
The 30,000 Hours has been doing a podcast interviewing people on the intersection of the pandemic and children.
Student Privacy Compass has a ton of resources.
CRIN would like to thank Melanie, Amelia and the ECDG members Philippine, Elisavet and Diego for their time and expertise.
*Edited to add on July 2nd 2020