The concerns surfacing ahead of proposed COVID-19 tracing apps show that privacy isn’t dead: far from it. Fears of government overreach and corporate tracking (if left unaddressed) could doom the apps to failure: for the apps to be useful, experts say they need at least 60% of the population to adopt them.
Adoption rates will vary among countries depending on whether the use of tracing apps is obligatory or voluntary. In the US and EU, a greater focus on civil liberties means that people are much less likely to download an application that has any perceived risk of surveillance, regardless of whether that risk comes now or later. Trust issues will be “make or break” for COVID-19 tracing apps, and this has led to a serious discussion of just how much privacy we are willing to trade away in order to protect ourselves.
While the underlying need to make this difficult choice appears valid, it is actually a false trade-off. Yuval Noah Harari explains that “when people are given a choice between privacy and health, they will usually choose health,” a binary question that nobody wants to answer. Fortunately, technical controls that can enforce the legal and ethical rights underlying privacy are now available, which allows the choice to be reframed from an “either/or” answer to “both”.
Privacy and Trust Issues
Interest in COVID-19 tracing apps began when governments realized that a vaccine would not be available quickly, and that ongoing lockdowns would harm the economy. Current proposals take either a “centralised” or “decentralised” approach: the former is intended to be more protective of privacy, while proponents of the latter argue that it provides “more insight into Covid-19’s spread.” These two approaches have created a massive debate over privacy, data use, and trust.
The issue is that governments face a real and urgent problem: they need to be able to roll out COVID-19 monitoring apps to manage the spread of disease with an exit strategy in mind, but without trust, people won’t use the apps, and without widespread adoption, the apps are useless. Moving past the issue requires a realisation that everyone is framing the problem as a false, binary choice, which forces a trade-off between privacy and data use.
A New Hope: Embedded Technical Controls
As technology and law have developed, newer approaches are emerging that mitigate privacy risks in a way that enables data use. These approaches, such as GDPR-compliant Pseudonymisation and data protection by design and by default, do not degrade the accuracy of data, while providing superior privacy protection. These technical controls are embedded into the data and flow with it to provide data protection in use. Using these kinds of dynamic technical controls and a functional separation approach to data processing, it is possible to process information about people without knowing who those people are. This allows both data utility and protection of privacy while data is in use.
This risk-based approach provides numerous benefits to organisations, governments, business, and society, which means that governments can roll out COVID-19 tracing apps that ensure privacy without compromising the potential value of these tools.
When moving away from traditional models of privacy and data use, it is crucial to remember that just because not all regulators are aware that new technical controls are available, it doesn’t mean that they don’t exist. When a binary choice between data utility and privacy is pushed to the forefront of the debate, new solutions can be overlooked, simply because they sit outside the traditional approaches to the issue.
All big crises provide big opportunities for significant positive change, and the next steps we need to take include the adoption of an integrated framework of data use alongside privacy protection. This kind of data use can ultimately provide more benefits to society, and we need to be ready to reap those rewards.
Author: Magali Feys, Chief Strategist -, Ethical Data Use, Anonos