28203 Bremen, de
+49 (421) 336592-55
Data Protection Risks of a Corona App: Full version of the Data Protection Impact Assessment (DPIA) now available in English
The debate about the data protection-compliant design of a corona app has intensified in recent days. The app digitally supports the so called "contact tracing" which intends to break COVID-19 infection chains by warning people who have been exposed to someone tested positive. Initially, the only goal pursued by the German government was to introduce an app with a warning functionality for those potentially infected, but in the meantime, further purposes beyond tracing are being discussed which would cause more infringements of fundamental rights. However, there are still general doubts about the effectiveness of digital contact tracing for containing the pandemic, as the discussion about false positives caused by e.g. walls, masks or varying Bluetooth signal strengths shows. The accusations that pushing such a corona app project primarily signals political actionism or that the project might accustom the general population to future tracing or tracking projects by government bodies have not yet been dispelled.
In the course of the current discussion about a 'stay at home' order exit strategy, the use of a corona app has been considered strategic in other countries and is now also being considered by the German government. The German Minister of Health, Jens Spahn, has recently switched his preference from a centralized, and from a data protection point of view riskier architecture, to a decentralized model. Austria and Switzerland have already adopted the decentralized DP-3T implementation. With the publication of a DPIA, we are pursuing the goal of informing the discussion about the far-reaching consequences of these decisions and contributing to making this app as data protection-friendly as possible.
Materials regarding the DPIA (Creative-Commons-License: Attribution, CC BY 4.0 Int.):
English version 1.6 (PDF, 97 p., 1 MB), for other languages see here: https://www.fiff.de/dsfa-corona
Discussion via FIfF-Github-Repositorium
One of the central questions relevant to data protection is: How is the purpose limitation of the overall system secured and enforced? How can misuse, especially by the operators, be prevented by technical, organizational, and legal means? It will be decisive for the success of a data protection-friendly Corona App to restrict the purpose solely to informing potentially infected persons. In our view adding other purposes such as epidemiological studies, an immunity pass function, or detailed quarantine monitoring poses disproportionate risks and infringements of fundamental rights and is therefore not justifiable.
The question of centralisation vs. decentralisation is of crucial importance for data protection due to the following circumstance: In a central architecture, an almost 'omniscient' server coordinates all procedural activities; It collects all contact events from infected users and notifies persons at risk. In a decentralized architecture, however, the server has no access to the contact events of users. It only stores non-identifying infection indicating data. The apps themselves detect possible infection events; the necessary calculations are performed on the devices of the respective users. If a government agency were to be given blanket access to contact events of infected and non-infected persons, this would not only be a considerable violation of data protection, but also a collection of data that is simply not necessary for the purpose, i.e. a violation of the principle of data minimization. "So far, the European Parliament, Germany, Austria, Ireland and Switzerland have spoken out in favour of a decentralised variant, whereas France still favours the centralised one. The FIfF would like to urgently point out the danger that a centralised system will be followed by extensive possibilities for subsequent use, which generates considerable potential for abuse." warns Kirsten Bock from the FIfF.
A decentralised model is clearly preferable to a centralised one, but it is also not free of serious data protection risks. Therefore, the FIfF now presents a model data protection impact assessment (DPIA) for decentralised architectures. In doing so we refer to a requirement under Art. 35 of the General Data Protection Regulation (GDPR), which is directed towards the future controller of such data processing. The purpose of this model DPIA is to demonstrate in a publicly accessible way the risks for data subjects. "It needs to be underlined that the data protection risks also affect persons who do not use the app themselves", says Rainer Mühlhoff, FIfF e.V. Furthermore, with this document we present recommendations for the (re)design of the app and the processing procedure as well as protective measures concerning a whole list of possible weaknesses and attacks.
"With this DPIA, we have set a new standard that others whose data processing creates high risks for fundamental rights and freedoms have to meet from now on." comments Rainer Rehak from FIfF. "And we are also showing that DPIAs must be published as a matter of principle so that society can discuss these risks in an informed manner and exert pressure on those responsible to protect our basic rights when processing data," adds Jörg Pohle, also from FIfF.
With this DPIA, now completely available in English, we intend to enrich the pan-European discussion on data protection. Data protection, not privacy, is the guarantor for the protection of all fundamental rights in the digital age.
Die Nutzung von hier veröffentlichten Informationen zur Eigeninformation und redaktionellen Weiterverarbeitung ist in der Regel kostenfrei. Bitte klären Sie vor einer Weiterverwendung urheberrechtliche Fragen mit dem angegebenen Herausgeber. Bei Veröffentlichung senden Sie bitte ein Belegexemplar an firstname.lastname@example.org.