'Tracking, tracing, trust: contemplating mitigating the impact of COVID‐19 through technological interventions' by Kobi Leins, Christopher Culnane and Benjamin IP Rubinstein in (2020) 213(1) Medical Journal of Australia 6-8.e1 comments
A false impression of technological panacea may see much needed interventions overlooked and may introduce unintended consequences and risks.
In the face of coronavirus disease 2019 (COVID‐19) limiting free movement, experts are scrambling to mitigate the profound impact that the disease is having on our lives. For many countries, this approach involves increased testing, isolation, and education about hygiene practices until a vaccine is found. To varying degrees, without much evidence as to their efficacy, countries are turning to technology to solve some of the current challenges. Increasingly, smartphone applications (apps) are being contemplated for tracking proximity of people to determine possible sources of transmission, with elements of technological solutionism. Such technical solutions require trust, and without honest and clear information about the possibilities and limitations of technologies, an app's benefits may be undermined by low adoption, or conversely a false impression of a technological panacea may see much needed interventions overlooked. For example, the Australian Government's target of a 40% uptake of the COVIDSafe app may or may not be effective in helping to control the disease, while 60% uptake is supported by independent modelling from the United Kingdom. Furthermore, such summary statistics do not clarify to the public the wide range of other factors and assumptions that must be considered in predicting the app's efficacy.
Much is being written about the different technological models and whether they trace, track and comply with privacy and human rights frameworks, including whether this information can, in fact, ever be anonymised. Fully effective anonymisation is unlikely when collecting data as granular as regular interaction with others in addition to age, gender and postcode demographics, as has been demonstrated by previous attempts to de‐anonymise data. If these data are accidentally or deliberately linked with other datasets, such as births in hospitals or the public Myki public transport dataset, anonymity is virtually impossible to guarantee. Successful uptake of new technologies requires trust. When adoption is insufficient, collective benefits are not guaranteed. Civil society in the United Kingdom called for clear and comprehensive primary legislation to regulate data processing in symptom tracking and digital contact tracing applications, including with a strict purpose, access and time limitations.6 Such regulation may improve trust.