An interview with Michael Veale. He is a lecturer in Digital Rights and Regulation at University College London in the Faculty of Laws and Digital Charter Fellow at the Alan Turing Institute in London. He works at the intersection between law, emerging technology, public policy and society, with a current focus on machine learning and privacy enhancing technologies. He is also on the advisory council of the Open Rights Group. He holds a PhD on the intersection of data protection, machine learning and public values from University College London, a MSc from Maastricht University and a BSc from the London School of Economics. Michael Veale is part of the research team that developed the DP-3T approach. This interview took place on the phone on 21 April.
At the beginning of April, your research team published the whitepaper for the Decentralised Privacy-Preserving Proximity Tracing protocol, DP-3T, joining the PEPP-PT initiative. A couple of weeks ago, you publicly dissociated from the PEPP-PT. What happened decision?
Michael Veale: We became increasingly aware that PEPP-PT was not sharing information with us, and were using the privacy reputation of DP-3T, decentralised systems, and its researchers to push their own solution. We find this dangerous and flawed. We further suspected that they were misrepresenting ‘decentralised systems’ as risky whilst hiding the fact that their own systems were vulnerable to the same attacks they describe DP-3T as being weak to.
According to you, are the European and national authorities aware of the problems inherent to a centralised approach?
Michael Veale: We are working to help different governments that request help in understanding the difference between these systems. However, we are just a group of scientists — we cannot afford to hire expensive PR firms to push an unclear agenda.
The application of a technology of some kind is increasingly seen as an effective and necessary solution. What do you think?
Michael Veale: We do not make claims about how effective any Bluetooth contact tracing app could be. This is the first time something like this has been done in this context, at this scale, so our understanding is strictly limited by our models. We are hopeful it could be useful, but it will require the absolute trust of individuals that their data cannot be misused — trust centralised systems cannot provide.