Libra project, a scientist’s opinion
Interview with Maurizio Mensi, Professor of Law & Economics at the National School of Administration (SNA), professor of Information & Communications Law at the LUISS Guido Carli University of Rome and co-head of @LawLab, a research centre on Cyber & Telecommunications-Media-Technology Law.
How could the new cryptocurrency Libra affect European privacy regulations?
While Facebook and Calibra, its crypto wallet-focused entity (a subsidiary regulated by FinCEN, the US authority), have made broad public statements about privacy, they have failed to specifically address the information handling practices that will be in place to secure and protect personal information. In fact, the Libra Association does not cite any data protection mechanisms, although it specifies that Calibra was created to ensure separation between social and financial data.
It has to be pointed out that although financial data is not considered sensitive data under either the General Data Protection Regulation (GDPR) or the modernised Convention 108 of the Council of Europe, its processing does require particular safeguards, given that digital transactions can lead to the tracing of purchase data, and allow for the possibility of implementing customer profiling by providing information on how money is spent, and how decisions are made. So it is necessary to ensure accuracy and data security, data protection by design and by default, and specific legal tools to fight any risks of insider dealing and market manipulation.
First of all, the measures to protect the personal information of network users have to be clear, including the use of profiling and algorithms, as highlighted in a joint statement of 5 August 2019 issued by several data protection authorities, including the European Data Protection Supervisor (EDPS).
The Libra network has to provide end users with clear information on how their data will be used by project participants, ensuring the lawfulness of the processing and privacy-protective default settings. Facebook should only use the minimum amount of personal data required for the service, and ensure the lawfulness of its processing through data protection impact assessments. Simple procedures have to be in place, allowing the exercise of privacy rights by participants, and incorporate privacy by design principles in the development of its infrastructure. And, given the transnational nature of the activities, the Libra network would need to demonstrate that it can ensure that its data privacy policies, standards and controls apply consistently across its operations in all jurisdictions.
What societal impacts could result if the global financial system is dominated by a few players?
Maurizio Mensi: Despite its multiple benefits, the digital age poses challenges to privacy and data protection, as huge amounts of personal information are being collected and processed in increasingly complex and opaque ways. Technological progress has led to the development of massive data sets that can easily be crosschecked and further analysed to look for patterns, or for the adoption of decisions based on algorithms, which can provide unprecedented insight into human behaviour and private life.
This is especially true thanks to the fact that the global financial system is dominated by few players. So the quality of the rules and the presence of efficient regulators is crucial for economic actors and citizens alike.
As highlighted by Professor Robert Reich at the hearing before the US Senate Committee on the Judiciary Subcommittee on Antitrust, Competition Policy, and Consumer Rights on 5 March 2019, personal data processing is a powerful tool in the hands of a few giants. This raises several issues which Reich encapsulated perfectly, saying: ‘First, it stifles innovation. Big Tech’s sweeping patents and copyrights, huge and growing data collections, dominant networks and platforms, capacities for predatory behavior have become formidable barriers to new entrants. The second problem is that such large size and gigantic capitalisation normally translate into political power. They allow vast sums to be spent on lobbying, political campaigns, and public persuasion. A third problem is the ease by which misinformation can be ‘weaponised’ through these giant networks, as long as Facebook and YouTube possess a virtual duopoly over information flows. The fourth problem is privacy. Big Tech is amassing huge amounts of data about the personal behavior of virtually everyone. The sheer quantity of data has become an entry barrier of its own. So, the concentration of data, information and power in a few giant corporations threaten to undermine civil society’. I think this summarises the situation nicely.
That’s why proper antitrust enforcement unlocks beneficial competition for the protection of user privacy, and averts the need for additional regulation. So, a good balance between regulation on data privacy and antitrust is essential, both for the market and for a healthy economic system.
Why is data privacy important in the digital economy in particular?
Maurizio Mensi: Because data privacy is a cornerstone of personal freedom, particularly in a data-driven economy, where data is such a valuable asset.
The advent of digital technology is shaping a new way of life, where social relations, business, private and public services are digitally interconnected. All of this is underpinned by a huge amount of data, much of it personal data. In so-called surveillance capitalism, data privacy and antitrust enforcement are closely intertwined for the benefit of citizens, and can unlock beneficial competition for the protection of user privacy, and avert the need for additional privacy regulation in the economy of digital platforms. Data processing techniques may also have an impact on economic governance and democratic processes when used to influence elections, for instance, through the ‘micro-targeting’ of voters’ communications. In other words, while privacy was once perceived as a right to protect individuals against unjustified interference by public authorities, in the modern era it may also be threatened by the powers of private actors.
Maurizio Mensi: The so-called digital platforms are the interface that allows everyone to exchange ideas and knowledge, goods and services, a virtual market based on data that drastically reduces any form of intermediation and accelerates a process of economic, social and cultural transformation. The added value of the organisational and cooperative models that it introduces, where it becomes difficult to distinguish between producers and consumers, employees and the self-employed, resides in almost zero transaction costs, in the multiplication of exchanges and in the ability to satisfy the individual demand for any service rapidly and effectively — in many cases by non-professional operators.
All this is based on data, including personal data. Data collected and analysed with increasingly advanced techniques, is the fundamental resource of the new order, endowed with flexibility, autonomy, decentralisation. When digital platforms are more powerful than states, the guarantee of a level playing field lies in the power of rules, particularly data privacy rules. Rules rebalance power relations, guarantee and monitor compliance, protecting the general interest together with individual rights. This is the only way to ensure innovation is not a source of new discrimination, but a tool at the service of collective well-being.