The European Union’s General Data Protection Regulation (GDPR), which became effective in 2016, is one of the most detailed legislative schemes in the field of data protection. This article discusses two libertarian-minded objections to its approach. First, I argue that the notion of “right” adopted in the GDPR is flawed. Second, it shows that the GDPR doesn’t protect individuals from data-hungry governments and corporations. In the end, data protection legislation makes people strong in theory but weak in practice, while making powerful private and public entities weak in theory but strong in practice.
The GDPR seeks to protect fundamental individual rights relating to the collection and processing of personal data. These include the right to access, the right to rectification, the right to erasure, the right to be forgotten, the right to restriction of processing, the right to data portability, the right to object, and the right to not be subjected to automated decisions.
Libertarian reductionism holds that human rights are natural rights and that natural rights are property rights. The nonaggression principle states that any initiation of violence, that is, any aggression against property, is illegitimate. However, some of the fundamental rights protected by the GDPR violate the nonaggression principle. For example, the right to be forgotten can be invoked by an individual to force tech companies like search engine providers to obscure results about her. The GDPR seems to adopt the view that data subjects own their personal data, but this is debatable.
For example, a user that interacts with Google’s hardware and software, thus producing personal data, is not the only owner of this data because she generated it using Google’s infrastructure. The same goes for any personal data that is produced by interacting with other people, both online and in person. Moreover, when Google shows publicly available information in its search results, it is hardly violating anyone’s property rights.
From a libertarian perspective, it is a big stretch to state that the law should give users the “right” to force companies to delete data about them because this implies that these companies are not free to use their property (their hardware and software) and public information as they wish. Similar objections can be levied against other “rights” as well. The fact that at least some of the “fundamental rights” protected by the GDPR cannot be reduced to property rights is highly problematic: in the absence of well-defined property rights, the GDPR can be used to legalize aggression against persons and entities.
The GDPR aims at protecting individuals from the exploitation of personal data, but as is often the case with state regulation, it puts individuals at danger and favors big companies and governments.
First, the GDPR understands privacy as a fundamental right, but in most cases, it has to be invoked by individuals in order to be enforced. For example, in the case of the automated processing of data, users are granted the right to ask for human intervention before a decision is taken. Given that the vast majority of people do not have the time, the resources, and the ability to actively engage with the tens or hundreds of private and public entities that handle their data, this amounts to giving controllers and processors carte blanche with regard to data processing in general and automated processing in particular.
Second, the GDPR does little to nothing against the abuse of power that may come from the state. Users’ privacy rights can be suspended or restricted every time there is some kind of public security issue or some kind of legitimate interest. For example, recital nineteen of the GDPR states,
This Regulation should provide for the possibility for Member States under specific conditions to restrict by law certain obligations and rights when such a restriction constitutes a necessary and proportionate measure in a democratic society to safeguard specific important interests including public security and the prevention, investigation, detection or prosecution of criminal offenses or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security. This is relevant for instance in the framework of anti–money laundering or the activities of forensic laboratories.
These kinds of clauses sound appealing, but they are full of empty words (“democracy,” “important interests,” “public security,” and the like) that are found multiple times in the GDPR. On the one hand, public institutions are supposed to protect individuals’ privacy rights; on the other hand, public institutions can exempt themselves from the obligations laid down in the GDPR because of “national security.” It is a bit ironic that individuals are attributed to so many “rights” that governments and companies are legally authorized to override them in a lot of different ways. Regulators are not protecting data when they make room for exceptions and give themselves and private companies the green light to ignore individuals’ privacy: they’re just making these “exceptions” legal.
Third, the GDPR is very clear in stating that the most important duty of controllers, processors, data protection officers, the European Data Protection Board, and the like is to ensure compliance with the GDPR. However, to comply with the GDPR is one thing, and to protect data effectively is another.
For example, Daniel Solove points out that GDPR consent requirements are fiction because the scale of data processing is so overwhelming that individuals cannot possibly deal with hundreds of privacy notices. Also, individuals are required to take active action in order to invoke their rights, which is something that most people will not and cannot do. Moreover, the GDPR lays down many legal grounds to process personal data that do not require individual consent, like legitimate interest or public safety. In the end, as long as private and public entities comply with the GDPR formally, they do not need to care too much about actual individual preferences and about actual data protection.
The GDPR both overshoots and undershoots. On the one hand, it overshoots because individuals are granted “rights” that may be used to violate other entities’ property; the main issue is that property rights of personal data are not well-defined. On the other hand, the GDPR undershoots because individual “privacy rights” as defined by European Union regulators are a fiction that can be legally overridden by corporations and by public institutions for a variety of reasons.
The GDPR paradox is that it gives individuals rights that they do not have while undermining their practical ability to protect personal data from powerful third parties. Conversely, private and public processors are denied legitimate property rights but are protected by law in their daily mission to take advantage of personal data. Without a clear definition of property rights and privacy in the domain of personal data, regulations can only generate confusion and paradoxes.
Andrea Togni is a Philosophy and history teacher at Liceo Medardo Rosso (Lecco).
Subscribe to our evening newsletter to stay informed during these challenging times!!