Tuesday, August 15, 2017

Puzzling over Big Data and Data Protection Rights in a European Perspective

We are pleased to publish this guest post by Monica Cappelletti, a Post-Doc Researcher, at Dublin City University (DCU).

Puzzling over Big Data and Data Protection Rights in a European Perspective1

The Big Data phenomenon,2 the latest evolution of information and communication revolution, is radically transforming the daily lives of people. At the same time, redefines the relationship between individuals and power, whose definition itself is becoming more and more nuanced and articulated. In this context, we should wonder about the legal notion of Big Data and, consequently, what the impact of this phenomenon is on privacy and data protection rights.

From a technical point of view, Big Data is a type of data that has 4Vs: it is a high-Volume, high-Velocity, high-Value and high-Variety “information assets that demand cost-effective innovative forms of information processing for enhanced insight and decision making."3 In other words, Big Data derives from a complex data analysis process, which leads to varied techniques, such as the use of algorithms, the collection of “all data”, and the opacity of the processing.4

Focusing on the European legal framework and the new General Data Protection Regulation (Regulation EU 2016/679 - GDPR) , it is worth noting three open questions: one regarding the non-expressed definition of Big Data in legal term; another one concerning the consequences on personal data notion; and a third one dealing with the concept of data protection right as a fundamental right.

1. Big Data definition
Although there is no explicit definition of Big Data in the GDPR, it is possible to recover this concept indirectly from the notion of profiling. This “consists of any form of automated processing of personal data evaluating the personal aspects relating to a natural person, in particular to analyze or predict aspects concerning the data subject’s performance at work, economic situation, health, personal preferences or interests, reliability or behavior, location or movements” (recital 71 and article 4, GDPR).5

Given the complexity and the evolutionary nature of Big Data, the EU legislator preferred not to mention or specify this particular type of data as an autonomous one, but instead to focus on activities of the process, more precisely on those activities and analysis that are possible through this data. In some ways, we have been moving from the definition of type of data perspective (personal data, sensitive data, etc.) to the specification of the process approach, that elaborates different categories of data.

This trend of emphasizing the process in order to conceptualize Big Data6 has been confirmed recently by the European Parliament, arguing that “big data refers to the collection, analysis and the recurring accumulation of large amounts of data, including personal data, from a variety of sources, which are subject to automatic processing by computer algorithms and advanced data-processing techniques using both stored and streamed data in order to generate certain correlations, trends and patterns.”7

2. Personal Data definitions and Big Data
The shift towards the process may have an impact on the European traditional categories of data (personal data and sensitive data, articles 4, n.1 and 9, GDPR), that tend to blur. In fact, compared to classification of personal/ sensitive data, we are faced with something innovative. Although Big Data can consist of personal, or even sensitive data, new data categories are emerging, such as observed data, derived data or inferred data. These new categories are generated automatically by technology even if they are linked partially to a person. Consequently, we should wonder what kind of legal guarantees should be implemented in order to ensure the same standard of protection of fundamental rights, whenever these new data categories affect directly individuals’ private life.

Furthermore, the GDPR has specified some peculiar sensitive data, such as genetic and biometric data (article 14, nn. 13 and 14, GDPR). Regarding the latter, it is worth pointing out that this notion is broad enough to include physical and physiological features of a person, as well as behavioral characteristics. In other words, we should reshape the conception of sensitive data, since it seems to identify not only “racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership” (article 9, GDPR), but also what is linked with our behavior as a whole. In this perspective, we should, firstly, delineate what behavior is, and secondly identify the “new” line between what it is sensitive and what is not.

3. Data Protection as a fundamental right
The third open query set is the issue of data protection as a fundamental right, more specifically if it is an autonomous right or a specification of privacy right . Due to the Big Data phenomenon, we should address a different phase in the legal debate. As a matter of fact, the essence of the guarantee should be more centered towards data and right of individuals to protect and control their own personal information, considering that there is a new asymmetry between these two rights. In other words, the axis should be no longer imbalanced in favor of protecting privacy of individuals, but be increasingly in favor of the protection of personal data.

4. Concluding remarks
The multidimensional Big Data phenomenon suggests new and unprecedented fronts in the field of personal data protection. The challenge depends not only on the achievable implementation of privacy by design principles, already embodied in the GDPR, but also on the strengthening of data protection right as an autonomous and distinct fundamental right, providing for specific safeguards and guarantees increasingly.

However, since a large part of information used in Big Data analytics may concern “behavioral information” of people, constitutional and legal solutions (in terms of individuals - rectius data subjects - rights and data security requirements) should coincide with a greater awareness of each person about its rights to control its own data flow in an effective and transparent way.
As freedom of expression had shaped the democratic society, data protection right would be the pillar of future hyper-technological society.


1.  Short summary of the discussion paper presented at the 2017 ICON Annual Conference (University of Copenhagen, 5-7 July 2017).
2. Big Data is currently a hot topic that requires much scrutiny. Consider the everyday news regarding Big Data, for example, such as concerning the use of algorithm in finance or security. Recently, O’Neil, How can we stop algorithms telling lies?, in The Guardian, July 16, 2017, available at https://www.theguardian.com/technology/2017/jul/16/how-can-we-stop-algorithms-telling-lies. Regarding a preliminary legal reflection consider Zeno-Zencovich, Vincenzo, and Giannone Codiglione, Giorgio, Ten Legal Perspectives on the “Big Data Revolution”, 23 Concorrenza e Mercato 2016 (February 1, 2017). - Special Issue on Big Data (F. Di Porto ed.), 29-57, available at SSRN: https://ssrn.com/abstract=2834245.
3. For the 4Vs theory see Iafrate, Fernando, Advances in information systems set. From big data to smart data [2015]. Consider also Information Commissioner’s Office, Big Data, artificial intelligence, machine learning and data protection, 6 [2017], available at https://ico.org.uk/for-organisations/guide-to-data-protection/big-data/.
4. Information Commissioner’s Office describes aspects of big data analytics in Big Data context, see Information Commissioner’s Office, Big Data, artificial intelligence, cit., 10-14.
5. Regarding Big Data and the new Regulation refers to European Commission, The EU Data Protection Reform and Big Data (Factsheet), March 2016, available at http://ec.europa.eu/justice/data-protection/files/data-protection-big-data_factsheet_web_en.pdf.

6. In the European framework there are a lot of definitions of Big Data. However, all these notions point out the “process nature” of it. Refer to the European Data Protection Supervisor, Opinion 7/2015, Meeting the challenges of big data, 19 November 2015, 7. Recently, ID., Opinion 8/2016, EDPS Opinion on coherent enforcement of fundamental rights in the age of big data, 23 September 2016. Consider also that one of the Article 29 Data Protection Working Party, Statement of the WP29 on the impact of the development of big data on the protection of the individuals with regards to the processing of their personal data in the EU, adopted on 16 September 2014. Lastly, it is remarkable to mention that even the Council of Europe has recently adopted Guidelines on the protection of individuals with regards to the processing of personal data in a world of Big Data (T-PD(2017)01), Strasbourg, 23 January 2017.

7. European Parliament, Fundamental rights implications of big data. European Parliament resolution of 14 March 2017 on fundamental rights implications of big data: privacy, data protection, non-discrimination, security and law-enforcement (2016/2225(INI)), P8_TA-PROV(2017)0076 (Provisional Edition), March 17, 2017.
8. Information Commissioner’s Office, Big Data, artificial intelligence, cit., 12-14.
9. Biometric data definition according the GDPR “biometric data means personal data resulting from specific technical processing relating to the physical, physiological or behavioral characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data” (article 4, n. 14, Regulation (EU) 2016/679).

10. In the EU context privacy right and data protection right are expressly recognized by articles 7 and 8 of the EU Charter of Fundamental Rights. There is a wide legal debate regarding autonomous or separated nature of these rights; among different comments, please refers to Lynskey, Orla, The Foundations of EU Data Protection Law, Oxford University Press, 2015; Tzanou, Maria, Is Data Protection the Same as Privacy? An Analysis of Telecommunications’ Metadata Retention Measures, 17(3) Journal of Internet Law, 2013, 20-33.


| Permalink


Post a comment