Cart

  • SUGGESTED TOPICS
  • The Magazine
  • Newsletters
  • Managing Yourself
  • Managing Teams
  • Work-life Balance
  • The Big Idea
  • Data & Visuals
  • Reading Lists
  • Case Selections
  • HBR Learning
  • Topic Feeds
  • Account Settings
  • Email Preferences

The New Rules of Data Privacy

  • Hossein Rahnama
  • Alex “Sandy” Pentland

essay on data protection

Navigating privacy protection, new regulation, and consumer revolt.

After two decades of data management being a wild west, consumer mistrust, government action, and competition for customers are bringing in a new era. Firms that generate any value from personal data will need to change the way they acquire it, share it, protect it, and profit from it. They should follow three basic rules: 1) consistently cultivate trust with customers, explaining in common-sense terms how their data is being used and what’s in it for them; 2) focus on extracting insight, not personal identifiable information; and 3) CIOs and CDOs should work together to facilitate the flow of insights, with a common objective of acquiring maximum insight from consented data for the customer’s benefit.

The data harvested from our personal devices, along with our trail of electronic transactions and data from other sources, now provides the foundation for some of the world’s largest companies. Personal data also the wellspring for millions of small businesses and countless startups, which turn it into customer insights, market predictions, and personalized digital services. For the past two decades, the commercial use of personal data has grown in wild-west fashion. But now, because of consumer mistrust, government action, and competition for customers, those days are quickly coming to an end.

essay on data protection

  • HR Hossein Rahnama is Associate Professor with the Creative School at Ryerson University in Toronto and a Visiting Professor with the MIT Media Lab in Cambridge, Massachusetts. A recognized computer scientist known for his work in context-aware computing, Hossein is the founder and CEO of Flybits, a technology firm that helps companies synthesize digital customer experiences from enterprise data assets.
  • AP Alex “Sandy” Pentland is the Toshiba Professor of Media Arts and Sciences with the Media Lab, Sloan School of Management, and College of Computing at MIT. Sandy directs MIT’s Connection Science and Human Dynamics research laboratories, advises the OECD and UN, and co-led the World Economic Forum personal data initiatives.

Partner Center

  • Search Menu
  • Sign in through your institution
  • Advance articles
  • Author Guidelines
  • Open Access
  • About Current Legal Problems
  • Editorial Board
  • Advertising and Corporate Services
  • Journals Career Network
  • Self-Archiving Policy
  • Dispatch Dates
  • Journals on Oxford Academic
  • Books on Oxford Academic

Issue Cover

Article Contents

1. introduction, 2. the law of everything applied to everyone, 3. meaningless law on the books the tension between complete and effective protection, 4. introducing ‘site-level’ flexibility, 5. conclusion.

  • < Previous

Complete and Effective Data Protection

  • Article contents
  • Figures & tables
  • Supplementary Data

Orla Lynskey, Complete and Effective Data Protection, Current Legal Problems , Volume 76, Issue 1, 2023, Pages 297–344, https://doi.org/10.1093/clp/cuad009

  • Permissions Icon Permissions

Data protection law is often invoked as the first line of defence against data-related interferences with fundamental rights. As societal activity has increasingly taken on a digital component, the scope of application of the law has expanded. Data protection has been labelled ‘the law of everything’. While this expansion of material scope to absorb the impact of socio-technical changes on human rights appears justified, less critical attention has been paid to the questions of to whom the law should apply and in what circumstances. The Court of Justice has justified an expansive interpretation of the personal scope of the law in order to ensure ‘effective and complete’ data protection for individuals. This article argues that the attempt to make the protection offered by the law more ‘complete’ risks jeopardising its practical effectiveness and raises doubts about the soundness of the regulatory approach to data protection. In the quest for effective and complete protection, it seems that something must give.

The right to data protection enjoys a privileged position in the EU legal order. 1 The right is strictly interpreted by the Court of Justice of the EU (CJEU) and is given remarkable weight when balanced with other rights and interests. 2 While data protection sits alongside the more established right to respect for private life in the EU Charter, 3 it is data protection rather than its more established counterpart that is specifically referenced in the EU’s General Data Protection Regulation (GDPR). The GDPR, like its predecessor a 1995 Directive, has influenced the adoption of European-style data protection laws globally. 4 Recently adopted EU legislative initiatives in the digital sphere, such as the Digital Markets Act 5 and the Digital Services Act, 6 are all ‘without prejudice to’ the GDPR. 7 Data protection is, therefore, both a cornerstone of EU digital regulation as well as its international poster child and is treated as an ‘issue of general and structural importance for modern society’. 8 Yet, set against this success story of EU data protection law, recurring reservations have been expressed about both its boundaries and its capacity to achieve its objectives in practice. 9

A key concern is that EU data protection has become the law of everything applied to everyone putting compliance with the legal framework, and those charged with its enforcement, under strain. This development of the law is driven, to a large extent, by the jurisprudence of the CJEU. Scholars attribute the broad scope of the law to the need to protect fundamental rights in the context of significant socio-technical changes. 10 Since the 1970s, when data protection laws were first adopted, these laws have sought to address the risks and harms for fundamental rights that stem from personal data processing. 11 At that time, the primary focus was on mitigating the adverse effects that might follow for individuals from holding and controlling files on them and combining information across databases and computer systems. 12 Although, these concerns are still present, the technological and societal landscape has shifted dramatically. Advances in automation, such as the widespread availability of generative AI, will further unsettle the environment to which the law applies and which shapes its application.

To date, the law has expanded to absorb the impact these socio-technical changes might have on fundamental rights with the Court emphasising the need for ‘effective and complete’ data protection in its jurisprudence. This article argues that the broad personal scope of application of the law—the attempt to make the protection offered by the law more ‘complete’, in the language of the Court—risks jeopardising its practical effectiveness and raises doubts about the soundness of the regulatory approach to data protection. 13 In the quest for effective and complete protection, it seems that something must give. While a broad application of the concept of personal data is necessary to protect fundamental rights in light of socio-technical developments, the legislature may need to revisit to whom the law applies and what obligations adhere to distinct controllers under the legal framework. This inquiry also illuminates the need for further reflection and research on the relationship between the law’s scope, compliance with the law by its addressees and its enforcement by regulators.

This argument proceeds in three parts. First, it outlines why it is now argued that data protection has become the law of everything but suggests that the more significant development is the application of the law to everyone, with few exceptions to its material and personal scope of application. While existing legal literature has queried whether the law should apply to everything, much less attention has been dedicated to the question of whether everyone should be subject to the same legal obligations. Second, it demonstrates that this ideal of complete protection is leading to cracks in the legal framework and suggests that these cracks are currently being patched over by Courts and regulators in a way that is itself antithetical to effective data protection. Third, it interrogates whether some of these problems might be addressed by adopting a more flexible approach to data protection interpretation and enforcement. This approach itself raises fundamental questions that must be addressed, suggesting the time may be ripe for a more radical rethink of the data protection framework. 14

Data protection is a regulatory regime that puts in place a series of both rules and principles that must be applied whenever personal data is processed. It regulates the creation, collection, storage, use and onward transmission of personal data, amongst others. 15 At its most basic, when the data protection framework applies, personal data processing can be legitimised if certain conditions are met: there must be a legal basis for processing and adherence to the principles of fair data processing. 16 The legal framework thus imposes compliance obligations primarily on ‘data controllers’ and grants rights to individuals (‘data subjects’). 17 An innovation in the GDPR is the introduction of a suite of meta-regulatory obligations, including an obligation of demonstrable accountability applicable to controllers and various other compliance requirements such as the need to conduct data protection impact assessments and to appoint a data protection officer (DPO) in some circumstances. 18 In the EU, this legislative framework is undergirded by the right to data protection found in the EU Charter of Fundamental Rights. 19 The Court has held in its caselaw that the very act of personal data processing engages the right to data protection and must therefore comply with the requirements set out in Article 8 EU Charter. 20 The legislative framework could therefore be viewed as something that simultaneously facilitates the interference with a fundamental right while allowing for the justification of this interference if its legal requirements are satisfied. 21 From a human rights law perspective, the entire legislative framework functions as a justificatory regime. The implicit aim of the legal framework is to ensure that data processing operations are proportionate in that they pursue a legitimate aim and contain safeguards to ensure they do not go beyond what is necessary to achieve that aim.

Since the adoption of data protection laws by the EU in 1995, the data protection framework has been characterised by its expansive scope of application. The key concepts determining the material scope of application of the EU system are defined broadly, with exceptions construed narrowly. It follows that as societal activity now increasingly has a digital component, data protection has become an almost unavoidable legal framework 22 : data protection is the law of everything, 23 applied to everyone. This is, however, as much a result of a legal evolution as it is a socio-technical one. This section will trace how this has come to pass. The material and personal scope of the rules are defined and interpreted expansively while exceptions to their scope have been construed restrictively. Moreover, attempts to limit this expansionist approach have been rejected by the CJEU. Later sections will explore the implications of this expansionist approach for effective data protection.

A. The Law of Everything

Data protection law applies to the processing of personal data. Any operation or set of operations performed upon personal data, whether by automatic means or not, constitutes processing. It is therefore difficult to conceive of any type of activity with a digital component that would not constitute processing. 24 The only limitation found in the law is that where the processing is conducted manually, as opposed to fully or partly automated processing, the data processing must form part of a filing system which allows for the easy retrieval of an individual’s data file. 25 For the law to apply, however, it is personal data that must be processed.

Data protection law operates in a binary way: it applies when the data processed are classified as ‘personal’ data but does not apply to the processing of non-personal data. 26 Much therefore hinges on what is classified as ‘personal data’. Anonymous data is not treated as personal data whereas data that is pseudonymised, where the data can only be attributed to a specific individual once combined with additional information which is separately held and subject to additional measures to ensure non-attribution, is personal data. 27 The scope of the term personal data is wide, as we shall see, and what constitutes personal data is varied. 28 Personal data is defined as ‘any information relating to an identified or identifiable natural person’. 29 While much of the focus in the existing doctrine is on the issue of identifiability 30 —what does it mean for an individual to be identified and when is an individual identifiable—the other elements of the definition may be equally consequential for its application. Indeed, while it is necessary to disaggregate these elements in order to apply this definition, it is only by considering them together that the overall reach and impact of the law can be determined. Some examples may help to illustrate these points.

Many publishers describe the peer review process as anonymous on the basis that the data being processed—in this case the article distributed for peer review and the comments of the reviewers—do not reveal the identity of the individuals at stake. 31 Anonymity in this colloquial sense is distinct from anonymity as defined in the GDPR. In the peer review context, individuals are deemed anonymous if they cannot be identified or identifiable from the data immediately available to authors or reviewers (an errant reference to previous work revealing an author’s identity, for instance). 32 However, for GDPR purposes, irrespective of whether the article or review allowed for an individual’s immediate identification, they would meet the legal standard for identifiability. An individual is considered identifiable where they can be identified, directly or indirectly using means reasonably likely to be used by the data controller or by any third party. In this example, the identifiability threshold is easily met as the journal editor is able to identify both the author of the article and the reviewer even where they remain unknown to one another. We might be tempted to stop the analysis here, however, the remaining elements of the definition must also be met. If an unreliable author submitted a piece of work that had been generated by ChatGPT and contained inaccuracies attributed to non-existent sources this would nevertheless constitute ‘information’. 33 Instinctively, we might also think that an academic article could not be personal data as its content is not about a particular academic, it is simply the output of their efforts. Early caselaw in the UK, for instance, insisted that personal data must focus on an individual or be biographical in a significant sense. 34 However, the Court endorsed a much more capacious vision of personal data in Nowak finding that information can relate to an individual in so far as it is linked to the individual by reason of its ‘content, purpose or effect’. 35 In Nowak , the CJEU considered that the examination script of a candidate in an open book examination ‘related to’ the candidate as the content of the answers reflected the extent of the candidate’s knowledge and competence; the purpose of the processing was to evaluate their professional abilities and suitability for practice and the use of that information would be liable to have an effect on their rights and interests. 36 The Court also held that the examiner’s comments related to the candidate as, amongst others, their purpose is to record the examiner’s evaluation of the candidate’s performance and they are liable to affect the candidate. 37 This reasoning would apply by analogy to an article submitted for peer review and the comments of the reviewer. Despite the fact that publishers tend to refer to this process as anonymous, suggesting it would fall outside the law’s scope, we would therefore conclude that the peer review process constitutes personal data processing to which the data protection framework applies.

A further example is the act of uploading some content to social media, for instance, a photograph with friends or a video of colleagues. This would again easily meet the threshold criteria for the law to apply. Personal data can be any information: it is not restricted to information that is private or sensitive. 38 This information is linked to them in terms of its content: it is about them and the processing of this information might impact upon them, for instance, if they were photographed with friends during the working day. Even if they could not be immediately identified on the basis of the photograph, they are identifiable at least to the person who uploaded the content online. Notably, they are also potentially identifiable to third parties such as phone companies if, using means likely reasonably to be used, they could combine this data with other data they hold, such as geo-location data, to identify the individuals concerned. 39 Here one might object that the social media user has a right to impart information as part of their right to freedom of expression, thus excluding the data protection rules. However, rather than excluding protected free speech from the scope of data protection law, it is brought within the scope of the data protection framework and tensions between data protection and freedom of expression are reconciled from within the data protection framework. 40 This is similar to the example provided by Advocate General Bobek in his Opinion in X and Z : an individual in a pub who shares an e-mail containing an unflattering remark about a neighbour with a few friends becomes a data controller subject to the GDPR’s obligations. At the hearing in that case, the Advocate General noted that the Commission accepted that even the incidental processing of personal data triggers the GDPR’s rights and obligations and that it had difficulty explaining where the limits of the law lie. 41

At its more extreme, the literature provides examples of data which can plausibly be argued to meet the definition of personal data although intuitively ‘far from being “personal”’. 42 Purtova takes the example of a smart city project in the Dutch city of Eindhoven initiated by a public–private collective to anticipate, prevent or de-escalate anti-social behaviour on Stratumseind, a street known for its social life. The data used for this behavioural regulation is gathered from multiple sources and includes weather data, such as rainfall per hour and wind direction and speed. Purtova reasons that weather contains information which is then datafied; that this relates to individuals as it can be used to assess and influence behaviour deemed undesirable and that this information, when combined with other data collected via sensors, can lead to the identification of individuals. Indeed, this is the very purpose of the Stratumseind 2.0 project. She proposes that weather data could therefore be classified as personal data. Others have applied similar analysis to other environmental artefacts, such as wastewater. 43 Once we start to look around us to apply this definition we see that almost all data is potentially personal data if applied to evaluate or influence individuals thus making data protection the law of everything (or almost everything).

This development is desirable if we consider that it is no longer simply data about an individual that might be leveraged to impact upon their rights. 44 Take, for instance, synthetic or artificial data derived from personal or non-personal data to create replica datasets. Such synthetic data may be used to make significant and impactful decisions about identified individuals. In such circumstances, it could be classified as personal data under the GDPR. 45 While this might seem to confirm Purtova’s concerns that data protection law is the law of everything, Dalla Corte highlights that information that relates to someone as a result of its impact on them will not necessarily be personal throughout its entire lifecycle. 46 For instance, data about the performance of a vehicle is non-personal data until the point when it relates to someone, such as when it is used to evaluate a driver’s performance. 47

A further feature of the legal framework is that while ‘personal data processing’ is potentially all encompassing, the limited derogations to the material scope of the GDPR are construed restrictively. Data processing for EU external action, national security purposes and processing by competent authorities for law enforcement purposes fall outside of the GDPR’s ambit, 48 as does data processing undertaken by the EU institutions. 49 The only other derogation is for data processing for ‘purely personal or household purposes’. 50 The uploading of content to social media might seem to constitute such a purpose, however, this is not necessarily so as the Lindqvist case demonstrates. Mrs Lindqvist was a church catechist in Sweden who, as coursework for an evening class on computer processing, uploaded short descriptions of her colleagues to the church website. She was criminally prosecuted for illegal data processing and, amongst the many defences invoked in the ensuing court proceedings, was that Mrs Lindqvist was engaged in ‘purely personal or household’ processing. The CJEU acknowledged that Mrs Lindqvist’s activities were charitable and religious rather than commercial 51 but refused to apply this derogation. It considered that the information concerned was ‘clearly not’ carried out in the course of an activity relating to the private or family life of individuals as the internet publication resulted in the data being made accessible to ‘an indefinite number of people’. 52 In later jurisprudence, the Court found that when a home security camera used for personal security captures not only the home but the public footpath outside, it too cannot benefit from this derogation. 53 In this way, many of the routine data processing operations of individuals are brought within the law’s fold.

As this section suggests, the concept of personal data has the capacity to bring all impacts of data usage on the fundamental rights of individuals within the remit of data protection law. Given that the law is concerned with the protection of rights rather than the protection of data per se, this expansion is desirable and legitimate. For instance, at the point at which weather data is used to assess an individual’s potential criminality, it is appropriate that legal protections are activated. However, as Mrs Lindqvist’s case suggests, this does raise questions about to whom the law applies and the extent of their obligations under the framework. It is these questions of scope that require further consideration and to which we shall now turn.

B. Applied to Everyone? The Data Controller and Joint Controllership

To whom does this vast legal framework apply? Data protection law distinguishes between data subjects, the individuals whose personal data are processed, and data controllers and processors, who initiate and undertake the data processing. Data controllers act as the ‘intellectual lead’ 54 or brains behind the data processing operation—determining the purposes and means of processing 55 —while data processors act as the brawn—conducting the data processing under the instruction of the data controller. 56 Primary legal responsibility is attributed to the data controller, although the GDPR does confer specific responsibilities on the data processor for some tasks. 57

While these concepts and the division of labour between them appear clear, already in 2010 it was noted that their concrete application was becoming increasingly complex leading to uncertainty regarding responsibilities under the framework. 58 The main reason for this complexity is that modern data processing is itself complex 59 : unlike the conditions that prevailed when data protection laws were first adopted, control over processing is no longer centralised 60 or exercised by singular actors who use available technologies for easily distinguishable purposes. 61 Moreover, technologies confound the distinction between means and ends that the GDPR deploys: determining the appropriate technical tools for the job (de facto a task often assumed by the processor) can have a significant bearing on the purposes to which those tools can be put and, ultimately, the functioning of a socio-technical system.

This messiness of the socio-technical environment is recognised, to some extent, through the concept of joint controllership: controllers can determine the purposes and means of processing alone or ‘jointly with others’. This joint control can take different forms: it can result from a common decision on purposes and means or from ‘converging decisions’, where complementary decisions have a tangible impact on the purposes and means of processing and the processing would not be possible without the participation of the jointly controlling entities. 62 For our purposes, what is significant is that the concept of controllership is both defined and interpreted expansively. Per the definition, a controller or a joint controller can be a ‘natural or legal person, public authority, agency or any other body’. 63 Like other forms of regulation, such as environmental regulation and consumer protection laws, data protection is a form of mixed-economy oversight: the law, therefore, applies equally to public actors, such as local authorities or departments of government, as it does to private enterprise. For the latter, there is little differentiation made between large multinational companies and the local corner shop. 64 Moreover, the law brings individuals within its reach as data controllers, subject to the limited derogation for purely personal and household processing noted above.

The CJEU has had opportunity to interpret the notion of controllership on numerous occasions, taking these opportunities to stretch the concept to ensure the ‘complete and effective’ protection of individuals. We could locate the foundations for this broad approach in the Court’s Google Spain judgement. While this ruling is best known for its recognition of a ‘right to be forgotten’ in EU data protection law, its finding that Google search engine is a data controller was also momentous. 65 Notably, in an earlier advisory opinion on the application of data protection law to search engines, the advisory body comprised of data protection regulators (the Article 29 Working Party) had considered that where a search engine acts purely as an intermediary, the principle of proportionality requires that it should not be considered the principal controller of the content. 66 However, the Court implicitly rejected analogies with other areas of law where intermediaries such as Google Search enjoy quasi-immunity from liability for hosting illegal content until they have actual or constructive awareness of such content. Google had argued that when providing hyperlinks to content already available online it did not differentiate between links to primary publications containing personal data and those that did not. 67 The Court applied the controllership test broadly, finding that in the context of this linking activity it is the search engine operator that determines the purposes and means of the personal data processing. 68 It considered that it would be contrary to the clear wording of the definition of data controller and its objective to exclude search engine operators, going on to note that the role of search engine operators is distinct from primary publishers and that the former is liable to affect fundamental rights to privacy and data protection ‘significantly and additionally’ compared with the latter. 69 Importantly, the Court considered that the objective of the broad definition of data controller is to ensure ‘effective and complete protection of data subjects’. 70

Later jurisprudence brought this concern for the ‘effective and complete’ protection of individuals to the fore, sometimes at the expense of the law’s literal meaning. 71 In Wirtschaftsakademie ( Facebook fan pages) the Court held that the administrator of a fan page on Facebook was a joint controller. 72 Visitors to the fan page, both Facebook users and non-users alike, had data collecting cookies placed on their devices by Facebook and the Court reasoned that the fan page operator provided Facebook with this opportunity. 73 Moreover, the fan page operator also defined the parameters for the statistical analysis of visitor’s data conducted by Facebook, thereby contributing to and determining the purposes and means of processing. 74 The later Fashion ID case, where the Court considered whether the integration of a Facebook social plug-in (the Facebook like button) into a website was sufficient to make the website operator a joint controller, confirmed that the definition of parameters for data analytics by a Facebook fan page was not what was decisive. 75 In Fashion ID , the mere presence of the piece of Facebook code on the website—triggered when the website was consulted—was sufficient to transmit data from the website user’s device to Facebook. The website visitor did not need to click on the plug-in or be a Facebook user for this to occur. 76 The Court was asked whether embedding a piece of Facebook code on a website was sufficient for the website operator to constitute a data controller, particularly given that once the data was transmitted to Facebook the website operator had no influence on the subsequent data processing. The Court broke the data processing operations down into segments. It determined that Fashion ID exercised joint control over the collection and transmission of the personal data of visitors to its website, a first segment, however, it was not responsible for subsequent processing operations, over which it had no influence. 77 Specifically with reference to the means of processing, the Court emphasised that Fashion ID was ‘fully aware’ of the fact that the embedded plug-in served as a tool for the collection and transmission of personal data to Facebook. 78 The Court concluded that through the embedding of the plug-in on its website, Fashion ID exerted ‘decisive influence’ over the data processing that would not have occurred in the absence of the plugin 79 and that there was joint control over the data processing operation. 80 In support of this conclusion, the Court pointed to the mutual benefit the data processing provided to Fashion ID and Facebook Ireland. 81

In both of these instances, the fan pages and website operators did not ‘hold’ or have access to the data undergoing processing, thus rendering them incapable of complying with the vast majority of the regulatory framework (a point to which we shall return). The Court addresses this point, finding that the classification of data controller does not necessitate that the data controller has access itself to the personal data collected and transmitted. 82 Implicitly, the role of facilitating and benefiting from data processing is sufficient to incur legal responsibility. 83 Jehovan todistajat offers more explicit confirmation of this understanding of controllership in the context of the relationship between the Jehovah’s witness community, its congregations and its preaching members. 84 In the conduct of their preaching activities, preaching members of the Jehovah’s witness community (the community) took notes regarding the people they met. These notes served the dual purpose of acting as an aid for future visits and to compile a ‘refusal register’ of those who did not want to be contacted again. The community and its congregations coordinated this preaching activity by creating maps allowing for the allocation of areas between preaching members and keeping records about preachers and the number of leaflets they distributed. 85 While the preaching members received written guidelines on note-taking published in a magazine for members, they exercised their discretion as to the circumstances in which they should collect data; which data to collect; and how those data are subsequently processed. 86 Yet, the role of the community in ‘organising, coordinating and encouraging’ this preaching activity was sufficient for it to be deemed a joint controller. 87

In Jehovan , we might distinguish between the overarching aim or purpose of data processing—to encourage new members to join the community—which is determined by the community and more essential elements of the processing (such as which data to be processed and who should have access to the data) which was determined by the preaching members. 88 The orchestrating role of the community is sufficient to establish responsibility under data protection law, without the need for access to the data 89 or to have produced written guidelines around data processing. 90 This is perhaps unsurprising given that the preaching was carried out in furtherance of the overarching objectives of the community—to spread its faith—and the community acted as the ‘intellectual lead’ on the data processing. In a subsequent case, the Court is asked to determine whether a standard-setting organisation that offers its members a standard for managing consent specifying how personal data is stored and disseminated is a data controller. 91 The way in which the standard-setting organisation ‘organises and coordinates’ personal data processing through this standard seems highly likely to meet the criteria set by the Court in Jehovan .

This low legal threshold for controllership, when combined with technical–organisational developments, particularly the increasingly interconnected nature of information systems and markets, will therefore make joint controllership more prevalent. 92 This has the benefit of enabling regulators to more easily bring complex data processing structures within their regulatory remits, as was the case in the standard-setting investigation noted above. However, it also brings more individuals and tangential actors within the law’s fold. We might conclude that, to the extent that it is necessary to establish ‘which level of influence on the “why” and “how” should entail the qualification of an entity as a controller’, 93 the answer is very little. This caselaw leaves one with the impression that everyone is responsible for data processing from the facilitators (such as Fashion ID) to the orchestrators (such as the community). Data protection is, it seems, the law of everything applied to everyone. We will return to the question of whether this is desirable below.

C. Failed Attempts to Limit the Law

This expansive evolution of the scope of data protection law has been challenged. Prior to the development of European case law, British courts tended to interpret its material scope more restrictively. The notion of processing was interpreted narrowly to exclude the act of anonymising personal data on the grounds of ‘common sense and justice alike’ 94 while information only constituted personal data relating to someone when it was private or biographical in a significant sense. 95 At European level, pushback has come from within the Court in the Opinions of its Advocates General.

Advocate General Sharpston sought to keep the material scope of the rules in check by proposing alternative readings of the concepts of automation, processing and personal data in her Opinions. It is recalled that the GDPR applies to personal data that is processed manually as part of a filing system or that is processed ‘wholly or partly by automated means’. In an early case where the right to access documents was pitted against the data protection rights of those featuring in the documents, she sought to avoid a balancing of interests by suggesting that the data protection rules did not apply. The retention and making available of these meeting minutes using a search function was not, she opined, ‘automated’ processing. Her reasoning was that throughout this process the ‘individual human element plays such a preponderant part and retains control’ 96 in contrast to ‘intrinsically automated’ processing operations such as the loading of a website. The search function, like the use of an electric drill, could be replicated by humans but simply with less efficiency. 97 This reasoning was undoubtedly influenced by the Advocate General’s opinion that ‘the essence of what is being stored is the record of each meeting, not the incidental personal data to be found in the names of the attendees’. 98 Had the Advocate General’s reasoning been accepted, the range of processing operations to which the data protection framework would apply would have been dramatically limited. 99 The Court did not follow, or even acknowledge, the Advocate General’s attempt to place boundaries around the notion of personal data processing. 100

When the Court was asked to consider whether the legal analysis found in an administrative note concerning the immigration status of several individuals constituted personal data in the YS, M and S case, Advocate General Sharpston again proposed to restrict the law’s material scope. As in Bavarian Lager she emphasised the human dimension of the processing. Legal analysis is a process controlled entirely by individual human intervention through which personal data (in so far as they are relevant to the legal analysis) are assessed, classified in legal terms and subjected to the application of the law, and by which a decision is taken on a question of law. 101 Once again, the Court did not acknowledge this perspective. It did, however, find her opinion on what constitutes personal data more persuasive. Her opinion suggested that the definition of personal data should be confined to ‘facts’ about an individual, whether objective (e.g. weight in kilos) or subjective (underweight or overweight), 102 to the exclusion of the reasoning or explanation used to reach such conclusions or facts. 103 She was unconvinced that the definition of personal data should ‘be read so widely as to cover all of the communicable content in which factual elements relating to a data subject are embedded’. 104

The Court concurred finding that legal analysis is not information relating to the applicant but is, at most, information about the assessment and application of the law to the applicant’s situation. 105 Like the Advocate General, it supported this conclusion by reference to the broader legal framework, suggesting that its interpretation was borne out by its objectives and general scheme. 106 It reasoned that in order to promote the law’s objectives of protecting fundamental rights, including privacy, the law gives individuals the right to access data to conduct ‘necessary checks’ (to check its legality; to rectify or delete in some circumstances). In this instance, as the legal analysis itself is not liable to be subject to the checks set out in the right to access, such as an accuracy check, granting access to the data would not serve the law’s purpose. 107 The Court’s reasoning in this case is flawed: it rendered the scope of application of the legal framework contingent on whether substantive rights can be exercised in a particular scenario although the scope of the legal framework is a logically prior question. 108 What is notable, however, is that YS is a ‘rare instance in which the Court has read the concept of “personal data” restrictively’. 109

However, in the later Nowak case, the Court seems to recognise this misstep as it differentiates explicitly between ‘classification’—the scope of the rules—and ‘consequences’—the substantive responsibilities they impose. It held that whether the answers and exam comments could be classified as personal data should not be affected by the consequences of that classification. 110 To confirm this point, the Court emphasised that if data are not personal data they are entirely excluded from data protection’s principles, safeguards and rights. 111 While the Court made a weak reference to YS and M and S , intimating that it might be distinguished on the facts, its findings and reasoning in Nowak stand in opposition to YS . At best, the current status of YS is ‘somewhat uncertain’. 112 However, given the Court’s later expansive line in Nowak , it is perhaps more reasonable to treat YS as an anomaly.

The scope of the notion of controllership has also been subject to contestation. In Facebook fan pages , the referring court hinted at the possibility of a ‘third way’ to attribute responsibility for data processing beyond controllership and joint controllership. It considered that the operator of a fan page was not a controller but queried whether the action of choosing which operators to engage with should entail some responsibility for the fan page host. 113 The Court simply considered the fan page operator to be a joint controller. In their opinions on data controllership, Advocates General also expressed their unease about the expansive personal scope of the law, albeit without fully articulating their concerns. In Google Spain , the Advocate General proposed a knowledge component to controllership 114 : the data controller should be aware in some ‘semantically relevant way’ of what kind of personal data they are processing and why 115 and then process this data ‘with some intention which relates to their processing as personal data’. 116 Advocate General Bobek was most forthright in expressing his concerns, openly querying whether this strategy of broadly interpreting controllership—making ‘everyone’ responsible—would enhance effective protection. 117 The Court was not ‘faced with the practical implications of such a sweeping definitional approach’. 118 The Advocate General does not, however, develop how the broad scope of the law might hinder its effectiveness or what the practical implications of this broad scope might be. Having shown how judicial developments in the EU mean that data protection law might not be credibly classified as the law of everything applied to everyone, we now turn to examining this question: what are the consequences of this broad scope for the effectiveness of the law.

The scope of data protection law has been interpreted expansively with a view to preventing human rights infringements. To achieve their preventive function, Simitis argued that these rules should be strictly applied but, primarily, that they adapt to ‘both the exigencies of an evolving technology and of the varying structural as well as organisational particularities of the different controllers’. 119 No doubt the Court considers that it has remained true to this mission in its jurisprudence. However, this approach is increasingly questioned. Advocate General Bobek suggests that the current approach is ‘gradually transforming the GDPR into one of the most de facto disregarded legislative frameworks under EU law’. 120 Similar reservations are expressed in the academic literature. Bygrave and Tosoni note that the law’s enormous scope of application is ‘perhaps beyond what it can cope with in terms of actual compliance and enforcement’. 121 Nolan observes that the Court’s approach appears to assume that ‘by applying data protection law to more actors better protective outcomes will be achieved’ 122 while Koops more explicitly declares data protection law to be ‘meaningless law on the books’ as a result of, amongst others, its broad scope. 123 Therefore although the Court justifies its expansive application of the law on human rights grounds, this quest for completeness may be in tension with the law’s effectiveness and the attainment of these human rights objectives. In other words, we must query whether data protection law can be both complete and effective.

A. Assessing the Effectiveness of the Law

When we test this claim—that data protection law can be all encompassing or effective but not both—we are immediately faced with the challenge of determining appropriate parameters to assess the effectiveness of the law. As one data protection authority has noted, while the volume of work they undertake is ever intensifying, what remains elusive ‘is any agreed standard by which to measure the impacts and success or otherwise of a regulatory intervention in the form of GDPR that applies to literally everything’. 124 While the idea of measuring the impact of human rights and the methodologies used remain contested, scholars such as De Búrca have sought to break the deadlock by proposing an experimentalist account of human rights to assess their effectiveness. 125 However, such accounts speak predominantly to how Treaty and Charter rights, rather than the legislative frameworks that implement them, have been harnessed for social change. Policymakers, journalists and civil society organisations tend to speak of the effectiveness of the GDPR in terms of the complaints resolved by authorities and the remedies and sanctions imposed. 126 The number of complaints lodged by data subjects was also deemed by the European Commission to be an appropriate indicator of the impact of the GDPR to be taken into consideration when monitoring the implementation of the law. 127 However, the number of complaints alone provide an inconclusive indication of success. Not only is data gathering in this area very inconsistent, detracting from its reliability 128 but, more fundamentally, interpreting this data is difficult. A low number of complaints or insignificant fines could be indicative of either a dysfunctional system of enforcement or widespread compliance with existing obligations. 129 Equally, while by August 2023 an impressive 1.4 million requests for the erasure of links from Google’s search engine have been submitted pursuant to GDPR, 130 this figure gives us only a small insight into the overall exercise of individual rights and tells us nothing of who is exercising their rights and whether these requests were appropriately handled. 131 In assessing the effectiveness of the law, we might then return to a simple test that asks what are the law’s objectives and queries whether these objectives have successfully been attained. 132

The stated objectives of the GDPR are two-fold: to remove impediments to the free flow of personal data within the EU and to protect fundamental rights, in particular data protection. 133 These different ambitions of data protection are often not mutually exclusive and are sometimes in tension. 134 The GDPR’s fundamental rights objective has become dominant in its interpretation in recent years. 135 However, parsing this fundamental rights objective further, we can see that the content of the right to data protection itself remains contested. The right has been characterised in different ways: as promoting individual control over personal data; ensuring ‘fair’ processing of personal data; a right which simply guarantees legislative safeguards for data processing; and as instrumental for other rights. 136 Moreover, the Court has explicitly acknowledged that not all violations of the GDPR entail a fundamental rights interference, 137 thereby confirming that there are provisions of the law that do not have a fundamental rights character.

Whether the law is successful in achieving the protection of fundamental rights, in particular data protection, may differ depending on which of these conceptualisations of data protection one prefers. However, for simplicity, assuming that the GDPR gives at least partial expression to the right to data protection, 138 we might then infer that compliance with the GDPR would itself achieve the law’s objective of fundamental rights protection. This vision of effectiveness equates legal compliance with success. This assumes that the legal rules are the ‘right’ ones to achieve the objectives of data protection laws. In other words, by achieving high levels of compliance we would achieve the law’s objectives of fundamental rights protection. However, existing legal scholarship appears to challenge this assumption. Bygrave, for instance, observes a paradox in the enactment of ‘increasingly elaborate legal structures’ for privacy while privacy protection is increasingly eroded. 139 Richards similarly queries why people are so concerned about the Death of Privacy when there is so much privacy law. 140 There is also some limited empirical evidence to suggest that modern data protection frameworks encourage ‘symbolic compliance’ by allowing the information industry to apply the law in a way that aligns to corporate rather than public objectives. 141 While this empirical work was conducted in the USA, its findings are also said to reflect on the GDPR. Further empirical research is required to assess how the law is being received on the ground. early evidence suggests that rather than even encouraging symbolic compliance there remains widespread non-compliance with the law in reality. Writing in 2022 Lancieri examined the 26 independent empirical studies to assess the impact of the GDPR and the California Consumer Protection Act on legal compliance and concluded that non-compliance remains widespread. 142 Such non-compliance includes obvious violations, for instance, that 85% of Europe’s most accessed websites continued to track users even after they had opted out of such tracking. 143 Thus while compliance requirements will undoubtedly play an important role in securing the application of the GDPR, 144 this suggests that over-reliance on controller compliance over enforcement would be erroneous. 145 Yet, even where the desire to comply is present, the law’s complete scope makes compliance with its provisions impossible in some circumstances (B) while rendering the enforcement needed to complement compliance strategies more challenging for regulators (C). In this way, complete protection is pitted against effective protection.

B. The Practical Impossibility of Compliance

It follows from the Court’s jurisprudence that the broad scope of responsibility it envisages renders compliance with the law practically impossible in some circumstances, one of Fuller’s characteristics of a bad law. 146 The practical impossibility of compliance is best illustrated through the Court’s caselaw on joint controllership, discussed above. It follows from this case law that in networked situations, for instance, where a student society uses Facebook to host a fan page, data controller responsibility is segmented. The student society would need to comply with data protection law for any element of the processing that it facilitates while Facebook would need to comply for any data processing operations it undertakes jointly with or independently of the student society. Some provisions of the GDPR apply awkwardly to this situation. For example, the requirement found in Article 26 GDPR which stipulates that joint controllers should arrange between them their respective responsibilities either functions as a legal fiction when applied between big technology platforms and natural persons or is widely disregarded. Both scenarios detract from the law’s credibility and legitimacy. However, joint controllership also leads to situations where it will be impossible in practice for the student society to comply with all of its obligations under data protection law. The Court has, for example, held that joint controllership is not contingent on the controllers having access to the data being processed. 147 Without such access the student society cannot comply with requests from individuals in relation to that data (such as data access, rectification or deletion requests). This necessarily raises the question of whether an individual or entity ought to be designated a data controller if they do not have or have not had access to the data that renders them legally responsible. In principle, as a joint controller the student society or individual could require others to provide such access pursuant to Articles 26 and 28 GDPR. Indeed, companies such as Meta have put in place a contractual addendum indicating that Meta will retain responsibility for compliance with data subjects’ rights that necessitate data access. 148 This fills the legal lacuna in this instance but it is noteworthy that this renders the compliance of the student society with the GDPR contingent on Meta’s contractual wishes. More broadly, this approach to controllership assumes that cooperation is feasible given the number of entities deemed joint data controllers pursuant to this approach and the often asymmetrical power relations between them. The same can be said for legal requirements that require no data for compliance, such as the GDPR’s transparency requirements. 149 Mahieu and Von Hoboken provide the example of the following transparency notice to illustrate this point evocatively:

We collect your IP address and browser-ID and transfer this personal data to Facebook. We do not know what Facebook does with the data. Click here to proceed.

By segmenting responsibility to ensure complete data protection, key provisions of data protection law are rendered meaningless in the process. The Court had been warned of this consequence by one of its Advocates General who considered that, when it came to controllership, a conceptual lack of clarity upstream about who was responsible for what processing might cross ‘into the realm of actually impossibility for a potential joint controller to comply with valid legislation’. 150 This warning did not influence the Court.

The Opinions of the Advocates General in these cases on joint controllership give some insights into the Court’s thinking in developing responsibility in this way. The ambition, it seems, was a policy one: that by making more individuals and entities responsible for data protection compliance this would introduce some bottom-up pressure on more significant data controllers to take compliance seriously. This approach has been subsequently vindicated to some extent as it has given data protection regulators more leverage to apply the law to address systemic data protection concerns. For instance, civil society organisation NOYB submitted 101 complaints to various European data protection authorities arguing that website operators that used Google Analytics and Facebook Business Tools transferred data illegally from the EU to the USA. In its initial advisory assessment of this practice, the European Data Protection Board (EDPB) emphasised that each website operator must ‘carefully examine whether the respective tool can be used in compliance with data protection requirements’. 151 Moreover, given the difficulties experienced in the use of the GDPR’s pan-European enforcement mechanism (the one-stop-shop), 152 this approach also potentially returns competence to national data protection authorities if the data processing operations of the joint controller affect residents in that State only. 153

Therefore, while this approach is not without merit, what is overlooked in the equation is that the business models in question co-opt individuals and entities into data processing but without giving them any real stake or meaningful control in the data processing operations. The real locus of power over data processing lies not with the millions of joint controllers who embed such analytics tools in their content and services but with the operators who provide them. One might also wonder how the data subject stands to benefit from the designation of an entity that cannot comply with core data protection rights, such as access and erasure, as a data controller. Joint controllership as conceived by the Court in Jehovan , extending responsibilities to those who coordinate and orchestrate data processing operations, appears to more accurately capture the real site of power in digital ecosystems and therefore offers a more effective leverage point for regulatory intervention. Indeed, relying on the Jehovan logic, the Belgian regulator has analysed the data processing operations of almost the entire online advertising technology ecosystem by focussing on a critical apex entity, the Interactive Advertising Bureau (IAB). 154 We might be more willing to accept the practical impossibility of compliance with the law’s provisions if it delivers real gains for fundamental rights protection.

C. Data Protection Authorities as the Regulators of Everything

Securing effective data protection in Europe will require an appropriate blend of private enforcement (including by civil society actors), 155 compliance by regulated data controllers and public enforcement by regulators. The regulator alone is not responsible for the full application of the law. However, it could be argued that regulators continue to play an out-sized role in the success or failure of the EU data protection regime as the extent to which follow-on private enforcement is initiated or regulatees voluntarily comply with the law is dependent on their actions. It is therefore significant that the law’s broad scope of personal application also poses challenges for the regulators tasked with interpreting and enforcing its provisions.

At a very basic level, the volume of cases that regulators deal with has increased significantly since the entry into force of the GDPR, suggesting a ‘new level of mobilisation on the part of individuals’ to tackle data misuses. 156 For instance, while in 2013 the Irish regulator received 910 complaints between May and December 2018, following the entry into force of the GDPR, it saw this number triple. 157 Regulators report on the number of complaints that they receive annually in their Annual Reports and these figures have been collated on occasion at European level. 158 While this mobilisation is to be welcomed, regulators may lack the capacity to handle the increase in demand for their services. In response to a questionnaire of the EDPB, 82% of regulators explicitly stated that they do not have enough resources to conduct their activities. 159 In this sense, with finite budgets and human resources at their disposal, the broad scope of the law means regulators struggle to fulfil their legal supervisory obligations. The solution may lie, in part, with providing regulators with more resources.

Yet, while a lack of resources no doubt exacerbates the enforcement challenge for regulators, the problem may also be one of delimiting appropriate regulatory boundaries when data protection law is applied to everyone. It is not simply the number of regulatees that might complicate the work of regulators but also that the regulated community is extremely diverse. We might contrast this with other areas of regulation, such as energy regulation where the regulator deals primarily with energy firms, or even competition law, where the regulator deals only with ‘undertakings’ engaged in economic activity. 160 Data protection regulators must regulate, amongst others, the activities of individuals, charities, political parties, public authorities and commercial actors. This diversity of regulatees is significant as regulation—and regulators—benefit from the existence of a ‘cohesive interpretive community’. As Black emphasises, for rules to work, that is to apply in a way that would further the overall aims of the regulatory system, the person applying the rule has to ‘share the rule maker’s interpretation of the rule; they have to belong to the same interpretive community’. 161

A lack of cohesion amongst regulatees may make a common understanding of the law more difficult to attain resulting in over- or under-compliance. Tales of such compliance misadventures are plentiful in data protection law. In 2019, for example, the Irish regulator needed to reassure publicly the Irish General Post Office that maintaining public bins outside its premises would not violate GDPR. 162 The more diverse the regulated community, the less the regulator will be able to assume some minimum levels of understanding of the rules and the more demanding its task becomes. Moreover, it is apparent that, as a result of the diversity of regulatees under the law, some legal requirements are awkwardly applied to individuals. Not only are many of the law’s requirements predicated on centralised control over a file, 163 but they also assume that a data controller will have certain organisational and bureaucratic capacities at its disposal. The GDPR introduced a wide range of ex ante meta-regulation obligations that apply to controllers, such as the record keeping needed to comply with demonstrable accountability requirements 164 and the requirement to appoint a DPO in some circumstances. 165 As Nolan observes, implicit in these responsibilities is the assumption that controllers are ‘commercial, institutional or bureaucratic entities, if controllers are to ever be able to meaningfully comply with their obligations’. 166 While some of these requirements contain exceptions for small- and medium-sized enterprises (and implicitly individuals), this is not universally true. 167 In short, by detracting from common understandings of the law and stretching the application of its requirements to all regulatees, the lack of cohesion in the regulated community can detract from the effectiveness of the law.

The diversity of the regulated community also puts pressure on regulators because they deal with a huge variety of regulatory issues. Recent examples include the systemic issues arising in data-centric industries, such as the ongoing legal investigations into the AdTech industry across Europe 168 ; assessing the compliance of public data processing initiatives, such as the use of contact tracing applications at the peak of the Covid-19 pandemic 169 ; complaints by individuals about institutional data controllers 170 ; and interpersonal complaints, including about the use of technologies such as smart doorbells and home security devices. 171 The diversity of contexts in which the law applies and actors within its regulatory ambit renders it impossible for regulators to provide general and authoritative guidance that is appropriate to all. Consider, for instance, the meaning of open-ended principles, such as fairness, found in the GDPR. 172 This concept could encompass both procedural and substantive fairness 173 and has been interpreted in differing ways by national regulators to date. 174 We might interpret fair processing differently if it is our neighbour processing our data compared to an international company such as Meta. Moreover, the capacity required to interpret open-ended principles such as fairness appropriately scales down badly, with individuals and small enterprises less likely to have the knowledge and resources at their disposal to do this.

In conclusion, while it is not possible to conclude authoritatively that the pursuit of complete data protection has rendered data protection ineffective, it is apparent that this completeness is in tension with effectiveness in two key ways. First, it has rendered compliance with the law’s requirements practically impossible in some circumstances. As we shall see in the next section, the Court’s response to such practical impossibility has been to develop an ad hoc rationalisation of the law—the responsibilities doctrine, a response which itself jeopardises the law’s effectiveness. Second, the law’s broad scope has further diversified the regulated community, making it more difficult for regulatees to have a shared understanding of the law and for regulators to exercise effective oversight of the broad array of data processing operations they must supervise. We will now consider how this problem might be addressed.

Can the law be both complete and effective, as the Court aspires? The literature on the effectiveness of regulatory instruments is surprisingly sparse. Not all problems with the GDPR’s enforcement stem from its broad scope. As Lancieri highlights, information asymmetries between regulators and data controllers undermine compliance and enforcement as do high levels of market power in data-related markets. 175 Some problems in Europe also stem from the difficult cooperation between regulators foreseen by the GDPR. 176 However, the problems with the law’s effectiveness also stem, at least in part, from the over-inclusiveness of the law at rule level (in particular, as a result of the expanded scope of responsibility under the law). Bardach and Kagan suggest that such over-inclusiveness at rule level might be mitigated by a flexible application of the law at ‘site-level’. 177 Black similarly observes the reflexive relationship between rules and enforcement: it may be possible to use over-inclusive rules knowing that their application might be tempered through a conversational model of regulation. 178

It is possible to envisage mechanisms to facilitate such site-level accommodation in data protection law in two broad ways. 179 Such flexibility could come, firstly, through the interpretation of the law (A). Alternatively, or additionally, the law could be applied and enforced flexibly through graduated enforcement, applying insights from responsive regulation (B). These approaches are already evident to some extent in data protection law and practice yet, it is argued that without appropriate legislative underpinning and transparency regarding their application, they too risk jeopardising the attainment of the law’s objectives (C).

A. Flexible Interpretation: the Ad Hoc Rationalisation of the Law

The undesirable effects of an over-inclusive legal framework might be mitigated by interpreting the law in a ‘sensible’ or proportionate manner. Moreover, calls for such a ‘common sense’ approach to the interpretation of data protection law have been made from inside the Court. In Rīgas satiksme the Court was asked to consider whether data protection law provided legal grounds to compel the police to provide the personal information of an offender to a third party so that third party could initiate civil proceedings against the offender. 180 Specifically, the referring Court asked the CJEU to consider whether the legitimate interests legal basis—which enables data processing where necessary for the legitimate interests of the controller or of third parties provided such interests do not override the fundamental rights of the data subject—could be interpreted in this way. While the Court suggested this question should be answered in the affirmative, the Advocate General was more sceptical expressing a ‘certain intellectual unease as to the reasonable use and function of data protection rules’. 181 In the domestic proceedings leading to the case, the police—the data controllers—had refused the request on the basis, amongst others, that alternative options to access this information were available, leading to litigation and a referral to the national regulator. For the Advocate General, the application of data protection law in this context deviated from what he saw as the main concern of the law: namely, large-scale processing of personal data by mechanical, digital means. 182 He cautioned against their application in this context suggesting that such ‘“application absolutism” might result in discrediting the original idea’. 183 Instead, he suggested that when balancing interests under the law, a rule of reason ought to deployed necessitating a distinction between situations entailing large-scale mechanical processing and those where a ‘lighter touch’ is required. 184 While this has been interpreted as a call to introduce more flexibility and less formalism into the application of proportionality assessments under the data protection framework, 185 it could also be seen as a broader appeal for more flexibility in the law’s application outside the structures of proportionality assessments. It is noteworthy that the Advocate General refers to a rule of reason, rather than proportionality as such.

The challenges of introducing a dose of ‘common sense’, or site-level flexibility, to the law’s application are best illustrated by the Court’s designation of Google Search as a data controller and the subsequent jurisprudential contortions it has engaged in to ensure that Google’s Search operations can comply with the law. In Google Spain the Court concluded that Google Search was a data controller and was therefore responsible for ensuring its search engine activities were compliant with data protection law. In his Opinion, the Advocate General encouraged the Court to take into consideration proportionality, the objectives of the law and the means the law contains to achieve those objectives to reach a ‘balanced and reasonable outcome’. 186 His concern was that a search engine operator could not comply in law or in fact with the law’s provisions leading to the ‘absurd’ conclusion that a search engine could not be compatible with the law. 187 This concern had also been expressed by academic observers. 188 The Court was confronted with these concerns in the later case of GC and Others , which laid bare the mismatch between the operations of a search engine and the law’s requirements. At stake in GC was the prohibition on the processing of ‘special category’ personal data found in Article 9(1) GDPR. This provision reads as follows:

Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purposes of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation shall be prohibited.

This provision is clearly worded as a prohibition, which is then subject to a number of exceptions found in Article 9(2) GDPR, none of which readily apply to Google’s search engine activities. A literal interpretation of the law would therefore put Google’s search engine operations in direct conflict with the prohibition on sensitive data processing and render them illegal. As the rules on sensitive data processing are clearly linked to the fundamental rights of individuals, the inescapable conclusion would be that Google should cease or significantly alter its search engine operations.

In GC , the Court was asked to consider whether this prohibition applied to Google Search. The national referring court prefaced this question by asking whether the general prohibition also applies to search engines, ‘having regard to the specific responsibilities, powers and capabilities of the operator of the search engine’. 189 The inspiration for this qualification to controller duties came from the Court in Google Spain when it stated that a search engine operator must ensure that its activity complies with the law’s requirements ‘within the framework of its responsibilities, powers and capabilities’. 190 The meaning of this phrase, and in particular its ramifications for the responsibilities of controllers under data protection law, were left unexplored until GC and others .

In GC, the Court invoked this responsibilities formula to devastating effect. It began by emphasising that the prohibition applies to all kinds of processing by all controllers 191 and that an a priori exclusion of search engines from the prohibition would run counter to its ambition of enhanced protection for such rights-infringing processing. 192 Nevertheless, the Court went on to highlight the ‘specific features’ of a search engine which would have an effect on the extent of its responsibility under the law. 193 In particular, as the search engine operator is responsible as a data controller by linking to existing publications, the Court held that the prohibition ‘can apply to that operator only be reason of that referencing and thus via a verification, under the supervision of the competent national authorities, on the basis of a request by the data subject’. 194 The end result of GC is that the Court, relying on the responsibilities formula, maintained the fiction that the law applied to Google search in full, while interpreting a provision of the law clearly worded as a prohibition as a right. This ad hoc rationalisation of the law to accommodate Google’s business model not only goes against a literal interpretation of the provision but also contradicts the law’s general scheme. 195 The consequences of this approach will be elucidated below.

B. Flexible Enforcement: the Role of Regulatory Discretion

An alternative option to interpreting the law in a flexible manner would be to introduce flexibility at the point at which decisions regarding the enforcement of the law are made. Two distinct options present. Regulators might first exercise judgment in deciding which actions or complaints they will pursue. They might subsequently display further flexibility in determining how they deal with these cases.

The extent to which regulators can exercise this first-level flexibility in complaint handling under the GDPR is unclear. In other fields, the idea of risk-based regulation has taken root. This is a strategy which allows regulators to ‘prioritize how they consume their limited enforcement resources such that threats that pose the greatest risks to the regulator’s achievement of its institutional objectives are given the highest priority, while those that pose the least risk are allocated with few (if any) of the regulator’s limited resources’. 196 European data protection regulators are already prioritising their resources in this way. The Irish regulator, for instance, states that it applies a ‘risk-based regulatory approach to its work, so that its resources are always prioritised on the basis of delivering the greatest benefit to the maximum number of people’. However, while risk might be used to prioritise regulatory resources, it cannot be used as a criterion to exclude the handling of complaints entirely. The law requires regulators to ‘handle complaints … and investigate, to the extent appropriate, the subject matter of the complaint and inform the complainant of the progress and outcome of the investigation’. 197 Authorities have seemingly sought to stem the flow of complaints coming their way by indirectly imposing on individuals ‘preliminary actions or evidence requirements that do not directly derive from the GDPR’, calling into question their legality. 198 Yet, an authority cannot simply ignore a complaint or decline to deal with it as it is not a regulatory priority. 199 This is supported by the fact that data subjects have an explicit right to an effective judicial remedy against a regulator where the regulator ‘does not handle a complaint or does not inform the data subject within 3 months on the progress or outcome of the complaint’. 200 Nevertheless, authorities must only handle complaints ‘to the extent appropriate’. This suggests that they may inject discretion into the process at the second level of flexibility.

Flexibility in terms of the response of regulators to an infringement is in keeping with the idea of responsive regulation. Ayres and Braithwaite’s influential work queried when regulators should punish and when they should persuade. Their enforcement pyramid proposed that regulators begin at the pyramid’s base with persuasion moving up the pyramid to warnings and then penalties if the regulatory engagement did not have the desired effect. 201 Is such a tit-for-tat approach permitted under the GDPR? According to the Court in Schrems II , the primary responsibility of regulators is to monitor the application of the GDPR 202 and to ensure that it is ‘fully enforced with all due diligence’. 203 Data protection regulators, which are endowed by the Charter with ‘complete independence’ in the discharge of their duties, might argue that such complete independence enables them to tailor the approach they take in order to ensure the ‘full’ enforcement of the law. This might entail starting at the bottom of the enforcement pyramid by relying on persuasion before escalating up the pyramid to credible sanctions at the top where required. Some national laws, such as the Irish Data Protection Act of 2018, 204 expressly foresee the possibility of the amicable resolution of disputes.

However, other aspects of the law appear to place a greater constraint on regulatory discretion. The provisions on administrative sanctions suggest that they were not envisaged as part of an enforcement pyramid. The GDPR text provides that regulators shall ensure that the imposition of administrative fines is effective, proportionate and dissuasive in each individual case 205 while the non-binding recitals state that penalties including administrative fines ‘should be imposed for any infringement … in addition to, or instead of appropriate measures imposed by the supervisory authority’. 206 By way of exception, it specifies that for minor infringements or if the fine would constitute a disproportionate burden to a natural person, a reprimand may be issued instead of a fine. Erdos, for instance, claims that the GDPR therefore establishes a presumption that a national data protection authority will ‘at least take formal corrective action once cognisant of a significant infringement of data protection law’. 207 This seems also to be borne out by the wider text of the GDPR. The idea of amicable dispute resolution is mentioned only once in a recital and, only then, in the context of disputes that are localised because of their nature or impact. 208 We could conclude that, at a minimum, amicable resolution is inappropriate in the context of transnational disputes which might require cooperation between various concerned authorities. It is notable also that while data subjects have the right to challenge a regulator before a Court where it does not handle a complaint or where it issues a legally binding decision 209 this seems to leave a gap in situations where the complaint is handled but no legally binding decision is adopted. 210 Again, this suggests that the legislature did not foresee such flexible enforcement of the rules at scale. Beyond the doctrinal question of whether data protection law allows for the exercise of such site-level discretion, this discretion also raises broader normative challenges to which we shall now turn.

C. The Challenges of Site-Level Flexibility

In an ideal world, the ‘unreasonable and excessive legal consequences’ 211 of the broad scope of application of data protection law might be avoided or mitigated by interpreting and enforcing the law flexibly while continuing to offer effective and complete protection to individuals. The reality, however, is that site-level flexibility itself entails potential negative repercussions that must be addressed. Two negative consequences stand out: these concern the effectiveness and the quality of the law, respectively.

(i) The effectiveness of the law

The impact that the flexible interpretation and enforcement of data protection law will have on the law’s effectiveness remains uncertain. In GC the Court was left with a choice: to declare Google Search’s data processing, and therefore its business model, to be incompatible with the law or to accommodate the business model. The Court’s solution—treating an ex ante prohibition as an ex post right—does the latter: it is a bespoke interpretation of the law designed to accommodate a business model that does not fit the mould. It has been suggested that this finding provides a ‘safety valve’ against the disproportionate extension of data protection obligations to search engine operators. 212 Such accommodation might be justified on the basis of the societally beneficial role search engines play in organising the world’s information. 213 It was likely for this reason that the Advocate General considered that any finding of incompatibility with the law by search engines would be absurd. Yet, the relationship between law and technology in this instance is worth highlighting. The law is often simplistically characterised as seeking to keep up with technology, however, in GC we see that technological design impacts the interpretation and application of the law. 214 Specifically, the responsibilities formula deployed by the Court to rationalise the law’s application means that technologies that are designed in a way that renders data protection compliance impossible may avoid the law. It is thus no longer safe to assume that when there is personal data processing, ‘the entire body of the data protection guarantees applies’. 215 The Court’s approach is likely to embolden proponents of the ‘move fast and break things’ model of technological practices and design. We might, for instance, query whether data protection rights such as the right to delete can be exercised on an immutable decentralised ledger technology such as blockchain 216 or whether a tool like ChatGPT could avoid ex ante or ex post data protection requirements as they are not commensurate with the ‘powers, capabilities and responsibilities’ of the relevant data controllers. In short, the risk is that the responsibilities formula creates an incentive for technologists to circumvent the law through design, a scenario that almost certainly militates against effective data protection. 217

Nor is it clear that the flexible enforcement of the law will yield more effective data protection. While it is generally acknowledged that the success of data protection law should not be measured using a crude assessment such as the number of fines issued, 218 this is in part because the law offers a broader array of corrective powers that regulators can draw on, such as a ban on data processing operations, that may have an equally, if not more significant effect, than fines. 219 Evidence to date indicates that European data protection regulators have made limited use of the full palette of corrective powers. 220 If flexible enforcement, anchored in the enforcement pyramid, secured the more effective application of data protection law, a purposive interpretation of the law would support its application. However, we lack the empirical evidence needed to assess whether flexible enforcement leads to more effective protection. In situations where the overall level of formal enforcement drops dramatically due to a regulatory preference for informal interactions between regulators and regulatees, doubts arise as to the impact of the law in practice. For instance, in the UK although the regulator ‘handled’ 40,000 data subject complaints in the 2021–2022 period only four fines were issued for breach of the GDPR totalling £663,000 in total. 221 No other enforcement notices or penalties were issued. Some of the examples of situations where the regulator opted not to use its formal enforcement powers are striking. For instance, the Information Commissioner’s Office (ICO) did not impose an administrative sanction on two police forces that surreptitiously recorded and stored over 200,000 phone conversations involving victims, witnesses and perpetrators of suspected crimes as part of its revised approach towards the public sector. 222 We might legitimately query in these circumstances whether informal enforcement is delivering effective fundamental rights protection.

(ii) The quality of the law

The flexible interpretation and application of the law is difficult to square with some of the core qualities of law that ensure its internal morality, including that law be general, publicly promulgated and that there be congruence between official action and declared rule. 223 This is particularly important in the data protection context where the foreseeability of the law is a requirement to justify interferences with fundamental rights 224 while the foreseeability of data processing operations is central to garnering public trust in processing and technology. 225

The data protection framework is ‘all or nothing’ in so far as it applies when the data processed is personal but not to non-personal data. 226 However, it has arguably never been accurate to characterise the data protection framework as a one-size-fits-all model, or an ‘intensive and non-scalable regime of rights and obligations’ 227 due to the existence of the general principle of proportionality and the introduction of risk-management obligations. These already introduce a significant degree of flexibility into how the law is interpreted. For instance, Gellert observes that while the GDPR provides some guidance to data controllers regarding potential sources of risk (toxicological factors) it leaves the consequences and harms (epidemiological factors) as well as the methodologies for assessing harms undelineated to a large extent. 228 However, the use of the responsibilities formula marks a qualitative shift in the law’s flexibility. 229 While some may welcome a doctrine that enables the application of the law to be calibrated to the powers of the data controller, 230 this must be set against the uncertainty that this formula introduces about how the rules apply to whom. Unlike other elements of the legal regime which also introduce elements of scalability, such as the provisions introducing risk-management requirements, the application of this formula comes with no guidance or legislative footing. Quelle suggests that this gap could be filled by applying the responsibilities formula with reference to risk. 231 While this may help to anchor the application of the responsibilities formula more firmly to the text of the GDPR in some circumstances, it would not be helpful when interpreting provisions where there is no reference made to risk. The result will be the further unpredictability of the regime’s application to the detriment of not only its effectiveness but also its transparency and predictability.

Moreover, while the ‘rule of reason’ applied by the Court might be likened to the principle of proportionality, proportionality analysis does not feature explicitly at all in the Court’s reasoning. Like the application of the rule of reason in competition law, where a restriction on competition was removed from the scope of competition law as this restriction was inherent in the pursuit of public policy objectives, this might be characterised as ‘bold and innovative or unprincipled and misconceived’ 232 depending on one’s perspective. More generally, the extent of the role that proportionality could play in introducing flexibility to the law’s application remains ambiguous. If the data protection framework is correctly characterised as a justificatory framework for data processing that interferes with fundamental rights, then the provisions of the GDPR and their interpretation should embody the principle of proportionality. Primarily through the jurisprudence of the Court, proportionality has emerged as a ‘data privacy principle in its own right’ with some viewing it as being ‘at the core of the GDPR’s structure’. 233 While the data protection principles do not explicitly include proportionality, it is said to underpin them and ‘shines through in their interstices’. 234 Proportionality therefore potentially offers a more rigorous tool through which to introduce flexibility into the data protection framework. This, however, depends on how the proportionality principle is applied. The Court has, for instance, on occasion replaced an assessment of whether data processing was compatible with the specific provisions of the GDPR with a more general assessment of whether the processing was compatible with the principle of proportionality, grounding its reasoning directly in the EU Charter rights to data protection and to respect for private life. 235 Regulators are more likely than Courts to engage in a more loyal and specific application of the law’s provisions than to replace their application with a broader proportionality analysis, as the Court did in this case. Moreover, while some provisions of the law lend themselves readily to proportionality analysis, 236 notably the principles found in Article 5 GDPR, many of the law’s other ex ante requirements, such as transparency obligations and the abovementioned prohibition on special category data processing, are less amenable to proportionate interpretation. The appropriate role of this principle in calibrating the application of data protection law, and its relationship with the risk requirements introduced by the GDPR, requires further research and consideration.

The compatibility of responsive regulatory enforcement with rule of law requirements has received surprisingly little attention. 237 The complete independence of data protection authorities dictates that these regulators exercise their powers free from internal and external influence. However, some accountability mechanisms must exist if regulators fail to discharge their primary responsibility of enforcing the law. 238 The status quo also does nothing to prevent zealous application of the law, such as fining individuals for the positioning of their home or business surveillance cameras or for posting content filming public disorder incidents on social media. 239 The transparency of the criteria applied in deploying the enforcement pyramid will be critical in this regard. 240 For instance, the ICO has adopted a revised approach towards the public sector, where it has opted to use its discretion to reduce the impact of fines on public sector operators. Pursuant to this approach, the ICO will rely on powers to warn, reprimand and issue enforcement notices, with fines only handed down in the ‘most serious cases’. 241 However, the example mentioned above of the covert recording of conversations by the police where no fine was issued begs the question of what the ICO considers to be a ‘serious case’. More broadly, empirical evidence suggests that where regulators have adopted a strategic approach to enforcement this has neither been calibrated to the extent to which the data controllers demonstrated compliance with relevant legal requirements nor systematically assessed against the overarching requirement to achieve effective and complete protection of data subjects. 242

In the absence of clear and transparent criteria guiding the enforcement of the law, the ensuing regulatory roulette offends against the equal protection and application of the law to the detriment of its beneficiaries—individuals in the first instance but ultimately society. Moreover, it may be inappropriate to apply the ‘conversational approach’ to the enforcement of the law, found at the bottom of the enforcement pyramid, in some circumstances. These includes where the stakes are high (such as in situations where there is a risk of irreversible harm); where there are no repeated interactions with regulatees; or where the regulatee is reluctant to comply. 243

Data protection law faces mounting criticism, both from human rights scholars and activists and from those who treat it as an unnecessary impediment to boundless data processing and the claimed innovation this would entail. Despite the technological developments during its lifespan, it has proven to be a resilient and adaptable legal framework, most recently acting as a first brake on the deployment of generative AI in ways that violate fundamental rights. The expansive interpretation of responsibility under the law has already yielded some benefits. Equally, however, many of the challenges that the law faces stem from its application, not to everything, but to everyone. While we could think of data protection as a broad church, it has also been characterised (perhaps more accurately) as an indiscriminate obsession. 244 Thinking about the law’s future, we could be pulled in different directions. On the one hand, it is challenging to interpret the law in way that adheres to different contexts while, on the other, its broad application puts regulators under pressure with rising numbers of complaints which they have an imperative to handle. The judicial response has been to overlook these problems, or to simply patch them by rationalising the law’s application in an ad hoc manner.

Turning to the future, the possibility of using increased site-level flexibility must be further explored and the rule of law challenges it entails addressed. This can be done by the EDPB without legislative change under the auspices of the GDPR. More broadly, however, it is clear that the current lack of empirical assessment of how the law applies in practice ‘leaves legal reformers shooting in the dark, without a real understanding of the ways in which previous regulatory attempts have either promoted or thwarted privacy’s protection’. 245 Recognising that no law is ever fully enforced, what is required for data protection is agreement on an appropriate standard against which to gauge regulatory effectiveness. Determining an appropriate balance between data protection compliance and data protection enforcement will be necessary. Finally, and perhaps most ambitiously, the purposes of data protection law need to be further specified by the Court. A starting point may be to disentangle the intersecting demands of informational privacy from those of fair information governance. 246

This may seem like an uphill battle. Data protection pioneer, Spiros Simits, spoke of data protection as an ‘impossible task’. 247 However, Simitis also saw data protection as an ‘unending learning process’ necessitating a ‘continuous critical review of the regulatory approach’ to ensure its efficiency. 248 It is in this spirit that the challenge of securing effective fundamental rights protection in the digital era should be approached.

Associate Professor, LSE Law School and Visiting Professor, College of Europe Bruges, Belgium. E-mail: [email protected]. I am very grateful to the Editors of Current Legal Problems for the invitation to contribute to this series, with particular thanks to Despoina Mantzari for her guidance throughout. My thanks also to the anonymous referees for their valuable comments and to Mr Wojciech Wiewiórowski, the European Data Protection Supervisor, for his generosity in attending and chairing the lecture. I benefited from helpful feedback on earlier drafts of this text from Gloria Gonzalez Fuster, Hielke Hijmans, Filippo Lancieri, Rotem Medzini, Katherine Nolan and Thomas Streinz. All views, and any errors, remain my own.

Bilyana Petkova, ‘Privacy as Europe’s First Amendment’ (2019) 25 European Law Journal 140.

For instance, in Google Spain the Court held that ‘as a general rule’ the data subject’s rights to data protection and to respect for private life override the interests of internet users in access to information (Case C-131/12, Google Spain SL and Google Inc. v Agencia Española de Protección de Datos (AEPD) and Mario Costeja González EU:C:2014:317, para 81).

Gloria Gonzalez Fuster and Hielke Hijmans, ‘The EU Rights to Privacy and Personal Data Protection: 20 Years in 10 Questions’, VUB Discussion Paper (2019) https://cris.vub.be/ws/portalfiles/portal/45839230/20190513.Working_Paper_Gonza_lez_Fuster_Hijmans_3_.pdf .

Anu Bradford, The Brussels Effect: How the European Union Rules the World (OUP 2020) 132; Anu Bradford, ‘The Brussels Effect’ (2012) 107 Nw U L Rev 1, 22–26. The Council of Europe’s Convention 108 is also a highly influential instrument and a likely standard for global convergence; Global Privacy Assembly, ‘Privacy and Data Protection as Fundamental Rights – A Narrative’ https://globalprivacyassembly.org/wp-content/uploads/2022/03/PSWG3-Privacy-and-data-protection-as-fundamental-rights-A-narrative-ENGLISH.pdf , 48–50.

Regulation (EU) 2022/1925 of the European Parliament and of the Council of 14 September 2022 on contestable and fair markets in the digital sector and amending Directives (EU) 2019/1937 and (EU) 2020/1828 (Digital Markets Act) (Text with EEA relevance) OJ [2022] L265/1.

Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act) (Text with EEA relevance) OJ [2022] L277/1.

Ibid, Article 2(4)(g) and recital 10. This also follows from recital 12 and Article 8(1) Digital Markets Act (n 5).

Peter Hustinx, ‘The Role of Data Protection Authorities’ in Serge Gutwirth et al. (eds), Reinventing Data Protection (Springer 2009) 131, 133.

From within the Court see, for instance, Case C-245/20, X, Z v Autoriteit Persoonsgegevens ECLI:EU:C:2021:822, Opinion of AG Bobek, paras 55–56. Nadezhda Purtova, ‘The Law of Everything. Broad Concept of Personal Data and Future of EU Data Protection Law’ (2018) Law, Innovation and Technology 40; Bert-Jaap Koops, ‘The Trouble with European Data Protection Law’ (2014) 4 International Data Privacy Law 250.

Colin J. Bennett and Robin M. Bayley, ‘Privacy Protection in the Era of “Big Data”: Regulatory Challenges and Social Assessments’ in Bart van der Sloot, Dennis Broeders and Erik Schrijvers (eds), Exploring the Boundaries of Big Data (Amsterdam University Press 2016) 205, 210.

Raphaël Gellert, The Risk-Based Approach to Data Protection (OUP 2020), 186.

Luca Tosoni, ‘Article 4(6): Filing System’ in Christopher Kuner, Lee A Bygrave, Christopher Docksey and Laura Drechsler (eds), The EU General Data Protection Regulation (GDPR): A Commentary (OUP 2020) 138, 141.

The expansive approach to the territorial application of the GDPR is justified on the same grounds but is beyond consideration of the jurisdictional reach of the rules is beyond the scope of this article. On jurisdictional issues see, Merlin Gömann: ‘The New Territorial Scope of EU Data Protection Law: Deconstructing a Revolutionary Achievement’ (2017) Common Market Law Review 567.

Before the enactment of the GDPR Erdos remarked that its ‘almost unfathomable scope, inflexible nature and sometimes unduly onerous default standards’ are ill suited to digital realities, recommending a more radical shift of focus and balance in the law. David Erdos, European Data Protection Regulation, Journalism, and Traditional Publishers: Balancing on a Tightrope? (OUP 2019) 146.

Colin Bennett, Regulating Privacy: Data Protection and Public Policy in Europe and the United States (Cornell University Press 1992).

Articles 5 and 6 GDPR.

Articles 12–22 GDPR.

Claudia Quelle, ‘Enhancing Compliance under the General Data Protection Regulation: The Risky Upshot of the Accountability- and Risk-based Approach’ (2018) 9 European Journal of Risk Regulation 502; Reuben Binns, ‘Data Protection Impact Assessments: A Meta-regulatory Approach’ (2017) 7 International Data Privacy Law 22.

On the phenomenon of legislative instruments giving expression to fundamental rights in equality and data protection law see Elise Muir, EU Equality Law: The First Fundamental Rights Principle of the EU (OUP 2018) 137–143.

Joined Cases C-293/12 and 594/12, Digital Rights Ireland Ltd and Seitlinger and others EU:C:2014:238, para 36. See also C-311/18, Data Protection Commissioner v Facebook Ireland Limited and Maximillian Schrems EU:C:2020:559, para 170; Opinion 1/15, ECLI:EU:C:2016:656, para 123.

The Court has conceptualised the application of the right to data protection in this way, however, the content and application of the right remain contested. See, González Fuster and Hijmans (n 3).

While it remains possible to envisage daily activities that do not entail personal data processing, such as riding a bicycle or reading a book, a digital component is now introduced to many of our activities (such as the digital transactions required to rent a bike in a city or the use of an e-reader to read books).

This term was coined by Purtova in her influential article ‘The Law of Everything’ (n 9).

Damian George, Kento Reutimann and Aurelia Tamò-Larrieux, ‘GDPR Bypass by Design? Transient Processing of Data Under the GDPR’ (2019) 9 International Data Privacy Law 285.

Article 4(6) GDPR defines a ‘filing system’ as ‘any structured set of personal data which are accessible according to specific criteria, whether centralised, decentralised or dispersed on a functional or geographical basis’.

The GDPR recognises a category of pseudonymous data but this is still categorised as personal data (Article 4(5) GDPR).

Recital 26 GDPR. Article 4(3)(b) GDPR defines pseudonymisation.

See, for instance, the examples recognised in the Court’s jurisprudence referred to by Wachter and Mittelstadt in Sandra Wachter and Brett Mittelstadt, ‘A Right to Reasonable Inferences: Re-thinking Data Protection Law in the Age of Big Data and AI’ (2019) Columbia Business Law Review 1, 30–31.

Article 4(1) GDPR.

Paul Ohm, ‘Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization’ (2009) 57 UCLA Law Review 1701; Michèle Finck and Frank Pallas, ‘They Who Must Not Be Identified—Distinguishing Personal from Non-personal Data Under the GDPR’ (2020) 10 International Data Privacy Law 11; Nadezdha Purtova, ‘From Knowing by Name to Targeting: The Meaning of Identification Under the GDPR’ (2022) 12 International Data Privacy Law 163.

See, Taylor and Francis, ‘What Are the Different Types of Peer Review?’ https://authorservices.taylorandfrancis.com/publishing-your-research/peer-review/types-peer-review/ ; or OUP, ‘Five Models of Peer Review: A Guide’ (23 September 2021) https://blog.oup.com/2021/09/five-models-of-peer-review-a-guide/ .

Anonymity in this context serves the purpose of limiting the risk of bias in the evaluation procedure (as distinct from under the GDPR where it serves to determine the law’s scope of application).

Case C-434/16, Nowak v Data Protection Commissioner EU:C:2017:994, para 34. One might argue that the article itself is simply data—a source of information—that needs to be read to reveal information about the individual, however, the Court has not, as of yet, made this distinction between data and information.

Durant v Financial Services Authority [2003] EWCA Civ 1746.

Nowak (n 33) para 35.

ibid, paras 37–39.

ibid, para 43.

ibid, para 34.

Case C-582/14, Patrick Breyer v Bundesrepublik Deutschland ECLI:EU:C:2016:779 para 43 confirms that it is not necessary that the information enabling identification be in the hands of one entity. However, for such data to constitute identifiable information it must also be assessed whether the data combination is a means reasonably likely to be used to identify an individual (para 45).

See, for instance, Articles 17(4) and 85 GDPR. The balancing of data protection and related rights and freedom of expression must therefore occur within the data protection framework.

Case C-245/20, X, Z v Autoriteit Persoonsgegevens, Opinion of AG Bobek (n 9) paras 56 and 57.

Purtova, ‘The Law of Everything’ (n 9) 57.

Bart van der Sloot, ‘Truth from the Sewage: Are We Flushing Privacy Down the Drain?’ (2021) 12 European Journal of Law and Technology https://ejlt.org/index.php/ejlt/article/view/766

Salóme Viljoen, ‘A Relational Theory of Data Governance’ (2021) 131 Yale L J 573.

Michal S. Gal and Orla Lynskey, ‘Synthetic Data: Legal Implications of the Data-Generation Revolution’ (forthcoming) Iowa Law Review 2023; LSE Legal Studies Working Paper No. 6/2023.

Lorenzo Dalla Corte, ‘Scoping Personal Data: Towards a Nuanced Interpretation of the Material Scope of EU Data Protection Law’ (2019) 10 European Journal of Law and Technology https://ejlt.org/index.php/ejlt/article/view/672 .

Article 2(2)(b), (a) and (d) GDPR, respectively.

Article 2(3) GDPR.

Article 2(2)(c) GDPR.

Case C-101/01 Bodil Lindqvist [2003] ECR I-12971, para 39.

ibid, para 47.

Case C-212/13 František Ryneš v Úřad pro ochranu osobních údajů EU:C:2014:2428.

Article 29 Data Protection Working Party, ‘Opinion 1/2010 on the concepts of “controller” and “processor”’ WP169, adopted on 16 February 2010, 25. This Opinion was superseded by European Data Protection Board (EDPB) Guidelines. EDPB, ‘Guidelines 07/2020 on the concepts of controller and processor in the GDPR’ version 1.0, adopted on 2 September 2020, 8.

Article 4(7) GDPR.

Article 4(8) GDPR.

For instance, the processor is under a general obligation to ensure that appropriate technical and organisational measures are in place to ensure the processing complies with the Regulation and that any sub-processors it engages comply with the terms of the original contract with the controller (Article 28 GDPR).

Opinion 1/2010 (n 54) 2.

René Mahieu, Joris van Hoboken and Hadi Asghari, ‘Responsibility for Data Protection in a Networked World on the Question of the Controller, “Effective and Complete Protection” and its Application to Data Access Rights in Europe’ (2019) 10 Journal of Intellectual Property , Information Technology and Electronic Commerce Law 85, 87.

Heleen Janssen, Jennifer Cobbe, Chris Norval and Jatinder Singh, ‘Decentralized Data Processing: Personal Data Stores and the GDPR’ (2020) 10 International Data Privacy Law 356.

Brendan Van Alsenoy, ‘Allocating Responsibility among Controllers, Processors, and “Everything in Between”: The Definition of Actors and Roles in Directive 95/46/EC’ (2012) Computer Law & Security Review 25, 27.

EDPB Guidelines (n 54) 3 and 19.

Article 4(7) GDPR. Data controllers and data processors also benefit from procedural rights, such as the right to lodge a complaint with a supervisory authority: Article 77(1) GDPR.

The GDPR does recognise the specific needs of micro, small- and medium-sized enterprises to some extent in several recitals (recitals 13, 98, 137 and 167). It provides that their specific needs should be taken into account when codes of conduct are drawn up to contribute to the Regulation’s proper application and when certification measures are introduced, although neither codes of conduct nor certification have been widely adopted so far (Articles 40 and 42 GDPR).

The jurisdictional component of this case was also notable. The Court had held that although Google Inc., the parent company responsible for the coordination of Google’s data processing operations was established in the USA, the presence of a subsidiary in Spain selling advertising to cross-subsidise these operations was sufficient to bring the processing within the scope of EU data protection law. Google Spain (n 2) para 55.

Article 29 Working Party, ‘Opinion 1/2008 on data protection issues related to search engines’, adopted on 4 April 2008 WP148, 14.

Google Spain (n 2) para 22.

ibid, para 33.

ibid, paras 34 and 38.

Mahieu and von Hoboken note that the Court is more concerned with ensuring effective and complete protection ‘than a more literal interpretation of the law’s text would seem to point to’. René Mahieu and Joris von Hoboken, ‘Fashion ID: Introducing a Phase-oriented Approach to Data Protection?’ ( European Law Blog , 30 September 2019).

Case C-210/16, Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v Wirtschaftsakademie Schleswig-Holstein GmbH (Facebook fan pages) ECLI:EU:C:2018:388, para 39.

ibid, para 35.

ibid, para 36.

This was a reasonable assumption based on the way in which the Court set out its reasoning. For instance, Mahieu et al (n 59, 94) were critical of the Court’s decision in Facebook fan pages stating that ‘it seems unreasonable that if Facebook would not offer the so-called Insights function, the fan page administrator would no longer have responsibility for the data processing’.

Case C-40/17, Fashion ID GmbH & Co.KG v Verbraucherzentrale NRW eV ECLI:EU:C:2019:629, para 75.

ibid, para 76. The Court noted that it seemed ‘impossible’ that Fashion ID determines the purposes and means of these subsequent processing operations.

ibid, para 77.

ibid, para 78.

ibid, para 79.

ibid, para 80.

ibid, para 82. Facebook fan pages (n 72) para 38.

This was noted by the Advocate General in Fashion ID who considered that taken to extremes this makes anyone in a ‘personal data chain’ who makes data processing possible a controller. Case C-40/17, Fashion ID GmbH & Co. KG v Verbraucherzentrale NRW eV ECLI:EU:C:2019:629, Opinion of AG Bobek, para 74.

Case C-25/17, Jehovan todistajat EU:C:2018:551.

ibid, para 16.

ibid, para 23.

ibid, para 73.

The EDPB distinguishes between essential means of processing (which is closely linked to purposes) and includes determining what and whose personal data is processed and for long, and non-essential means which concerns more practical aspects of implementation (e.g. Hardware choices). EDPB Guidelines (n 54) 14.

Jehovan (n 84) para 69.

ibid, para 67.

C-604/22, IAB Europe v Gegevensbeschermingsautoriteit (application pending).

Lee A. Bygrave and Luca Tosoni, ‘Article 4(7): Controller’ in Christopher Kuner, Lee A Bygrave, Christopher Docksey and Laura Drechsler (eds), The EU General Data Protection Regulation (GDPR): A Commentary (OUP 2020) 145, 152.

EDPB Guidelines (n 54) 13.

R v Department of Health; ex parte Source Informatics Ltd [2000] 1 All ER 786, para 799. In R v Department of Health the UK Court of Appeal held obiter dicta that the process of anonymising personal data did not qualify as a form of ‘processing’ under the 1998 DPA.

Durant (n 34).

Case C-28/08P, European Commission v The Bavarian Lager Co. Ltd ECLI:EU:C:2009:624, Opinion of AG Sharpston, paras 144–146.

ibid, para 146.

ibid, paras 137 and 139.

It is perhaps also notable that the Advocate General took a holistic approach to ‘processing’ viewing the processing operation as a composite whole: the she looked at the overall process of retrieving a legally contested digital document as opposed to a series of smaller, distinct processing operations.

Instead, the Court simply endorsed the General Court’s finding that the ‘communication of data, by transmission, dissemination or otherwise making available, falls within the definition of processing’. Case C-28/08P European Commission v The Bavarian Lager Co. Ltd [2010] ECR I-06055, para 69; endorsing [105] in T-194/04 The Bavarian Lager Co. Ltd v Commission [2007] ECR II-04523.

Case C-141/12 YS v Minister voor Immigratie, Integratie en Asiel and Minister voor Immigratie, Integratie en Asiel v M and S ECLI:EU:C:2020:753, Opinion of AG Sharpston, para 63.

ibid, para 56.

ibid, paras 58 and 59.

ibid, paras 55.

Case C-141/12 YS v Minister voor Immigratie, Integratie en Asiel and Minister voor Immigratie, Integratie en Asiel v M and S ECLI:EU:C:2020:753, para 40.

ibid, para 41.

ibid, paras 42–46.

Orla Lynskey, ‘Criminal Justice Profiling and EU Data Protection Law: Precarious Protection from Predictive Policing’ (2019) 15 International Journal of Law in Context 162, 169. This finding was likely influenced by a desire to avoid undermining established principles of administrative law, like freedom of information, in Member States.

Lee A. Bygrave and Luca Tosoni, ‘Article 4(1): Personal Data’ in Christopher Kuner, Lee A Bygrave, Christopher Docksey and Laura Drechsler (eds), The EU General Data Protection Regulation (GDPR): A Commentary (OUP 2020) 103, 110.

Nowak (n 33) para 46.

ibid, para 49.

Bygrave and Tosoni, ‘Article 4(1)’ (n 109) 110.

Facebook fan pages (n 72) para 24(1).

This aligns to the findings of the Supreme Court of Milan which held that as long as the illicit data is unknown to the service provider it cannot be a data controller. Giovanni De Gregorio, Digital Constitutionalism in Europe: Reframing Rights and Powers in the Algorithmic Society (Cambridge Studies in European Law and Policy, CUP 2022), 138.

Case C-131/12, Google Spain SL and Google Inc. v Agencia Española de Protección de Datos (AEPD) and Mario Costeja González EU:C:2014:317, Opinion of AG Jääskinen, para 83

ibid, para 82.

Case C-40/17, Fashion ID , Opinion of AG Bobek (n 83), para 71.

ibid, para 72.

Spiros Simitis, ‘Legal and Political Context of the Protection of Personal Data and Privacy’ (Speech in Montreal, September 1997) Council of Europe Archives (T-PD (97) 17—on file with the author), 7.

Case C-245/20, X, Z v Autoriteit Persoonsgegevens, Opinion of AG Bobek (n 9) para 65.

Bygrave and Tosoni, ‘Article 4(1): Personal Data’ (n 109) 113. We will return to the distinction between compliance and enforcement below.

Katherine Nolan. The Individual in EU Data Protection Law (PhD thesis; LSE Law School), 130.

Koops (n 9) 251.

Irish Data Protection Commission, ‘Annual Report 2021’, 5.

Gráinne de Búrca, Reframing Human Rights in a Turbulent Era (OUP 2021), 46.

Adam Satariano, ‘Europe’s Privacy Law Hasn’t Shown Its Teeth, Frustrating Advocates’ New York Times (27 April 2020), https://www.nytimes.com/2020/04/27/technology/GDPR-privacy-law-europe ; Johnny Ryan and Alan Toner, ‘Europe’s Governments are Failing the GDPR’ (Brave Report 2020).

Impact Assessment, ‘Commission Staff Working Document accompanying SEC(2012) 72 final’, Brussels (2 January 2012), 103.

Access Now, ‘The right to lodge a data protection complaint: OK, but then what? An empirical study of current practices under the GDPR’, June 2022. More generally, it noted that ‘there is a lack of precise information on complaint-handling, including on the number of complaints lodged with DPAs’ (ibid, 4).

The number of complaints received could also be an indicator of the relevance and visibility of the law to individuals.

Google Transparency Report, ‘Requests to Delist Content Under European Privacy Law’, https://transparencyreport.google.com/eu-privacy/overview?hl=en-GB .

Julia Powles, ‘The Case That Won’t Be Forgotten’ (2015) 47 Loy U Chi LJ 583. See also, Julia Powles and Enrique Chaparro, ‘How Google Determined Our Right to Be Forgotten’, The Guardian (18 February 2015).

Julia Black, Rules and Regulators (OUP 1997), 9.

Article 1(2) and (3) GDPR. Chapter V GDPR subjects data flows to outside the EU to distinct legal requirements to ensure that the level of protection individuals receive when the data is transferred out of the EU is ‘essentially equivalent’ to within the EU to prevent the circumvention of the data protection framework.

Macenaite, for instance, considers the aims of developing a data-driven economy and protecting fundamental rights and freedoms to be essentially contradictory while Yakovleva envisages their reconciliation. Milda Macenaite, ‘The “Riskification” of European Data Protection Law through a Two-Fold Shift’ (2017) 8 European Journal of Risk Regulation 506, 507; Svetlana Yakovleva, ‘Personal Data Transfers in International Trade and EU Law: A Tale of Two Necessities’ (2020) 21 Journal of World Investment & Trade 881, 888.

Kristina Irion, ‘A Special Regard: The Court of Justice and the Fundamental Rights to Privacy and Data Protection’ in Ulrich Faber et al (eds), Gesellschaftliche Bewegungen - Recht unter Beobachtung und in Aktion: Festschrift für Wolfhard Kohte (Nomos 2016) 873. This was foreseen by Spiros Simitis, ‘From the Market to the Polis: The EU Directive on the Protection of Personal Data’ (1994–1995) 80 Iowa Law Review 445.

Plixavra Vogiatzoglou and Peggy Valcke, ‘Two Decades of Article 8 CFR: A Critical Exploration of the Fundamental Right to Personal Data Protection in EU Law’ in Eleni Kosta, Ronald Leenes and Irene Kamara (eds), Research Handbook on EU data protection (Edward Elgar 2022). See also, Gonzalez Fuster and Hijmans (n 3).

C-60/22, UZ v Bundesrepublik Deutschland ECLI:EU:C:2023:373, para 65.

The Court has not explicitly confirmed that the GDPR ‘gives expression’ to the right to data protection, which might result in a self-referential system whereby the right to data protection is interpreted in light of secondary law. Nadezhda Purtova, ‘Default Entitlements in Personal Data in the Proposed Regulation: Informational Self-determination Off the Table … and Back On Again?’ (2014) 30 Computer Law & Security Review 6, 11.

This echoes Koops’ earlier observation that ‘we see data protection bodies moving all around, but they do not provide us with real protection’. Koops (n 9) 259.

Neil Richards, Why Privacy Matters (OUP 2022) 52.

Ari Ezra Waldman, Industry Unbound: The Inside Story of Privacy, Data and Corporate Power (CUP 2021), 114. This echoes the findings of Black in the field of financial services regulation where she refers to ‘creative compliance’. Julia Black, ‘Learning from Failures: “New Governance” Techniques and the Financial Crisis’ (2012) 75 Modern Law Review 1037.

Filippo Lancieri, ‘Narrowing Data Protection’s Enforcement Gap’ (2022) 74 Maine Law Review 15, Appendix: 65–72.

Lancieri cites Sanchez-Rola et al. to this effect. See, Iskander Sanchez-Rola et al., ‘Can I Opt Out Yet? GDPR and the Global Illusion of Cookie Control’ (2019) Proceedings of the 2019 ACM Asia Conference on Computer and Communications Security 1, 3–5.

Hodges advocates that effective data protection requires ‘a system of constructive engagement in resolving problems, involving relationships based on evidence of trust’ between regulators and businesses. Christopher Hodges, ‘Delivering Data Protection: Trust and Ethical Culture’ (2018) 1 European Data Protection Law Review 65, 79.

Hielke Hijmans, ‘How to Enforce the GDPR in a Strategic, Consistent and Ethical Manner? A Reaction to Christopher Hodges’ (2018) 1 European Data Protection Law Review 80, 82.

Fuller refers to ‘rules that require conduct beyond the powers of the affected party’. Lon L. Fuller, The Morality of Law (Revised edn, Yale University Press 1969), 39.

Facebook fan pages (n 72) para 38; Jehovan (n 84) para 69.

See https://www.facebook.com//legal/controller_addendum accessed 23 August 2023.

Articles 12–14 GDPR.

Case C-40/17, Fashion ID , Opinion of AG Bobek (n 83) para 84.

EDPB, ‘Report of the Work Undertaken by the Supervisory Authorities within the 101 Task Force’ (28 March 2023), 10.

There is emerging consensus that there are structural impediments to its effective enforcement. For instance, the European Data Protection Supervisor hosted a conference in May 2022 on data protection enforcement to make progress on this issue. European Data Protection Supervisor, Effective enforcement in the digital world, June 2022. https://www.edpsconference2022.eu/en .

This was the case in Facebook Fanpages (n 72).

Michael Veale, Midas Nouwens and Cristiana Teixeira Santos, ‘Impossible Asks: Can the Transparency and Consent Framework Ever Authorise Real-Time Bidding After the DPA Decision?’ (2022) Technology and Regulation 12.

On the enhancement of the role of civil society actors and public regulators in this space see Lancieri, ‘Data Protection’s Enforcement Gap’ (n 142) 57–60.

Irish Data Protection Commission, ‘Annual Report 2018’, 5.

EDPB, ‘Overview on Resources Made Available by Member States to the Data Protection Authorities and on Enforcement Actions by the Data Protection Authorities’ (5 August 2021), 10. However, in its study on complaints Access Now notes that what can be gleaned from such figures is limited due to disparities in what is treated as a complaint and the handling of complaints at national level. Access Now (n 128), 4.

ibid, 5. With the exception of Germany which has over 1000 employees, all other regulators had fewer than 300 employees in 2021 (ibid).

Niamh Dunne, ‘Knowing When to See It: State Activities, Economic Activities, and the Concept of Undertaking’ (2010) 16 Colum J Eur L 427.

J Black, Rules and Regulators (n 132) 30. This is in keeping with later work describing regulation as a ‘communicative process’. See, Julia Black, ‘Regulatory Conversations’, (2002) 29 Journal of Law and Society 163, 164.

Ian Begley, ‘Office of Data Protection Commissioner Says GPO Can Keep their Bins as Public Litter Is Not in Breach of GDPR rules’, Irish Independent (2 May 2019). https://www.independent.ie/irish-news/office-of-data-protection-commissioner-says-gpo-can-keep-their-bins-as-public-litter-is-not-in-breach-of-gdpr-rules-38073828.html .

Chris Reed, ‘The Law of Unintended Consequences – Embedded Business Models in IT Regulation’ (2007) Journal of Information, Law and Technology 33, 9 (noting the law’s ‘implicit assumption that there is central control of personal data processing’).

Article 5(2) and 30 GDPR.

Rotem Medzini, ‘Credibility in Enhanced Self-regulation: The Case of the European Data Protection Regime’ (2021) Policy & Internet 13(3) 366.

Nolan (n 122) 37.

Article 30(5) GDPR contains a derogation from the requirement to maintain a record of processing activities for SMEs, however, Article 25 on data protection by design and by default contains no such exceptions.

See, for instance, Autorité de Protection des Données, ‘The BE DPA to restore order to the online advertising industry: IAB Europe held responsible for a mechanism that infringes the GDPR’, Press Release (2 February 2022); Decision of the litigation chamber, Case number: DOS-2019-01377 (2 February 2022).

EDPB, ‘Guidelines 04/2020 on the Use of Location Data and Contact Tracing Tools in the Context of the COVID-19 Outbreak’ (21 April 2020); EDPB, ‘Guidelines 03/2020 on the Processing of Data Concerning Health for the Purpose of Scientific Research in the Context of the COVID-19 Outbreak’, 21 April 2020.

The French regulator (the CNIL) received 14,143 complaints in 2021 and responded to a further 33,329 phone calls and 16, 898 contacts by e-mail in 2021 with advice and information (representing a 39% increase on 2020). Commission National Informatique et Libertés (CNIL), ‘The CNIL in a Nutshell 2022’, 4.

Dr Mary Fairhurst v Mr Jon Wakefield (Oxford County Court) (12 October 2021), Case No: G00MK161.

Article 5(1)(a) GDPR.

Damian Clifford and Jef Ausloos, ‘Data Protection and the Role of Fairness’ (2018) 37 Yearbook of European Law 130.

Reporting on the findings from national rapporteurs see, Orla Lynskey, ‘General Report Topic 2: The New EU Data Protection Regime’ in Jorrit Rijpma (ed), The New EU Data Protection Regime: Setting Global Standards for the Right to Personal Data Protection (Eleven International Publishing 2020) 23, 36.

Lancieri, ‘Data Protection’s Enforcement Gap’ (n 142) 28–55.

Giulia Gentile and Orla Lynskey, ‘Deficient-By-Design? The Transnational Enforcement of the GDPR’ (2022) 71 International and Comparative Law Quarterly 799.

Eugene Bardach and Robert A. Kagan, Going by the Book: The Problem of Regulatory Unreasonableness (2nd edn, Transaction Publishers 2003) 7.

Black, Rules and Regulators (n 132) 43–44.

Practically, the remaining option for a data subject to initiate private enforcement action against a data controller for breach of the GDPR would seemingly undermine any attempt to mitigate the hard edges of the law by public enforcers applying site-level flexibility.

Case C-13/16, Rīgas satiksme ECLI:EU:C:2017:336.

Case C-13/16, Rīgas satiksme ECLI:EU:C:2017:336, Opinion of AG Bobek, para 93.

ibid, para 95.

ibid, para 96. As in any other area of law, rules governing certain activity must be sufficiently flexible in order to catch all the potential eventualities that arise. That might, however, lead to the danger of an overbroad interpretation and application of those rules. They might end up being applied also to a situation where the link with the original purpose is somewhat tenuous and questionable.

This lighter touch would be needed in situations ‘when a person is asking for an individual piece of information relating to a specific person in a concretised relationship, when there is a clear and entirely legitimate purpose resulting from the normal operation of the law’. ibid, para 98.

Lorenzo Dalla Corte, ‘On Proportionality in the Data Protection Jurisprudence of the CJEU’ (2022) 12 International Data Privacy Law 259, 265.

Google Spain , Opinion of AG Jääskinen (n 115), para 79. He deemed it inappropriate to apply a law that was drafted prior to the emergence of the decentralised internet teleologically (paras 77 and 78).

ibid, paras 89 and 90.

Miquel Peguera, ‘The Shaky Ground of the Right to Be Delisted’ (2016) 18 Vanderbilt Journal of Entertainment and Technology Law 507, 539.

C-137/17, GC and Others v Commission nationale de l’informatique et des libertés (CNIL) ECLI:EU:C:2019:773 para 31.

Google Spain (n 2) para 38; repeated at para 83.

GC and Others (n 189) paras 42 and 43.

ibid, para 44.

ibid, para 45.

Rights of individuals are clearly found in a chapter of the law labelled ‘Rights of the data subject’ while the Article 9 prohibition is found in the ‘Principles’ chapter.

Karen Yeung and Lee A. Bygrave, ‘Demystifying the Modernized European Data Protection Regime: Cross-disciplinary Insights from Legal and Regulatory Governance Scholarship’ (2022) 16 Regulation & Governance 137, 146.

Article 57(1)(f) GDPR.

Access Now (n 128) 41.

Hijmans, for instance, observes that ‘DPAs are free to set their own agenda, but with one limitation which is their obligation to handle complaints’. Hielke Hijmans, The European Union as Guardian of Internet Privacy (Springer 2016), 383.

Article 78(2) GDPR.

Ian Ayres and John Braithwaite, Responsive Regulation: Transcending the Deregulation Debate (OUP 2002) 35.

Case C-311/18, Data Protection Commissioner v Facebook Ireland Limited and Maximillian Schrems (Schrems II) ECLI:EU:C:2020:559, para 108.

ibid, para 112.

S.109(2) Data Protection Act 2018 (Ireland).

Article 83(1) GDPR.

Recital 148.

David Erdos, ‘Ensuring Legal Accountability of the UK Data Protection Authority: From Cause for Data Subject Complaint to a Model for Europe?’ (2020) 5 European Data Protection Law Review 444, 452.

Recital 131.

Article 78(2) and (1), respectively.

A lacuna explored, but not filled, in the UK case of Killock & Veale v ICO [2021] UKUT 299 (AAC).

Google Spain , Opinion of AG Jääskinen (n 115) para 30. He highlighted that currently ‘the broad definitions of personal data, processing of personal data and controller are likely to cover an unprecedentedly wide range of new factual situations due to technological developments’.

De Gregorio (n 114) 141.

For a more critical assessment of the power wielded by Google Search see Powles (n 117).

Therefore while it is often claimed that the law is designed to be technologically neutral, we cannot claim that the law applies in a way that is technologically neutral.

Purtova, ‘The Law of Everything’ (n 9) 71.

Michèle Finck, ‘Blockchains and Data Protection in the EU’ (2018) 1 European Data Protection Law Review 17, 30–31.

System design cannot only frustrate rights but often entails trade-offs between rights that are not made explicit by the law. See further, Michael Veale, Reuben Binns and Jef Ausloos, ‘When Data Protection by Design and Data Subject Rights Clash’ (2018) 8 International Data Privacy Law 105.

Commission Staff Working Document (n 127) 5.

European Parliament, ‘European Parliament resolution of 25 March 2021 on the Commission evaluation report on the implementation of the General Data Protection Regulation two years after its application (2020/2717(RSP))’ [2021] C494/29, para 13.

EDPB, ‘Overview on resources’ (n 134) 14.

ICO, Information Commissioner’s Annual Report and Financial Statements 2021–22 July 2022 HC 392, 33.

ICO, ‘ICO reprimands Surrey Police and Sussex Police for recording more than 200,000 phone calls without people’s knowledge’, 18 April 2023.

Fuller (n 146). These criteria also reflect those set out by Diver in his work on the optimal precision of legal rules. He notes that the success of a rule will depend on qualities such as its transparency (whether the words have a well defined and universally accepted meaning within the relevant community) and their accessibility (their application to concrete situations without excessive difficulty or effort). Colin S. Diver, ‘The Optimal Precision of Administrative Rules’ (1983) 93 Yale Law Journal 65.

Joris van Hoboken, ‘From Collection to Use in Privacy Regulation? A Forward-Looking Comparison of European and US Frameworks for Personal Data Processing’ in Bart van der Sloot, Dennis Broeders and Erik Schrijvers (eds), Exploring the Boundaries of Big Data (Amsterdam University Press 2016) 231, 248.

Lee A. Bygrave, Data Protection Law: Approaching Its Rationale, Logic, and Limits (Kluwer Law International 2002), 107–112.

Koops (n 9) 257.

Peter Blume, The Data Subject, (2015) 1 Eur Data Prot L Rev 42.

Gellert (n 11) 215.

The role of risk in data protection law remains ambiguous. As Yeung and Bygrave note, although regulatory scholars are familiar with the idea of ‘risk’ in various guises, the concept of ‘risk to rights’ is unfamiliar and the traditional focus of risk on quantifying tangible harms sits uneasily alongside the dignitarian basis for human rights. Yeung and Bygrave (n 170) 143.

Quelle, for instance, suggest that this formula serves the function of maintaining a broad scope of application for the data protection rules while ‘keeping the consequences of controllership in check’. Claudia Quelle, ‘GC and Others v CNIL on the Responsibility of Search Engine Operators for Referring to Sensitive Data: The End of ‘Right to be Forgotten’ Balancing?’ (2019) 5 Eur Data Prot L Rev 438, 440.

Giorgio Monti, ‘Article 81 EC and Public Policy’ 2002(39) Common Market Law Review 1057, 1088.

Lee A. Bygrave, Data Privacy Law: an International Perspective (OUP 2014) 147; De Gregorio (n 114) 141.

Lee A Bygrave and Dag Wiese Schartum, ‘Consent, Proportionality and Collective Power’ in Serge Gutwirth and others (eds), Reinventing Data Protection (Springer 2009), 162.

Case C-439/19, Latvijas Republikas Saeima (Points de pénalité) EU:C:2021:504, para 97.

ibid, para 98. In the penalty points case, the Court affirmed that the principle of data minimisation (Article 5(1)(c) GDPR) ‘gives expression to the principle of proportionality’.

Jan Freigang, ‘Is Responsive Regulation Compatible with the Rule of Law’ (2002) 8 European Public Law 463.

Erdos, ‘Ensuring Legal Accountability’ (n 207). The one-stop-shop and consistency mechanisms foreseen in Chapter VII, Sections 1 and 2 GDPR are ill equipped to force an authority to handle a complaint in a particular manner: Gentile and Lynskey (n 176).

Easy GDPR, ‘GDPR fine for Austrian kebab store’, https://easygdpr.eu/en/gdpr-incident/gdpr-fine-for-austrian-kebab-store/ ; One Trust Data Guidance, ‘Spain: AEPD fines individual €6,000 for unlawfully processing personal data’ https://www.dataguidance.com/news/spain-aepd-fines-individual-600-data-minimisation

The importance of transparency in this regard has been emphasised by the European Parliament which has called for harmonisation of penalties by means of guidelines and clear criteria ‘in order to increase legal certainty and to prevent companies settling in the locations that impose the lowest penalties’. European Parliament (n 191)

ICO, ‘ICO Sets Out Revised Approach to Public Sector Enforcement’ (30 June 2022).

Erdos Balancing on a Tightrope (n 14) 199.

Roger Brownsword and Morag Goodwin, Law and the Technologies of the Twenty-First Century: Text and Materials (Law in Context) (CUP 2012), 310.

Kenneth A. Bamberger and Deirdre K. Mulligan, Privacy on the Ground: Driving Corporate Behavior in the United States and Europe (MIT Press 2015), 9.

Brownsword and Goodwin (n 241) 312.

Simitis (n 119).

Month: Total Views:
October 2023 154
November 2023 223
December 2023 319
January 2024 328
February 2024 374
March 2024 453
April 2024 681
May 2024 776
June 2024 447
July 2024 489
August 2024 547

Email alerts

Citing articles via.

  • Recommend to your Library

Affiliations

  • Online ISSN 2044-8422
  • Print ISSN 0070-1998
  • Copyright © 2024 University College London
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

A man using a smartphone on a cobblestone street with illuminated office windows and a shop in the background.

Photo by Raghu Rai/Magnum

Privacy is power

Don’t just give away your privacy to the likes of google and facebook – protect it, or you disempower us all.

by Carissa Véliz   + BIO

Imagine having a master key for your life. A key or password that gives access to the front door to your home, your bedroom, your diary, your computer, your phone, your car, your safe deposit, your health records. Would you go around making copies of that key and giving them out to strangers? Probably not the wisest idea – it would be only a matter of time before someone abused it, right? So why are you willing to give up your personal data to pretty much anyone who asks for it?

Privacy is the key that unlocks the aspects of yourself that are most intimate and personal, that make you most you, and most vulnerable. Your naked body. Your sexual history and fantasies. Your past, present and possible future diseases. Your fears, your losses, your failures. The worst thing you have ever done, said, and thought. Your inadequacies, your mistakes, your traumas. The moment in which you have felt most ashamed. That family relation you wish you didn’t have. Your most drunken night.

When you give that key, your privacy, to someone who loves you, it will allow you to enjoy closeness, and they will use it to benefit you. Part of what it means to be close to someone is sharing what makes you vulnerable, giving them the power to hurt you, and trusting that person never to take advantage of the privileged position granted by intimacy. People who love you might use your date of birth to organise a surprise birthday party for you; they’ll make a note of your tastes to find you the perfect gift; they’ll take into account your darkest fears to keep you safe from the things that scare you. Not everyone will use access to your personal life in your interest, however. Fraudsters might use your date of birth to impersonate you while they commit a crime; companies might use your tastes to lure you into a bad deal; enemies might use your darkest fears to threaten and extort you. People who don’t have your best interest at heart will exploit your data to further their own agenda. Privacy matters because the lack of it gives others power over you.

You might think you have nothing to hide, nothing to fear. You are wrong – unless you are an exhibitionist with masochistic desires of suffering identity theft, discrimination, joblessness, public humiliation and totalitarianism, among other misfortunes. You have plenty to hide, plenty to fear, and the fact that you don’t go around publishing your passwords or giving copies of your home keys to strangers attests to that.

You might think your privacy is safe because you are a nobody – nothing special, interesting or important to see here. Don’t shortchange yourself. If you weren’t that important, businesses and governments wouldn’t be going to so much trouble to spy on you.

You have your attention, your presence of mind – everyone is fighting for it. They want to know more about you so they can know how best to distract you, even if that means luring you away from quality time with your loved ones or basic human needs such as sleep. You have money, even if it is not a lot – companies want you to spend your money on them. Hackers are eager to get hold of sensitive information or images so they can blackmail you. Insurance companies want your money too, as long as you are not too much of a risk, and they need your data to assess that. You can probably work; businesses want to know everything about whom they are hiring – including whether you might be someone who will want to fight for your rights. You have a body – public and private institutions would love to know more about it, perhaps experiment with it, and learn more about other bodies like yours. You have an identity – criminals can use it to commit crimes in your name and let you pay for the bill. You have personal connections. You are a node in a network. You are someone’s offspring, someone’s neighbour, someone’s teacher or lawyer or barber. Through you, they can get to other people. That’s why apps ask you for access to your contacts. You have a voice – all sorts of agents would like to use you as their mouthpiece on social media and beyond. You have a vote – foreign and national forces want you to vote for the candidate that will defend their interests.

As you can see, you are a very important person. You are a source of power.

By now, most people are aware that their data is worth money. But your data is not valuable only because it can be sold. Facebook does not technically sell your data, for instance. Nor does Google. They sell the power to influence you. They sell the power to show you ads, and the power to predict your behaviour. Google and Facebook are not really in the business of data – they are in the business of power. Even more than monetary gain, personal data bestows power on those who collect and analyse it, and that is what makes it so coveted.

T here are two aspects to power. The first aspect is what the German philosopher Rainer Forst in 2014 defined as ‘the capacity of A to motivate B to think or do something that B would otherwise not have thought or done’. The means through which the powerful enact their influence are varied. They include motivational speeches, recommendations, ideological descriptions of the world, seduction and credible threats. Forst argues that brute force or violence is not an exercise of power, for subjected people don’t ‘do’ anything; rather, something is done to them. But clearly brute force is an instance of power. It is counterintuitive to think of someone as powerless who is subjecting us through violence. Think of an army dominating a population, or a thug strangling you. In Economy and Society (1978), the German political economist Max Weber describes this second aspect of power as the ability for people and institutions to ‘carry out [their] own will despite resistance’.

In short, then, powerful people and institutions make us act and think in ways in which we would not act and think were it not for their influence. If they fail to influence us into acting and thinking in the way that they want us to, powerful people and institutions can exercise force upon us – they can do unto us what we will not do ourselves.

There are different types of power: economic, political and so on. But power can be thought of as being like energy: it can take many different forms, and these can change. A wealthy company can often use its money to influence politics through lobbying, for instance, or to shape public opinion through paying for ads.

Power over others’ privacy is the quintessential kind of power in the digital age

That tech giants such as Facebook and Google are powerful is hardly news. But exploring the relationship between privacy and power can help us to better understand how institutions amass, wield and transform power in the digital age, which in turn can give us tools and ideas to resist the kind of domination that survives on violations of the right to privacy. However, to grasp how institutions accumulate and exercise power in the digital age, first we have to look at the relationship between power, knowledge and privacy.

There is a tight connection between knowledge and power. At the very least, knowledge is an instrument of power. The French philosopher Michel Foucault goes even further, and argues that knowledge in itself is a form of power . There is power in knowing. By protecting our privacy, we prevent others from being empowered with knowledge about us that can be used against our interests.

The more that someone knows about us, the more they can anticipate our every move, as well as influence us. One of the most important contributions of Foucault to our understanding of power is the insight that power does not only act upon human beings – it constructs human subjects (even so, we can still resist power and construct ourselves). Power generates certain mentalities, it transforms sensitivities, it brings about ways of being in the world. In that vein, the British political theorist Steven Lukes argues in his book Power (1974) that power can bring about a system that produces wants in people that work against their own interests. People’s desires can themselves be a result of power, and the more invisible the means of power, the more powerful they are. Examples of power shaping preferences today include when tech uses research about how dopamine works to make you addicted to an app, or when you are shown political ads based on personal information that makes a business think you are a particular kind of person (a ‘persuadable’, as the data-research company Cambridge Analytica put it, or someone who might be nudged into not voting, for instance).

The power that comes about as a result of knowing personal details about someone is a very particular kind of power. Like economic power and political power, privacy power is a distinct type of power, but it also allows those who hold it the possibility of transforming it into economic, political and other kinds of power. Power over others’ privacy is the quintessential kind of power in the digital age.

T wo years after it was funded and despite its popularity, Google still hadn’t developed a sustainable business model. In that sense, it was just another unprofitable internet startup. Then, in 2000, Google launched AdWords, thereby starting the data economy. Now called Google Ads, it exploited the data produced by Google’s interactions with its users to sell ads. In less than four years, the company achieved a 3,590 per cent increase in revenue.

That same year, the Federal Trade Commission had recommended to US Congress that online privacy be regulated. However, after the attacks of 11 September 2001 on the Twin Towers in New York, concern about security took precedence over privacy, and plans for regulation were dropped. The digital economy was able to take off and reach the magnitude it enjoys today because governments had an interest in having access to people’s data in order to control them. From the outset, digital surveillance has been sustained through a joint effort between private and public institutions.

The mass collection and analysis of personal data has empowered governments and prying companies. Governments now know more about their citizens than ever before. The Stasi (the security service of the German Democratic Republic), for instance, managed to have files only on about a third of the population, even if it aspired to have complete information on all citizens. Intelligence agencies today hold much more information on all of the population. To take just one important example, a significant proportion of people volunteer private information in social networks. As the US filmmaker Laura Poitras put it in an interview with The Washington Post in 2014: ‘Facebook is a gift to intelligence agencies.’ Among other possibilities, that kind of information gives governments the ability to anticipate protests, and even pre-emptively arrest people who plan to take part. Having the power to know about organised resistance before it happens, and being able to squash it in time, is a tyrant’s dream.

Tech companies’ power is constituted, on the one hand, by having exclusive control of data and, on the other, by the ability to anticipate our every move, which in turn gives them opportunities to influence our behaviour, and sell that influence to others. Companies that earn most of their revenues through advertising have used our data as a moat – a competitive advantage that has made it impossible for alternative businesses to challenge tech titans. Google’s search engine, for example, is as good as it is partly because its algorithm has much more data to learn from than any of its competitors. In addition to keeping the company safe from competitors and allowing it to train its algorithm better, our data also allows tech companies to predict and influence our behaviour. With the amount of data it has access to, Google can know what keeps you up at night, what you desire the most, what you are planning to do next. It then whispers this information to other busybodies who want to target you for ads.

Tech wants you to think that the innovations it brings into the market are inevitable

Companies might also share your data with ‘data brokers’ who will create a file on you based on everything they know about you (or, rather, everything they think they know), and then sell it to pretty much whoever is willing to buy it – insurers, governments, prospective employers, even fraudsters.

Data vultures are incredibly savvy at using both the aspects of power discussed above: they make us give up our data, more or less voluntarily, and they also snatch it away from us, even when we try to resist. Loyalty cards are an example of power making us do certain things that we would otherwise not do. When you are offered a discount for loyalty at your local supermarket, what you are being offered is for that company to conduct surveillance on you, and then influence your behaviour through nudges (discounts that will encourage you to buy certain products). An example of power doing things to us that we don’t want it to do is when Google records your location on your Android smartphone, even when you tell it not to.

Both types of power can also be seen at work at a more general level in the digital age. Tech constantly seduces us into doing things we would not otherwise do, from getting lost down a rabbit hole of videos on YouTube, to playing mindless games, or checking our phone hundreds of times a day. The digital age has brought about new ways of being in the world that don’t always make our lives better. Less visibly, the data economy has also succeeded in normalising certain ways of thinking. Tech companies want you to think that, if you have done nothing wrong, you have no reason to object to their holding your data. They also want you to think that treating your data as a commodity is necessary for digital tech, and that digital tech is progress – even when it might sometimes look worryingly similar to social or political regress. More importantly, tech wants you to think that the innovations it brings into the market are inevitable. That’s what progress looks like, and progress cannot be stopped.

That narrative is complacent and misleading. As the Danish economic geographer Bent Flyvbjerg points out in Rationality and Power (1998), power produces the knowledge, narratives and rationality that are conducive to building the reality it wants. But technology that perpetuates sexist and racist trends and worsens inequality is not progress. Inventions are far from unavoidable. Treating data as a commodity is a way for companies to earn money, and has nothing to do with building good products. Hoarding data is a way of accumulating power. Instead of focusing only on their bottom line, tech companies can and should do better to design the online world in a way that contributes to people’s wellbeing. And we have many reasons to object to institutions collecting and using our data in the way that they do.

Among those reasons is institutions not respecting our autonomy, our right to self-govern. Here is where the harder side of power plays a role. The digital age thus far has been characterised by institutions doing whatever they want with our data, unscrupulously bypassing our consent whenever they think they can get away with it. In the offline world, that kind of behaviour would be called matter-of-factly ‘theft’ or ‘coercion’. That it is not called this in the online world is yet another testament to tech’s power over narratives.

I t’s not all bad news, though. Yes, institutions in the digital age have hoarded privacy power, but we can reclaim the data that sustains it, and we can limit their collecting new data. Foucault argued that, even if power constructs human subjects, we have the possibility to resist power and construct ourselves. The power of big tech looks and feels very solid. But tech’s house of cards is partly built on lies and theft. The data economy can be disrupted. The tech powers that be are nothing without our data. A small piece of regulation, a bit of resistance from citizens, a few businesses starting to offer privacy as a competitive advantage, and it can all evaporate.

No one is more conscious of their vulnerability than tech companies themselves. That is why they are trying to convince us that they do care about privacy after all (despite what their lawyers say in court). That is why they spend millions of dollars on lobbying. If they were so certain about the value of their products for the good of users and society, they would not need to lobby so hard. Tech companies have abused their power, and it is time to resist them.

In the digital age, resistance inspired by the abuse of power has been dubbed a techlash. Abuses of power remind us that power needs to be curtailed for it to be a positive influence in society. Even if you happen to be a tech enthusiast, even if you think that there is nothing wrong with what tech companies and governments are doing with our data, you should still want power to be limited, because you never know who will be in power next. Your new prime minister might be more authoritarian than the old one; the next CEO of the next big tech company might not be as benevolent as those we’ve seen thus far. Tech companies have helped totalitarian regimes in the past, and there is no clear distinction between government and corporate surveillance. Businesses share data with governments, and public institutions share data with companies.

When you expose your privacy, you put us all at risk

Do not give in to the data economy without at least some resistance. Refraining from using tech altogether is unrealistic for most people, but there is much more you can do short of that. Respect other people’s privacy. Don’t expose ordinary citizens online. Don’t film or photograph people without their consent, and certainly don’t share such images online. Try to limit the data you surrender to institutions that don’t have a claim to it. Imagine someone asks for your number in a bar and won’t take a ‘No, thank you’ for an answer. If that person were to continue to harass you for your number, what would you do? Perhaps you would be tempted to give them a fake number. That is the essence of obfuscation, as outlined by the media scholars Finn Bruton and Helen Nissenbaum in the 2015 book of that name. If a clothing company asks for your name to sell you clothes, give them a different name – say, Dr Private Information, so that they get the message. Don’t give these institutions evidence they can use to claim that we are consenting to our data being taken away from us. Make it clear that your consent is not being given freely.

When downloading apps and buying products, choose products that are better for privacy. Use privacy extensions on your browsers. Turn your phone’s wi-fi, Bluetooth and locations services off when you don’t need them. Use the legal tools at your disposal to ask companies for the data they have on you, and ask them to delete that data. Change your settings to protect your privacy. Refrain from using one of those DNA home testing kits – they are not worth it. Forget about ‘smart’ doorbells that violate your privacy and that of others. Write to your representatives sharing your concerns about privacy. Tweet about it. Take opportunities as they come along to inform business, governments and other people that you care about privacy, that what they are doing is not okay.

Don’t make the mistake of thinking you are safe from privacy harms, maybe because you are young, male, white, heterosexual and healthy. You might think that your data can work only for you, and never against you, if you’ve been lucky so far. But you might not be as healthy as you think you are, and you will not be young forever. The democracy you are taking for granted might morph into an authoritarian regime that might not favour the likes of you.

Furthermore, privacy is not only about you. Privacy is both personal and collective. When you expose your privacy, you put us all at risk. Privacy power is necessary for democracy – for people to vote according to their beliefs and without undue pressure, for citizens to protest anonymously without fear of repercussions, for individuals to have freedom to associate, speak their minds, read what they are curious about. If we are going to live in a democracy, the bulk of power needs to be with the people. If most of the power lies with companies, we will have a plutocracy. If most of the power lies with the state, we will have some kind of authoritarianism. Democracy is not a given. It is something we have to fight for every day. And if we stop building the conditions in which it thrives, democracy will be no more. Privacy is important because it gives power to the people. Protect it.

A silhouetted figure walking with a dog through a dimly lit tunnel, contrasting with bright concrete walls in the foreground.

Psychiatry and psychotherapy

For those who hear voices, the ‘broken brain’ explanation is harmful. Psychiatry must embrace new meaning-making frameworks

Justin Garson

Black-and-white photo of a man in a suit and hat grabbing another man by his collar in front of a bar with bottles.

Political philosophy

C L R James and America

The brilliant Trinidadian thinker is remembered as an admirer of the US but he also warned of its dark political future

Harvey Neptune

A suburban street with mountains in the background, featuring a girl on a bike, parked cars, and old furniture on the sidewalk in front of a house.

Progress and modernity

The great wealth wave

The tide has turned – evidence shows ordinary citizens in the Western world are now richer and more equal than ever before

Daniel Waldenström

Silhouette of a person walking through a spray of water at sunset with cars and buildings in the background.

Neuroscience

The melting brain

It’s not just the planet and not just our health – the impact of a warming climate extends deep into our cortical fissures

Clayton Page Aldern

A brick house with a tiled roof, surrounded by a well-maintained garden with bushes and colourful flowers.

Falling for suburbia

Modernists and historians alike loathed the millions of new houses built in interwar Britain. But their owners loved them

Michael Gilson

An old photograph of a man pulling a small cart with a child and belongings, followed by a woman and three children; one child is pushing a stroller.

Thinkers and theories

Rawls the redeemer

For John Rawls, liberalism was more than a political project: it is the best way to fashion a life that is worthy of happiness

Alexandre Lefebvre

Data Privacy Perception, Protection and Assessment Essay

  • To find inspiration for your paper and overcome writer’s block
  • As a source of information (ensure proper referencing)
  • As a template for you assignment

Privacy can be defined in many ways depending on the perspective of the context that is being covered. Generally, privacy refers to the state of freedom from scrutiny from knowing the secrets or confidential information shared (Cox, 2019). Privacy can be extended to the situation of being free from public attention or illegal interruption. When privacy is defined by Information Technology (IT), it refers to the aspect of a firm being able to determine what data and information in a system can be shared with other parties (Ghiglieri & Waidner, 2016). In IT, privacy aims to protect the resources of a given firm from unwanted exposure to third parties and, therefore, serves as a mitigation measure towards manipulating data and information.

In legal terms, privacy refers to the right to make an important decision that concerns personal issues free from state compulsion, pressure, and regulation. Therefore, privacy is related to the welfare of autonomy, self-determination, and self-respect. However, under the common law, privacy means the right to be alone and have the liberty to conceal things from other parties (Cox, 2019). Privacy is a fundamental right responsible for underpinning human dignity and liberty to communicate in the public domain.

There are differences in the perceptions of privacy by the average person and that of law where they all have an impact on risk assessment for privacy protection. The variation in the ideas makes privacy a contradictory subject since both the average person and the law take privacy as an all-inclusive matter instead of dealing with a specific instance (Ghiglieri & Waidner, 2016). Therefore, the outcome may be affected if the terms used in risk assessment fail to concretely analyze the circumstances.

The best practices for protecting the privacy of information handled can be achieved in various ways. The US federal government can conform to this by having in place privacy risk management and compliance documentation (Jamieson & Salinas, 2018). It means the government can leverage the existing resources to identify programs that are privacy compliant. For example, the Federal Information Security Management Act (FISMA) and Certification and Accreditation (CA) are the resources that can be used. When FISMA and CA are applied, the federal government plays a vital role by acting as a subject body to identify privacy-conforming matters (Cox, 2019). The two tools are key in spotting programs for assessing risks and implementing privacy in the systems and programs available.

The other practice that can be applied when essential information has been collected and retained by the federal government is information security. The security responsibilities include the need to have the collected information safeguarded from access by unwanted parties through various metrics (Seigneur Nininahazwe, 2018). First, records containing sensitive data and information should be maintained by approved retention and disposition from the parties involved in the process (Jamieson & Salinas, 2018). Suppose the data that the federal government collects and maintains is approved. In that case, there should be comprehensive privacy policies to enhance confidentiality, data availability, and integrity of the personnel handling the data.

The last element of practice that can be utilized in handling data and information to ensure privacy is the practice of incident response. Planning for privacy incidents requires the federal government to develop the reporting and notification aspects for all levels (Seigneur Nininahazwe, 2018). That means responders will be the senior leadership, the division heads for given programs, legal counsel, and other management bodies. It requires the government to educate all the parties on how to report privacy incidents. Any privacy incident program must take the potential for conforming with security breaches. For example, there may be programs that provide publications on privacy incident reporting essentials and how risk measures can be practiced in such situations.

Privacy Impact Assessments (PIA) is a tool that agencies apply sufficient defenses for personal information confidentiality to implement a citizen-centered relationship with the government. It is a requirement that government agencies conduct PIAs for IT systems that are information-related, especially to the members of the public (“Privacy Impact Assessments (PIA)”, 2021). The US federal government completes PIAs to ensure that system and program handlers are responsible and liable for the appropriate handling of privacy matters. Through PIA, the US federal government assures the public that their confidential information is protected (Ghiglieri & Waidner, 2016). Every federal IT system should have PIA to provide basic documentation of the linear personal information that the systems contained.

The benefits that citizens can gain from the utilization of PIAs by the federal IT systems include reducing the cost of managing databases and other files that individuals own. People are assured of safety in their information and will not incur the cost of imposing other methodologies for protecting data (“Privacy Impact Assessments (PIA)”, 2021). The other benefit is that the citizens are protected from public criticism when data is exposed to public opinion. Therefore, individuals will continue working in the development of information-based enterprises that can enhance growth and expansion. The other advantage that the US citizens benefit from the protection of privacy by the federal government is the potential to have innovative ideas concerning data and information (“Privacy Impact Assessments (PIA)”, 2021). It means that people will have the liberty to critically think about strategies for protecting data.

Cox, K. (2019). The legal landscape of consumer/ user data privacy and current policy discussions. The Current Privacy Landscape , 5 (297), 15-37. Web.

Ghiglieri, M., & Waidner, M. (2016). HbbTV security and privacy: issues and challenges. IEEE Security & Privacy , 14 (3), 61-67. Web.

Jamieson, T., & Salinas, G. (2018). Protecting human subjects in the digital age: Issues and best practices of data protection. Survey Practice , 11 (2), 1-10. Web.

Privacy Impact Assessments (PIA). (2021). Www2.ed.gov. Web.

Seigneur Nininahazwe, F. (2018). Best practices to protect your privacy against search engines data mining – a review. Internet of Things and Cloud Computing , 6 (3), 56. Web.

  • The Issue of Cybercrimes
  • Cybersecurity for Open Data Initiatives
  • Termination of Unwanted Pregnancy
  • Breach of Data Confidentiality in Healthcare Institutions
  • Computer Security: Safeguard Private and Confidential Information
  • Protecting Digital Assets Overseas for U.S. Companies
  • Sensitive Data Exposure
  • Digitalization and Data Policy for National Security
  • The Cloud Storage: Advantages and Disadvantages
  • Cyber-Security in Small Healthcare Services
  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2022, October 12). Data Privacy Perception, Protection and Assessment. https://ivypanda.com/essays/data-privacy-perception-protection-and-assessment/

"Data Privacy Perception, Protection and Assessment." IvyPanda , 12 Oct. 2022, ivypanda.com/essays/data-privacy-perception-protection-and-assessment/.

IvyPanda . (2022) 'Data Privacy Perception, Protection and Assessment'. 12 October.

IvyPanda . 2022. "Data Privacy Perception, Protection and Assessment." October 12, 2022. https://ivypanda.com/essays/data-privacy-perception-protection-and-assessment/.

1. IvyPanda . "Data Privacy Perception, Protection and Assessment." October 12, 2022. https://ivypanda.com/essays/data-privacy-perception-protection-and-assessment/.

Bibliography

IvyPanda . "Data Privacy Perception, Protection and Assessment." October 12, 2022. https://ivypanda.com/essays/data-privacy-perception-protection-and-assessment/.

Search form

Reflections on data privacy.

essay on data protection

By Mahsa Hedayati and Amanda Wang, Information System Officers, Policy, Strategy and Governance Division, OICT

The topic of privacy is gaining momentum in the mainstream big data discourse. Governments, policy makers, businesses, academics, and civil society are increasingly reflecting on how to strike the right balance between the potential benefits of exponential data collection and discovery, on the one hand, and the protection and privacy of data subjects, on the other hand.  

Is the privacy discourse new?

While interest in data privacy appears to be growing in both the public and private sectors, it is not a new topic. In 2018, the European Union introduced the General Data Protection Regulation (GDPR). However, this was not the first regulatory framework related to data privacy and it will likely not be the last. For example, in 1980, as computers became increasingly used to processing business transactions, Organisation for Economic Co-operation and Development (OECD) policy makers published a first set of guidelines on the protection of privacy and transborder flows of personal data. At national levels, it appears that data privacy regulation was introduced decades ago within several countries, including Japan, Sweden and the United States.  

When looking at the concept of privacy more broadly, we see that the UN General Assembly adopted the Universal Declaration of Human Rights in 1948, with privacy outlined as the 12th fundamental human right. This historical international document, which was created just three years after the UN was founded, demonstrates that privacy as a global standard has been important to the UN system for more than eight decades.

We believe that the UN’s role as a global privacy champion will grow in the digital world. And as part of this evolving role, it will be important to continue ensuring that no country or society is left behind. Much like the traditional notion of privacy established by the UN in 1948, data privacy must be a common standard of achievement for all people and all nations.

Privacy from within

With respect to the Organization’s internal operations, some notable progress has been made to advance data privacy work. For example, in 2018, the same year that GDPR came into effect, the UN High Level Committee on Management (HLCM)  published  Personal Data Protection and Privacy Principles for the entire UN system. These principles were designed to inform how to process personal data, defined as “information relating to an identified or identifiable natural person (“data subject”), by, or on behalf of, the United Nations System Organizations in carrying out their mandated activities”.

Currently, entities across the UN system are working on developing policies and programmes that can integrate the 2018 HLCM principles into their operations. And to further strengthen the Organization’s commitment to data privacy internally, the Secretary General’s new Data Strategy has called for the integration of data protection and privacy into all current and future data-related work across the UN.

Thoughts on next steps

Regarding the application of data privacy measures within an organization, we would like to share three thoughts.

First, we believe that  regulations  associated with data privacy in an organization’s internal operations must include – from the very start - a multidisciplinary approach that includes diverse professionals from a variety of backgrounds including law, policy, computer science, and management.

We stress this point because we believe that there is a bias in the technology industry, which perceives “regulation” as a matter that should be solely solved by lawyers and policy makers. While the role of such professionals is certainly crucial, it is important to recognize that regulation can also be achieved through digital infrastructures. Indeed, depending on their design, the technologies that we use to collect, discover and analyze data can significantly influence our behaviors and actions, including in the realm of data privacy. In this way, “regulation” can be achieved not only through law or policy, but also through a technology’s design. [1]

For this reason, we believe that creating an environment that protects the privacy of data subjects must be a collaborative, multi-stakeholder process, with the concept of “regulation” understood more broadly.

Second, we believe that to respect and protect the privacy of data subjects, all data work within an organization should be guided by the principle of “ do no harm ”. For example, in international affairs literature, the concept of “do no harm” stresses that certain forms of international support in post-conflict or development settings historically may have inadvertently weakened rather than strengthened local processes. The literature notes that, consequently, such efforts may have done more harm than good, even if unintentionally.

The principle of “do no harm” could also be useful to the data privacy discourse. As digital technologies increasingly enable us to connect and discover more data, committing to the “do no harm” principle provides us with an important ethical lens. Such a lens can help us be more mindful of unintended consequences that could potentially arise when working with ever-growing volumes of data – especially data we must keep protected and private. This means that whether we are creating, collecting, sharing, connecting, or analyzing data related to our work, we should keep the principle of “do no harm” in mind. And of course, this concept must be backed by smart regulation, as per our first point.

Third, and closely related to the principle of doing no harm, we believe that it is important to recognize the role of the  individual  in the responsible handling of private data. Whether it be in relation to data that each of us create or manage in our professions, or data we generate and share about ourselves and others as private citizens, our individual actions can have an impact on the protection and privacy of identifiable and other sensitive data. As such, each of us should become better informed about how our own unique actions affect data privacy – including through trainings, workshops, readings, and broad discussions with our peers and communities.

In conclusion, the topic of data privacy – while not new – is an increasingly important subject area. In line with the UN’s historical commitment to privacy, we, in OICT, will work closely with our partners across the UN Secretariat and beyond to help design and build digital infrastructures and policies that protect the privacy of data subjects. The principle of “do no harm” will provide us with an important ethical foundation throughout this journey. And we will also be mindful that – ultimately - data privacy awareness begins at the level of the individual.

*The views expressed in the blog piece are those of the author and do not reflect the views of OICT or the UN.

[1] This broader, more collaborative approach to solving for data privacy is often referred to as “privacy by design”. For more on the concept of privacy by design, as well as the role of infrastructure as a form of regulation, we recommend exploring published works by Dr. Ann Cavoukian, Luk Arbuckle and Dr. Benedict Kingsbury.

Logo

Essay on Data Privacy

Students are often asked to write an essay on Data Privacy in their schools and colleges. And if you’re also looking for the same, we have created 100-word, 250-word, and 500-word essays on the topic.

Let’s take a look…

100 Words Essay on Data Privacy

What is data privacy.

Data privacy is about keeping your personal information safe. It’s like a secret that you don’t want others to know. When you use the internet, you often share your details like your name, address, or credit card number. Companies collect this information to give you better services. But, if they don’t protect it well, bad people can steal it.

Why is Data Privacy Important?

Data privacy is very important because it protects you from harm. If bad people get your personal information, they can use it to steal your money or your identity. They can also use it to harm you in other ways. So, it’s important to keep your data safe.

How to Protect Your Data?

Protecting your data is not hard. You can do things like creating strong passwords, not sharing your personal details online, and only using secure websites. You should also be careful about what you post on social media. Remember, once you share something online, it’s hard to take it back.

Role of Companies in Data Privacy

Companies play a big role in data privacy. They collect your details and should keep them safe. They should tell you what they do with your data and ask for your permission. If they don’t, it’s not right. You should be careful about sharing your details with such companies.

250 Words Essay on Data Privacy

Data privacy is about keeping your personal information safe. It’s like keeping a secret that only you should know. In the digital world, this secret can be your name, address, phone number, or even things you like and dislike.

Imagine if your secret got out and people you don’t know started using it. It would feel bad, wouldn’t it? That’s why data privacy is important. It stops people from using your information in ways you don’t want.

How is Data Privacy Protected?

There are rules called ‘laws’ that tell companies how they can use your information. These laws are like a big fence that keeps your data safe. Companies must ask for your permission before they can use your data.

What Can We Do to Protect Our Data Privacy?

We can do a lot to protect our data. We can use strong passwords, not share too much information online, and always check if a website is safe before using it. We should also read and understand the privacy policies of websites and apps we use.

Data privacy is very important in our digital world. It’s about keeping our personal information safe from people who might misuse it. We can help protect our data privacy by being careful about what we share online and using safe and secure websites and apps.

500 Words Essay on Data Privacy

Introduction to data privacy, what is personal data.

Personal data is any information that can be used to identify you. This could be your name, address, phone number, or even the school you go to. When you use the internet, you often give out this information without even realizing it. For example, when you sign up for a new game or social media site, you often have to give them your email address or other personal details.

Imagine if someone you didn’t know could find out where you live, what school you go to, or even what your favorite food is, just by looking at the information you’ve shared online. This could be very scary and dangerous. That’s why data privacy is so important. It helps to keep your personal information safe, so you can use the internet without having to worry about people finding out things about you that you don’t want them to know.

Another way to protect your data privacy is by being careful about what information you share online. Before you give out your personal information, think about who will have access to it and what they might do with it. If you’re not sure, it’s always best to keep your information to yourself.

Data privacy is a very important part of using the internet safely. By understanding what personal data is and how to protect it, you can make sure your personal information stays safe. Remember, your personal data is like your secret treasure, and it’s up to you to keep it safe!

If you’re looking for more, here are essays on other interesting topics:

Apart from these, you can look at all the essays by clicking here .

Leave a Reply Cancel reply

Save my name, email, and website in this browser for the next time I comment.

Home  >  Learning Center  >  Data Protection  

Article's content

essay on data protection

Looking for a way to protect personal data?

See how imperva can help, data protection, what is data protection.

Data protection is the process of protecting sensitive information from damage, loss, or corruption.

As the amount of data being created and stored has increased at an unprecedented rate, making data protection increasingly important. In addition, business operations increasingly depend on data, and even a short period of downtime or a small amount of data loss can have major consequences on a business.

The implications of a data breach or data loss incident can bring organizations to their knees. Failure to protect data can cause financial losses, loss of reputation and customer trust, and legal liability, considering most organizations today are subject to some data privacy standard or regulation. Data protection is one of the key challenges of digital transformation in organizations of all sizes.

Therefore, most data protection strategies have three key focuses:

  • Data security – protecting data from malicious or accidental damage
  • Data availability – Quickly restoring data in the event of damage or loss
  • Access control – ensuring that data is accessible to those who actually need it, and not to anyone else

Elements of a data protection program

Elements of a data protection program

Principles of Data Protection

The basic tenet of data protection is to ensure data stays safe and remains available to its users at all times. These are the two key principles of data protection: data availability and data management.

Data availability ensures users can access the data they need to do business, even if the data is corrupted or lost.

Data management encompasses two main areas of data protection:

  • Data lifecycle management —automatically distributes important data to online and offline storage, depending on its context and sensitivity. In today’s big data environment, this includes methods of identifying valuable data and helping the business derive data from it, by opening it for reporting, analytics, development, and testing.
  • Information lifecycle management —assesses, classifies, and protects information assets to prevent application and user errors, malware or ransomware attacks, system crashes or malfunctions, and hardware failures.

Enterprise Data Protection Trends

The latest trends in data protection policy and technology include the following:

Hyper-Convergence

With the advent of hyper-converged systems, vendors are introducing devices that can provide backup and recovery in one device that integrates compute, networking, and storage infrastructure. Hyper-converged systems are replacing many devices in the traditional data center, and providing cloud-like capabilities on-premises.

Ransomware Protection

Ransomware is a type of malware that infects a system, encrypts its data, and demands a ransom fee to release it. Traditional backup methods are useful for protecting data from ransomware. However, new types of ransomware are able to infect backup systems as well, rendering them useless. This makes it very difficult to restore the original version of the data.

To solve this problem, new backup solutions are designed to be completely isolated from the corporate network, and use other measures, like data encryption at rest, to prevent ransomware from infecting backups.

Disaster Recovery as a Service

Disaster Recovery as a Service (DRaaS) is a cloud-based solution that allows an organization to create a remote copy of local systems or even an entire data center, and use it to restore operations in case of disaster. DRaaS solutions continuously replicate data from the local data center to provide a low recovery time objective (RTO), meaning they can spring into action within minutes or seconds of a disastrous failure.

Copy Data Management (CDM)

CDM solutions simplify data protection by reducing the number of copies of data stored by the organization. This reduces overhead, maintenance, and storage costs. Through automation and centralized management, CDM can accelerate development lifecycles and increase the productivity of many business processes.

Data Protection Strategy

Every organization needs a data protection strategy. Here are a few pillars of a robust strategy:

Audit of Sensitive Data

Assessing internal and external risks, defining a data protection policy, security strategy, compliance strategy.

Before adopting data protection controls, you must first perform an audit of your data. Identify data sources, data types, and storage infrastructure used throughout the organization.

Classify data into sensitivity levels, and see what data protection measures already exist in the organization, how effective they are, and which can be extended to protect more sensitive data. Often, the biggest potential is in leveraging existing data protection systems that are “lying around” or are not used consistently throughout the organization.

The security team in the organization should regularly assess security risks that may arise inside and outside the organization. Data protection programs must be designed around these known risks.

Internal risks include errors in IT configuration or security policies, the lack of strong passwords, poor authentication, and user access management, and unrestricted access to storage services or devices. A growing threat is malicious insiders or compromised accounts that have been taken over by threat actors.

External risks include social engineering strategies such as phishing , malware distribution, and attacks on corporate infrastructure such as SQL injection or distributed denial of service (DDoS). These and many most security threats are commonly used by attackers to gain unauthorized access to sensitive data and exfiltrate it.

Based on the organization’s analysis of its data assets, and the most relevant threats, it should develop a data protection policy that determines:

  • The tolerance for risk for every data category —data protection has a cost, and protection measures must be applied in accordance with the sensitivity of the data.
  • Authorization and authentication policy —use best practices and historical information to identify which business applications or user accounts should have access to sensitive data .

With respect to data protection, an organization’s security strategy should:

  • Take measures to prevent threat actors from accessing sensitive data
  • Ensure that security measures do not hurt productivity or prevent employees from accessing data when and where they need it
  • Manage backups effectively to prevent ransomware or other threats, and ensure constant data availability

Finally, a data protection strategy must consider compliance obligations. Organizations or specific business units may be subject to a variety of regulations or industry-specific compliance standards. Below are the most significant regulations affecting data protection today.

European Union (EU): the GDPR

The General Data Protection Regulation (GDPR) applies to all organizations that do business with EU citizens, regardless of whether the company is located inside or outside the EU. Failure to comply can result in fines of up to 4% of worldwide sales or 20 million euros. The GDPR protects personal data such as name, ID number, date or address of birth, web analytics data, medical information, and biometric data.

Data protection laws in the USA

The USA does not have a sweeping regulation equivalent to GDPR, but it does have several regulations that affect data protection:

  • The Federal Trade Commission Act requires organizations to respect consumer privacy and adhere to privacy policies.
  • The Health Insurance Portability and Accountability Act (HIPAA) regulates the storage, confidentiality, and use of health information.
  • The Gramm Leach Bliley Act (GLBA) regulates the collection and storage of personal data by financial institutions.
  • The California Consumer Privacy Act (CCPA) protects the right of California residents to access their personal information, ask for deletion, and request that their personal data will not be collected or resold.

Data protection laws in Australia

The Australian Prudential Regulatory Authority (APRA) introduced a mandatory data privacy regulation called CPS 234 in 2019. CPS 234 requires organizations to improve information security measures to protect personal data from attacks.

CPS 234 applies to accredited deposit-taking institutions (ADI), general insurance companies, life insurance companies, private health insurance organizations, and companies licensed under RSE.

Data Protection with Imperva

Imperva’s data security solution protects your data wherever it lives—on-premises, in the cloud, and in hybrid environments. It also provides security and IT teams with full visibility into how the data is being accessed, used, and moved around the organization.

Our comprehensive approach relies on multiple layers of protection, including:

  • Database firewall —blocks SQL injection and other threats, while evaluating for known vulnerabilities.
  • User rights management —monitors data access and activities of privileged users to identify excessive, inappropriate, and unused privileges.
  • Data masking and encryption —obfuscates sensitive data so it would be useless to the bad actor, even if somehow extracted.
  • Data loss prevention (DLP) —inspects data in motion, at rest on servers, in cloud storage, or on endpoint devices.
  • User behavior analytics —establishes baselines of data access behavior, uses machine learning to detect and alert on abnormal and potentially risky activity.
  • Data discovery and classification —reveals the location, volume, and context of data on-premises and in the cloud.
  • Database activity monitoring —monitors relational databases, data warehouses, big data, and mainframes to generate real-time alerts on policy violations.
  • Alert prioritization —Imperva uses AI and machine learning technology to look across the stream of security events and prioritize the ones that matter most.

Latest Blogs

Resize image project 1

Brian Robertson

Aug 20, 2024 4 min read

essay on data protection

Aug 15, 2024 3 min read

blue fibers across dark background

Lynne Murray

Apr 25, 2024 4 min read

blue and purple waves

Apr 19, 2024 3 min read

financial papers and graphs

  • Industry Perspective

Apr 2, 2024 3 min read

Rows of blue dots on a dark background

Mar 11, 2024 4 min read

Latest Articles

  • Data Security

187.1k Views

158.4k Views

123.2k Views

120.5k Views

92.1k Views

88.6k Views

The DDoS Threat Landscape 2024

111% increase in DDoS attacks mitigated by Imperva

2024 Bad Bot Report

Bad bots now represent almost one-third of all internet traffic

The State of API Security in 2024

Learn about the current API threat landscape and the key security insights for 2024

Protect Against Business Logic Abuse

Identify key capabilities to prevent attacks targeting your business logic

The State of Security Within eCommerce in 2022

Learn how automated threats and API attacks on retailers are increasing

Prevoty is now part of the Imperva Runtime Protection

Protection against zero-day attacks

No tuning, highly-accurate out-of-the-box

Effective against OWASP top 10 vulnerabilities

An Imperva security specialist will contact you shortly.

Top 3 US Retailer

Data Protection Essays

Enhancing hypervisor security: insights from recent research, cybersecurity dimensions and principles, security issues with iot devices, patient safety presentation, do you agree that terrorism research has stagnated, data protection and confidentiality in international business, cyber security implications on smart farming, popular essay topics.

  • American Dream
  • Artificial Intelligence
  • Black Lives Matter
  • Bullying Essay
  • Career Goals Essay
  • Causes of the Civil War
  • Child Abusing
  • Civil Rights Movement
  • Community Service
  • Cultural Identity
  • Cyber Bullying
  • Death Penalty
  • Depression Essay
  • Domestic Violence
  • Freedom of Speech
  • Global Warming
  • Gun Control
  • Human Trafficking
  • I Believe Essay
  • Immigration
  • Importance of Education
  • Israel and Palestine Conflict
  • Leadership Essay
  • Legalizing Marijuanas
  • Mental Health
  • National Honor Society
  • Police Brutality
  • Pollution Essay
  • Racism Essay
  • Romeo and Juliet
  • Same Sex Marriages
  • Social Media
  • The Great Gatsby
  • The Yellow Wallpaper
  • Time Management
  • To Kill a Mockingbird
  • Violent Video Games
  • What Makes You Unique
  • Why I Want to Be a Nurse
  • Send us an e-mail

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

DATA PROTECTION ESSAY---SUBMITTED VERSION

Profile image of Moses Fabiyi

Related Papers

In this paper I will offer several arguments in support of the view that individuals have moral claims to control personal information. Coupled with rights to control access to one's body, capacities, and powers, or physical privacy rights, we will have taken important steps toward a general right to privacy. In Part I, a definition of privacy is offered along with an account of the value of privacy. Simply put, privacy - defined as control over access to locations and information - is necessary for human well-being. In Part II, an attempt to move beyond claims of value to claims of obligation is presented and defended. Policies that sanction the capturing, storing, and trading of personal information about others is something we each have reasons to avoid. In the final part, the tension between privacy and security is considered. It is argued that privacy rights may be set aside only if specific procedural conditions are followed.

essay on data protection

Bhavana Muralidhar

Ján Valášek

“Is privacy just something that is interesting from a theoretical perspective, a ‘right’ that matters only on paper or in the minds of philosophers? Is it something that people care about when you ask them, but do not care about enough that they will actually change their habits or actions if it is at all inconvenient?”, Internet Privacy Rights (Cambridge University Press 2014)

Journal of Siberian Federal University. Humanities & Social Sciences

Tigran Oganesian

The article considers the legality of mass surveillance and protection of personal data in the context of the international human rights law and the right to respect for private life. Special attention is paid to the protection of data on the Internet, where the personal data of billions of people are stored. The author emphasizes that mass surveillance and technology that allows the storage and processing of the data of millions of people pose a serious threat to the right to privacy guaranteed by Article 8 of the ECHR of 1950. Few companies comply with the human rights principles in their operations by providing user data in response to requests from public services. In this regard, States must prove that any interference with the personal integrity of an individual is necessary and proportionate to address a particular security threat. Mandatory data storage, where telephone companies and Internet service providers are required to store metadata about their users’ communications ...

Article Journal

Haekal Al Asyari

It is well accepted that all aspects of society has been affected by digitization. Issues of privacy and data protection has exceeded individual interests and constitutes major challenges in recent times. Due to the complex interrelations between state, businesses, and citizens, data protection has become a shared concern and responsibility. The protection of personal data that competes with collective interests of the society warrants a public good dilemma. Based on the study of Fairfield and Engel who has established privacy as a public good, this study will dwell on the further inquiry in the context of legal policy of data protection as a public good. It will discuss in more details on the concept of personal data protection laws between the public and private sphere. Using a normative methodology based on existing literatures, this article reelaborates the understanding of privacy as public good. It explains further on the common interest shared in privacy and provides contextual example of data protection policy in the European Union. Lastly, it discusses the interrelations between actors of data protection and the shared interests between them.

The Daily Observer

Dr. Md. Toriqul Islam

The need for comprehensive privacy protection law Protection of privacy has received an utmost priority in the political discourse and policy making in the contemporary world. The United Nations in Resolution No. 68/167, 2013 affirmed, 'the rights held by people offline must also be protected online'. Nearly, 121 countries have enacted data privacy legislation, and over 130 countries have recognised privacy in their Constitutions globally. Privacy is an elusive and ill-defined idea as people from diverse nations, cultures and times interpret and evaluate privacy differently. Indeed, there is no one-size-fits-all solution to the notion of 'privacy'. However, noted scholars, Warren and Brandeis termed privacy as 'the right to be let alone'. Privacy is an important human right and often blended with numerous rights and guarantees, e.g., right to life, personal liberty, well-being, bodily and social autonomy, security, safety etc. Hence, from the cradle to the grave, privacy urges to be revered unless there remains a compelling State interest. In such a context, Bangladesh has enacted the Digital Security Act, 2018. Let us examine the provisions of the Digital Security Act, 2018 and other prevailing laws of Bangladesh.

Gergely László Szőke

37th IBIMA Conference Proceedings, 2021, Cordoba, Spain

ALINA GENTIMIR

The current paper aims to underline the new dimensions of the right to privacy in the field of personal data protection within the context of ascending limitations imposed by security requirements in the era of globalization. Its specific premise provides that the technological and communications developments within the general context of globalization engage an extension of privacy regardless of the level of the research approach: national, European, global. Simultaneously, occurs the enlargement of the geographic area for the application of the privacy protection regulations, especially in the conditions of appearance and evolution of specific grounds for such as medical crisis, cybercrimes, masssurveillance, grounds which justify new and distinctive regulations. For a comprehensive analysis, using the methodology of the research based on the provisions of legal science referring to personal data protection, and also on the logical-legal, historical, comparative-legal, and sociological research methods, the current paper submits the relevant conceptual delimitations which might be involved within the circumstances of the most recent challenge for the right to privacy: protection of personal data. Beyond strategic relevance, Council of Europe, European Union, and United Nations legal documents provide fundamental rules for personal data protection, such as lawfulness, fairness, and transparency of processing principles, the principle of purpose limitation, principle of data minimization, the principle of data accuracy, the principle of storage limitation, principle of data security, the principle of accountability. Taking into consideration the current European and international level of regulation for personal data protection and the high frequency of cases in which privacy is violated intentionally and unintentionally, the main conclusion which results is that, unfortunately, the balance between protection of the security and safeguarding of human rights is seriously broken.

In: Keresztes, Gábor (ed.): Tavaszi Szél 2016 Tanulmánykötet I., Budapest, Doktoranduszok Országos Szövetsége

Adrienn Hadady-Lukacs

Rakesh Chandra

The Supreme Court's Judgment of August 2017 is indeed a landmark in the realm of fundamental rights jurisprudence in India. The Court unanimously declared that the right to privacy is a fundamental right and recognised informational privacy as one ofits facets. The right to informational privacy requires that an individual is able to affirmatively control her life and personality by controlling her personal information. This implies that the law must guarantee an individual the ability to exercise control over the collection, use anddisclosure of her personal information. In the digital age this right assumes enormous significance as the people are sharing huge volumes of personal information to access digitised services. In the present times, both State and non-State actors alike have access to a citizen's personal data and activities- such as biometric information, internet-use patterns, geometric information, financial information. All this huge data is out there, without a law securing their integrity and protecting the data subject against harm. As the Supreme Court noted, this must be achieved by enacting an effective data protection law for India. Undoubtedly, a comprehensive law on data protection to safeguard an individual's right to privacy is imperative. On July 31, the central government set-up a five member committee chaired by former Supreme Court Judge, Justice (retd.) B.N. Srikrishna, to draw up a draft Data Protection Bill. One of the primary guiding factors for the committee would be the exhaustive report submitted in October 2012 by a group of experts on privacy led by former Delhi High Court Chief Justice A.P. Shah, which was constituted by the erstwhile Planning Commission. Both the government and the Court have agreed that this would be the "conceptual foundation for protecting privacy" in the form of the new Data Protection Bill. In the last few months we have read a lot about Aadhar data leak, hacking in Banks and other financial institutions. We are aware also that nearly all the social networking sites have their headquarters in U.S.A. and they are controlling the whole data available on their sites. Further in India also, the maintenance and upkeep of data is in pathetic shape which reminds us of colonial legacy. All these factors poses an important and disturbing question- whether we need a strong Data Protection Law only? Any such Law however strong and comprehensive it might be, cannot account for human element in the upkeep and security aspect of data protection. We badly need a modern and scientific infrastructure for upkeep of data and highly trained staff for the job. We also have to develop appropriate technology for indigenous social networking sites so that our data remains at our hands in our country itself. Lastly, in this age of Artificial Intelligence, we can also think of taking recourse to it in the maintenance and upkeep of the huge data. Only then, the law will prove to be effective in protecting our data and our privacy too. Keywords: Right to Privacy, Informational Privacy, Committee on Data Protection Bill, Data Protection Law.

Loading Preview

Sorry, preview is currently unavailable. You can download the paper by clicking the button above.

RELATED PAPERS

Protecting the Genetic Self from Biometric Threats

Despina Kiltidou

Silvia Venier

Computer Law & Security Review

Yves Poullet

Sandra Wachter

Anatoliy A. Lytvynenko, The Legal Approaches of Defining the Equipoise Between Privacy And Public Interest, 102 Teise p.p. 164-177

Anatoliy Anatoliyovych Lytvynenko

Globalex, NYU School of Law, USA

Godgrace E Obi Abang

Una Marian Murphy (Dr.)

Karen Cullagh

Monika Zalnieriute

Stanford Law Review

Jessica Fogerty

Seton Hall Law Review

Jonathan Kahn

alison vine

Dominik Horodyski

The Handbook of Privacy Studies

Sjoerd Keulen

Raphaël Gellert

Computer Science & Information Technology (CS & IT) Computer Science Conference Proceedings (CSCP)

European Journal of Multidisciplinary Studies

Ruzhdi Jashari

Lilian Mitrou

Journal ijmr.net.in(UGC Approved)

Brent Mittelstadt

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

Home — Essay Samples — Law, Crime & Punishment — Security — The General Data Protection Regulation (gdpr)

test_template

The General Data Protection Regulation (gdpr)

  • Categories: Security

About this sample

close

Words: 763 |

Published: Jan 4, 2019

Words: 763 | Pages: 2 | 4 min read

  • How “personal data” is defined
  • Appointing a Data Protection Officer
  • Privacy Impact Assessments
  • Documenting “valid consent”
  • Requesting the “right to be forgotten”

Image of Dr. Oliver Johnson

Cite this Essay

Let us write you an essay from scratch

  • 450+ experts on 30 subjects ready to help
  • Custom essay delivered in as few as 3 hours

Get high-quality help

author

Prof. Kifaru

Verified writer

  • Expert in: Law, Crime & Punishment

writer

+ 120 experts online

By clicking “Check Writers’ Offers”, you agree to our terms of service and privacy policy . We’ll occasionally send you promo and account related email

No need to pay just yet!

Related Essays

4 pages / 1798 words

1 pages / 519 words

1 pages / 459 words

1 pages / 476 words

Remember! This is just a sample.

You can get your custom paper by one of our expert writers.

121 writers online

Still can’t find what you need?

Browse our vast selection of original essay samples, each expertly formatted and styled

Related Essays on Security

The aim of the GDPR (General Data Protection Regulation) is to protect all EU citizens from privacy and data breaches in an ever more data-driven world that is immensely unlike from the time in which the 1995 directive was [...]

Do you know, there is a risk of infection of about 1 million viruses without antivirus computer? Attaching pen drives or other USB drives with mobiles, or attacking email attachments and downloading software they attack. Along [...]

December 1st 2016 ushered in the eyes of many a new dispensation or so many Gambians thought. There was an air of optimism that things will be done differently; that the era of impunity is gone for good; that the negative peace [...]

Many people are surprised to learn that there is no international copyright law. Yes, that is right. There is not an international copyright law that will protect your work on the other side of the world. However, it is [...]

Without safety efforts and controls set up, your information may be subjected to an attack. A few attacks are latent, which means data is observed; others are dynamic, which means the data is modified with plan to degenerate or [...]

Mobile security, or more specifically mobile device security, has become increasingly important in mobile computing. Of particular concern is the security of personal and business information now stored on smartphones. More and [...]

Related Topics

By clicking “Send”, you agree to our Terms of service and Privacy statement . We will occasionally send you account related emails.

Where do you want us to send this sample?

By clicking “Continue”, you agree to our terms of service and privacy policy.

Be careful. This essay is not unique

This essay was donated by a student and is likely to have been used and submitted before

Download this Sample

Free samples may contain mistakes and not unique parts

Sorry, we could not paraphrase this essay. Our professional writers can rewrite it and get you a unique paper.

Please check your inbox.

We can write you a custom essay that will follow your exact instructions and meet the deadlines. Let's fix your grades together!

Get Your Personalized Essay in 3 Hours or Less!

We use cookies to personalyze your web-site experience. By continuing we’ll assume you board with our cookie policy .

  • Instructions Followed To The Letter
  • Deadlines Met At Every Stage
  • Unique And Plagiarism Free

essay on data protection

Support NYU Law

2024 Next Generation of Antitrust, Data Privacy and Data Protection Scholars Conference

This day long Next Generation of Antitrust, Data Privacy & Data Protection Scholars Conference provides an opportunity for professors in law, economics, accounting, finance, management, information system, operations management, and marketing who began their full-time tenure-track career in or after 2016 to present their latest research. Senior scholars and practitioners in the field will comment on the papers.

Continuing Legal Education (CLE) and Sign-In for MCLE (U.S. CLE) The ABA will seek 5.25 hours of CLE credit in 60-minute states and 6.3 hours of CLE credit for this program in 50-minute states. Credit hours are estimated and are subject to each state’s approval and credit rounding rules.

Registration Conference is free and open to the public. For registration and more information visit the ABA Section of Antitrust Law .

© 2024 New York University School of Law. 40 Washington Sq. South, New York, NY 10012.   Tel. (212) 998-6100

IMAGES

  1. Data Protection Essay

    essay on data protection

  2. SOLUTION: Part 1 data protection explained

    essay on data protection

  3. Digital Privacy and the Right to be Protected Free Essay Example

    essay on data protection

  4. Data Protection

    essay on data protection

  5. (DOC) DATA PROTECTION ESSAY---SUBMITTED VERSION

    essay on data protection

  6. AACSB Standard: Global, Data Protection

    essay on data protection

VIDEO

  1. Online Privacy And Security Protection |Digital Empowerment

  2. #essay #factbook #facts #essaywriting #climatechange #climate #airpollution #smog #css #css2025 #pms

  3. #essaydata #factbook #facts #essaywriting #airpollution #climatechange #climate #environmental #css

  4. IOM Director General on Data Protection and Privacy (FR)

  5. UTS Essay Data Mining-Daffa Rifqi Abyansyah

  6. AFRI BABA DIGITAL SOLUTIONS about, Market Analysis, POlls, Consumer Insight

COMMENTS

  1. The New Rules of Data Privacy

    Firms that generate any value from personal data will need to change the way they acquire it, share it, protect it, and profit from it. They should follow three basic rules: 1) consistently ...

  2. Complete and Effective Data Protection

    1. Introduction. The right to data protection enjoys a privileged position in the EU legal order. 1 The right is strictly interpreted by the Court of Justice of the EU (CJEU) and is given remarkable weight when balanced with other rights and interests. 2 While data protection sits alongside the more established right to respect for private life in the EU Charter, 3 it is data protection rather ...

  3. PDF PART 1: Data Protection, Explained

    A Guide for Policy Engagement on Data Protection. PART 1: plainedData Protection, ExplainedWhat is Data Protection?Data protection is commonly defined as the law desi. ned to protect your personal data. In modern societies, in order to empower us to control our data and to protect us from abuses, it is essential that data protection laws ...

  4. Conceptualising the right to data protection in an era of Big Data

    The Data Protection Directive of 1995 made no mention to a human right to data protection. As Van der Sloot has argued, the original rules in the Data Protection Directive and related rules 'could best be regarded as principles of good governance', as they were not framed as relating to the human rights of the data subject, but rather focussed on the procedural obligations of controllers ...

  5. Privacy matters because it empowers us all

    But your data is not valuable only because it can be sold. Facebook does not technically sell your data, for instance. Nor does Google. They sell the power to influence you. They sell the power to show you ads, and the power to predict your behaviour. Google and Facebook are not really in the business of data - they are in the business of power.

  6. A Review of Data Protection Regulations and the Right to Privacy: the

    across borders through our screens. Along with the rise of cyber warfare and terrorism, unwanted. surveillance, and collecting people's personal data without proper data protection regulation, the. right to privacy is at risk. In a 2021 Data Breach Investigation Report (DBIR) by Verizon, one of the largest.

  7. Data Privacy: Safeguarding Personal Information in The Digital Age

    The question of why data privacy is important essay arises as we navigate the complex landscape of data collection and usage. This essay will delve into the significance of data privacy, exploring its implications for individuals, businesses, and society as a whole. By understanding the importance of data privacy, we can ensure the protection ...

  8. Data protection, scientific research, and the role of information

    Introduction. This paper aims to critically assess the information duties set out in the General Data Protection Regulation (GDPR) and national adaptations when the purpose of processing is scientific research. Due to the peculiarities of the legal regime applicable to the research context, information about the processing plays a crucial role ...

  9. Full article: The European Union general data protection regulation

    Data protection by design broadly means that privacy rules are implemented in the technical infrastructure, for example pseudonymization and data minimization. Footnote 167 Data protection by default indicates that privacy-enhancing choices are made the default in the technical infrastructure, while data subjects can change the default. Data ...

  10. Data Privacy Perception, Protection and Assessment Essay

    In legal terms, privacy refers to the right to make an important decision that concerns personal issues free from state compulsion, pressure, and regulation. Therefore, privacy is related to the welfare of autonomy, self-determination, and self-respect. However, under the common law, privacy means the right to be alone and have the liberty to ...

  11. Reflections on Data Privacy

    The topic of privacy is gaining momentum in the mainstream big data discourse. Governments, policy makers, businesses, academics, and civil society are increasingly reflecting on how to strike the right balance between the potential benefits of exponential data collection and discovery, on the one hand, and the protection and privacy of data ...

  12. Essay on Data Privacy

    Data privacy is about keeping your personal information safe. It's like a secret that you don't want others to know. When you use the internet, you often share your details like your name, address, or credit card number. Companies collect this information to give you better services. But, if they don't protect it well, bad people can ...

  13. What is Data Protection

    What is Data Protection. Data protection is the process of protecting sensitive information from damage, loss, or corruption. As the amount of data being created and stored has increased at an unprecedented rate, making data protection increasingly important. In addition, business operations increasingly depend on data, and even a short period ...

  14. "Protecting Privacy in the Age of AI: Understanding Data Security

    This essay aims to explore the challenges surrounding data privacy and security issues within AI-powered technology from both legal and ethical perspectives. ... seeks to provide a comprehensive ...

  15. General Data Protection Regulation: [Essay Example], 777 words

    It is a European Union directive adopted in 1995 which regulates the processing of personal data within the European Union. The General Data Protection Regulation (GDPR) which was adopted in April 2016 will replace the Data Protection Directive and will be enforceable from May 2018. GDPR is a regulation by which the European Parliament, the ...

  16. Concepts of Privacy and Data Protection

    Even a single profile can provide up to 1500 data points. This can incorporate an individual's sexuality, browse history, political connection, and hospital records. In this document, we aim to investigate this discussion, focusing on the related however distinct concepts of privacy and data protection.

  17. Protection Against Data Breaches: [Essay Example], 539 words

    And the volume and velocity is also increasing. The bad news is that organizations of all verticals and sizes are being hit with data breaches. Ponemon reports the average total cost of a data breach rose from $3.62 to $3.86M, an increase of 6.4 percent. However, the same study reports companies that contained a breach in less than 30 days ...

  18. Data Protection Essay Examples

    Data Protection Essays. Enhancing Hypervisor Security: Insights From Recent Research. Overview Article The article by Aalam et al. (2021) outlines a hypervisor, a piece of programming or an application for PCs that empowers clients to make and oversee numerous concurrent virtual PCs. The abbreviation VMM, which represents virtual machine ...

  19. DATA PROTECTION ESSAY---SUBMITTED VERSION

    Undoubtedly, a comprehensive law on data protection to safeguard an individual's right to privacy is imperative. On July 31, the central government set-up a five member committee chaired by former Supreme Court Judge, Justice (retd.) B.N. Srikrishna, to draw up a draft Data Protection Bill.

  20. Data Protection Essay

    STUDENT ID: u1627043 WORD COUNT: 3212 Critically evaluate the difference that the new EU Data Protection Regulation will make in Europe. WHAT IS GDPR AND ITS AIMS The EU Data Protection Regulation (GDPR) 1 was recently adopted to replace the Data Protection Directive (Directive) 2.

  21. The General Data Protection Regulation (gdpr)

    Published: Jan 4, 2019. The General Data Protection Regulation (GDPR) is coming, and it will have an impact on your business whether you're operating within the UK or the European Union (EU).As of May 25th 2018, the current Data Protection Act will be updated and replaced with the GDPR. Not only will the new regulation detail existing laws ...

  22. A Study On Data Protection Act Social Policy Essay

    This policy was designed to conform to the (ref 1) Data Protection Act of 1998. This Act ensures client confidentiality and any information written about a client is accurate, truthful and any opinions are objective, substantiated by factual evidence. The Act also allows the individual to make a formal application to see the information held on ...

  23. Overview of the Data Protection Act

    Legislation governs communication in Ireland concerned with communication freedom of information act. It was amended in 2003 and is called the Data Protection Act. What is the Data Protection. When you give personal details to an organisation or individual, they have a duty to keep these details private and safe.

  24. 2024 Next Generation of Antitrust, Data Privacy and Data Protection

    Senior scholars and practitioners in the field will comment on the papers. Continuing Legal Education (CLE) and Sign-In for MCLE (U.S. CLE) The ABA will seek 5.25 hours of CLE credit in 60-minute states and 6.3 hours of CLE credit for this program in 50-minute states.