Paul Voigt, Axel von dem Bussche: the EU ePrivacy Regulation – Preliminary Guidance and Commentary

Article 8 ePrivacy Regulation -Protection of end-users’ terminal equipment information

Article 8 ePrivacy Regulation

Article 8 ePrivacy Regulation –Protection of end-users’ terminal equipment information

1. The use of processing and storage capabilities of terminal equipment and the collection of information from end-users’ terminal equipment, including about its software and hardware, other than by the end-user concerned shall be prohibited, except on the following grounds:

(a) it is necessary for the sole purpose of providing an electronic communication service; or

(b) the end-user has given consent; or

(c) it is strictly necessary for providing a service specifically requested by the end-user; or

(d) if it is necessary for the sole purpose of audience measuring, provided that such measurement is carried out by the provider of the service requested by the end-user, or by a third party, or by third parties jointly on behalf of or jointly with provider of the service requested provided that, where applicable, the conditions laid down in Articles 26 or 28 of Regulation (EU) 2016/679 are met; or

(da) it is necessary to maintain or restore the security of information society services or terminal equipment of the end-user, prevent fraud or prevent or detect technical faults for the duration necessary for that purpose; or

(e) it is necessary for a software update provided that:

(i) such update is necessary for security reasons and does not in any way change the privacy settings chosen by the end-user,

(ii) the end-user is informed in advance each time an update is being installed, and

(iii) the end-user is given the possibility to postpone or turn off the automatic installation of these updates; or

(f) it is necessary to locate terminal equipment when an end-user makes an emergency communication either to the single European emergency number ‘112’ or a national emergency number, in accordance with Article 13(3).

(g) where the processing for purpose other than that for which the information has been collected under this paragraph is not based on the end-user’s consent or on a Union or Member State law which constitutes a necessary and proportionate measure in a democratic society to safeguard the objectives referred to in Article 11 the person using processing and storage capabilities or collecting information processed by or emitted by or stored in the end-users’ terminal equipment shall, in order to ascertain whether processing for another purpose is compatible with the purpose for which the electronic communications data are initially collected, take into account, inter alia:

(i) any link between the purposes for which the processing and storage capabilities have been used or the information have been collected and the purposes of the intended further processing;

(ii) the context in which the processing and storage capabilities have been used or the information have been collected, in particular regarding the relationship between end-users concerned and the provider;

(iii) the nature the processing and storage capabilities or of the collecting of information as well as the modalities of the intended further processing, in particular where such intended further processing could reveal categories of data, pursuant to Article 9 or 10 of Regulation (EU) 2016/679;

(iv) the possible consequences of the intended further processing for end-users;

(v) the existence of appropriate safeguards, such as encryption and pseudonymisation.

(h) Such further processing in accordance with paragraph 1 (g), if considered compatible, may only take place, provided that:

(i) the information is erased or made anonymous as soon as it is no longer needed to fulfil the purpose,

(ii) the processing is limited to information that is pseudonymised, and

(iii) the information is not used to determine the nature or characteristics of an end-user or to build a profile of an end-user.

(i) For the purposes of paragraph 1 (g) and (h), data shall not be shared with any third parties unless the conditions laid down in Article 28 of Regulation (EU) 2016/697 are met, or data is made anonymous.

2. The collection of information emitted by terminal equipment of the end-user to enable it to connect to another device and, or to network equipment shall be prohibited, except on the following grounds:

(a) it is done exclusively in order to, for the time necessary for, and for the purpose of establishing or maintaining a connection; or

(b) the end-user has given consent; or

(c) it is necessary for the purpose of statistical purposes that is limited in time and space to the extent necessary for this purpose and the data is made anonymous or erased as soon as it is no longer needed for this purpose,

(d) it is necessary for providing a service requested by the end-user.

2a. For the purpose of paragraph 2 points (b) and (c), a clear and prominent notice is shall be displayed informing of, at least, the modalities of the collection, its purpose, the person responsible for it and the other information required under Article 13 of Regulation (EU) 2016/679 where personal data are collected, as well as any measure the end-user of the terminal equipment can take to stop or minimise the collection.

2b. For the purpose of paragraph 2 points (b) and (c), the collection of such information shall be conditional on the application of appropriate technical and organisational measures to ensure a level of security appropriate to the risks, as set out in Article 32 of Regulation (EU) 2016/679, have been applied.

3. The information to be provided pursuant to paragraph 2a may be provided in combination with standardized icons in order to give a meaningful overview of the collection in an easily visible, intelligible and clearly legible manner.

4. The Commission shall be empowered to adopt delegated acts in accordance with Article 25 determining the information to be presented by the standardized icon and the procedures for providing standardized icons.

(20) Terminal equipment of end-users of electronic communications networks and any information relating to the usage of such terminal equipment, in particular where such information is processed by, stored in, or collected from such equipment, or where information is collected from it or processed in order to enable it to connect to another device and or network equipment, are part of the end-user’s private sphere, including the privacy of one’s communications, and require protection in accordance with the Charter of Fundamental Rights of the European Union. Given that such equipment contains or processes information that may reveal details of an individual’s emotional, political, social complexities, including the content of communications, pictures, the location of individuals by accessing the device’s GPS capabilities, contact lists, and other information already stored in the device, the information related to such equipment requires enhanced privacy protection. Furthermore, the so-called spyware, web bugs, hidden identifiers, tracking cookies and other similar unwanted tracking tools can enter end-user’s terminal equipment without their knowledge in order to gain access to information, to store hidden information and to trace the activities. Information related to the end-user’s device may also be collected remotely for the purpose of identification and tracking, using techniques such as the so-called ‘device fingerprinting’, often without the knowledge of the end-user, and may seriously intrude upon the privacy of these end-users. Techniques that surreptitiously monitor the actions of end-users, for example by tracking their activities online or the location of their terminal equipment, or subvert the operation of the end-users’ terminal equipment pose a serious threat to the privacy of end-users. Therefore, the use of processing and storage capabilities and the collection of information from end-user’s terminal equipment should be allowed only with the end-user’s consent and or for other specific and transparent purposes as laid down in this Regulation. The information collected from end-user’s terminal equipment can often contain personal data.

(20aa) In light of the principle of purpose limitation laid down in Article 5 (1) (b) of Regulation (EU) 2016/679, it should be possible to process in accordance with this Regulation data collected from the end-user’s terminal equipment for purposes compatible with the purpose for which it was collected from the end-user’s terminal equipment.

 

(20aaa) The responsibility for obtaining consent for the storage of a cookie or similar identifier lies on the entity that makes use of processing and storage capabilities of terminal equipment or collects information from end-users’ terminal equipment, such as an information society service provider or ad network provider. Such entities may request another party to obtain consent on their behalf. The end-user’s consent to storage of a cookie or similar identifier may also entail consent for the subsequent readings of the cookie in the context of a revisit to the same website domain initially visited by the end-user.

(20aaaa) In contrast to access to website content provided against monetary payment, where access is provided without direct monetary payment and is made dependent on the consent of the end-user to the storage and reading of cookies for additional purposes, requiring such consent would normally not be considered as depriving the end-user of a genuine choice if the end-user is able to choose between services, on the basis of clear, precise and user-friendly information about the purposes of cookies and similar techniques, between an offer that includes consenting to the use of cookies for additional purposes on the one hand, and an equivalent offer by the same provider that does not involve consenting to data use for additional purposes, on the other hand. Conversely, in some cases, making access to website content dependent on consent to the use of such cookies may be considered, in the presence of a clear imbalance between the end-user and the service provider as depriving the end-user of a genuine choice. This would normally be the case for websites providing certain services, such as those provided by public authorities. Similarly, such imbalance could exist where the end-user has only few or no alternatives to the service, and thus has no real choice as to the usage of cookies for instance in case of service providers in a dominant position.

To the extent that use is made of processing and storage capabilities of terminal equipment and information from end-users’ terminal equipment is collected for other purposes than for what is necessary for the purpose of providing an electronic communication service or for the provision of the service requested, consent should be required. In such a scenario, consent should normally be given by the end-user who requests the service from the provider of the service.

(20a) End-users are often requested to provide consent to the storage and access to stored data in their terminal equipment, due to the ubiquitous use of tracking cookies and similar tracking technologies. As a result, end-users may be overloaded with requests to provide consent. This can lead to a situation where consent request information is no longer read and the protection offered by consent is undermined. Implementation of technical means in electronic communications software to provide specific and informed consent through transparent and user-friendly settings, can be useful to address this issue. Where available and technically feasible, an end user may therefore grant, through software settings, consent to a specific provider for the use of processing and storage capabilities of terminal equipment for one or multiple specific purposes across one or more specific services of that provider. For example, an end-user can give consent to the use of certain types of cookies by whitelisting one or several providers for their specified purposes. Providers of software are encouraged to include settings in their software which allows end-users, in a user friendly and transparent manner, to manage consent to the storage and access to stored data in their terminal equipment by easily setting up and amending whitelists and withdrawing consent at any moment. In light of end-user’s self-determination, consent directly expressed by an end-user should always prevail over software settings. Any consent requested and given by an end-user to a service should be directly implemented, without any further delay, by the applications of the end user’s terminal. If the storage of information or the access of information already stored in the end-user’s terminal equipment is permitted, the same should apply.

(21) Use of the processing and storage capabilities of terminal equipment or access to information stored in terminal equipment without the consent of the end-user should be limited to situations that involve no, or only very limited, intrusion of privacy. For instance, consent should not be requested for authorizing the technical storage or access which is necessary and proportionate for the purpose of providing a specific service requested by the end-user. This may include the storing of cookies for the duration of a single established session on a website to keep track of the end-user’s input when filling in online forms over several pages, authentication session cookies used to verify the identity of end-users engaged in online transactions or cookies used to remember items selected by the end-user and placed in shopping basket. In the area of IoT services which rely on connected devices (such as connected thermostats, connected medical devices, smart meters or automated and connected vehicles), the use of the processing and storage capacities of those devices and access to information stored therein should not require consent to the extent that such use or access is necessary for the provision of the service requested by the end-user. For example, storing of information in or accessing information from a smart meter might be considered as necessary for the provision of a requested energy supply service to the extent the information stored and accessed is necessary for the stability and security of the energy network or for the billing of the end-users’ energy consumption. The same applies for instance to storing, processing or accessing of information from automated and connected vehicles for security related software updates.

(21aa) In some cases the use of processing and storage capabilities of terminal equipment and the collection of information from end-users’ terminal equipment may also be necessary for providing a service, requested by the end-user, such as services provided in accordance with the freedom of expression and information including for journalistic purposes, e.g. online newspaper or other press publications as defined in Article 2 (4) of Directive (EU) 2019/790, that is wholly or mainly financed by advertising provided that, in addition, the end-user has been provided with clear, precise and user-friendly information about the purposes of cookies or similar techniques and has accepted such use.

(21a) Cookies can also be a legitimate and useful tool, for example, in assessing the effectiveness of a delivered information society service, for example of website design and advertising or by helping to measure the numbers of end-users visiting a website, certain pages of a website or the number of end-users of an application. This is not the case, however, regarding cookies and similar identifiers used to determine the nature of who is using the site, which always require the consent of the end-user. Information society providers that engage in configuration checking to provide the service in compliance with the end-user’s settings and the mere logging of the fact that the end-user’s device is unable to receive content requested by the end-user should not constitute access to such a device or use of the device processing capabilities.

(21b) Consent should not be necessary either when the purpose of using the processing storage capabilities of terminal equipment is to fix security vulnerabilities and other security bugs or for software-updates for security reasons, provided that the end-user concerned has been informed prior to such updates, and provided that such updates do not in any way change the functionality of the hardware or software or the privacy settings chosen by the end-user and the end-user has the possibility to postpone or turn off the automatic installation of such updates. Software updates that do not exclusively have a security purpose, for example those intended to add new features to an application or improve its performance, should not fall under this exception.

 

 (25) Accessing electronic communications networks requires the regular emission of certain data packets in order to discover or maintain a connection with the network or other devices on the network. Furthermore, devices must have a unique address assigned in order to be identifiable on that network. Wireless and cellular telephone standards similarly involve the emission of active signals containing unique identifiers such as a MAC address, the IMEI (International Mobile Station Equipment Identity), the IMSI, the WiFi signal etc. A single wireless base station (i.e. a transmitter and receiver), such as a wireless access point, has a specific range within which such information may be captured. Service providers have emerged who offer physical movements’ tracking services based on the scanning of equipment related information with diverse functionalities, including people counting, such as providing data on the number of people waiting in line, ascertaining the number of people in a specific area, referred to as statistical counting for which the consent of end-users is not needed, provided that such counting is limited in time and space to the extent necessary for this purpose. Providers should also apply appropriate technical and organisations measures to ensure the level if security appropriate to the risks, including pseudonymisation of the data and making it anonymous or erase it as soon it is not longer needed for this purpose. Providers engaged in such practices should display prominent notices located on the edge of the area of coverage informing end-users prior to entering the defined area that the technology is in operation within a given perimeter, the purpose of the tracking, the person responsible for it and the existence of any measure the end-user of the terminal equipment can take to minimize or stop the collection. Additional information should be provided where personal data are collected pursuant to Article 13 of Regulation (EU) 2016/679. This information may be used for more intrusive purposes, which should not be considered statistical counting, such as to send commercial messages to end-users, for example when they enter stores, with personalized offers locations, subject to the conditions laid down in this Regulation, as well as the tracking of individuals over time, including repeated visits to specified locations.

(25a) Processing the information emitted by the terminal equipment to enable it to connect to another device would be permitted if the end-user has given consent or if it is necessary for the provision of a service requested by the end-user. This kind of processing might be necessary for example for the provision of some IoT related services.

Art. 8 ePrivacy Regulation prohibits the use of processing and storage capabilities as well as the collection of information from end-users´ terminal equipment. The protection of terminal equipment had already been subject to regulation under the preceding ePrivacy Directive.[1] According to Art. 5 Sec. 3 ePrivacy Directive, the use of electronic communications networks to store information or to gain access to information stored in the terminal equipment was restricted to cases, in which the person concerned had been previously provided with clear and comprehensive information as well as the right to refuse the processing entirely. Those provisions stay in place under the ePrivacy Regulation, yet in a modified manner. The legislator sophisticated and expanded the range of exceptions, according to which storage or collection of information on an end-user´s terminal equipment is only permissible under the exhaustive conditions of Art. 8 Sec. 1 lit. a ff.[2]

Ratio to this rule is found in Recital 20 ePrivacy Regulation. According to the legislator, terminal equipment, in particular when used to process, store or collect information, is an integral part of the end-user´s privacy.[3] This consideration corresponds to the fact that today end-users store an abundance of information about their private life on such devices. As a result, these often contain intimate insights into the emotional, social, financial, health or other state of their user. Moreover, it is not only this information being stored on devices alone, but also and in particular the devices themselves, representing a part of the end-user´s privacy. Terminal equipment is being carried around and used almost all the time. This way, the relationship between an individual and his or her personal environment – anchored in the idea of privacy – materializes in these devices. It is because of this relationship, the legislator considers terminal equipment to be in particular need of protection.

A multitude of tools that enable surreptitious access to his or her device information oppose this need. Individually or in combination, these tools are able to monitor the end-users digital and physical behaviour in its entirety.[4] Spyware, web bugs, hidden identifiers and tracking cookies serve as examples. The legislator points out that in many cases end-users are not aware of these technologies and therefore unknowingly become victim of a “serious intrusion” into their privacy.[5]

From a dogmatic point of view, Art. 8 ePrivacy Regulation reflects the interplay of protective subjects, which can already be found in the GDPR. There, protection focuses on natural persons and their personal data. Here, the range expands to a triad of subjects: the persons concerned (both natural and legal), their privacy and their terminal equipment. Withal, Art. 8 ePrivacy Regulation shifts the focus from the abstract concept of privacy to its more specific materialisation. Doing so, it clarifies that privacy as such does not only refer to the sphere of personal development itself, but also to the preconditions for its enactment and its continuing effects. Indeed, such enactment cannot be granted unlimitedly. Yet, the provision points out that restrictions will not in every case result from external interferences alone, but rather from the deliberate decisions of end-users themselves. Many times, end-users will find that participation in life will depend on both the use of terminal equipment and the acceptance of interferences with it. Knowingly or unknowingly, self-information, communication and organisation in a digital environment are connected to small sacrifices of privacy, which end-users need to take into account. Art. 8 ePrivacy Regulation addresses this fact by balancing out the conflicting interests, technical realities and prerequisites, all with a focus on the protective purpose and its fundamental rights determination. Conversely, this balancing exercise determines the interpretation of this provision and needs to be kept in mind under the following comments.

[1] Art. 5 Sec. 3 of Directive 2002/58/EC of the European Parliament and of the Council (ePrivacy Directive) concerning the processing of personal data and the protection of privacy in the electronic communications sector.

[2] See No. I.3.

[3] Cf. already Rec. 24 ePrivacy Directive.

[4] Cf. already Rec. 65 of Directive (EU) 2009/136/EC of the European Parliament and of the Council of 25 November 2009 amending Directive 2002/22/EC on universal service and users´ rights relating to electronic communications networks and services, Directive 2002/58/EC concerning the processing of personal data and the protection of privacy in the electronic communications sector and Regulation (EC= No 2006/2004 on cooperation between national authorities responsible for the enforcement of consumer protection laws (Cookie Directive).

[5] Rec. 20 ePrivacy Regulation.

a) Terminal equipment

The definition of ‘terminal equipment’, is laid out in Art. 4 Sec. 1 lit. c, which itself refers to Art. 1 Sec. 1 Directive 2008/63/EC (Terminal Equipment Directive)[6]. The Terminal Equipment Directive includes any device, which is directly or indirectly connected to the interface of a public telecommunications network and sends, processes or receives information. Consequently, the definition includes a wide range of internet-attached devices, such as computers, smartphones and tablets, smart TVs, speakers or watches, but also routers, modems and other devices, linking the end-user to the internet.[7] Accordingly, only two factors are decisive in order to qualify an item as “terminal equipment”. I.e. a) the existence of a public telecommunications network and b) a connection with it.

aa) Public Telecommunications Network

The term ‘public telecommunications network’ is not defined in EU law. However, Art. 2 Secs. 1, 8 Directive (EU) 2018/1972 (EECC)[8] serves as a resource for definitions in the context of the ePrivacy Regulation[9] and introduces the resembling terms ‘electronic communications network’ and ‘public electronic communications network’. According to these definitions, an ‘electronic communications network’ is a transmission system, which permits the conveyance of signals by wire, radio, optical or other electromagnetic means, irrespective of the type of information conveyed.[10] Thus, the term includes all networks regardless of whether they are permanent or provisional, centralized or decentralized and encompasses all necessary equipment of such networks, including active as well as non-active components, switching or routing equipment and other resources. Examples encompass classic telephone networks consisting of copper cable and the respective active infrastructure, networks operated on fibreglass, satellite networks or cellular networks (e.g. 4G/LTE) and also Wi-Fi networks, both public or for-pay (e.g. in a hotel).

Furthermore, according to Art. 2 Sec. 8 EECC, the electronic communications network must be ‘public’, i.e. when used wholly or mainly for the provision of publicly available electronic communications services. The term ‘public’ however does not mean the network must be operated by or on behalf of a sovereign authority. Neither does it mean, that the service must be operated free of charge. The only criteria is that the network can be accessed by any member of the public and on equal terms. Thus, a private electronic communication network provider is included to the definition just as well as a publicly available Wi-Fi routed from within a town hall. Nonetheless, despite this wide definition of Art. 2 Sec. 8 EECC, there are still conceivable cases, in which an end-user´s terminal equipment is not connected to the public network and therefore not a subject to the provisions under Art. 8 Sec. 1 ePrivacy Regulation.[11] Even if these cases might be rather theoretical of nature, they can occur, when several individuals set up a private network and do not connect their individual devices to the internet. An example might be given in the case of a corporate network, which is accessible only by end devices that are not capable of or just in fact not connected to the internet. Also, e-sports tournaments and LAN parties may be subsumed hereunder, however again, only if no other connection to the internet is present.

bb) Connection

The connection must be made between the end-user’s device and the public telecommunications network. However, it is of no importance for the applicability of the ePrivacy Regulation, whether the connection of the device to the telecommunications network is made on a physical or non-physical level. Art. 1 No. 1 Directive 2008/63/EC expressly includes both direct and indirect connections, e.g. by wire, optical fibre or electromagnetics.

b) Affiliation to an end-user

The applicability of Art. 8 ePrivacy Regulation requires the involvement of an end-user’s device.[12]

aa) Individual assessment of actions

In real life ePrivacy scenarios, the distinction whether someone is an end-user or not is not always easy to make. The EECC Directive, from which the definition of ‘end-user’ is taken, mainly deals with regulatory questions around the provision of telecommunications networks and services, where the difference between a provider and an end-user is rather clear. In that context, the person concerned is either one or the other. Meanwhile, when it comes to the protection of terminal equipment under the ePrivacy Regulation, there might be entities that both provide public communications networks or publicly available electronic communications services on the one hand and are end-users on the other. Also, they may engage in different activities, either for internal purposes or in other fields of business. Therefore, it seems preferable not to base the qualification of an end-user on absolute or general categories, but rather on their individual actions. Someone is not an end-user in regard to actions that entail the provision of public communications networks, publicly available electronic communications services or other actions that do not belong to his privacy. Conversely, in regard to activities that include his privacy, e.g. by using or requesting a publicly available electronic communications service, the same person may qualify as an end-user after all. This, for instance, might be the case for professional individuals, when operating business-devices purely in an official environment. As soon as their device, however, is being used in a private context, they acquire the status of an end-user.[13]

bb) Relation between end-users and terminal equipment

The ePrivacy Regulation leaves unclear, whether an end-user has to own a device to make it ‘his property’ in the sense of the definition “terminal equipment of end-users” in Art. 2 Sec. 1 ePrivacy Regulation, or whether simple usage of the device suffices. Regarding the objective of the ePrivacy Regulation, aiming to provide effective protection for an end-user’s privacy, it seems preferable to interpret the term broadly.[14] After all, the privacy risks an end-user has to face from a device that is potentially revealing information about him are the same, regardless of whether he owns it or uses it on other grounds, e.g. by rental contract. Thus, any device rightfully used by the end-user should fall under the protection of the ePrivacy Regulation.

Example: Individual R subscribes to an Internet access service. The plan includes a modem and router device that is owned and administrated by the Internet service provider but installed in P’s apartment.

cc) Multiple persons and employer-employee relationships

There are, however, constellations where in principle more than one person must be considered the end-user to a certain device, because all of those in question have an interest in the device’s integrity. This might be the case for work computers, shared family devices, computer pools in universities or comparable situations. A distinction can easily be made in instances of clearly separated user accounts existing on one device. In this case, the information of the current person using the device is being shielded from the information of other users. Since privacy risks are limited to information related to the user currently logged-in, only this user should be considered an end-user in context of Art. 8 ePrivacy Regulation.

However, the determination proves to be more difficult, when such separations do not exist, e.g. in the case of shared accounts or devices. In such cases it is nearly impossible for a provider to determine, which persons are using the device and which of these usages are rightful. Consequently, qualifying all persons as end-users, makes effective consent, as the most relevant exception provision, very difficult to obtain or at least to prove all requirements as given in the individual case. Art. 4 Sec. 2a ePrivacy Regulation addresses this problem with regard to the demonstration of consent, stating that in cases a provider is not able to identify the data subject, i.e. to prove consent was given by the right person, it suffices to prove that consent was given at all, as long as it originates from the terminal equipment.[15] The legislator solves the problem on a practical level, hence. Conversely, however, it must be deduced that the legislator applies a broad concept of the term ‘end-user’, meaning any person with a relation to the equipment.

Indeed, it is left up to legal interpretation, what the nature of this relation is. The preceding Regulation of Art. 5 Sec. 3 ePrivacy Directive made a distinction between subscribers and users, both of which could consent to interference with a device without one having precedence over the other. Art. 8 ePrivacy Regulation, however, gives up this distinction. By reference in Art. 4 Sec. 1 lit. b, it instead focuses on the end-user defined as “a natural or legal person using […] a […] service”, as per Art. 2 Nos. 13-14 EECC Directive. This might imply that it is of no relevance, who has made the subscription to a service or who the owner of a device is. Rather, it might be only relevant who is the current user of a device.[16] This view is supported by the wording of Art. 8 ePrivacy Regulation, which speaks of an end-user concerned in singular, indicating that the ePrivacy Regulation itself assumes that only a single person can be considered an end-user for the purposes of Art. 8 ePrivacy Regulation. Also, Recital 20aaaa ePrivacy Regulation addresses the situation of multiple concerned individuals, stating that consent should “normally be given by the end-user who requests the service”. This person will be the one, who actually uses the device. Eventually, one might, by a strict understanding of the term, only consider an end-user as the person, who uses the device at last instance, i.e. the ‘final’ or ‘last user’. This interpretation corresponds the fact that regularly only the ‘final user’ connects his private sphere to the device. Prior users might indeed also have an interest on the integrity of the device, e.g. when they themselves have stored personal information on it. Yet, passing on the device, they impliedly consent to the risk involved with the use by another person and are therefore less worth protecting. After all, focussing on the actuality of use provides a precisely determinable objective criterion for third parties that have to deal with the right end-user, if they want to access his device. Considering only the current user as the end-user therefore represents the most pragmatic possibility in practice to deal with the uncertainty regarding multiple users of the same device. Interests of other users of the same device would in that regard have to stand back for the sake of legal clarity.

Yet, in light of Art. 4 Sec. 2a ePrivacy Regulation´s clear wording, such an interpretation of the relation between end-user and terminal device is hardly defensible. In fact, the Regulation considers everybody with a reasonable affiliation to the device an end-user. Not least, this results from the extensive wording of Art. 4 Sec. 1 lit. b ePrivacy Regulation in connection with Art. 2 Sec. 14 EECC Directive defining an end-user as everyone not themselves providing networks or communication services, but also and in particular from the protective purpose of the ePrivacy Regulation. Privacy does not end, where usage of terminal equipment by another person begins, but rather prevail, as long as elements of an individual´s private sphere are still concerned. This, indeed, makes it hard to draw a clear demarcation line between end-users or even determine one single individual as decisive addressee of the provision.

However, Art. 4 Sec. 2a ePrivacy Regulation provides guidance, as of how practical application of the provision may look like. Accordingly, in a first step, providers need to identify all relevant data subjects, in the sense of individuals (natural or legal) exhibiting ‘reasonable affiliation’ to the device. Affiliation might particularly draw on a privacy related context, however not require privacy to be materialised within the equipment. Rather, one might consider the fact, in relation to whom a potential issue with the interference of privacy might arise, when making use of their processing or storage capabilities or the collection of information from terminal equipment. Examples for such affiliation will regularly be a proprietary relation or actual use. If, however, not all respective individuals become apparent, it will suffice on a second step to concentrate on the actual operator of the device, i.e. the ‘final’ user in the above mentioned sense.

When it comes to employment relationships, it is as of yet not specified, whether the employer or the employee, that is using a work-related device, has to be considered the relevant end-user. The Council of the European Union has proposed to explicitly state that the employee is the relevant end-user.[17] The European Parliament again assumes the employee to be the relevant end-user already under the text of the European Commission’s proposal and therefore proposes to add permissions allowing the employer work-related access to devices.[18] Meanwhile, industry associations call for amendments that consider the employer to be the relevant end-user.[19] In fact, there is no reason to deviate from the above mentioned definition, since privacy might be inflicted with respect to both individuals. This finding proves to be accurate in the light of Recital 3 ePrivacy Regulation, stating that the  Regulation secures both the interests of natural and legal persons, since the latter will also be affected by a disclosure of sensitive information concerning their business. Indeed, reasons of practicality require a restriction of evidence requirements in consent obtainment. However, this restriction is provided for by Art. 4 Sec. 2a ePrivacy Regulation (see above), making deviations once again pointless. Consequently, both the employer and the employee must be referred to as end-users.

dd) Usage by the end-user himself

Art. 8 Sec. 1 ePrivacy Regulation prohibits the use of a device’s storage and processing capabilities as well as the collection of information from the device by anyone ’other than the end-user concerned‘. This means: the end-user is free to use and interfere with his device while anyone else needs a justification for interfering with it. However, at times it may be difficult to assess, whether the device is actually “used” by the end-user or by a third party. On the one hand, it is clear that actions, which cannot be traced back to the use of the end-user in any way must not be considered a ’use by the end-user concerned‘. They need to be justified pursuant to Art. 8 Sec. 1 ePrivacy Regulation. For example, self-installing viruses utilizing a security vulnerability in the device´s operating system fall under this scenario. On the other hand, processes that are the intentional and direct result of a deliberate action of the informed end-user obviously represent a ’use (…) by the end-user concerned’. Here, it eventually comes down to an assessment of cause and effect.

However, most processes that occur on devices can causally be traced back to actions of the end-user. For example, if an end-user installs and runs programs, opens websites or clicks on links, these actions trigger certain processes on the device. Some of them might be in accordance with the end-user’s intent and are useful to him, other processes might be unnecessary to achieve the result desired by the end-user. Since Art. 8 ePrivacy Regulation only protects the privacy of the end-user, especially with regard to access techniques that take place without their knowledge[20], it should not be interpreted to include all processes that can be causally linked to some kind of interaction of the end-user. Rather, it is favourable to interpret the provision to only cover those processes that are the intended result of a deliberate action of the informed end-user. The relevant criterion is, whether a common end-user could reasonably expect that a certain process is the result of his interactions.[21] Only such ‘expected’ processes are not subject to the need of justification in Art. 8 Sec. 1 ePrivacy Regulation. Admittedly, the focus on the reasonable expectations of an informed end-user can at times make it difficult to distinguish between processes that are still ‘used […] by the end-user concerned’ and those that require consent of the end-user. If in doubt, it is advisable to obtain consent in these cases.

[6] Directive (EU) 2008/63/EC of the European Commission of 20 June 2008 on competition in the markets in telecommunications terminal equipment.

[7] Cf. also in view of the extensive meaning Rec. 21, naming IoT services, which rely on connected devices, such as connected thermostats, medical devices, smart meters or automated and connected vehicles.

[8] Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast).

[9] It is most probably not the intention of the legislator to introduce an initially new and different term for the definition of ‘terminal equipment’. It is rather likely, that the existing legal terminology of European law was used in the legislation process. Any other course of action would contradict both the unity of the legal system and the principle of legal certainty. Therefore, the terms ‘public telecommunications network’ and ‘public electronic communications’ should be used synonymously.

[10] See Art. 2 Sec. 1 EECC.

[11] Cf. Rec. 13 ePrivacy Regulation.

[12] For a detailed description of the term see Art. 4 No. I.2.e).

[13] Note: Comments above refer to actions by individuals, not their employing legal entities as regulated by Rec. 3 ePrivacy Regulation. These have to be assessed individually.

[14] Cf. Rec. 6, 20 ePrivacy Regulation.

[15] This approach moreover entails the advantage to avoid an otherwise necessary identification of the end-user, which would also be undesirable from a GDPR-perspective.

[16] A similar interpretation has already been made by the British supervisory authority ICO with regard to the ePrivacy Directive and cases of multiple users involved. The ICO advises, when in doubt, to repeatedly ask for consent over short periods of time and observe only to the last of such declarations: ICO, Guide to the Privacy and Electronic Communications Regulations, v. 2.4.21, 9 May 2018, p. 29 – 30.

[17] Council of the European Union, ST 6771/19, Rec. 19b.

[18] European Parliament, LIBE report A8-0324/201720 October 2017, amendment 91.

[19] E.g. Bitkom views on the Presidency Discussion Paper 6771/19, p. 3, 6.

[20] Cf. Rec. 20, 21 ePrivacy Regulation.

[21] The concept of ‘reasonable expectations’ is explicitly mentioned in the GDPR, where it constitutes one factor in the determination of ‘legitimate interest’ in Art. 6 Sec. 1 phrase 1 lit. f GDPR, see Rec. 47 GDPR.

Art. 8 Sec. 1 ePrivacy Regulation prohibits three different ways of interference[22] with a device’s integrity: the use of its storage capabilities (a), the use of its processing capabilities (b) and the collection of information contained in the device (c). Any interference is only permissible if one of the legal conditions in Art. 8 Sec. 1 lit. a to i ePrivacy Regulation is met[23]. In practice, effective consent will be required regularly (see under No. I.3.b).

The prohibition applies regardless of whether the device is accessed in an unauthorized way or in a way that is technically intended (i.e. to allow the execution of code). Furthermore, Art. 8 ePrivacy Regulation encompasses processing of personal as well as non-personal data. This approach is in line with the regulatory content of its predecessor Art. 5 Sec. 1 ePrivacy Directive.[24] Otherwise, the scope of application would be considerably reduced, because the mere placing of a cookie on a device does in itself not necessarily constitute processing of personal data. Actually, it is precisely this activity (both already under Art. 5 Sec. 3 ePrivacy Directive and now with the new Art. 8 of the Regulation) that should be subject to legalisation and regular approval. Consequently, processing of pseudonymous or even anonymous data, e.g. in order to research or develop new services and features, will as well be governed by the requirements and preconditions of the provision.

As already mentioned, the main case of application for Art. 8 Sec. 1 ePrivacy Regulation is the setting of cookies on user terminals, i.e. the storage of text segments or so-called device identifiers that enable providers to track and recognise end-users. This procedure is particularly valuable for the advertising industry, as it allows the evaluation of users’ online behaviour and, thus, the implementation of targeted advertising.

However, the scope of protection under Art. 8 Sec. 1 ePrivacy Regulation is not restricted to the use of cookies alone. The Regulation rather seeks to cover the entirety of options for interferences. This approach follows from the aforementioned consideration that not only cookies, but also “similar technologies” represent a threat to the end-user’s privacy.[25] “Similar technologies” include inter alia secretly installed spyware, malware, or other information on devices and tools of unwanted tracking, monitoring and surveillance.[26] Particularly noteworthy is the so-called “device fingerprinting”, which allows end-user-monitoring, unrestricted to web browsers and thus includes a broad variation of internet connected devices. It may be listed as one of the results to an exploration of alternatives for cookies, not only, but especially to avoid the consent requirement under Art. 5 Sec. 3 ePrivacy Directive.[27] Avoiding a respective backdrop and in the light of future technological developments, it seems comprehensible that the legislator pursues their regulatory line, which has been already taken before[28] and provides for the use of processing and storage capabilities in a technology-neutral manner.[29]

a) Use of storage capabilities

Art. 8 Sec. 1 ePrivacy Regulation prohibits the “use of storage capabilities of terminal equipment” by anyone other than by the end-user concerned. Briefly summarized, the provision is intended to protect users from third parties saving data sets on their devices. As already mentioned before, in particular, the extensive use of cookies is targeted here, as was already the case within the preceding stipulation of Art. 5 Sec. 3 ePrivacy Directive.[30] Doing so, Art. 8 Sec. 1 ePrivacy Regulation answers imprecise, incorrect or inadequate implementations by individual EU Member States, such as Estonia and Germany.[31] Indeed, Art. 5 Sec. 3 ePrivacy Directive in its first version had been unclear with regard to the specific requirements for the effectiveness of refusal to such use. This uncertainty dissolved, however, in the course of the amendments by Directive 2009/136/EG (‘Cookie Directive’), clarifying that the use of (non-functional) cookies was subject to prior consent by the end-user, i.e. an opt-in-procedure. Yet, in Germany, instead of adjusting the respective transposition in § 15 Sec. 3 of the German Telemediengesetz (TMG), the original wording was kept, leaving pseudonymous profiling with cookies subject to an explicit objection (“opt-out”). In the light of consequent legal uncertainty, the German Federal Court of Justice (GFCJ), eventually submitted the question of interpretation to the CJEU, asking about the effectivity of consent pursuant to Art. 5 Sec. 3 and Art. 2 lit. f ePrivacy Directive in case of a pre-set tick box, which the user must uncheck in order to refuse cookie-use.[32] The CJEU, in his much-noticed Planet49-decision, denied that question, making the case for a clear “opt-in” rule.[33] This led the GFCJ to interpret § 15 Sec. 3 TMG in conformity with the Directive, however contrary to its explicit wording, to the effect that the term ‘objection’ (“opt-out”), actually, required consent, i.e. an “opt-in” procedure.[34] It was only following reforms in telemedia and communication law in late 2021 that this predicament has been straightened out. § 25 Sec. 1 of the German Telemedia-Telecommunications-Data Protection-Act (TTDPA) henceforth adopts the wording of Art. 5 Sec. 3 ePrivacy Directive, stipulating the requirement of prior consent (“opt-in”) to cookie-use.

In front of this backdrop it seems plausible that cookie law now will be subject to a Regulation directly applicable in all Member States and give them considerably less scope for individual interpretation.

aa) Storage Capabilities

The term ‘storage capabilities’ includes all parts of an electronic device that enable it to store information over time. This definition encompasses permanent storage (e.g. on hard disk drives or in flash memory) as well as temporary storage (meaning storage that is only available as long as the device is running, like RAM). Recital 65 ePrivacy Directive clarified that interferences with terminal equipment impose a privacy-threat, regardless of whether respective applications are downloaded via electronic communications networks or are delivered offline, e.g. by installation of software transported on external data storage media, such as CDs, or USB keys. In absence of contrary indications, particularly in regard of the comprehensive wording of Recital 20 ePrivacy Regulation[35], this assessment also applies to Art. 8 Sec. 1 ePrivacy Regulation.[36] Thus, the technical scope of this provision is unrestricted. Any storage operation not explicitly triggered by the end-user needs to be justified by one of the listed exceptions in Art. 8 Sec. 1 lit. a – i ePrivacy Regulation.

bb) Use of cookies

The use of cookies represents the traditional form of end-user tracking and must currently be regarded as the main case of application for Art. 8 Sec. 1 ePrivacy Regulation.[37] Cookies are small website-related text files, which are stored in the end-user´s browser. Either, those text files are sent by a webserver to the browser or they are directly created by the browser itself. When visiting the website again or when navigating between different pages, the web server can later read and identify the text file or transmit the text file to other web servers. Subsequently, the provider is able to monitor the end-user´s individual surfing behaviour and, to the extent possible, determine patterns in it. Thus, his or her personal characteristics, e.g. interests and preferences, become apparent, sometimes even allowing to create a detailed user profile. Alongside simpler applications facilitating the ordinary website use, such as storage of the information of an online shopping cart or the user´s login-information, the use of cookies therefore represents the most significant basis for so-called targeted marketing strategies.

The ePrivacy Regulation does not distinguish between different kinds of cookies (i.e. between temporal, so-called session cookies, and permanent cookies, first- and third-party cookies, essential and non-essential cookies or other types). Rather, the relevant aspect is, if, or respectively, that cookies occupy storage capacity on an end-user’s device. Their placement will be subject to the prohibition of the use of storage capabilities according to Art. 8 Sec. 1 ePrivacy Regulation and require a justification due to Art. 8 Sec. 1 lit. a – i.[38] Indeed, in most cases this will mean explicit consent. Yet, the legislator has provided for a variety of legal justifications, which include, for example, the necessity of such use of storage for the sole purpose to provide the electronic service at hand (Art. 8 Sec. 1 lit. a ePrivacy Regulation). This exemption may apply particularly in the case of so-called “essential” of “functional cookies”, which are crucial to the technical functioning of a website.

Corresponding Art. 5 Sec. 3 of the ePrivacy Directive, Art. 8 Sec. 1 does not require a cookie to contain personal data or tracking information in order for its application. There is, even beyond the information retained within the cookie, no need for any processing of personal data in the context of dropping or accessing cookie information. Art. 8 Sec. 1 ePrivacy Regulation applies to the mere placement (and access, cf. below) of information on end-users’ terminal equipment, regardless of whether personal data is involved or not. However, when personal data actually is involved in the use of cookies, special attention must be paid to the requirements of the GDPR (cf. under No. I.5.).

b) Use of processing capabilities

The prohibition of the “use of processing […] capabilities of terminal equipment” by third parties in Art. 8 Sec. 1 ePrivacy Regulation was not included in the previous ePrivacy Directive. It has been introduced into the ePrivacy Regulation in order to mitigate a shortcoming, which had been identified in the former Art. 5 Sec. 3 ePrivacy Directive. Since the latter only regulated tracking techniques based on the use of storage capabilities in terminal equipment, it was sometimes understood to not cover other tracking methods, such as device fingerprinting (see No. I.2.b)bb).[39]

aa) Processing capabilities

The term ‘processing capabilities’ is not defined in the ePrivacy Regulation or other EU law. It relates to the ‘processor(s)’ or ‘processing unit(s)’ of an electronic device, meaning the parts that allow it to perform mathematical operations on data. Every piece of software, regardless of its range of applications, is dependant on these processing capabilities and uses them for multiple purposes, first and foremost its execution. Even a device that is running idle will, thus, most likely use processing capabilities for the background processes of its operating system. Therefore, the principal prohibition of the use of ‘processing capabilities’ by third parties presents a significant extension of the protection of device integrity in ePrivacy rules. Linking this conceivably broad concept of the use of processing capabilities to the general prerequisite of justification by third parties imposes the danger of over-regulation and excessive regulatory effects to the regulation. Here, Art. 8 Sec. 1 ePrivacy Regulation overstretches its scope of application, encompassing practically any imaginable process carried out on a device. This proves it challenging for both the operators and the legal user concerned to draw a clear demarcation line between allowed and prohibited practices.

bb) Main case of application: Device fingerprinting

Main case of application for the prohibition to the use of terminal-equipments´ processing capabilities is the so-called ‘third party active device fingerprinting’. ‘Device Fingerprinting’, as the overriding term of related techniques, comprises various methods of collection and combination of data about a certain device. Its aim is to distinguish the device from other devices, making it identifiable and, more importantly, recognizable to third-parties. The basic idea behind this approach is that any network-connected device discloses certain amounts of information about itself – either voluntarily or upon request.[40] While certainly most of this information is not unique on its own, the combination of different information might be. This is especially true when large amounts of data are collected and compared, allowing for the recognition of individual devices by means of unique identifiers, therefore called ‘fingerprints’. As a main use case, operators will apply digital fingerprinting for the creation of tracking profiles, regarding both to the device itself, and, presumably, the individual using it. For example, whenever a device accesses a website, this process will be used to create a fingerprint and later repeated to determine recurring visits.

Device fingerprinting is regularly performed in covert, generally without the knowledge of the users and leaves them with nearly no room for countermeasures. Indeed, theoretically one could restrict the amount of information his or her device discloses or prevent the execution of scripts. However, this would come at the cost of a loss of convenience or usability on other websites. Even worse, such an ‘unusual’ and therefore rare configuration could also have the reverse effect of furthermore increasing recognisability of a device. Thus, the method of device fingerprinting is probably the most subversive form of information gathering today.

(1) Examples

To begin with, a classic mean of device fingerprinting represent HTTP protocols. While being used to transfer websites to the browser, they do not only retrieve content data (the requested website) from the server but also provide for the transfer of metadata in the so called ‘header field’. This information can inter alia be used to customize websites according to the user’s preferences and devices. Also, the ‘user agent’ is included among this information, describing the browser software, preferred language and character sets for the requested content and other technical specifications of the software used. This technique, being a mere evaluation of already conveyed information, is referred to as ‘passive device finger printing’. As opposed to that, a distinction is made between more active technologies, which, in contrast to pure utilisation of information, actively draw on the capacities of the end-user´s terminal device. These techniques are referred to as ‘active device fingerprinting’. Scripting languages serve as an example of this, enabling the inclusion of active program codes into websites, which is later executed on the user’s device (for example JavaScript). Such scripts allow the collection of further configuration details of a device, like screen size, installed fonts or plugins etc. from the device itself.[41] Also, so-called ‘Canvas Fingerprinting’ should be mentioned here, although representing a rather modern variation of Device Fingerprinting. The idea behind this technique is that due to different software, configuration etc., devices carry out instructions to draw images in slightly different ways. So, instead of obtaining information stored on the device, without the user noticing, the server, respectively the website, instructs the device to actively draw an image in the background. That image, containing text, shapes, colours etc., is then mathematically transformed into an identifier and transferred back to the website.[42]

(2)Legal classification

With regard to the legal classification of device fingerprinting under Art. 8 Sec. 1 ePrivacy Regulation, a further distinction is necessary. On the one hand, this concerns application scenarios qualifying both for the use of processing capabilities and the collection of information and on the other hand scenarios, only qualifying for one of the alternatives. To begin with, through measures of device fingerprinting by third parties, relying on the execution of code on a device (active device fingerprinting), as in the aforementioned usage of JavaScript codes to obtain system parameters, not only information of and about a device is collected, but processing capabilities of the respective device can be used as well.[43] Thus, they fall within the scope of both alternatives subject to prohibition under Art. 8 Sec. 1 ePrivacy Regulation. Consequently, active device fingerprinting must take place on the basis of one of the justifications mentioned in the stipulation in order to be lawful.[44] By contrast, whenever information is ‘voluntarily’ disclosed by the end-user’s device itself, for example when accessing websites as part of the aforementioned HTTP headers, this does not constitute a use of the processing of the device by a third party. Yet, this procedure is still subject to the prohibition in Art. 8 Sec. 1 ePrivacy Regulation, as far as it represents a collection of such information to the web server (passive device fingerprinting). Consequently, both active and passive device fingerprinting require a justification according to Art. 8 Sec. 1 ePrivacy Regulation. When in doubt, and no other justification is available, it may therefore be advisable to explicitly inform the end-user on the specific usage of processing capabilities prior to its execution and obtain his consent as a precautionary measure.[45]

c) Collection of information

Art. 8 Sec. 1 ePrivacy Regulation forbids the collection of information, already stored on the device. This includes information about both its software and hardware.

aa) Information, stored on terminal equipment

The ePrivacy Regulation does not qualify the term of information. However, an overview of the possible range of applications to Art. 8 Sec. 1 Var. 3  ePrivacy Regulation is offered by Recital 20 ePrivacy Regulation, explaining the purpose of the provision. Thus, “information” within the scope of Art. 8 Sec. 1 ePrivacy Regulation applies to both personal and technical information related to the end-user´s terminal equipment. According to Recital 20, the prior especially comprises details “of an individual´s emotional, political [and] social complexities, including content of communications, pictures, location of individuals, contact lists and other information already stored in the device”. The latter relates to information both stored in the device, such as settings or its capacities, and such that is deductible from its use, e.g. its location or operational data.

Art. 8 Sec. 1 ePrivacy Regulation does not restrict its scope of application to personal data within the meaning of Art. 4 No. 1 GDPR. Recital 20 captures the rationale of this approach by stating that not only personal information itself but also “terminal equipment of end-users of electronic communications networks and any information relating to the usage of such terminal equipment […] are part of the end-user´s private sphere”.[46] This broad concept of privacy reflects the consideration that an effective protection of privacy regards not only its materialisation (in form of personal data and its technical embodiment), but also its principal conditions. The ability to make use of freedom for personal development is therefore just as worthy of protection as is the terminal equipment of an end-user itself.[47]

bb) Differentiation from Art. 8 Sec. 2 ePrivacy Regulation

The collection of information under Art. 8 Sec. 1 ePrivacy Regulation has to be differentiated from Art. 8 Sec. 2 ePrivacy Regulation, which also regulates the collection of information related to terminal equipment. Key difference is that Sec. 1 applies to the ‘collection of information from end-user’s terminal equipment’, while Art. 8 Sec. 2 ePrivacy Regulation, on the other hand, applies to ‘the collection of information that is emitted by terminal equipment to enable it to connect to another device and, or to network equipment’. Also, as a result, the legislator sets out different catalogues of justification (see under No. II.2.)

Comparing the provisions with regard to their wording, one could assume that Art. 8 Sec. 2 ePrivacy Regulation encompasses all data, which is being emitted from terminal equipment, while Section 1 refers to all data, that remains on it. This rather superficial delimitation, however, does not match the legislative intent and leads to misjudgements with regard to the scope of available justifications. Correctly, the line of demarcation must be drawn, where information is being emitted as an integral part of the connection-making by the respective equipment itself. This comprises both technical data with regard to the device, e.g. the MAC address or IMEI and meta data, such as the location of the device.[48] Thus, Art. 8 Sec. 1 ePrivacy Regulation regards the entire rest of information, both technical and personal, being either retained on the respective device or emitted on a later stage, particularly in the course of an existing connection to the network. On the one hand, this follows from Art. 8 Sec. 2 ePrivacy Regulation, which, by its explicit wording, refers to information, which is ‘emitted by terminal equipment of the end-user’, while Section 1 comprises any ‘information from [the] end-users´ terminal equipment’. Here, the absence of a restriction terminologically clarifies that Art. 8 Sec. 1 ePrivacy Regulation claims a comprehensive application to information irrespective of its later emission or retention. Secondly, Section 2 refers to information emitted “by” the terminal equipment, envisaging the automatism of a connection-initiation, thus excluding any other information irrelevant to this procedure. On the other hand, this also follows from Recital 25 ePrivacy Regulation, which explicitly enlists the envisaged use cases, such as the MAC address, the IMEI, IMSI, WiFi signal, other ‘equipment related information’ or the counting and tracking of persons within a certain area. E contrario, Section 1 includes the entire rest of information.

However, it must be noted that this differentiation does not fit to the technical realities in all possible cases. From case to case, legal use will require to make exceptions from a purely conceptual approach. E.g. cookies, from a technical point of view, are not actually read by the party accessing it, but instead transmitted by the end-user’s web browser as part of a HTTP-request for a specific website. This implies that in fact the handling of cookies falls under Art. 8 Sec. 2 ePrivacy Regulation as it involves the emission in the course of connection-making.[49] A consistent legal classification, however, deviates to the legislative history and thus, to the resulting systematic (meaning in particular that the applicable scope of justifications differs). Art. 8 Sec. 1 ePrivacy Regulation´s predecessor, Art. 5 Sec. 3 ePrivacy Directive (‘gaining of access to information already stored, in the terminal equipment’), traditionally comprised cookies under the impression that those are being set and read not by the end-user, but rather by the party accessing it, i.e. the operator of the respective website. Hence, in disregard of the technical reality, cookies would not be seen as being ‘emitted’ by the device, but stored onto it. Consequently, also under Art. 8 ePrivacy Regulation the use of cookies should not be subject to Section 2 but to Section 1.

By contrast, it becomes apparent that the legislator considers the prime field of application of Art. 8 Sec. 2 ePrivacy Regulation to be the use of mobile devices (e.g. Wi-Fi, cellular or Bluetooth) attempting to connect to base stations of mobile networks or other mobile devices and third parties intercepting these attempts (so called ‘offline tracking’).[50] This is reflected by the available justifications, especially Art. 8 Sec. 2a ePrivacy Regulation, which requires a clear and prominent notice on the edge of an area, subject to statistical counting of network-connected equipment. This notice must inform passers-by of the modalities, purpose and person responsible for the collection. Such notices only make sense in a context where connections are made locally over wireless networks, but not in a context where connections are made over long distances, e.g. to a server over the internet.

To sum up, Art. 8 Sec. 1 ePrivacy Regulation should be understood to apply to information data that is either still contained in the device and somehow obtained by a third party or sent or requested as part of a networking communication after a physical connection has already been established. This definition is opposed to its corresponding understanding that information, which is emitted in the course of the making of such a connection must be subsumed under Art. 8 Sec. 2 ePrivacy Regulation. Moreover, it does not matter whether the information concerned is personal data or not. This interpretation is in line with the legislator’s concern to provide for a comprehensive prevention of the collection of information from devices.[51]

cc) Examples

(1) Web Bugs, Beacons and other tools

Recital 20 ePrivacy Regulation names ‘web bugs’ and ‘hidden identifiers’ as measures that are subject to Art. 8 Sec. 1 ePrivacy Regulation. These measures appear under various names, e.g. web beacons, tracking pixels or JavaScript tags, but usually mean the same technique.[52] It consists of a small graphic element, which is embedded to a website or email. Sometimes invisible to the end-user, the element serves to track the frequency of website or email accesses. Access information obtained may then also contain both the place and time of access as well as specifics about the accessing device. Subsequently, hosts are able to monitor end-users’ surfing behaviour, or, if the beacon is embedded to an email, whether an email was opened.

Web beaconing techniques allow for a variety of different use cases, most prominently, so-called targeted marketing purposes. The scope of application can be further expanded, when including so-called inline-frames to a website or email, depicting third-party content. This content is usually hosted by servers that are different to the ones of the initial website. Thus, in order to depict the content, an email reader or web browser will request the content from the third-party server and simultaneously provide information about the end-user to the above mentioned extend. Prominent use cases include Facebook’s “like-button”, YouTube videos or Twitter content within interactive websites or news providers. In this respect, third-party beaconing can contribute elements to the use of web-services, useful for both parties, i.e. the end-user and the host. However, this technique remains conceivably single-sided whenever it is used for the sole purpose of obtaining information about an end-user and exploiting it without the provision of considerations in exchange. For example, analytics tools, such as Google Analytics, may be listed here, comprising a set of scripts that supply the website operator with in-depth statistics on visits to his website.

Since web beacons usually work without a need to notice the person concerned and will therefore be operated to a substantial extent without their knowledge, the legislator treats this technique as an equivalent to the above mentioned use of cookies. Web beacons trigger an exchange of information, i.e. on the one hand the retrieval from and the processing of content of a third party server and on the other hand the conveyance of identifying information to this server. Thus, they use the processing capabilities of an end-user’s device and at the same time collect information from it. Therefore, their use is in a twofold need of justification under Art. 8 Sec. 1 ePrivacy Regulation.

(2)Passive Device Fingerprinting

As already mentioned under No. 2.b)bb), techniques solely collecting and evaluating a device´s automatically conveyed information, so-called ‘passive device fingerprinting’ measures, are subsumed under variant 3 of Art. 8 Sec. 1 ePrivacy Regulation. For the rest, in particular ‘active device fingerprinting’ and ‘Canvas Fingerprinting’, encompass both the use of processing capabilities and the collection of information from terminal devices, so that it is left up to the law controller, whether he or she applies variant 1 or variant 3 of Art. 8 Sec. 1 ePrivacy Regulation, or even both accumulatively. Since, however all variants lead to the same legal consequence of the Regulation, a principal prohibition with reservation of permission, the legal classification has no further significance whatsoever.

dd) Further processing of legally obtained information

Art. 8 Sec. 1 lit. g – lit. i ePrivacy Regulation regulates the processing of information after it is obtained from end-user’s terminal equipment. The provision thus not only covers the use of processing and storage capabilities of a device and the collection of information from it, but also all subsequent steps of processing. Here, as Recital 20aa ePrivacy Regulation points out, the principle of purpose limitation laid down in Article 5 Sec. 1 lit. b GDPR applies, so that the use of information collected from terminal equipment of an end-user will only be permissible in accordance with the purposes it was obtained for in the first place, or such purposes compatible with them.[53] The textual realisation within Art. 8 Sec. 1 ePrivacy Regulation had not been part of earlier versions. Its addition, however, represents an important complementation, making practice in further processing easier and particularly more secure legally.[54]

Withal, whenever personal data is concerned, further processing is still subject to the GDPR.[55] Regarding tracking-tools, particularly cookies and fingerprinting processes, this is especially relevant to profiling.[56] According to Art. 4 No. 4 of the GDPR, profiling is defined as ‘any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements’. Thus, profiling by means of cookie usage, e.g. when user’s interests are evaluated on the basis of their online behaviour in order to broadcast personalised advertisement,  is a form of personal data processing in terms of the GDPR and therefore subject to its requirements, especially Art. 6 and 13 GDPR.[57] Accordingly, profiling requires a lawful basis due to Art. 6 GDPR and the data subject must be provided with all necessary information relating to the processing of its personal data, such as identity of the controller and purposes of processing, Art. 13 GDPR. While the ePrivacy Regulation requires consent in many or most cases for the collection of data from terminal equipment, the GDPR provides alternative justifications for processing of personal data, for example if used for fulfilling a contract (Art. 6 Sec. 1 S. 1 lit. b GDPR), or whenever  the controller pursues a legitimate interest (Art. 6 Sec. 1 S. 1 lit. f GDPR). This could lead to a contradictory situation: It is possible that for the mere collection of personal data from a user’s device according to Art. 8 Sec. 1 ePrivacy Regulation, explicit consent is required, but its further processing of the collected data could be justified by given legitimate interests of the controller.

Situations where a controller needs to obtain consent only for the mere collection of data, but not for its further processing (e.g. online advertising) are hard to explain. Consequently, it is not surprising that this contradictory interplay with data protection laws was evaluated by the relevant authorities in relation to the preceding norm of Art. 5 Sec. 3 of the ePrivacy Directive. There seems to be an emerging opinion that in instances of required consent for specific actions relating to the initial collection of personal data, the controller cannot rely on the full range of justifications provided by Art. 6 of the GDPR.[58] In other words, the consent requirement under the ePrivacy Regulation provisions for the collection of data also results in a consent requirement within the scope of application of the GDPR, concerning the further processing of such data.[59] Yet, the question of the relation between the GDPR and ePrivacy Regulation remains subject to a specific discussion, expanding the frame of explanations at this point.[60]

For the sake of completeness, however, it must be noted that there are special provisions in the GDPR referring to automated decision-making based on the results of profiling or where profiling itself constitutes an automated decision. For instance, Art. 13 Sec. 3 lit. f GDPR obliges the controller to inform a data subject of the existence of automated decision-making in the data processing operation.[61] In addition, Art. 22 GDPR  states that data subjects have the right not to be subject to a solely automated decision – including profiling – if this decision does produce legal effect on the data subject or a similar significant impairment.

[22] Used here as a general term for the use of processing and storage capabilities and the collection of data.

[23] See No. I.3.

[24] Cf. Arning/Born, in: Forgo/Helfrich/Schneider, Betrieblicher Datenschutz, Part XI. rec. 53 (2019).

[25] Rec. 20 ePrivacy Regulation; cf. also Art. 29 WP, WP 194, Opinion 04/2012 on Cookie Consent Exemption, p. 2 ff.

[26] Cf. Rec. 20 ePrivacy Regulation; Directorate-General for Internal Policies, An Assessment of the Commission’s Proposal on Privacy and Electronic Communications (2017), p. 74 f.

[27] Art. 29 WP, WP 224, Opinion 9/2014 on the application of Directive 2002/58/EC to device fingerprinting.

[28] Art. 5 Sec. 3 ePrivacy Directive,

[29] EDPB, Statement of the EDPB on the revision of the ePrivacy Regulation and its impact on the protection of individuals with regard to the privacy and confidentiality of their communications from 25 May 2018, p. 2.

[30] Steinrötter, in: Specht/Mantz, Handbuch Europäisches und deutsches Datenschutzrecht, 2019, § 5 Rec. 34; cf. Also Rec. 20 ePrivacy Regulation.

[31] European Commission, ePrivacy Directive: assessment of transposition, effectiveness and compatibility with proposed Data Protection Regulation, SMART 2013/0071, p. 63 ff.

[32] German Federal Court of Justice (GFCJ), decision of 5 October 2017, I ZR 7/16 – Cookie Einwilligung.

[33] CJEU, judgment of 1 October 2019, C-673/17 – Planet49, Rec. 65.

[34] GFCJ, judgment of 28 May 2020, I ZR 7/16 – Cookie Einwilligung II; further remarks by Schmitz, in: Spindler/Schmit, TMG, § 15 (2018), Rec. 97; National data protection authorities criticised this approach and connected legal uncertainties in regard to the principle of legal clarity, demanding legislator action in this regard: DSK, Orientierungshilfe der Aufsichtsbehörden für Anbieter von Telemedien (2019), p. 3 ff.

[35] Recital 20 ePrivacy Regulation seeks to establish an all-encompassing applicability to intrusions via end-users´ terminal equipment. Sentence 5 mentions “techniques that surreptitiously monitor the actions of end-users, for example by tracking their activities online (…)”. This shows the non-exhaustive nature of various examples that also, but not exclusively, cover online activities.

[36] While the ePrivacy Regulation does not explicitly regulate offline media, a respective regulation can be derived from a comparison to Art. 5 Sec. 3 ePrivacy Directive; see also rec. 65 Directive 2009/136/EC (‘Cookie Directive’).

[37] Steinrötter, in: Specht/Mantz, Handbuch Europäisches und deutsches Datenschutzrecht, 2019, § 5 Rec. 34

[38] En detail, see No. I.3.

[39] Explanatory Memorandum ePrivacy Regulation, 3.1; however, the Art. 29 WP, Opinion 9 (2014) on the application of Directive 2002/58/EC to device fingerprinting considered certain methods of device fingerprinting to be subject to Art. 5 Sec. 3 ePrivacy Directive.

[40] For an overview see Alich/Voigt, CR 2012, 344; for further reference see ‘Device fingerprint’, in: Wikipedia, retrieved 18 January 2022, from: https://en.wikipedia.org/wiki/Device_fingerprint.

[41] For detailed information and evaluation of Device Fingerprinting under the ePrivacy Directive see Art. 29 WP, Opinion 9/2014 (2014), pp. 4-7.

[42] For further reference see ‘Canvas fingerprinting’, in: Wikipedia, retrieved 18 January 2022, from: https://en.wikipedia.org/wiki/Canvas_fingerprinting.

[43] For the prohibition of the use of processing capabilities according to Art. 8 Sec. 1 ePrivacy Regulation see below, section (C).

[44] Rec. 20 ePrivacy Regulation.

[24] See No. I.3.b).

[45] This approach had already been subject of the preceding ePrivacy Directive, cf. Rec. 24 Directive 2002/58/EC; Art. 29 WP, WP 171 (2010), p. 9.

[46] Cf. Art. 7 Charter of Fundamental Rights of the European Union; in this regard Jarass, in: Jarass, EU-Grundrechte-Charta, Art. 7 (2021), Rec. 14, states that respect for private life regards first and foremost the actions within privacy. By implication the ability to pursue such actions is a precondition for privacy; cf. for the case of German law also Rixen, in: Sachs, Grundgesetz, Art. 2 (2021), Rec. 69.

[47] Cf. Rec. 25 ePrivacy Regulation.

[48] Art. 5 Sec. 3 ePrivacy Directive, see Art. 29 WP, WP 171 (2010), p.8.

[49] Cf. Rec. 25 ePrivacy Regulation.

[50] See Rec. 20, 21 ePrivacy Regulation; Example: Personal data, like phone book files or text messages as well as configuration data, like the set language and descriptive data, considering the type of device or version of the installed operating system, are both protected.

[51] An en detail description of the web beacon technology provides ibi research an der Universität Regensburg, eCommerce Leitfaden, chapter 3.3, available under https://www.ecommerce-leitfaden.de/ecl-v2/140-kapitel-3-lasst-zahlen-sprechen-kontinuierliche-verbesserung-durch-web-controlling#anchor_3_3_2, last retrieved 18 January 2022.

[52] See Schleipfer, ZD 2017, 460, 461 for an instructive presentation of the individual processing steps necessary for web tracking.

[53] For details see No. I.3.h).

[54] This finding, which had also been subject to discussions about German legislation (cf. Datenschutzkonferenz, OH Telemedien 2021, 20 December 2021, p. 5 with reference to EDPB, Opinion 5/2019 on the interplay between the ePrivacy Directive and the GDPR, 12 March 2019, Rec. 21 ff.), leads to the problematic result that for the same practical activity, two or even more national data protection agencies might be responsible.

[55] See Schleipfer, ZD 2017, 460, 461 for an instructive presentation of the individual processing steps necessary for web tracking.

[56] See Rec. 72 GDPR. However, further requirements will often apply, e.g. the requirement to implement appropriate technical and organisational security measures, or potentially the requirement to conduct a data protection impact assessment.

[57] EDPB, Opinion 5/2019 on the interplay between the ePrivacy Directive and the GDPR, in particular regarding the competence, tasks and powers of data protection authorities, rec. 40 (2019).

[58] For the case of so called Real Time Bidding, cf.: UK ICO, Update report into adtech and real time bidding, 20 June 2019, https://ico.org.uk/media/about-the-ico/documents/2615156/adtech-real-time-bidding-report-201906-dl191220.pdf, last retrieved 18 January 2022; Voigt, Stellungnahme des UK ICO zu Real Time Bidding, 28 June 2019, https://www.taylorwessing.com/de/insights-and-events/insights/2019/06/stellungnahme-des-uk-ico-zu-real-time-bidding, last retrieved 18 January 2022.

[59] For details see No. I.5, and, with regard to the specific question on the requirement of an additional legal basis for further processing, No. I.3.h).

[60] So does Art. 14 Sec. 2 lit. g) GDPR accordingly in cases where the personal data was not directly obtained from the data subject.

Art. 8 Sec. 1 ePrivacy Regulation contains an exhaustive list of legal bases that allow for an interference with the integrity of end-user’s terminal-equipment under strict conditions.[61] They can be distinguished into deliberate consent on the one hand, and legal justifications on the other. Most of these justifications, however, have a narrow and rather technical scope of application. Therefore, in practice, interference with the integrity of terminal equipment will preponderantly rely on consent (Art. 8 Sec. 1 lit. b ePrivacy Regulation).[62] From a legislative point of view, however, this condition is deplorable, since compared to the GDPR, the set of permissions could have been significantly larger.[63] Indeed, within the legislative process voices raised, promoting an alignment of the catalogue to the extensive examples of the GDPR. For instance, the Council promoted an implementation of conventional permission sets, such as the performance of a contract as in Art. 6 Sec. 1 lit. b GDPR.[64] In particular, a balancing and catch-all clause comparable to Art. 6 Sec. 1 S. 1 lit. f GDPR was demanded to provide for new or possibly unforeseen future cases. Yet, in contrast to Art. 6 ePrivacy Regulation, such an alignment was eventually not implemented. Instead the legislator maintained set of rather specific and fragmentary exceptions.

It has to be noted that even if the conditions of one of the legal permissions in Art. 8 Sec. 1 ePrivacy Regulation are met, only activities that are necessary for the respective purpose can be justified. For example, when it comes to cookies, their lifespan has to be adjusted in accordance with their respective purpose. Once the purpose has been fulfilled, the cookie has to expire. To that end, it will in many cases be necessary to work with so-called session cookies.[65]

Aside from cases where consent of the end-user is obtained, the ePrivacy Regulation does not require the party relying on one of the justifications for an interference with terminal equipment to inform the end-user about this activity. However, where an interference involves the processing of personal data, the information obligations pursuant Arts. 13 and 14 GDPR will have to be observed. These require that in addition to the legal basis for processing, the data subject must be provided with all necessary information enabling to assess the scope and significance of the processing of their personal data. This information needs to contain inter alia the identity of the controller, the purposes for which the data is collected, the foreseen duration of the data storage as well as any possibly intended transfers of data to third parties. Furthermore, the data subjects must be informed about their individual rights, particularly, their possible rights of objection. In strictly exceptional cases only, for instance, if the data subject already has all the information mentioned in Art. 13 GDPR (which will rarely be the case), the information obligations do not apply.[66]

a) Provision of an electronic communication service (Sec. 1 lit. a)

Art. 8 Sec. 1 lit. a ePrivacy Regulation allows for the use of processing or storage capabilities or the collection of information if this is necessary for the sole purpose of providing an electronic communication service. Compared to earlier versions of this provision[67], the wording changed in the course of the Council´s proposal, replacing the formulation “necessary for the sole purpose of carrying out the transmission of an electronic communication over an electronic communications network” to the above mentioned version. Thus, the particularly narrow scope of the earlier exemption, which only covered a concrete act of transmission, was slightly expanded. Now it encompasses any (necessary) act related to the provision of an electronic communication service. The justification can be invoked by any party that is legitimately part of the transmission of an electronic communication, be it a provider of an Electronic Communications Network or Service, another service provider or the recipient of the communication.

The term ‘necessary’ is neither defined in the ePrivacy Regulation nor in the GDPR. Thus, it seems reasonable to consider the term in accordance to Art. 5 Sec. 1 lit. b GDPR. In that sense, ‘necessary’ means, it only relates to activities to which there are no meaningful and appropriate alternatives.[68] Subsequently, the activity has to be ‘necessary’ for the sole purpose of providing an electronic communication service pursuant to Art. 8 Sec. 1 lit. a ePrivacy Regulation. Indeed, a comprehensive view with Art. 5 Sec. 3 ePrivacy Directive reveals that no further limitations have been added to the text, i.e. in particular the requirement of a ‘strict’ necessity. However, this does not imply the assessment of purpose and means is left to the respective party´s discretion. Exceptions must, as always, be interpreted in a narrow fashion.[69] Here, this already follows from the legislator´s express intent according to Recital 21 ePrivacy Regulation, stating that the use of processing and storage capabilities of terminal equipment or access to information stored in it should be “limited to situation that involve no, or only very limited, intrusion of privacy”. Thus, the necessity of an action must be considered in the light of a strict assessment on what is actually needed and what is merely helpful. Scholars developed different approaches to the specification of what is necessary.[70] Steinrötter noted critically that prevailing technical communication standards should not be decisive.[71] Rather it had to be taken into account what is technically feasible and sensible. This opinion corresponds the legislator´s normative approach, which also focuses on technical leeway and its possible elaboration rather than its present state.[72] A measure, consequently, is only necessary, if the provision of the respective service cannot be carried out without it.[73]

Nonetheless, some ambiguity on the definition of ‘necessary’ remains in this context, especially when it comes to convenience features like the detection and collection of the screen size of a device to deliver a mobile-optimized version of the site or the detection of the browser used, to enable browser-specific features. While such convenience and usability features might not really be ‘necessary’ for the provision of the service, the Art. 29 WP has considered short-term access for the purpose of “adapting the content to the characteristics of the device” as being covered by a comparable previous legal permission in the ePrivacy Directive.[74] That finding can be transferred to Art. 8 Sec. 1 lit. da ePrivacy Regulation.

The provision of electronic communications services and related activities include various examples, some of which being described in Recital 21 ePrivacy Regulation by the legislator itself. Mainly, it distinguishes between cookies and services related to the Internet of Things (IoT)[75]. Accordingly, storing of information in or accessing information from a smart meter “might be considered as necessary for the provision of a requested energy supply service”, however only as far as it is necessary to ensure the stability or security of the energy network or the billing of end-users´ energy consumption. Also, automated and connected vehicles might be accessed, e.g. by the manufacturer or software operator, for security related software updates.

aa) Conflict with the layout of technical protocols

The technical base and protocols of electronic communications networks in general and the internet in particular are not designed with regard to privacy needs, but to robustness of connections, compatibility and, to a lesser degree, to provide convenience. Therefore, not all of the information being processed in the course of the use of standard communications protocols is strictly necessary in the aforementioned sense.[76] By means of the above mentioned strict understanding of the term, network systems or parts of it are therefore not in fact ‘necessary’ to provide the actual service, yet being an integral and essential part of it. The ePrivacy Regulation does not offer guidance on how to resolve this contradiction. It has not been addressed in deliberations by the European Parliament or the Council of the European Union as of now. For a practical solution, however, it seems advisable to understand Art. 8 Sec. 1 lit. a ePrivacy Regulation as allowing the use of state of the art protocols, whenever there are no obvious, more privacy focused alternatives.

bb) Cookies for transmission purposes

Besides the above mentioned, most prominent cases include the use of cookies. Indeed, only few types of cookies can be justified as “necessary for the sole purpose of providing an electronic communication service” under Art. 8 Sec. 1 lit. a ePrivacy Regulation. Though not explicitly defined in the Regulation, an outline of this term becomes visible, when comparing the provision to its predecessor Art. 5 Sec. 3 ePrivacy Directive. Withal, a comparison is admissible, since the Directive referred to the more specific act of transmitting “an electronic communication over an electronic communications network”, which represents a mere part of the superordinate provision of an electronic communication service. Thus, case groups developed for Art. 5 Sec. 3 ePrivacy Directive apply accordingly.[77] Transmission purposes in this respect are: (1) routing information over the network, notably by identifying the communication endpoints, (2) exchanging data items in their intended order, notably by numbering data packets and (3) detecting transmission errors or data loss.

b) Consent by the end-user (Sec. 1 lit. b)

Art. 8 Sec. 1 lit. b ePrivacy Regulation exempts interferences with the integrity of terminal equipment on the basis of the end-user´s consent. Pursuant to Art. 4a Sec. 1 ePrivacy Regulation, consent is defined in accordance with Art. 4 No. 11 GDPR. It refers to “any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her”. It encompasses both personal and other data, such as anonymized or technical information.[78] Corresponding to similar findings within the scope of application of Arts. 6 Sec. 1 lit. a and 9 Sec. 2 lit. a GDPR, consent is of high practical importance to the ePrivacy Regulation. It will many times represent the only permissible way for interferences with an end-user´s device, because Art. 8 Sec. 1 ePrivacy Regulation only provides for a narrow scope of alternative justifications. In view of legal context to the term and general requirements for the declaration of consent, comments on Art. 4a ePrivacy Regulation apply. Here, by contrast, only aspects specific to the handling of terminal equipment, particularly with regard to the usage of cookies and other tracking mechanisms, are considered.

aa) Consent subject

Consent refers to the active approval of a specific interference with a terminal device´s integrity. It cannot be given in form of a mere blanket declaration, but instead has to be specific with regard to the approved measures and purposes.[79] In order to obtain consent, the entity using processing and storage capabilities or collecting information from terminal equipment needs to both identify the relevant consent subject and make sure it fulfils all consent requirements. Particularly the prior, however, may cause complications, as there might be multiple persons and entities interested in the integrity of the same device (e.g. in employment relationships or at household-shared devices).[80] In praxis, obtaining consent will therefore often come down to the actual or ‘final’ end-user, i.e. the operating individual.[81] That is justified particularly, if information is collected from him or her, since then it is their privacy being interfered with. Yet, as Art. 4 Sec. 2a ePrivacy Regulation clarifies, this will only suffice, as far as the provider is (definitely) not able to identify the data subject. Aiming at a more holistic justification, the respective service provider thus needs to consider procedures, also available to the device´s owner or the entity requesting the service.[82] As Recital 20aaaa ePrivacy Regulation clarifies, this will be necessary particularly in cases, where interference with device integrity goes beyond the actual features of justifications under lits. a and c – f. In this regard, a practical solution might represent to either periodically ask for renewal of consent or to include device settings (cf. stipulations in in Art. 4 Sec. 2 ff. and Rec. 20a ePrivacy Regulation).

bb) Consent addressee

Generally, when relying on consent as legal basis for an interference with end-users´ terminal equipment, the entity making use of its processing and storage capabilities or collecting information from it is responsible to demonstrate that valid consent has been obtained.[83] Yet, this does not apply without exception. According to the general principles of the distribution of the burden of proof, it is always the party invoking circumstances in its favour[84] or, respectively, asserting the exception to a particular rule,[85] who must provide evidence. Since in general, Art. 8 Sec. 1 ePrivacy Regulation prohibits device interference, this means, also third parties are subject to the requirement, whenever they rely on the exception pursuant to Art. 8 Sec. 1 lit. b ePrivacy Regulation. This will in particular be the case, any time a third party sets non-functional cookies. Yet, since apart from cases where consent is given by a technical setting, third parties will have no technical possibility to display appropriate information and consent banners to the end-user, they have to rely on the website operator to comply with all respective requirements.[86]

In this regard it is questionable, whether in addition to the third party, also website operators themselves are responsible for both obtaining and demonstrating consent under Art. 8 Sec. 1 lit. b ePrivacy Regulation even though they only indirectly interfere with devices´ integrity. This question corresponds to a similar problem within Arts. 4 No. 7 and 26 GDPR, pursuant to which the model of joint controllership applies, whenever various persons hold joint decision-making powers about the purposes and means of processing of personal data.[87] Joint controllership encompasses both ‘classical situations’, when responsibilities and powers with regard to specific processing are equally distributed, and cases of pluralistic control and cooperation, following from more complex business structures.[88] In its Wirtschaftsakademie-decision[89] the CJEU stated that with regard to this model it suffices, if the website operator causally enables a third party to process personal data of the operator´s website users.[90] Thus, joint controllership might apply, even though the website operator does not have any direct access to the personal data at hand.[91] Transferred to Art. 8 ePrivacy Regulation it would mean, whenever a website operator enables third parties to place cookies or otherwise interfere with end-users´ devices, they are directly responsible in the above mentioned sense.

Yet, in the light of both systematics, interests and recitals of the ePrivacy Regulation, this assumption must be called into question. Recital 20aaa ePrivacy Regulation stipulates that the responsibility for obtaining consent lies on the entity making use of the terminal equipment. According to its wording, such entities “may request another party to obtain consent on their behalf”. Subsequently, only direct interferences invoke the requirement to obtain consent, while indirect interferences (i.e. such of a website operator) are generally no part of it. Thus, third parties placing the cookie are addressees of the provision alone. Conversely, website operators must first be subjected to the requirement in the course of an agreement under the law of obligations. This makes also sense in view of the general distribution of interests. If shared responsibility followed directly from Art. 8 Sec. 1 lit. b ePrivacy Regulation, adequate and comprehensive justification might be jeopardized in the course of third parties´ efforts to evade responsibility at the expense of more visible website operators. Thus, compliance to the standards under the ePrivacy Regulation would be shifted to the side of legal remedies under Arts. 21 ff. ePrivacy Regulation, rather than initial safeguards. Also, in a systematic perspective, this interpretation does not represent a deviation to the stipulations under Art. 26 GDPR. That is, because according to the CJEU, one must differentiate between joint responsibility on the one hand and equal responsibility on the other.[92] Since operators and third parties may be involved at different stages of processing and to different degrees, the level of responsibility must be assessed with regard to each case individually.[93] Generally speaking, this means that the degree of interference has to be weighed against the operability of obtainment and demonstration of consent in each case. Particularly in the case of high cookie handling autonomy, this weighing must lead to the detriment of third parties. Subsequently, a shift of responsibilities to the website operator will only apply be means of private obligation and not by Art. 8 Sec. 1 lit. b ePrivacy Regulation itself. Thus, third parties will continue to rely on the commissioning of website operators to comply with consent requirements.

cc) Consent by technical settings

A distinctive feature of consent under Art. 8 Sec. 1 lit. b ePrivacy Regulation is that it can be expressed by an appropriate technical setting of a software, e.g. a web browser (Art. 4a Sec. 2 ePrivacy Regulation).[94]

dd) Consent into the use of cookies

One of the European Commission’s main goals with the introduction of the ePrivacy Regulation was to combat the so-called ‘consent fatigue’ resulting from an ubiquitous use of cookies and respective consent requests.[95] Nearly every website asks the end-users for consent via ‘cookie banners’. Yet, in absence of an applicable ePrivacy Regulation, the competent data protection authorities relied on the GDPR as main legal basis for the lawfulness of cookie placement. With the ePrivacy Regulation, particularly Art. 8 coming into force, the GDPR is replaced as the primary legal framework for handling cookies. However, since other legal permissions under Art. 8 Sec. 1 ePrivacy Regulation have a very narrow scope of applications, consent will remain the primary legal base. Therefore, the legal situation is not expected to change dramatically. This is particularly due to the lack of a legal basis relying on legitimate interest of providers.[96]

(1)Active approval – “opt-in”-declaration

According to Art. 4 Sec. 1 of the ePrivacy Regulation, the procedure of obtaining consent under the regime of the Regulation shall follow the same rules as included in the GDPR, in particular Art. 4 Sec. 11 and Art. 7. Thus, valid consent requires an unambiguous statement or a clear affirmative action of the user (Art. 4 Sec. 11 GDPR).[97] Traditional cookie banners that merely inform about the placement of cookies and do not demand affirmative action by the user are insufficient to obtain consent. Instead, consent has to be obtained by active approval of the end-user.[98] This approval has to be obtained as a separate act clearly distinguishable from other matters.

In its Planet49-decision, the CJEU has clarified that pre-ticked boxes, or other so-called “opt-out”-solutions, are insufficient to obtain valid consent in terms of the GDPR and the ePrivacy Directive.[99] Beside the CJEU, this view has now also been endorsed by the EDPB,[100] the German Federal Data Protection Conference (Datenschutzkonferenz)[101] and the Federal Court of Justice of Germany.[102] In summary and simply put, the obtainment of explicit consent of their users for the setting of non-functional[103] cookies by means of a so-called “opt-in”-declaration are mandatory for website operators. The default setting relating to such cookies must be set on “reject” or “deactivated” and the cookies may only be set if and after the user changes this manually. The CJEU has further clarified that this applies regardless of whether ‘personal’ data is processed through the cookie placement or not. Thus, its judgement is of particular relevance to the ePrivacy Regulation.[104] Whether a website operator continues to use an illegal non-compliant opt-out procedure to obtain consent despite these developments is easily recognizable to the public, thus, generally also for competitors and consumer protection organizations. Consequently, operators who do not implement a proper opt-in procedure expose themselves to legal remedies.[105]

(2) Informed decision

Furthermore, consent needs to be specific and informed, according to the transparency requirements of Art. 5 GDPR.[106] Data subjects and end-users have to be informed about all relevant circumstances regarding the cookies that are about to be placed. Therefore, the GDPR stipulates specific information obligations that need to be performed prior to obtaining consent for data processing. Those obligations are determined in Arts. 13, 14 GDPR and can be transferred to the ePrivacy Regulation accordingly (cf. Art. 4a Sec. 1 ePrivacy Regulation).[107] Pursuant to Arts. 13, 14 GDPR, information must be sufficient in extent, but yet provided in clear and plain language. That means the end-user must at least be informed about:

the identity of the actor placing the cookie,

the kind of information that is being stored in the cookie,

the purposes for which the information is stored and read,

the type of cookie (first or third party, session or persistent), and

the right to withdraw consent at any time.[108]

(3) Genuine choice

The specific design of cookie banners, which may lead to obtaining effective consent, has long been subject to a debate under the GDPR. In this context, it is being discussed particularly, how to deal with the requirement of voluntariness, whenever a data subject is only offered the choice between accepting cookies (or other tracking mechanisms) and leaving the website, so-called “cookie walls”.[109] The EDPB and various national data protection authorities took the stand that such procedure is unlawful. In respect of the lack of a genuine choice, the activation of cookies would not be subject to a voluntary decision.[110] Users would instead need to receive an opportunity refusing consent and still access content of the website.[111] This required websites to explicitly inform the user accordingly.[112] Others supported the view that at least in certain cases, cookie walls allowed for a genuine choice. Those cases typically applied, if an equivalent service was available without tracking. Paid services, such as news portals, were cited as examples.[113] However, it is correct to assume that in principle cookie walls were already permissible under the status quo ante. This follows from the fact that in the light of the comprehensive consent requirement under the GDPR, cookie walls are the only compatible option to comply both to the regulatory requirements of the public and the economic reality within the free digital market. Online services are provided, because they allow direct or indirect remuneration. Consequently, it represents a natural prerequisite that websites are accessible only, if such a remuneration is being provided. Insofar as the means of payment deviate from the “offline“ reality, this is only in line with the specific characteristics of online trade and principally illegitimate accordingly. Rather, the disclosure of data serves as one particular and feasible alternative. Rec. 24 f. and Art. 3 Sec. 1 S. 2 Directive (EU) 2018/770 confirm this finding, stating that data serve as a common means of remuneration for unpaid online services.[114] Thus, cookie walls safeguard providers´ remuneration and serve as an essential part in the balancing of interests. Contrarily, their prohibition undermines this balance and evokes dysfunctional effects. It extends the consent-authority of end-users onto the providers´ sovereign sphere, which at first encompasses the decision on the ‘if’ and ‘how’ to deliver a service and this way overstretches consent´s actual functionality. In fact, a normative consideration requires limiting the scope of ‘genuine choice’ and apply the threshold of access to the requested service as a boundary line. A ‘voluntary decision’ subsequently relates to the sphere of the provider alone, which is able to decide on the delivery of a service and its conditions in the first place. On the contrary, it is merely left to the discretion of the end-users, if they either use the service or refrain from it completely.

Yet, also within the legislative proceedings regarding the ePrivacy Regulation, the implementation of a prohibition of cookie walls was discussed. The first amendments by the European Parliament of 20 October 2017 even included a proposal prohibiting the link of access to online services to the declaration of consent on online tracking.[115] The Counsel did not follow through on these attempts. Both of the drafts of 10 February 2021 and 4 November 2021 omitted the provision, relegating this topic to the recital section.[116] Now, according to Rec. 20aaaa ePrivacy Regulation, cookie walls are not generally considered as depriving the end-user of a genuine choice. Not only, but in particular if the end-user is able to choose between offers including and not-including tracking mechanisms (such as paid services), cookie walls are considered admissible.[117] This rule-exception ratio follows from the legal systematic of the superordinate provision of Art. 8 Sec. 1 lit. b ePrivacy Regulation as the rule and its explanatory Recital 20aaaa as exception. Accordingly, consent by the end-user is generally decisive. “In some cases”, pursuant to Recital 20aaaa ePrivacy Regulation, a deprivation would however be assumed, if an imbalance between the end-user and the service provider emerges. Such an imbalance may for example follow from a dominant market position of the service provider or services provided by public authorities. Also, the legislator presumes an imbalance in case the end-user have only few or no alternatives to the service.

According to the wording of Recital 20aaaa, the provider should make the alternative service available himself. However, there are constellations conceivable, when a referral to equivalent alternatives of third-party services suffices. Since in some cases, the end-user will be provided with a choice between paid for and cookie-involving services, such an alternative might consist of a free of charge service with a comparable range of features.[118]

 (4) Further requirements

In view of the further design of mechanisms for obtaining consent, competent authorities have sought to establish regulatory standards under the legal situation prior to the adoption of the ePrivacy Regulation.[119] Accordingly, a declaration of consent is considered valid only, if end-users are provided with an overview of all respective tracking processes and involved actors.[120] Standards apply for each cookie and each purpose separately. Thus, the aforementioned information must be available for every cookie used in form of a differentiated display.[121] Having said that, consent itself does not have to be given for each individual act of processing, but rather in a bundled way, only distinguishing the different purposes of processing.[122] This opinion aligns with Recital 32 GDPR, stating that “consent should cover all processing activities carried out for the same purpose or purposes”. Otherwise, the differentiation of consent actions would evoke a negative effect on the simplicity of the process and thus de facto make access to the respective website considerably more difficult.[123] The required granularity of information on the one hand and respective consent on the other deviates, subsequently.

Consent only covers the purposes it was once obtained for. When additional purposes are being pursued, new and specific consent has to be obtained for each individual purpose.[124] This follows from the principle of purpose limitation laid down in Article 5 Sec. 1 lit. b GDPR.[125] Yet, as an exception, Art. 8 Sec. 1 lit. g ePrivacy Regulation stipulates certain conditions under which additional purposes might be pursued without specific consent.[126]

Furthermore, the end-user has the right to withdraw his or her consent at any time (Art. 7 Sec. 3 GDPR). Withdrawal must be as easy as providing consent, meaning that websites have to include a corresponding option to withdraw, which is easy to access. Website operators have to remove the respective cookies from the end-user’s terminal equipment as soon as consent is withdrawn.[127] However, the consent continues to serve as a justification for any cookie operation carried out until this point in time.

With regard to the form of consent, the CJEU decided in his landmark judgment Planet49 that no visual separation to other declarations is required.[128] Thus, consent may be given together with other declarations, e.g. the acknowledgment of the user´s privacy policy.[129]  Yet, in the sense of transparency, optical partitioning can be advisable.

Eventually, it is important to make sure that cookies relying on consent are placed on the end-user’s terminal equipment only after consent is properly obtained. Contrary action will violate the ePrivacy Regulation and can be subject to fines according to Art. 23 ePrivacy Regulation and Chapter VIII of the GDPR.[130]

Example: O operates the website example.com. When a visitor enters this website, a cookie that enables web audience measuring is immediately placed on the visitor’s computer. At the same time, a persistent cookie that serves the purpose of tracking the visitor’s browsing behaviour across the site is being placed. The visitor is shown a dialogue box asking for consent into the placement of tracking cookies, along with the necessary information. This design violates Art. 8 Sec. 1 ePrivacy Regulation. While the placement of the web measurement cookie is permitted by Art. 8 Sec. 1 lit. d ePrivacy Regulation, the other cookie must not be placed prior to the visitor’s consent is properly obtained.

ee) Consent to the use of processing capabilities

The requirements for consenting to the use of processing capabilities of a device are, in general, the same as those for consenting to the use of cookies.[131] It is equally important to get valid consent from the end-user concerned before engaging in any such activity.

As of now, there is no established technical solution in web browsers or other software that would allow for consent into such measures by a technical setting. While there is, usually, an option to disable active content like JavaScript altogether, it cannot be assumed that a user, who does not disable active contents, consents into the executing of such code (like an “opt-out”). At the same time, third parties are not able to prompt a dialogue box to the end-user like the website operator to ask for consent. Therefore, they have to rely on the first party website operator, which includes their code, to obtain valid consent for their means as well.

 ff) Consent to device fingerprinting

Device fingerprinting is either a collection of information from terminal equipment, the use of its processing capabilities or a mixture of both, depending on the actual implementation as active or passive device fingerprinting.[132] Therefore, the remarks from above on cookie consent and on consent into the use of processing capabilities apply accordingly.

c) Provision of specifically requested services (Sec. 1 lit. c)

Art. 8 Sec. 1 lit. c refers to the use of an end-user´s terminal equipment, which is strictly necessary for providing a service, he or she has specifically requested. This exceptions corresponds almost word-by-word to its predecessor, Art. 5 Sec. 3 S. 2 ePrivacy Directive.[133] Only by a closer examination, small but significant alterations become apparent, i.e. the extension of the term ‘services’, now encompassing not only services of the information society (cf. under No. I.3.e)aa) but all kinds of services. While the earlier classically included media, search engines, online shops or other publicly available services, the latter adds particularly those, which had previously not been encompassed, e.g. all services under reference of Art. 4 Sec. 1 lit. b ePrivacy Regulation, Art. 2 EECC (i.e. electronic communications services and interpersonal communications services[134]) and services excluded by interpretation pursuant to Annex I of Directive (EU) 2016/1535.[135] Recital 21aa specifies that such services, pursuant to the new layout of the term, might be of journalistic nature, e.g. online newspapers or other press publications. These, being oftentimes wholly or mainly financed by advertising, must provide end-users with “clear, precise and user-friendly information about the purposes of cookies or similar techniques” and await their consent. Withal, it must be noted that this consent only refers to the case, cookies are actually used. Services, displaying untargeted advertising are free to make use of it without additional consent. Initial information regarding the end-user´s choice to either use a paid, un-advertised service or a free of charged one, which is being advertised instead, must be distinguished from the above mentioned and is left to the provider`s sole discretion.

Services are specifically requested by the user, if he or she took positive action to request a service on a clearly defined perimeter.[136] Such action can consist in various forms: The visit of a website, the use of a messenger service or any comparable form of implied requests. Conversely, an explicit request is not required. End-users, for example, do not have to sign a form, tick a box or otherwise express his intent to use a certain service.

‘Strict necessity’, as laid out under No. I.3.a), requires that there are no meaningful and appropriate alternatives.[137] In case of the most obvious example, the visit of a website, for instance, the transmission of URL to the device and the IP-address from it is strictly necessary. In order for the website to be displayed correctly, processing also concerns information about the device settings, both of the hard- and software, such as the screen resolution, operating system and browser.[138] Last but not least, in many cases, also the processing and temporal storage of a session ID will become necessary within this procedure.

Another use case, as particularly relevant to the provision of Art. 8 ePrivacy Regulation, represents cookies, which in some cases may be considered “necessary for providing an information society service explicitly requested by the end-user” in terms of Art. 8 Sec. 1 lit. c ePrivacy Regulation, even though their range needs to be very restrained. For this end, Art. 29 WP has published an interpretation to Art. 5 Sec. 3 ePrivacy Directive[139], which in terms of object and purpose remains valid and can be directly transferred to the new ePrivacy Regulation (cf. No. I.3.a)bb). According to this interpretation, cookies are viewed as necessary and allowed only if they are “essential to the service”, meaning that without them, a functionality that has been explicitly requested by the user, as part of a certain service, would not be available at all. These cookies are named ‘functional’ or ‘essential cookies’. Examples for types of explicitly allowed essential cookies are as follows:

User input / shopping cart cookies: In line with the Art. 29 WP-interpretation, Recital 21 ePrivacy Regulation clarifies that storing of cookies might be permissible “for the duration of a single established session on a website to keep track of the end user´s input when filling in online forms over several pages”. Here, most probably, the legislator had in mind the time consuming procedure required in various cases, e.g. when creating user profiles at websites. Unless not strictly necessary for the provision of a service, session cookies might help to bypass tedious repetitions, particularly if interim results are unexpectedly deleted. Also, in case of online transactions, session cookies might prove useful to verify the identity of end-users, or, as Recital 21 ePrivacy Regulation illustrates, to remember their items selected and placed in shopping baskets. All of the latter, being mere additions to the underlying service, are not necessary for the act of provision itself, and therefore, even though qualified as ‘essential’, to be distinguished from other cookies, which directly enable the service (esp. cookies for transmission purposes, cf. under No. I.3.a)bb).

Authentication cookies: Cookies used to identify the user, once he has logged in, not only facilitate the use of online services, but sometimes also enable it. Otherwise end-users are required to provide a username and password every time he or she returns to it, so that e.g. in cases passwords are lost or timely procedures are being avoided entirely, authentication cookies provide an important form of access to online services. They are considered to be ‘essential’ in the above defined meaning. However, their range of use is restricted, since again, exceptions under Art. 8 ePrivacy Regulation must be interpreted in a narrow fashion (cf. No.I.3.a)). Authentication cookies are not justified hereunder, as far as they allow for other secondary purposes, such as behavioural monitoring or advertising. In contrast to the traditional assessment, however, one distinction has to be dropped, which seems outdated in view of the technical reality and user habits of today. Previously, only session cookies were considered to fall under the exception, whereas now also persistent cookies shall be included to this understanding, because the earlier assumption that users when returning to a website after a browser session believe they would be anonymous to the service (while in fact they are still logged in to it) is not valid any longer. Much rather, today´s users will assume they are able to use a certain service at the exact same stage and would be even irritated if they are not. This interpretation is supported by implication of the wording in Recital 21 ePrivacy Regulation. Session cookies within Sentence 2 only relate to either the completion of online forms (cf. above), or authentication cookies used to verify the identity of end-users in online transactions. Any other form of authentication is therefore privileged and open to different configuration.

User centric security cookies: Cookies used to increase the security of the service, which has been requested by the user, support their privacy towards intrusions by third parties. By detecting, for example, repeated failed login attempts or other infringements to the end-user´s privacy they make an important contribution to the purposes of the Regulation. Also, user centric security cookies are considered to be ‘essential’ for this reason. Besides the authentication cookies themselves, they can be designed both as persistent and as session cookie, as well.

Media player cookies: This kind of cookie is needed to play back video or audio content. They store technical data, such as image quality, network link speed and buffering parameters. Also known as ‘flash cookies’, following their like-named internet video technology, media player cookies, without further need of explanation, form up an essential part of the usage of internet-related services.

Load balancing session cookies: A technique allowing for the distribution of processing requests to web servers over a pool of machines instead of one alone is called ‘load balancing’. Cookies, within this technique, serve the purpose to direct a user´s request to one of the available internal servers in the pool and ensure that further requests end up at the same point. This way consistency of processing is maintained, making load balancing session cookies an essential element of communication over the network.

Customization cookies: Typical examples for customization cookies include language settings or display preferences. The distinctive characteristic here is that the user consciously interacts with the service, e.g. by ticking a box or choosing from a drop down menu. Customization functionalities serve to facilitate the user experience and are therefore as well being considered ‘essential’.

Social network sharing cookies: While most of the times cookies are referred to as an internet-related technique, working in the imperceptible background of the online happening, in fact, there are also some cookies playing an active role for the conscious handling of online services. Particularly social network sharing cookies are integrated to websites, in order to allow users to share social media contents with other users or to comment on their approval (e.g. so-called ‘like-buttons’). This form of interaction concerns users of particular social networks and is therefore, sensibly, limited to their respective usage. In order to safeguard restrictive handling of social network sharing cookies, especially to prevent networks from expanding their offers by way of a surreptitious monitoring and marketing, those cookies have to be limited to apply to users only, if they are logged-in to the specific network at the time of the respective use.

To conclude: in any case where it is not possible to claim that cookies are necessary for the provision of a service, even if that service was funded by advertisements, the use is prohibited and must be based on another exception under Art. 8 Sec. 1 ePrivacy Regulation.[140] Those cookies include, in particular, so-called social plug.in tracking cookies, third-party advertising and first party analytics.[141]

d) Audience measuring (Sec. 1 lit. d)

Art. 8 Sec. 1 lit. d ePrivacy Regulation allows for interference with the user’s terminal equipment if it is necessary for audience measuring, that was requested by the end-user in question, by a third party or by third parties jointly on behalf of or jointly with the provider. Web audience measuring signifies the measuring of web traffic to a certain website and its individual pages.[142] Thus, it refers to the collection of information regarding a group of people, rather than an individual user.[143] The total amount of visits to a page in general can be measured by counting page impressions, meaning how often the specific page is requested from the web server.[144] Measuring the number of visits of unique visitors is often implemented by placing a cookie onto the user’s device.[145]

The interference has to be necessary for the audience measuring. In this context,  only measures to which there are no meaningful and appropriate alternatives are considered necessary (cf. No. I.3.a). Therefore, in most cases, the website operator will have to choose from the measures with regard to the one, that is most privacy-friendly. Recital 21a ePrivacy Regulation specifies that in the case of cookies and similar identifiers necessity is not given, if these are used to determine the nature of who is using the site. This procedure always requires the consent of the end-user.

Pursuant to the European Commission’s draft of the ePrivacy Regulation of 19 January 2017 it was unclear, whether measures based on Art. 8 Sec. 1 lit. d ePrivacy Regulation would only be allowed to be carried out by the service provider themselves, which the wording of the stipulation implied. Following such strict interpretation, website operators would have needed to host and perform audience measuring themselves, without the possibility of outsourcing such analytics services to a third party. The European Parliament[146] and the Council of the European Union[147] both intend, in order to avoid such an unwanted result, to add corresponding permissions to Art. 8 Sec. 1 lit. d ePrivacy Regulation. This end was implemented within the latest version so that now audience measuring can be carried out both by the provider and by a third party. Processing, according to a explicit reference, only requires a data processing agreement pursuant to the conditions laid down in Art. 26, 28 GDPR. It must be concluded by the provider and the third party. Adobe Analytics, Google Analytics or INFOnline serve as examples.

e) Security of information society services or terminal equipment (Sec. 1 lit. da)

Art. 8 Sec. 1 lit. da ePrivacy Regulation allows interference with the integrity of a device, e.g. the placing of a cookie on an end-user’s device, if it is necessary to maintain or restore the security of information society services or terminal equipment of the end-user, to prevent fraud or prevent or detect technical faults for the duration necessary for that purpose.

aa) Information society service

According to Art. 4 Sec. 1 lit. d ePrivacy Regulation[148] in connection with Art. 1 lit. b Directive (EU) 2015/1535, an ‘information society service’ is any service[149], which is usually provided

  • for remuneration,
  • at a distance, meaning the service is provided without the parties being simultaneously present
  • by electronic means, i.e. the service is sent initially and received at its destination by means of electronic equipment for the processing (including digital compression) and storage of data; it must be entirely transmitted, conveyed and received by wire, by radio, by optical or by other electromagnetic means, and
  • provided at the individual request of a recipient of services, meaning the service is provided through the transmission of data on individual request.

This definition is very broad and includes a variety of different examples, such as social networks, search engines, online shops and other websites. Previously being part of Art. 8 Sec. 1 lit. c and d ePrivacy Regulation, it has prevailed only within lit. da. Withal, it is not evident if the legislator sought to pursue a specific end by this differentiation. Neither a teleological extension to all services in general, nor a clear reduction to ‘information society services’ alone becomes visible. In absence of additional indications, however, a literal application seems preferable.

bb) Necessity for security purposes

Art. 8 Sec. 1 lit. da ePrivacy Regulation requires a ‘necessity’ for maintaining or restoring the security of an information society service or terminal equipment of the end-user to prevent fraud or technical faults of the end-user. As already outlined in the context of Art. 8 Sec. 1 lit. a ePrivacy Regulation (cf. No. I.3.a) and within Art. 6 ePrivacy Regulation, the term ‘necessity’ is not defined. In this respect, reference is made to the statements above. Insofar, Recital 21 ePrivacy Regulation confirms that the justification is supposed to be limited to situations with “no, or only very limited intrusion of privacy” and measures that are “strictly necessary and proportionate”, also implying a rather narrow scope of this justification. Therefore, this provision has to be understood in a restrictive way, allowing only measures the security of an information society service concerned could not be maintained or restored without, after all. Consequently, measures going beyond this purpose, especially those serving to extract an additional value of its maintenance are not encompassed by the exception. That concerns for instance any tracking or advertising. These measures will have to be based on consent, instead.

f) Software updates (Sec. 1 lit. e)

Art. 8 Sec. 1 lit. e ePrivacy Regulation excludes necessary software updates from the prohibition of the use of processing and storage capabilities or the collection of information from end-user´s terminal equipment. As lit. e in connection with its corresponding Recital 21b ePrivacy Regulation specifies, software updates are only considered necessary, if they serve security reasons. Other updates, for example such intended to add new features or improve a service´s performance, are not included to this exception. Furthermore, the update must not in any way change the privacy settings chosen by the end-user. He or she must be informed prior to the installation and be given the possibility to postpone or turn off the update in case it is unwanted.

g) Locating terminal equipment in case of emergency communications (Sec. 1 lit. f)

Art. 8 Sec. 1 lit. f ePrivacy Regulation extends the justification regime of ‘security purposes’ already pursued in lit. da and e. Accordingly, interference with device-integrity is allowed, if it is necessary to locate terminal equipment when an end-user makes an emergency communication either to the single European emergency number ‘112’ or a national emergency number (cf. Art. 13 Sec. 3 ePrivacy Regulation). This provision, like lit. e, has been added by the Proposal of the Council from 10 February 2021.

h) Further processing of collected information (Sec. 1 lit. g – lit. i)

Art. 8 Sec. 1 lit. g ePrivacy Regulation allows further processing of collected information for purposes other than those for which the information was initially acquired. The admissibility is subject to further restrictions under Art. 8 Sec. 1 lit. h and lit. i ePrivacy Regulation.

aa) Principle of purpose limitation, Sec. 1 lit. g

In accordance with the principle of purpose limitation, laid down in Rec. 20aa, Art. 8 Sec. 2 lit. g ePrivacy Regulation in connection with Art. 5 Sec. 1 lit. b, Art. 6 Sec. 4 GDPR, processing of information for further purposes is admissible, if these are compatible with the initial ones. The principle of purpose limitation represents an important component of privacy related regulatory matters.[150] It takes into account that it is possible to use data, once it is collected and stored, in various ways, and thereby repeatedly interfere with rights and interests of its respective subjects. Subsequently, the limitation of purposes for which data is collected restricts a deliberate handling of data: One or several purposes must already be determined when the collection or processing of data takes place.[151] Yet, the principle is not to be interpreted in an all too strict way according to which it completely excludes processing for purposes other than the initial ones.[152] In cases the law provides for exemption clauses, these can justify further processing of data, which are already in possession of the controller.[153] Conversely, such processing is prohibited only, if respective purposes are “incompatible” with the initial collection purpose.[154] Thus, by a technical consideration, the term of purpose limitation must rather be interpreted as a principle of ‘purpose compatibility’.[155]

Legal scholars discussed the legal nature of provisions regarding further processing in connection with Art. 6 Sec. 4 GDPR.[156] Some argued, a separate justification pursuant to Art. 6 Sec. 1 GDPR was needed for every case of further processing, qualifying Art. 6 Sec. 4 GDPR a mere additional requirement to that extent.[157] Such interpretation followed from the term “compatible”, which corresponds to the provision of Art. 5 Sec. 1 lit. b GDPR and was therefore considered an interpretation without individual justification-character. Also, being apprehensive of possible dysfunctional effects, voices sought to narrow down Art. 6 Sec. 1 GDPR´s scope of application in an effort to prevent excessive use of an additional legal basis. Such use, in their notion, undermined the ‘limiting function’ of the purpose compatibility assessment.[158]

It is correct, however, that already according to the GDPR, its Art. 6 Sec. 4 represented an individual legal basis.[159] That, first and foremost, followed from its Recital 50, which explicitly clarified that “no legal basis separate from that which allowed the collection of the personal data [was] required”. In light of Art. 6 Sec. 4 GDPR´s systematic layout, practical orientation and explicit wording, contrary opinions were hard to reason and, as a result, not compelling, whatsoever. Not only stated Art. 5 Sec. 1 lit. b GDPR that further processing in a compatible manner would be admissible.[160] But also, such further processing would be specified within Art. 6 GDPR, providing reasons for “lawful processing”. Finally, qualifying Art. 6 Sec. 4 GDPR a mere catalogue of additional criteria to Sec. 1 would have thwarted its general aim to privilege further processing and as a result applied further restrictions on further processing. Having said this, remarks apply to Art. 8 Sec. 1 ePrivacy Regulation accordingly. It must be considered an independent legal basis for further processing pursuant to the principle of purpose limitation and the sole scale for justification, subsequently.

Art. 8 Sec. 1 lit. g ePrivacy Regulation gives shape to the principle of purpose limitation by enumerating a non-conclusive list of criteria, which must be taken into account, when assessing the compatibility of purposes in further processing. It takes over the catalogue, which had already been proposed by the Art. 29 WP in 2013[161] and later implemented within Art. 6 Sec. 4 GDPR. The legal user is thus required to appraise his or her situation in each individual case.

At first, quite obviously, he or she must consider the link between purposes. Thereby, a specific connection does not alone follow from a mere textual correspondence.[162] In fact, in some cases a textual similarity might mislead the assessment and produce counter-intentional results (e.g. ‘software updates’ and ‘software evaluations’). The focus should rather lie on the substance of the relationship between matters and their actual connectability. For example, an adequate link can persist, if further processing represents a logical follow up to the initial collection or it was already implied in the initial purposes.[163] To the contrary, compatibility proves to be more problematic, the greater the distance between purposes turns out to be.

Secondly, the context of processing must be taken into account. This regards the reasonable expectations of the persons concerned about what his or her information is going to be used for. Expectations will by a large part be influenced by the relationship between end-users and collectors. Power imbalances, legal obligations or specific contractual provisions might usually restrict the amount of compatible purposes. For example, the legal obligation of telecommunications companies to store certain data for subscribers and law enforcement services will not justify further processing for commercial purposes. Also, common customs will play a part, even though not being justifiable in any case per se. Much rather, customs will have to be considered in the light of the legislative intention, which is to provide for an effective and comprehensive protection of end-users´ privacy.[164] For example, only because targeted advertising represents a common practice, it will not automatically be justified in the course of a specifically requested online-shopping service (cf. Art. 8 Sec. 1 lit. c ePrivacy Regulation). Here, the general rule applies that the more specific the context of collection and the more surprising a further use appears, the more incompatible further purposes will be.

Furthermore, the nature and modalities of the intended further processing represent important aspects. Art. 8 Sec. 1 lit. g No. iii ePrivacy Regulation particularly mentions the significance of possible revelations of data categories pursuant to Art. 9 or 10 GDPR. These include sensitive data, e.g. data about ethnicity, political opinions, religious confessions or criminal convictions. Even though in principle not focused on the collection of such information, further processing might combine different data points to complement a bigger picture. E.g. the collection of location data, necessary to connect an end-user´s telephone to the network, might, under further processing of other location data, reveal affiliation to a certain migration group. In general, one might therefore conclude that sensitive data narrows the scope of compatible use cases.

Assessing the compatibility, the impact of further processing, both positive and negative, must be considered. Possible consequences might include future decisions or actions by third parties or discriminatory effects against individuals. For example, IoT services collecting data or using storage of terminal equipment in order to undertake a software update are not allowed to use the same data in order to apply additional applications, e.g. when they collide with soft- or hardware preferences or interfere with the overall functionality of the device. For instance, it will not be compatible to run searches for unauthorised repair works in order to revoke a software update or implement colliding applications. Particularly, if equivalent alternative methods to achieve the pursued objective are available and these impose less of a (negative) impact, compatibility will be at doubt.

Finally, the existence of appropriate safeguards need to be considered. According to the Art. 29 WP, deficiencies at certain points may in some cases be compensated by a better performance in other regards. Thus, technical measures in the course of further processing represent a decisive factor in the overall assessment. Art. 8 Sec. 1 lit. g No. v ePrivacy Regulation mentions encryption and pseudonymisation as examples. Pseudonymisation is defined in Art. 4 Sec. 1 lit. a ePrivacy Regulation and Art. 4 No. 5 GDPR as the processing of personal data in such a manner that it can no longer be attributed to a specific data subject without the use of additional information. It must be distinguished from the term of encryption, describing the more general encipherment of all kinds of data, in a way that it can later be accessed only by an addition of a specific key. Additionally, one might consider the transparency of further processing, as well as the ability of the concerned end-user to intervene as significant.

Indeed, all of the above mentioned applies, if not already a specific consent pursuant to the requirements of Art. 4a and Art. 8 Sec. 1 lit. b ePrivacy Regulation has been provided by the end-user. Consent must be given on grounds of an informed decision.[165] It therefore requires clarification of the additional subjects of the consent, i.e. the fact that processing for further purposes will take place and what purposes will or might be involved.[166] Withal, further purposes do not have to be enlisted in a definite way. Rather, the end-user can consent in interferences with terminal equipment for one or for multiple purposes.[167] Only, the definition of purposes at the time of data collection must be made as precisely as possible.[168]

Indeed, not in every case such consent or compatible purpose according to Art. 8 Sec. 1 lit. g ePrivacy Regulation will be given. This, however, does not lead to the conclusion that further processing is inadmissible per se. Much rather, a respective controller should consider obtaining consent subsequently or to re-collect the data under the additional, cumulative purposes.[169]

bb) Further restrictions, Sec. 1 lit. h and lit. i

Art. 8 Sec. 1 lit. h and lit. i ePrivacy Regulation set out further restrictions in regard to the processing under additional purposes pursuant to lit. g. Such processing, if considered compatible, may subsequently take place, only provided that information is afterwards erased or anonymised (i), processing is limited to already pseudonymised information (ii) and information is not being used to determine the nature or characteristics of an end-user or to build a profile of them (iii). The provision corresponds to the principle of storage limitation under Art. 5 Sec. 1 lit. e GDPR. This principle rules a temporal limitation to processing of personal data, as of the point, when it is no longer necessary for the prior designated purposes.[170] It thus complements the principle of purpose limitation in a time-related context. Specific means of limitation include deletion or anonymization of collected data.[171] Yet, Art. 8 Sec. 1 lit. h No. ii and No. iii ePrivacy Regulation go beyond that range. Accordingly, any reference of collected information to the end-user is prohibited. While the GDPR within its direct field of application – personal data – permits such reference, the ePrivacy Regulation does not. Thus, one might consider this provision a contradiction of values, since it applies stricter requirements despite the ePrivacy Regulation´s different regulatory responsibility (cf. its Art. 1 Sec. 1 and Art. 2 Sec. 1 lit. a and lit. b). While, with regard to personal data, it might appeal that stricter limitations needed to be applied within the more specific regulatory matter of the GDPR, broader (and to that extent possibly less intensive interferences) stipulations should prevail within the ePrivacy Regulation. Yet, particularly because its different scope of application, general assertions of that sort are misleading, as the ePrivacy Regulation both expands and limits its regulatory scope with respect to the processing of data and the context, such processing is enacted in. Thus, further differentiations and restrictions within its regulatory matter are not contradictory but rather indicated.[172]

For the purposes of further processing, data shall eventually not be shared with any third parties, as long as they do not represent processors in the meaning of Art. 28 GDPR. Privacy, referring to the right to decide on who has access to one´s personal sphere, is understood particularly literal at this point. Access by one person, in the notion of the legislator, does not imply access by others. Third-party access is therefore preconditioned to further justifications under Art. 8 Sec. 1 ePrivacy Regulation.

[61] Steinrötter, in: Specht/Mantz, Handbuch Europäisches und deutsches Datenschutzrecht (2019), § 5 Rec. 22.

[62] See 3.b)

[63] Steinrötter, ibid., Rec. 32.

[64] Council of the EU, 15333/17, p. 3.

[65] See Art. 29 WP, WP 194 (2012), p. 8.

[66] For details see Voigt/von dem Bussche, GDPR – A Practical Guide (2017), pp. 143-147.

[67] European Commission, Proposal for a Regulation of the European Parliament and of the Council, 10 January 2017, COM(2017) 10 final.

[68] See also Buchner/Petri, in: Kühling/Buchner, DS-GVO BDSG, Art. 6 (2018), rec. 45.

[69] Herberger, in: Schriften zum Internationalen Recht, Band 217 (2017), p. 30; critically: Rosenkranz, in: Juristische Ausbildung 2016, pp. 783-788.

[70] Cf. Steinrötter, in: Specht/Mantz, Handbuch Europäisches und deutsches Datenschutzrecht, § 5 Rec. 35; Engeler/Felber, ZD 2017, 251 (255).

[71] Steinrötter, ibid.

[72] Cf. Rec. 20a, Arts. 4a Sec. 2; 14 Sec. 2 lit. a ePrivacy Regulation.

[73] Cf. the definition in Art. 29 WP, WP 194 (2012), p. 8.

[74] Art. 29 WP, WP 224 (2014), p. 10, 11.

[75] Instructive: European Commission, Internet of Things – An action plan for Europe, COM(2019) 278 final, pp. 2 ff.

[76] See Engeler/Felber, ZD 2017, 251, 255.

[77] Art. 29 WP, WP 194 (2012), p. 3, 6, 8.

[78] Piltz, DB 2018 749 (750).

[79] See Rec. 32 GDPR.

[80] Cf. No. I.1.b)cc).

[81] Cf. No. I.1.b)cc).

[82] Cf. Rec. 20aaa ePrivacy Regulation.

[83] Recital 20aaa ePrivacy Regulation; with regard to the corresponding provision in Art. 7 Sec. 1 GDPR, cf. Buchner/Kühling, in: Kühling/Buchner, DS-GVO BDSG, Art. 7 DSGVO (2018), Rec. 22.

[84] Transnationally acknowledged: http://translex.uni-koeln.de/966000/_/distribution-of-burden-of-proof/.

[85] Prütting, in: Münchener Kommentar zur ZPO, § 286 (2020), Rn. 121.

[86] Recital 20aaa ePrivacy Regulation clarifies the admissibility of such procedure.

[87] As the Art. 29 WP points out, typical examples are found with regard to the cooperation between different parties for the purpose of behavioural online-targeting, Art. 29 WP, Opinion 2/2010 on online behavioural advertising, WP 171, p. 12.

[88] Hartung, in: Kühling/Buchner, DS-GVO BDSG, Art. 4 DS-GVO (2020), rec. 12 with further reference to Art. 29 WP, Opinion 1/2010 on the concepts of “controller” and “processor”, WP 169, p. 22.

[89] Better known as ‘Facebook fanpage-decision’, CJEU, judgement of 5 June 2018, Case C-210/16-Wirtschaftsakademie Schleswig-Holstein.

[90] The additional requirement of access to statistics pursuant to self-defined parameters regarding data processed by the third party of the website operator was later dropped in favour of a more expensive interpretation. According to this, any exertion of influence on the processing in the course of a corresponding personal interest of the operator suffices to assume joint controllership. Cf. CJEU, judgement of 5 June 2018, Case C-210/16-Wirtschaftsakademie Schleswig-Holstein, rec. 35 f and later CJEU, judgement of 10 July 2018, Case C-25/17 – Zeugen Jehovas, rec. 68 plus CJEU, judgement of 29 July 2019, Case C-40/17 – Fashion ID, rec. 68 f.

[91] CJEU, judgement of 5 June 2018, Case C-210/16-Wirtschaftsakademie Schleswig-Holstein, rec. 38; cf. also CJEU, judgement of 10 July 2018, Case C-25/17 – Zeugen Jehovas, rec. 69.

[92] Cf. Hartung, in: Kühling/Buchner, DS-GVO BDSG, Art. 26 DS-GVO (2020), rec. 32; CJEU, judgement of 5 June 2018, Case C-210/16-Wirtschaftsakademie Schleswig-Holstein, rec. 43.

[93] Hartung, ibid; CJEU, judgement of 5 June 2018, Case C-210/16-Wirtschaftsakademie Schleswig-Holstein, rec. 43.

[94] For more detailed comments see Art. 4a No. I.2.c)bb).

[95] Cf. Rec. 20a ePrivacy Regulation.

[96] Indeed, an opening clause corresponding to Art. 6 Sec. 1 lit. f GDPR was discussed prior to the votings on the initial proposal (cf. draft proposal of 14 July 2017, 2017/0003(COD), Amendments 332-705, Amendment 525, 526, p. 88). This was due to fears about a distortion of competition in the course of a better market-position for consent-obtainment of leading advertisers such as Google and Facebook. Also, legislators worried about possible job losses in the media sector and financing concepts of free business models (cf. Herbrich, jurisPR-ITR 23/2017, Anmerkung 2 with further references).

[97] Cf. Datenschutzkonferenz (DSK), Orientierungshilfe der Aufsichtsbehörden für Anbieter von Telemedien, March 2019, p. 10.

[98] See Klement, in: Simitis/Hornung/Spiecker gen. Döhmann, Datenschutzrecht, Art. 7 (2019), rec. 35.

[99] CJEU, judgement of 1st October 2020, C-673/17-Planet49, Recs. 44-65.

[100] EDPB, Guidelines 05/2020 on consent under Regulation 2016/679 – 4 May 2020, rec. 40, 41.

[101] Datenschutzkonferenz, Orientierungshilfe der Aufsichtsbehörden für Anbieter von Telemedien, March 2019, p. 8.

[102] See German Federal Court of Justice (Bundesgerichtshof), press release no. 67/2020 from 28 May 2020 relating to judgement in case I ZR 7/16.

[103] However, there is no obligation to obtain consent (at all) for so-called ‘functional’ or ‘essential’ cookies; for the legal and technical differentiation between ‘functional’ (or: ‘essential’) cookies and other types of cookies see below, No. I.3.c).

[104] Cf. CJEU, Press Release No. 125/19 from 1st October 2019, regarding C-673/17-Planet49; for more details see also further below.

[105] The German Federal Court of Justice recently referred the question to the ECJ for a preliminary ruling, whether the GDPR also provides for an abstract right of judicial action of consumer protection organisations against responsible companies, or rather whether this can be regulated nationally by Member States due to Art. 80 Sec. 2 GDPR, see German Federal Court of Justice (Bundesgerichtshof), press release no. 66/2020 from 28 May 2020 relating to judgement in case I ZR 186/17 (CJEU, C-319/20). According to the legal situation under the Data Protection Directive, such a right of judicial action existed, whereas the wording of the GDPR provides for this only in the event that the relevant organisations refer to a concrete infringement of a specific data subject’s rights, and not only to an abstract risk.

[106] EDPB, Guidelines 05/2020 on consent under Regulation 2016/679, Version 1.1, 4 May 2020, p. 16 Recs. 62 et seqq.; and already Art. 29 WP, WP 259 rev.01 (2018), p. 18.

[107] Cf. Statement of the EDPB on the revision of the ePrivacy Regulation and its impact on the protection of individuals with regard to the privacy and confidentiality of their communications, at p. 2 (2019).

[108] Art. 29 WP, WP 259 rev.01 (2018), p. 15.

[109] Cf. for an overview Baumann/Alexiou, ZD 2021, 349.

[110] EDPB, Guidelines 05/2020 on consent under Regulation 2016/679, rec. 39 – 41 and EDPB, Statement 03/2021 on the ePrivacy Regulation, p. 3; German Bundesbeauftragter für den Datenschutz und die Informationsfreiheit (BfDI), press statement 10/2020; Dutch Autoriteit Persoonsgegevens (AP), press statement of 7 March 2019; Greek Helenic Data Protection Agency (HDPA), guidelines of February 2020; French Commision Nationale de lÍnformatique et des Libertés (CNIL), Deliberation No 2019-093 of 4 July 2019.

[111] German Konferenz der unabhängigen Datenschutzaufsichtsbehörden des Bundes und der Länder (DSK), Orientierungshilfe der Aufsichtsbehörden für Anbieter von Telemedien, March 2019, p. 10.

[112] EDPB, Guidelines 05/2020 on consent under Regulation 2016/679, at rec. 37, 38.

[113] Austrian Datenschutzbehörde (DSB), Bescheid vom 25. Mai 2018.

[114] Cf. Rank-Haedler, Daten als Leistungsgegenstand: Vertragliche Typisierung, in: Specht-Riemenschneider/Werry/Werry, Datenrecht in der Digitalisierung, p. 489 ff.; Directive (EU) 2019/770 of the European Parliament and of the Council of 20 May 2019 on certain aspects concerning contracts for the supply of digital content and digital services.

[115] Amendment to Art. 8 Sec. 1a, Report of the European Parliament, of 20 October 2017, A8-0324/2017.

[116] The EDPB criticized this decision, demanding the reintroduction of a respective provision, prohibiting such “take it or leave it”-techniques, EDPB, Statement 03/2021 on the ePrivacy Regulation of 9 March 2021.

[117] Cf. also Baumann/Alexiou, ZD 2021, 349 (353).

[118] Note that to date practice tends to even offer a choice between cookie-involving and free of charge services in the first place. Yet, this will most probably change in the course of time, the more often visitors to a website will click on a respective “decline” button.

[119] Cf. e.g. Datenschutzkonferenz (DSK), Orientierungshilfe der Aufsichtsbehörden für Anbieter von Telemedien, March 2019, p. 9 ff.

[120] Ibid., p. 11.

[121] EDPB, Guidelines 05/2020 on consent under Regulation 2016/679, Version 1.1, 4 May 2020, p. 16 Rec. 69 with reference to Art. 29 WP, Guidelines on transparency under Regulation 2016/679 of 29 November 2017, WP 260 rev.01, p. 19 Rec. 35 et seqq, suggesting a “layered approach” to ensure transparency.

[122] Datenschutzkonferenz (DSK), Orientierungshilfe der Aufsichtsbehörden für Anbieter von Telemedien, March 2019, p. 14 f.; EDPB, Guidelines 05/2020 on consent under Regulation 2016/679, Version 1.1, 4 May 2020, p. 12 Rec. 42 et seqq.

[123] It must be noted, however, that the GDPR pursues a differentiated approach in this regard. According to its Rec. 43, in principle, separate consent for each processing operation is required, however, only as far as this is appropriate in the individual case. Taking into account simplicity and feasibility, this will, hence only regard simple and confined processing procedures. More complex ones, such as cookies, consequently fall outside that requirement.

[124] EDPB, Guidelines 05/2020 on consent under Regulation 2016/679, Version 1.1, 4 May 2020, p. 12 Recs. 42 et seqq.; and already Art. 29 WP, WP 259 rev.01 (2018), p. 18.

[125] Cf. Rec. 20aa ePrivacy Regulation.

[126] For more details, see No. I.3.h).

[127] This, for example, can be done by placing a cookie with an expiry date in the past.

[128] CJEU, judgement from 1 October 2019, C-673/17 – Planet49.

[129] By that, the CJEU contradicted its Advocate General, having previously supported the opinion that a visual separation for each different declaration is required (Opinion of the Advocate General, 21 March 2019, Planet 49, C-673/17, rec. 87-93).

[130] DSK, Orientierungshilfe der Aufsichtsbehörden für Anbieter von Telemedien, March 2019, p. 9.

[131] For details see No. I.3.b)cc).

[132] For technical details on the term see No. I.2.b)bb) and No. I.2.c)cc)(2).

[133] In view of the initial Commission´s proposal of 10 January 2017, this has not always been the case. At first, the word “strictly” had been omitted. This lead to critique by the Art. 29 WP, pointing out already existing difficulties of interpretation on the provision within the ePrivacy Directive (Art. 29 WP, Opinion 01/2017 on the Proposed Regulation for the EPrivacy Regulation (2002/58/EC), wp247, p. 19). The Counsel´s proposal of 10 February 2021 later included the term again.

[134] Cf. Art. 4 No. I.2.

[135] Annex I of directive (EU) 2015/1535 presents a list of such services that are not considered information society services, while at least partly falling into the presented definition.

[136] Art. 29 WP, WP 194 (2012), p. 3.

[137] See also Buchner/Petri, in: Kühling/Buchner, DS-GVO BDSG, Art. 6 (2018), rec. 45.

[138] Schleipfer, in: ZD 2017, 460, p. 464.

[139] Art. 29 WP, WP 194 (2012), p. 8.

[140] A respective clarification is proposed by the Council of the European Union, ST 11001/19, p. 27. For an assessment under the GDPR see Buchner/Petri, in: Kühling/Buchner, DS-GVO BDSG, Art. 6 (2018), rec. 41.

[141] For definitions, see Art. 29 WP, WP 194, p. 9.

[142] Schleipfer, ZD 2017, 460 (464).

[143] Ibid.

[144] For details see ‘Web traffic’, in: Wikipedia, retrieved 18 January 2022, from: https://en.wikipedia.org/wiki/Web_traffic.

[145] Rec. 21a ePrivacy Regulation.

[146] European Parliament, LIBE report A8-0324/2017, 20 October 2017, amendment 89.

[147] Council of the European Union, ST13256/18, 19.10.2018, p. 57.

[148] Cf. the same reference in Art. 4 No. 25 GDPR.

[149] Annex I of directive (EU) 2015/1535 presents a list of services that are not considered information society services, while at least partly falling into the presented definition.

[150] Albers, Informationelle Selbstbestimmung, pp. 498 ff.; also cf. Art. 8 Sec. 2 S. 1 CFR; a brief overview on the history of the principle of purpose limitation provides Art. 29 WP, Opinion 03/2013 on purpose limitation, WP 203, pp. 6-10.

[151] Art. 29 WP, ibid. pp. 4, 21 ; Cf. also Art. 8 Sec. 2 S. 1 CFR.

[152] Herbst, in: Kühling/Buchner, DS-GVO BDSG, Art. 5 DSGVO (2020), Rec. 24.

[153] Herbst extensively deals with the question, whether an additional legal basis for further processing is needed or compatible purposes are already implied within the initial provision (ibid., Rec. 48). Contrary to his position, the explicit wording of Rec. 50 GDPR denies such further requirement. Yet, here, the dispute does not have to be decided, since Art. 8 Sec. 1 lit. g ePrivacy Regulation anyways provides for an additional legal basis by means of a general clause.

[154] Herbst, ibid.; cf. also German Federal Constitutional Court (GFCC), judgement of 20 April 2016, 1 BvR 966/09.

[155] Herbst, ibid., rec. 24, 27.

[156] For more details, see Art. 6c No. I.

[157] Cf. Buchner/Petri, in: Kühling/Buchner, DS-GVO/BDSG (2020), Art. 6 para. 181 et seqq; Albers/Veit, in: Wolff/Brink, BeckOK-Datenschutzrecht (2020), Art. 6 DSGVO para. 69;

[158] Herbst, in: Kühling/Buchner, DS-GVO/BDSG (2020), Art. 5 Rec. 48.

[159] For more details, see Art. 6c No. I.

[160] This follows from a reverse to Art. 5 Sec. 1 lit. b GDPR´s wording, data should not be “further processed in a manner that is incompatible with [the initial] purposes”.

[161] Art. 29 WP, ibid., pp. 23 ff.; cf. the herein provided explanations also for the following comments.

[162] Art. 29 WP, ibid.,

[163] Herbst, ibid., rec. 24, 27.

[164] Cf. Rec. 6 and Art. 1 Sec. 1 ePrivacy Regulation.

[165] Cf. Art. 4a ePrivacy Regulation Rec. 42.

[166] Cf. also Rec. 20aaa S. 3 ePrivacy Regulation.

[167] Cf. Art. 4a Sec. 1, as well as Recs. 20aaa and 20a ePrivacy Regulation in connection with Art. 6 Sec. 1 lit. a GDPR.

[168] Cf. Art. 29 WP, ibid., p. 16; Buchner/Petri, in: Kühling/Buchner, DS-GVO BDSG, Art. 6 DSGVO (2020), Rec. 179.

[169] Herbst, in: Kühling/Buchner, DS-GVO BDSG, Art. 5 DSGVO (2020), Rec. 47.

[170] Roßnagel, in: Simitis/Hornung/Spiecker gen. Döhmann, NK-Datenschutz, Art. 5 DSGVO (2019), Rec. 150.

[171] Roßnagel, ibid., Rec. 155.

[172] For details see No. I.5.

The provision in Art. 8 Sec. 1 ePrivacy Regulation is subject to the opening clause in Art. 11 Sec. 1 ePrivacy Regulation that allows for additional legislation by the European Union and the EU Member States. Legislation should restrict the rights and obligations set forth in Arts. 5 to 8 ePrivacy Regulation in the interest of the public.[173] As of now it is not clear to which extent the national legislators in the EU Member States will make use of this opening clause.

[173] For details see the comments on Art. 11 ePrivacy Regulation.

According to Art. 1 Sec. 3 ePrivacy Regulation, “the provisions of this Regulation particularise and complement [the GDPR] by laying down specific rules for the purposes mentioned in paragraphs 1 to 2”. Thus, whenever personal data is involved, the ePrivacy Regulation represents lex specialis to the GDPR.[174] While the GDPR regulates the protection of personal data, the ePR protects the respect for private life and communications.[175] In this case, pursuant to the explicit wording of Rec. 2a ePrivacy Regulation, the provisions particularise the GDPR in the way of translating its principles into specific rules. Conversely, if no such rules are established, the GDPR applies to its full extent.[176] The specific character of the ePrivacy Regulation becomes particularly apparent within Art. 8, since besides its direct regulatory content (the protection of privacy with regard to terminal equipment) it provides for various parallels to the GDPR, yet concretising these for the sake of a communication and tech-related environment. For instance, the range of exceptions under Art. 8 Sec. 1 ePrivacy Regulation, though being inspired by those of Art. 6 GDPR, bring up specific cases of communication-related matters, such as audience measuring (lit. d) or software updates (lit. e). Moreover, at many points the provision refers to the GDPR explicitly, such as in lit. d (Art. 26, 28 GDPR), lit. g No. iii (Art. 9, 10 GDPR), lit. I (Art. 28 GDPR) and both Sec. 2a and 2b (Arts. 13 and 32 GDPR).

Even though clearly regulated by principle, this relation becomes a problem, whenever the ePrivacy Regulation makes specific statements without including all regulatory matters into the provision.[177] For example, this was the case concerning the initial proposal, which applied the ‘catch-all’ character of the GDPR only to personal data in electronic communications, however, not for information from terminal equipment.[178] E contrario one had to conclude that personal data on terminal equipment, such as IP-addresses, would have been excluded from the complementary provisions of the GDPR.[179] Eventually, this finding was redacted within Recital 2a ePrivacy Regulation, so that now, by confining its wording to the core stipulation, the rule-exception ratio between GDPR and ePR is restored for all subject matters.

Also, value contradictions become apparent between GDPR and ePR with regard to the fact that the ePR applies to both personal and non-personal data, yet at some points provides for stricter provisions than those of the GDPR. This is the case in Art. 8 Sec. 1 lit. g – h ePrivacy Regulation, where processing for further purposes is subject to a variety of restrictions, going way beyond those, provided for in the GDPR. I.e., not only the principles of purpose limitation and storage limitation apply here (as they do within Art. 5 lit. b and lit. e GDPR), but also it is forbidden to use non-pseudonymised data or process such to determine the nature of a user or share it with a third party.[180] Any reference to the affected person must therefore vanish under the provision of Art. 8 Sec. 1 ePrivacy Regulation. It seems contradictory, however, why the GDPR, regulating a broader context of possible cases, allows further processing of personal data, while the ePrivacy Regulation, regulating a narrower case of applications, does not. Even more it is questionable, why the ePrivacy Regulation sets out additional restrictions to this data, even though being psydonymised, instead of loosening up those already existing under the GDPR. Much rather, it seems to be indicated to distinguish between both kinds of data and to take into account that the risk of infringement of personal rights is lighter, where no personal data is involved. In literature this distinction is made particularly with regard to the creation of user profiles.[181] In this notion, user profiles represent a deeper infringement of user rights, than such that are pseudonymised. By differentiating between different qualities of infringement, providers might perceive regulation as an incentive to concentrate on the latter.[182]

However, this perception is mislead with regard to the specific protective purpose of the ePrivacy Regulation. Accordingly, not only personal data, but particularly goods of private life and communications are protected here.[183] Indeed, if the ePrivacy Regulation regards only the interests of data autonomy, a gradation between certain qualities of impact to personal data in respect of the interference with terminal equipment would have made sense and would probably be even indicated. Yet, it is the additional and – by systematic position within Art. 1 Sec. 1 ePrivacy Regulation – primary protected good of privacy in life and communications, why this step ratio does not apply. Infringement of privacy takes place regardless of whether personal data is involved or not. Affected end-users need to be in the position to effectively and autonomously decide if and to what extent third-parties interfere with their privacy. Privacy, withal, includes not only the freedom of personal development, or interpersonal relations, but particularly all devices used in connection with it.[184] Thus, it is justified not to loosen safeguards, but to apply additional ones, whenever privacy related subject matters are involved and not to make such protection dependent merely on the question, whether personal data is infringed. Here, the legislator chooses to attach an additional range of protected goods (private life and communications) to an additional range of restrictions (Art. 8 lit. h and lit. i ePrivacy Regulation). Not least with respect to its legislative assessment prerogative, this decision appears comprehensible.

[174] Steinrötter, in: Specht/Mantz, Handbuch Europäisches und deutsches Datenschutzrecht, p. 132, Rec. 7.

[175] Rec. 2a ePrivacy Regulation.

[176] For more details to the interplay in general, cf. Art. 1 No. I.2.

[177] Cf. Steinrötter, ibid., Rec. 8.

[178] Cf. Rec. 5 S. 1 of the Proposal from 10 January 2017, COM(2017) 10 final; Herbrich, jurisPR-ITR 18/2017 Anm. 2, Sec. C.I.

[179] Herbrich, ibid.

[180] Art. 8 Sec. 1 lit. h ePrivacy Regulation.

[181] Schleipfer, ZD 2017, 460, 465.

[182] Schleipfer, ibid.

[183] Rec. 2a, Art. 1 Sec. 1 and 1a ePrivacy Regulation.

[184] Cf. Rec. 20 ePrivacy Regulation.

Art. 8 Sec. 2 ePrivacy Regulation restricts third-party-collection of information that is emitted by terminal equipment while trying to connect to networks or other devices. Such collection is prohibited in general, unless it can be based on one of the legal permissions in Art. 8 Sec. 2 ePrivacy Regulation. This is the first time that so-called offline tracking’ is addressed, which was not legally regulated in the past, but according to Recital 25 ePrivacy Regulation may well entail privacy risks for end-users, e.g. when the location of the latter is tracked over a longer period of time or repeated visits to specific locations are documented.

As a general rule, Art. 8 Sec. 2 ePrivacy Regulation prohibits the collection of information that is emitted by terminal equipment of end-users[185] while connecting (or trying to connect) to another device or to network equipment. Since Art. 8 Sec. 1 ePrivacy Regulation regulates the “collection of information from end-user’s terminal equipment”, the scope of application of both Art. 8 Sec. 1 and 2 ePrivacy Regulation overlap. Thus, the two provisions have to be differentiated from each other.[186] In summary, it is preferable to interpret Art. 8 Sec. 2 ePrivacy Regulation as to only apply to information emitted in the initial connection to a network. Examples represent MAC or Bluetooth addresses that identify a network interface, if transmitted during such connection attempts. In contrast, Art. 8 Sec. 1 ePrivacy Regulation should be understood to apply to all other data, i.e. such that is either still contained in the device and otherwise obtained by a third party, or information sent or requested as part of a networking communication after a physical connection has already been established. Examples for behaviour, prohibited by Art. 8 Sec. 2 ePrivacy Regulation, consequently, are the capturing of data from unencrypted wireless networks or routers, the deployment of IMSI-Catchers to capture connection data from mobile phones[187], or the tracking of end-user’s devices over time by capturing unique identifiers (like MAC addresses, IMEI or IMSI).[188]

It has to be noted that Art. 8 Sec. 2 ePrivacy Regulation is a general provision, which not only applies to providers of electronic communications networks and services, but to any person in touch with the emissions of end-user’s devices.

[185] See material scope, Art. 2 Sec. 1 lit. b ePrivacy Regulation.

[186] Cf. as described above in No. I.2.c)cc).

[187] Rec. 15 ePrivacy Regulation, second half.

[188] Rec. 25 ePrivacy Regulation.

Art. 8 Sec. 2 lits. a to d ePrivacy Regulation contain a set of permissions for the collection of information emitted by terminal equipment.

a)  Establishment or maintenance of a connection (Sec. 2 lit. a)

Art. 8 Sec. 2 lit. a ePrivacy Regulation permits the collection of information emitted by another device in order to establish a connection.[189] In this case, only information for the purpose and time necessary to establish a connection may be recorded. In order to find the correct counterpart, there may be situations where it is necessary to not only collect information from the device to which the connection shall be made, but also from other devices that are ready to be connected. For example, the connection to a public Wi-Fi-network requires the activation of the device´s discovery mode, in order to list available networks. The discovery process will record emissions of all wireless routers in reach (especially identifiers, such as the SSID and MAC address of the respective network adapter). After selecting the right network and connecting to it, the device will usually delete collected information about other networks. Such procedure is covered by Art. 8 Sec. 2 lit. a ePrivacy Regulation.

Yet, whenever the information is indispensable to maintain the connection it may be kept for that reason once the connection is established. Only after that, the information has to be deleted. Thus, the provision leaves no room for storing connection data in a permanent log or the likes. An exception might be seen in information that is stored to speed up or simplify future connections, e.g. saving the names of Wi-Fi networks to which a connection had already been established. This, at least, would reflect the common practice.

b) Consent (Sec. 2 lit. b)

As in Art. 8 Sec. 1 lit. b ePrivacy Regulation, the collection of information emitted by the device is also possible, if the end-user has given his or her consent. With regard to the prerequisites in this respect, reference is made to the explanations in the context of Arts. 4a and 8 Sec. 1 lit. b ePrivacy Regulation.

c) Statistical purposes (Sec. 2 lit. c)

Art. 8 Sec. 2 lit. c ePrivacy Regulation provides for an exception in cases, information is collected for statistical purposes. It must be considered within the context of a general privilege for statistical purposes within privacy related issues, as already provided for in Art. 5 Sec. 1 lit. b and Art. 89 GDPR. This privilege justifies in front of the backdrop of prevailing public interests.[190]

‘Statistics’ refer to a methodical handling of empirical data.[191] ‘Statistical purposes’ define those acts necessary to conduct handling and to produce statistical results.[192] It must be noted that the term ‘statistics’ in this context implies that data collected during surveys might indeed include such of natural persons, yet it is not used in connection with the individual. Statistical use requires an aggregate or pseudonomysed form of data.[193] Recital 25 ePrivacy Regulation exemplifies respective fields of application. Hence, the tracking of physical movements, based on the scanning of equipment related information, is admissible, provided that such counting is limited in time and space and to the extent necessary for this purpose. Also, the data must be made anonymous or erased as soon as it is no longer needed.[194] In the legislator´s notion, functionalities of physical tracking include the counting of people, such as the number of people waiting in line or ascertaining the number of people in a specific area. By contrast, no statistical counting is seen in commercial messages to end-users with personalized offers, e.g. when entering stores, or the tracking of individuals over time, including repeated visits to certain locations.

Being a specifically intrusive form of collecting information, the tracking of physical movements is made subject to further restrictions under Art. 8 Sec. 2a and 2b ePrivacy Regulation. According to this, a clear and prominent notice has to be posted at the boundaries of the zone of coverage[195]. Notice must inform end-users that the technology is in operation within a given perimeter prior to entering the defined area. Information should also include the purpose of and the person responsible for tracking, the modalities of the collection, as well as the existence of measures, available to minimize or stop the collection. In cases personal data is being collected, additional information pursuant to Art. 13 GDPR must be provided.[196] Yet, since ruling out the possibility of a collection of personal data will usually not be possible, the latter obligation must be considered mandatory. In turn, this information may as well be used for non-statistical purposes related to the specific area, i.e. for the above mentioned commercial messages.[197]

In view of the concrete form of display, Art. 8 Sec. 3 ePrivacy Regulation allows the use of standardized icons instead of a textual display of the aforementioned information. Furthermore, Arts. 8 Sec. 4, 25 and 26 ePrivacy Regulation empower the European Commission to adopt delegated acts to provide further requirements.[198]

Moreover, collection of information emitted by terminal equipment shall be conditional on the application of appropriate technical and organisational measures (TOM) to ensure a level of security appropriate to the risks, as set out in Art. 32 GDPR. In particular, this includes the encryption of personal data, safeguards for technical integrity and availability of related systems, as well as the revision of their effectivity.[199] As Rec. 25 ePrivacy Regulation stipulates, pseudonymisation, in principal being a part of these measures under Art. 32 GDPR already, applies regardless of whether personal data is concerned or not. Thus, as far as it is technically operable and sensible in scope, information in the context of Art. 8 Sec. 2 lit. c ePrivacy Regulation always needs to be enciphered, so that personal data becomes apparent only by an addition of further information.[200]

Art. 8 Sec. 2 lit. b ePrivacy Regulation is associated with different practical problems. At first, it is difficult to determine the boundaries of the zone of collection. Indeed, both capturing devices, e.g. a wireless bass station, and emitting devices, such as cellular telephones, have a specific range within which information may be emitted or captured (cf. Rec. 25 ePrivacy Regulation). However, from a technical perspective it is nearly impossible to exactly mark the reach of the network or the range of the device. Thus, entities engaging in offline tracking would either have to mark a wider area with information signs or reduce the signal strength of devices in order to ensure that there is no capturing outside the marked boundaries.[201] Such restrictions will either limit the usefulness of a corresponding setup or bring liability risks since any capture outside the marked boundaries can be ground for administrative fines pursuant to Art. 23 Sec. 2 lit. a ePrivacy Regulation.[202]

Secondly, Art. 8 Sec. 2 lit. c ePrivacy Regulation has been criticized for its lack of practicality with regard to privacy itself: An end-user will only be able to avoid a corresponding setup by either turning off his device or avoiding the area of operation altogether. Both options are not feasible in many situations.[203] Rather, end-users will let their devices stay switched on and endure the procedure. Mere endurance, however, cannot replace a deliberate decision. Indeed, the European Parliament[204] and the Council[205] sought to complement the provision compared to earlier broader worded versions. Within the first proposal of 2017, collection of device-emitted information would have been admissible for establishing a connection to a network or by setting up information signs alone.[206] The restriction on the purpose of statistical surveys thus represented an important adjustment. However, under the above mentioned conditions it is hard to actually demarcate a concrete scope of application, given the sweeping practice of tracking.[207]

Finally, it is questionable, to what extent these and other forms of private statistical surveys can be justified by this provision. Without a teleological reduction on public entities or such persons, holding an official permit, the exception, as one might think, could sprawl over its initial intent. Indeed, any given person could demarcate a random area and conduct surveys to his or her discretion. This practice, as laid out above, would contradict the legislative concept, if justified particularly in front of a prevailing public interest.[208] For this reason, in the past, literature already excluded private commercial purposes, such as direct marketing, opinion research or economic risk assessments from the range of application under Art. 89 GDPR.[209] Yet, in the context of Art. 8 Sec. 2 lit. c ePrivacy Regulation, this finding does not apply. Contrary to the predominantly fundamental rights-motivated regulation in the context of the GDPR, the ePrivacy Regulation also focuses in particular on the economic concerns within telecommunication-related matters.[210] This explains why the explicit wording of Rec. 25 ePrivacy Regulation encompasses the specific example of physical movements´ tracking services. Since neither Art. 8 Sec. 2 nor Rec. 25 ePrivacy Regulation restricts these services on public entities, it must be followed that in particular private providers can claim this justification. Thus, in the context of Art. 8 Sec. 2 ePrivacy Regulation, earlier restrictions on public surveys are no longer applicable.

d) Provision of a requested service (Sec. 2 lit. d)

Art. 8 Sec. 2 lit. d ePrivacy Regulation stipulates a justification for the collection of device-emitted information, in case it is necessary for providing a service requested by the end-user. This provision takes up the exception within Art. 8 Sec. 1 lit. c ePrivacy Regulation.[211] It justifies from the idea that when making use of certain functionalities, interferences with individual privacy must be taken into account. The term ‘requested by the end-user’ clarifies moreover, that a certain element of will is involved, leaning this justification clause towards the consent requirement in Art. 8 Sec. 2 lit. b ePrivacy Regulation.

e)  Opening clauses to safeguard general public interests

The provision in Art. 8 Sec. 2 ePrivacy Regulation is, just as Art. 8 Sec. 1, subject to the opening clause in Art. 11 Sec. 1 ePrivacy Regulation, allowing for additional legislation by the European Union and the EU Member States. Legislation should only restrict the rights and obligations set forth in Arts. 5 to 8 ePrivacy Regulation and serve the interest of the public. As of now, it is not clear to which extent the national legislators in the EU Member States will make use of this opening clause.

[189] The Council of the European Union clarifies in its latest amendment proposal that data needed to maintain a connection might also be collected and stored, Council of the European Union, ST 6771/19, p. 59.

[190] Hense, in: Sydow, Europäische Datenschutzgrundverordnung, Art. 89 (2018), rec. 6.

[191] Buchner/Tinnefeld, in: Kühling/Buchner, DS-GVO BDSG, Art. 89 DSGO (2020), rec. 15.

[192] Cf. Rec. 162 GDPR.

[193] Rec. 162 GDPR.

[194] Cf. also Rec. 25 ePrivacy Regulation.

[195] Rec. 25 ePrivacy Regulation.

[196] See Voigt/von dem Bussche, GDPR – A Practical Guide (2017), pp. 38-43 for details.

[197] Cf. Rec. 25 ePrivacy Regulation, last sentence.

[198] Rec. 41 ePrivacy Regulation.

[199] Voigt/von dem Bussche, ibid., pp. 143-146 for details.

[200] Cf. for definition Art. 4 Sec. 1 lit. a ePrivacy Regulation in connection with Art. 4 No. 5 GDPR. ‘Pseudonymisation’ needs to be distinguished from the term ‘anonymization’ in the way that the latter, after encrypting respective personal data, cannot be reconnected to the individual.

[201] Critically in respect of practicability: Steinrötter, in: Specht/Mantz, Handbuch Europäisches und deutsches Datenschutzrecht, § 5 ePrivacy (2019), Rec. 36. Accordingly, it must be noted that technical reality characterises by effectivity and reliability of connection-making to Wifi- or Bluetooth-networks, proving a comprehensive information obligation on the one hand and a satisfactory reception of this information by the end-user on the other hand difficult. Indeed, it appears questionable, if (particularly in front of the backdrop of its rather impractical technical prerequisites) an ‘escape into consent’ or a ‘disguise as statistical survey’ actually justifies the surreptitious collection of information from an end-user.

[202] For details on the sanctioning system of the ePrivacy Regulation see Art. 23 et seq. ePrivacy Regulation.

[203] Engeler/Felber, ZD 2017, 251, 255-256.

[204] European Parliament, LIBE report A8-0324/2017, 20 October 2017, amendments 98 – 100.

[205] Council of the European Union, ST6771 2019 INIT, pp. 59-60.

[206] Art. 8 Sec. 2 of the proposal for a regulation of the European Parliament and of the Council, 10 January 2017, COM(2017) 10 final.

[207] E.g. in any navigation programme or other location-related services.

[208] Cf. in the context of Art. 89 GDPR Hense, in: Sydow, Europäische Datenschutzgrundverordnung, Art. 89 (2018), rec. 6.

[209] Buchner/Tinnefeld, in: Kühling/Buchner, DS-GVO BDSG, Art. 89 DSGO (2020), rec. 15a.

[210] Cf. Rec. 1 – 7 GDPR and Rec. 2a – 6 ePrivacy Regulation.

[211] For more details, see No. I.3.c).