PageFair writes to all EU Member States about the ePrivacy Regulation

This week PageFair wrote to the permanent representatives of all Member States of the European Union in support for the proposed ePrivacy Regulation.
Our remarks were tightly bounded by our expertise in online advertising technology. We do not have an opinion on how the proposed Regulation will impact other areas.
The letter addresses four issues:

  1. PageFair supports the ePrivacy Regulation as a positive contribution to online advertising, provided a minor amendment is made to paragraph 1 of Article 8.
  2. We propose an amendment to Article 8 to allow privacy-by-design advertising. This is because the current drafting of Article 8 will prevent websites from displaying privacy-by-design advertising.
  3. We particularly support the Parliament’s 96th and 99th amendments. These are essential to enable standard Internet Protocol connections to be made in many useful contexts that do not impact of privacy.
  4. We show that tracking is not necessary for the online advertising & media industry to thrive. As we note in the letter, behavioural online advertising currently accounts for only a quarter of European publishers’ gross revenue.

[x_button shape=”rounded” size=”regular” float=”none” href=”” info=”none” info_place=”top” info_trigger=”hover”]Read the letter [/x_button]

The digital economy requires a foundation of trust to enable innovation and growth. The enormous growth of adblocking (to 615 million active devices) across the globe proves the terrible cost of not regulating. We are witnessing the collapse of the mechanism by which audiences support the majority of online news reports, entertainment videos, cartoons, blogs, and cat videos that make the Web so valuable and interesting. Self-regulation, lax data protection and enforcement have resulted in business practices that promise a bleak future for European digital publishers.
Therefore, we commend the Commission and Parliament’s work thus far, and wish the Council (of Ministers of the Member States) well in their deliberations.

Risks to brands under new EU regulations

Brands face serious new risks under the GDPR and the ePrivacy Regulation (ePR), and agencies will not be able to shield them. This note explains why, and describes what these risks are. 
When the GDPR and the ePrivacy Regulation (ePR) apply a year from now brands that use personal data in their marketing campaigns will become exposed to new legal risks, irrespective of their arrangements with ad agencies. Though the new rules are European, the exposure will be global.
[prompt type=”left” title=”Access the GDPR/ePR repository” message=”A repository of GDPR and ePrivacy Regulation explainers, official docs, and current status.” button_text=”Access Now” href=””]
Brands are directly exposed for two reasons.

Why agencies can not shield brands

The first reason is legal. The first reason is that the text of the General Data Protection Regulation (GDPR) says that “each controller or processor shall be held liable for the entire damage”, where more than one controller or processor are “involved in the same processing”[1]. In other words, all parties involved in the use of personal data are fully liable. A brand is safe from this liability only if it can prove that it was “not in any way responsible for the event giving rise to the damage”.[3] 
The second reason is financial. The administrative fines provided for in the GDPR (and in the proposed ePR) rise to 4% of “total worldwide annual turnover of the preceding financial year”.[4] Apple, for example, will be exposed to €7.6 billion if it or service providers acting on its behalf misuse personal data.[5] For P&G the exposure will be €2.3 billion, based on its 2016 turnover,[6] and for Unilever the exposure will be €2.1 billion.[7] It is not known yet whether data protection authorities will apply these maximum fines, but this new risk is now defined in the text of the Regulations. This potential exposure is so great that agencies can not adequately indemnify brands against losses.

New pressure on brand-agency relationship

This introduces new pressures to the relationship between brands and agencies. Brand-agency contracts generally include limited indemnities. As brands become aware of the risk to which they are about to be exposed, they will expect agencies to take on far greater liability. As a result, contract negotiations between brands and agencies will become fraught.This has already happened in the cloud services industry, where both service providers and clients are acutely aware of the risks and argue fiercely over indemnity.
Not only will advertising agencies be unwilling to take on vast, new liabilities to protect their clients, they may be unable to do so too. Agencies may be inadequately covered by their insurers to offer adequate indemnities to brands. Indeed the exposure may be so great that the question may not be whether an insurer can cover the agency, but whether a re-insurance provider can cover an insurer to do so. The data leakage inherent in the online behavioural advertising system means that insurance and reinsurance companies are likely to take a dim view of the risk involved.

AnchorThree causes of brand risk

This leads to the question of what things expose brands to risk. There are three things to consider.
1. The brand’s own personal data 
The first type of exposure comes from how brands directly obtain and use personal data. They are exposed in several ways, including if they use personal data that are not compliant,[8] or if their web sites leak these data, or if the personal data they hold are otherwise breached. These causes are immediately decipherable, and are easily remedied.
Anchor2. Augmenting personal data held by the brand with broker data 
The second type of exposure is less obvious. Many brands purchase data to augment the profiles they maintain of their customers, or of advertising targets. This is seductive from a marketer’s perspective. For example, one data seller offers brands the ability to “tie online and offline data across multiple channels back to the consumer … and activate upon them everywhere”.[9] Brands can buy the location, real names, contact details, interests, purchasing history, and demographic information of customers for whom one has some data from a data broker, to “accurately identify the precise locations of your customers or prospects”.[10] 
But however attractive this may be to a marketer, these data are fraught with risk because they are derived from personal data, or are themselves personal data.[11] Combining these data of unknown provenance with a brand’s own first party data therefore exposes a brand to risk, irrespective of whether the brand’s first party data are compliant. There are four reasons for this (see box below).

The Four Dangers of Purchased Data 

  1. The purchased data are personal data, or were generated from personal data, the use of which requires informed consent from the data subject.[12] This is very unlikely to have been obtained.
  2. Personal data must be accessible, rectifiable, and portable,[13] and a person has the right to object to profiling for direct marketing.[14] These rights are very unlikely to have been adequately provided.
  3. Personal data can not include “sensitive data” as defined in the GDPR.[15]
  4. Data subjects must be informed of and able to object to automated decisions that use personal data about them, such as segmentation, where this has a material impact.[16] This is very unlikely to have been provided.

3. Using personal data in online advertising 
The third type of exposure is the most challenging. Under the new rules it will be illegal for companies anywhere in the world to pass a European user’s personal information to another company, or to store these data, without agreeing a formal contract with the “data controller” (normally this is the company that requested the data from the user in the first place) that defines limits on how the data can be used.[17]
This is challenging because the online behavioural advertising system passes personal data among countless parties including ad exchanges, retargeting systems, media owners, demand side platforms, data management platforms, and potentially among many unknown others. We drafted this 30 second explanatory video to show how sharing personal data within this system exposes both brands and agencies acting on their behalf to risk.
A brand that passes personal data to partners within this system, or pays its agency to do so on its behalf, is exposed to risk because it is impossible to agree the required contractual agreements with all of the parties that might gain access to the data. Even if a brand could conclude contracts with all foreseeable parties, the use of javascript on publishers’ websites allows nefarious and unforeseen parties to gain unauthorized access to the personal data.
[x_video_embed type=”16:9″][/x_video_embed]

Snippets of the data discussion at the World Federation of Advertisers’ Global Marketing Week in Toronto 2017


For US-based companies the new rules may seem like an unwarranted European overreaction. But it is important to note that they contain many ideas suggested by American regulators almost a decade ago.[18]
Brands have eleven months to resolve these issues before they become exposed. The first two types of risk – what brands do with their own data, and whether they contaminate these data with purchased data of unknown provenance – are comparatively easy to resolve. The third type of risk, which is inherent to the online behavioural advertising system, is far more difficult to address. But it is addressable. PageFair is now drawing together interested parties to collaborate on a Data Protection Platform that solves this problem.
Please share our call for collaboration on this with any colleagues who might be interested. We are keen to hear from agencies, brands, and publishers.
Thanks to Philip Lee, Partner, Fieldfisher LLP; Anna Buchta, Head of litigation at European Data Protection Supervisor; Rachel Glasser, Director of Digital Privacy at Groupm; Bethan Crocket at Groupm. 
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=””]

See also

See previous PageFair Insider notes on the ePrivacy Regulation and the GDPR:

PageFair statements at the European Parliament


[1] The word “processing” here means an “operation or set of operations” performed on personal data or sets of personal data. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Article 4, paragraph 2.
[2] ibid., Article 82, paragraph 5.
[3] ibid., Article 82, paragraph 3.
[4] ibid., Article 82. (See also Recital 149, which discusses criminal penalties and recovery of profits under Member State laws).
[5] $215,639 million net sales in the twelve months up to 24 September 2016. Apple Consolidated Financial Statements, 25 October 2016.
[6] $65.3 billion net sales (€58.2). “Financial highlights”, P&G annual report 2016.
[7]€52.7 billion. Unilever Annual Report And Accounts 2016, p. 23.
[8] For example, the brand would have had to inform each data subject of all purposes of to which their data will be used, and all types of parties that will receive the data when they first got the data. The GDPR, Article 13, paragraph 1, c, and 2, and Article 14, paragraph 1, c.
[9] “LiveRamp’s identity graph”, LiveRamp (URL:, last accessed 18 May 2017).
[10] Alistair Dent, “Third-Party Data Is Awesome, But Maybe Too Powerful”, Marketing Land, 18 June 2015 (URL:; see also “Micromarketer Xpress”, (URL:; and “Data brokers: a call for transparency and accountability”, Federal Trade Commission, May 2014, p. 22.
[11] For example, one global brand’s privacy policy says it may obtain information from commercial sources, including “including name, postal address, email address, date of birth, income level, household information, Your interests such as hobbies and pets, Consumer and market research data, Purchase behaviour, Publicly observed data or activities, such as blogs, videos, internet postings, and user generated content”. The policy says “All of the information we collect about you may be combined …”
P&G Global Consumer Privacy Policy, URL:, last accessed 7 April 2017.
Another top global brand’s privacy policy says the personal data it has on customers “may be combined with [information] … that is publicly available, or that we may otherwise obtain … from providers of demographic and other information, social media platforms and other third parties”.
The Coca-Cola Company Privacy Policy, February 2017, (URL:
Yet another top global brand’s privacy policy tells readers that “we may receive information about you from publicly and commercially available sources (as permitted by law), which we may combine with other information we receive from or about you.” Samsung Privacy Policy & Choices, 10 February 2015 (URL:
[12] The FTC reports that data brokers collect data from sources such as warranty registrations, consumer purchases, and website registrations and cookies. None of these are likely to meet the heightened standard for consent set in the GDPR, in Article 6. “Data brokers: a call for transparency and accountability”, Federal Trade Commission, May 2014, pp iv, v.
Indeed, it would be impossible for them to do so in many cases. Seven of the nine data brokers in the FTC’s 2014 study provided data to each other. “It would be virtually impossible for a consumer to determine how a data broker obtained his or her data; the consumer would have to retrace the path of data through a series of data brokers”. “Data brokers: a call for transparency and accountability”, Federal Trade Commission, May 2014, p. iv.
[13] The General Data Protection Regulation, Article 15, 16, 17, 18, 19, 20, and 21. Note that the FTC reported in 2014 that only two brokers allowed people to correct information about them, and four allowed people to ‘suppress’ rather than delete data about themselves. “Data brokers: a call for transparency and accountability”, Federal Trade Commission, May 2014, pp. 42-3.
[14] ibid., Recital 70, and Article 21, paragraph 2.
[15] ibid., Recital 75, and  Article 9, paragraph 1. “…data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation shall be prohibited”.
[16] ibid., Recital 71, and Article 13, paragraph 2, f, and Article 14, paragraph 2, g, and Article 15, paragraph 1, h, Article 21, paragraph 1, and Article 22.
[17] ibid., Article 28, paras. 2, 3 and 4, and Article 29.
Here is how this will operate. Current European rules require contracts between data controller and processor that guarantee that the processor handles the personal data only in the manner dictated by the controller. (see Data Protection Directive (95/46/EC) 1995 Article 17, para. 3. (URL: However, this is now backed up by new sanctions, and the GDPR will require that these contracts define the nature and duration of processing (Regulation (EU) 2016/679, Article 28, para. 3). Similar agreements must also be in place when one processor engages another (ibid., Article 28, para. 4), and a processor can only do so with express permission from the controller (ibid., Article 28, para. 2).
[18] The US FTC proposed several of the provisions of the GDPR as long ago as 2009, and again in 2012 and 2014. The US Government Accountability Office made similar calls in 2013. “Data brokers: a call for transparency and accountability”, Federal Trade Commission, May 2014, pp 5-7, 49-52.

PageFair statement at European Parliament rapporteur's ePrivacy Regulation roundtable

Lightly edited transcription of PageFair remarks at rapporteur’s sessions at the European Parliament in Brussels on 29 May 2017, concerning the ePrivacy Regulation. 
Statement at roundtable on Articles 9, and 10. 
Dr Johnny Ryan: Thank you. PageFair is a European adtech company. We are very much in support of the Regulation as proposed, in so far as it relates to online behavioural advertising (OBA). Read more

Reprieve for IT departments as EU court rules on IP addresses

If you run a website, you might want to breathe a sigh of relief. A decision[1. The text of the ruling is available in a range of European languages (excluding English as of the time of writing) at] this morning from the European Court of Justice means that websites can continue to store visitor IP addresses.
The EU Court of Justice (ECJ) ruled that IP addresses are to be considered “personal data”, which are subject to the EU’s data protection rules, but hedged against causing disruption by watering down the ruling.
From the ECJ press release:

The dynamic internet protocol address of a visitor constitutes personal data, with respect to the operator of the website, if that operator has the legal means allowing it to identify the visitor concerned with additional information about him which is held by the internet access provider.

It would have been a shock to many if the ruling had gone the other way.[1. It could have been different, if the additional clause referring to legal means had not been included, as personal data is subject to stringent protection in the EU. Interestingly, the ruling slightly diverges from an opinion of an ECJ Advocate General delivered in May. Cases before the ECJ are considered in advance by an Advocate General, who publishes a (non-binding) opinion with which the Court often agrees. In May, AG Campos Sánchez-Bordona issued an opinion on this case that agreed that dynamic IP addresses constitute personal data, but also said that these data can be processed and stored without consent in cases where this is necessary to ensure a web service’s functionality.]

Why this matters

The immediate impact of a decision stopping the logging of IP addresses would have been disruption to many websites and services. IT departments everywhere would have thrown up their hands in despair at the task of expunging IP addresses from systems and databases that have relied on them.
Web services routinely keep a log of their users’ IP addresses. These logs are used for numerous largely mundane and innocuous purposes, such as to provide customized features to particular users, to prevent or enable access to content, or to blacklist IP addresses involved in “denial of service” attacks against a site.
IP addresses are rather more valuable to other companies. For instance, some adtech companies use IP addresses to identify and target consumers. Netflix and other content providers rely on IP addresses to restrict the use of VPNs to access TV shows and movies in blocked countries.[1. Geolocation can work at just the country-level, making it unnecessary to track individual IP addresses, and there are ways for Netflix et al to prevent VPN abuse, especially as business entities do not enjoy the same protection as individuals, but such workarounds would take time and money.]
While the ruling will probably pass by unnoticed, it is clear that websites have been granted a very real (although possibly temporary[1. The General Data Protection Regulation could change the rights landscape once it is applied in May 2018, as it includes stringent rules on how websites can handle personal data.]) reprieve, as the EU has been quick to act on ECJ rulings despite potentially devastating effects on companies both in Europe and elsewhere.
[share title=”Share this Post” facebook=”true” twitter=”true” google_plus=”true” linkedin=”true” pinterest=”true” reddit=”true” email=”true”]

Background to the ECJ’s decision

The ECJ was asked to rule on two issues:

  1. whether an IP address is personal data,[1. See (accessed October 11, 2016). Also, this is not the first time that the ECJ has concluded that IP addresses could be considered personal data. In Case C-70/10 Scarlet Extended SA v SABAM, a dispute between an ISP and a company “responsible for authorising the use by third parties of the musical works of authors, composers and editors”, the ECJ ruled that the ISP, Scarlet, could not be compelled to install a filtering system to detect and prevent the unlawful exchange of copyrighted works, as

    …the filtering system would also be liable to infringe the fundamental rights of its customers, namely their right to protection of their personal data and their right to receive or impart information, which are rights safeguarded by the Charter of Fundamental Rights of the EU. It is common ground, first, that the injunction would involve a systematic analysis of all content and the collection and identification of users’ IP addresses from which unlawful content on the network is sent. Those addresses are protected personal data.

    However, while it opened the door to the classifying of IP addresses as personal data and was referenced in the Breyer opinion, AG Campos Sánchez-Bordona noted that the SABAM case was “in a context in which the collection and identification of IP addresses was carried out by the Internet service provider”. Today’s judgement has farther-reaching consequences: the ISPs in the SABAM case already knew who their customers are, whereas the Breyer case affects any and all websites.] and

  2. whether the practice of logging IP addresses without consent was legal.[1. Or, more precisely, in accordance with the relevant provision of the German Telemedia Act, which states that a website provider may collect and process the personal data of users without their consent only to the extent it is necessary to (1) enable the general functionality of the website or (2) arrange payment. In addition, the relevant provision of the Telemedia Act states that enabling the general functionality of the website does not permit user data to be processed after the user closes, or navigates away from, the website.]

This followed eight years of litigation in various German courts[1. From Amtsgericht to Landgericht to Bundesgerichtshof.], which initiated in an action taken against the German government by Patrick Breyer, a member of Germany’s Pirate Party.[1. Case C-582/14 Patrick Breyer v Bundesrepublik Deutschland. EUR-Lex. (accessed October 11, 2016).] Breyer argued that government websites did not have an unrestricted right to indefinitely record the IP addresses of visitors without their consent.
Although IP addresses on their own are largely innocuous, Breyer gave two ways that government websites could combine IP addresses with other data to identification of an individual.
First, internet service providers (ISP) record customers’ real names and addresses, and assign their IP addresses. It is not inconceivable that a government could gain access to these records and connect a person’s real identity to their IP address.
Second, when combined with pages visited or search terms, IP addresses can provide an extensive profile of the visitor’s “political opinion, illnesses, religion, union affiliation” and more.[1. Translated from the original German suit brought by Breyer: (accessed October 12, 2016).]

The ruling

Today’s ruling will probably allow the German Supreme Court to rule against Breyer, as it effectively states that:

  1. a dynamic IP address constitutes personal data for a website operator only if it has the legal means enabling it to identify the visitor with the help of additional information from the ISP
  2. a website operator may collect and store personal data without consent for an indeterminate period so as to ensure the continued functioning of the website

[x_promo image=””]


JANUARY 3, 2008
Patrick Breyer asks Berlin local court to stop German government websites logging IP addresses
AUGUST 13, 2008
Local court dismisses case, arguing that an IP address is insufficient to identify an individual
JANUARY 31, 2013
Breyer appeals decision to Berlin district court, which orders German government to cease unrestricted logging of IP addresses
SEPTEMBER 16, 2014
German Federal Court of Justice addresses appeals from both parties
DECEMBER 17, 2014
German Federal Court of Justice refers two questions to European Court of Justice
MAY 12, 2016
Advocate General Sánchez-Bordona delivers his non-binding but influential opinion
OCTOBER 19, 2016
European Court of Justice rules that IP addresses are personal data under some circumstances
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=””]