Posts

PageFair writes to all EU Member States about the ePrivacy Regulation

This week PageFair wrote to the permanent representatives of all Member States of the European Union in support for the proposed ePrivacy Regulation.
Our remarks were tightly bounded by our expertise in online advertising technology. We do not have an opinion on how the proposed Regulation will impact other areas.
The letter addresses four issues:

  1. PageFair supports the ePrivacy Regulation as a positive contribution to online advertising, provided a minor amendment is made to paragraph 1 of Article 8.
  2. We propose an amendment to Article 8 to allow privacy-by-design advertising. This is because the current drafting of Article 8 will prevent websites from displaying privacy-by-design advertising.
  3. We particularly support the Parliament’s 96th and 99th amendments. These are essential to enable standard Internet Protocol connections to be made in many useful contexts that do not impact of privacy.
  4. We show that tracking is not necessary for the online advertising & media industry to thrive. As we note in the letter, behavioural online advertising currently accounts for only a quarter of European publishers’ gross revenue.

[x_button shape=”rounded” size=”regular” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/03/PageFair-letter-on-ePrivacy-to-perm-reps-13-March-2018.pdf” info=”none” info_place=”top” info_trigger=”hover”]Read the letter [/x_button]

The digital economy requires a foundation of trust to enable innovation and growth. The enormous growth of adblocking (to 615 million active devices) across the globe proves the terrible cost of not regulating. We are witnessing the collapse of the mechanism by which audiences support the majority of online news reports, entertainment videos, cartoons, blogs, and cat videos that make the Web so valuable and interesting. Self-regulation, lax data protection and enforcement have resulted in business practices that promise a bleak future for European digital publishers.
Therefore, we commend the Commission and Parliament’s work thus far, and wish the Council (of Ministers of the Member States) well in their deliberations.

Overview of how the GDPR impacts websites and adtech (IAPP podcast)

In this podcast, the International Association of Privacy Professionals interviews PageFair’s Dr Johnny Ryan about the challenges and opportunities of new European privacy rules for website operators and brands. 
Update: 3 January 2018: This podcast was the International Association of Privacy Professionals’ most listened to podcast of 2017. 

The conversation begins at 4m 14s, and covers the following issues.

  • Risks for website operators
  • How “consent” is an opportunity for publishers to take the upper hand in online media
  • Brands’ exposure to legal risk, and the agency / brand / insurer conundrum
  • Personal data leakage in RTB / programmatic adtech
  • How the adtech industry should adapt

As we told Wired some months ago, it’s not just that websites might expose yourself to litigation, it’s that you might expose your advertisers to litigation too. But this can be fixed.
Click here to view PageFair’s repository of explainers, analysis, and official documents about the new privacy rules.
Elsewhere you can find details about PageFair’s GDPR solutions for website operators.
Note: the IAPP published this podcast this month. The interview was conducted several months ago. 
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Frequency capping and ad campaign measurement under GDPR

This note describes how ad campaigns can be measured and frequency capped without the use of personal data to comply with the GDPR. 
It is likely that most people will not give consent for their personal data to be used for ad targeting purposes by third parties (only a small minority [1] of people online are expected to consent to third party tracking for online advertising). Even so, sophisticated measurement and frequency capping are possible for this audience.
This note briefly outlines how to conduct essential measurement (frequency capping, impression counting, click counting, conversion counting, view through measurement, and viewability measurement) in compliance with the EU’s General Data Protection Regulation. This means that publishers and advertisers can continue to measure the delivery of the ads that sustain their businesses, while simultaneously respecting European citizens’ right to protection of their personal data.
Note that this discussion assumes that the final text of the EU’s ePrivacy Regulation will not incidentally illegalize non-tracking cookies (i.e., cookies that neither contain nor reveal any personal data, and therefore pose no privacy risks) [2].
Table: cleaner ad tech measurement [3]

 

Frequency capping, without personal data

Most of today’s ad servers implement frequency capping by using a server-side database to store the number of times an ad campaign has been shown to each user. Each user is tracked using a unique ID, which is stored both in the database and in a 3rd party cookie in the user’s browser [4]. Since this user ID could be used to track what websites the user is visiting, and could potentially be matched against other online trackers and offline data, this will be illegal under GDPR (unless the user has specifically consented to it).
A privacy-by-design alternative is to get rid of the user ID and move the counter directly into the cookie. The cookie it is stored in can have an expiry equal to the maximum amount of time the campaign should be capped for, and the name of the cookie can be set to the name of the ad campaign. Variations on this approach have been discussed for years – see Arvind Narayanan and Jonathan Mayer’s approach here. Since the counter does not contain any information specific to a particular user, it is not “personal data” under GDPR, and is not subject to consent.
Two inefficiencies of this approach, storage and bandwidth, are addressed below.
First, how much storage space will all those frequency capping cookies take up in the web browser? So long as the cookie expiry dates are reasonable, this data should be proportional to the number of ad campaigns delivered to a browser over a one- or two-week time window, and should not grow beyond that. Even if a user manages to view a hundred thousand different advertising campaigns in a two week period, that would still require no more than a few megabytes to store.
Second, how much bandwidth might be consumed by transmitting all these view counters along with every request to the ad server? This can be reduced by being efficient in the encoding of the cookie data. As shown in the inset below,  transmitting a frequency counter in the ad server cookie could take as little as 9 bytes of extra bandwidth, which means that thousands of counters could be transmitted without significantly impacting the weight of a modern web page.

Calculating potential bandwidth requirements
The name of the cookie might be used to store a campaign ID, efficiently encoded using all the characters available according to RFC 6265 (i.e., “A-Z”, “a-z”, “0-9” and “!#$%&’*+-.^_`|~”). That’s 77 characters, meaning that just four characters can be used to encode over 35 million (i.e. 774) different IDs, which is easily sufficient for all ad campaigns that an ad server might be managing in a given period. Meanwhile, the value portion of the cookie needs only contain a counter, which can be 2 characters long (to store a value up to 99 in decimal, or about 6,000 in base-77 encoding.With this system in place, a browser request to an ad server might consist of the following HTTP request:

GET /ad HTTP/1.1
Host: acmeadserver.com
Cookie: 2iP&=21; Rz%x=13

The “Cookie:” field concatenates all cookies previously stored by the ad server. We can see that the bandwidth consumed by each frequency counter is only 9 bytes in total: 4 characters for the campaign ID, 2 characters for the counter value, and 3 counters for the equals sign, the semi-colon delimiter and the space. It would only take 20 Kb to transmit over two thousand frequency counters in an ad call.

There is a further opportunity to optimize bandwidth, with the help of header bidding. Because header bidding sends multiple potential bids to the client, the client can do the work of deciding which ones have not reached a frequency cap and are therefore eligible for display. SSPs are currently moving from second-price auctions to first-price auctions to better support header bidding, and in time are likely to start returning multiple bids to header bidder wrappers, so that more bids can participate in the final auction. This will provide plenty of choice to a client-side frequency capping algorithm, and probably entirely eliminate the need for the frequency capping data to ever leave the browser.

Campaign Metrics

Campaign metrics allow advertisers and media owners to see how different advertising campaigns are performing, and to optimize campaigns if necessary. The typical metrics are impression counts, click counts, conversion counts, view-through measurement and viewability measurement.
Fortunately, none of these metrics concern individual people. Therefore, ad servers can avoid tracking information at an individual user level to ensure GDPR compliance. It is likely that many of today’s ad servers have not been so careful, and have implementations that currently depend on counting the same user ID that was used for frequency capping. These ad servers will need to consider providing alternative implementations when serving ads to EU users.
In the paragraphs below we review typical campaign metrics for likely GDPR compliance, and suggest alternative implementations where appropriate.

Impression Counting

Impression counting is normally implemented by incrementing a database counter pertaining to the ad campaign whenever a request is made to the ad server for that campaign, or when an impression pixel specific to that campaign is loaded from the ad server by the web browser. Basic impression counting should not pose a problem under GDPR, as no user-specific information is processed or stored [5].

Click Counting

Click measurement is normally performed by redirecting the browser to the ad server when the ad is clicked on, at which point it registers the click event, and then redirects the browser to the advertiser URL.
Like basic impression counting, click counting should not be problematic under GDPR, as all that is required is an overall counter of the number of times that an ad has been clicked on across all users, without the involvement of any user-specific information.
When the number of clicks is known, the click-through-rate is given by dividing the number of clicks by the number of impressions for any given campaign.

Conversion counting

Conversion counting is normally performed by obtaining a campaign-specific pixel from the ad server, and placing that pixel on a web page the user will be brought to when they complete a transaction with the advertiser (when they “convert”). When a user who has clicked on an ad “converts”, the conversion pixel will be loaded from the ad server, which will increase the conversion count for that campaign by one.
As with impression counting and click counting, there is no specific privacy concern here: only campaign-level data is used, and no user-specific information is processed.

View Through Measurement

Although click-through rates help an advertiser understand the immediate positive reaction of users who see their ad, many are also interested in the indirect response, e.g., how many people who saw the ad went on to buy the product during the subsequent weeks regardless of clicking on the ad?
It is possible that some ad servers currently perform view-through measurement using user-specific information. For example, the ad server might record that an ad was viewed by a particular user ID. When the conversion pixel loads on the advertiser’s post-conversion page, the ad server could look up the details of the last time that user ID viewed that campaign.
The above implementation would be incompatible with GDPR, as it involves tracking the behavior of unique users. Fortunately, there are alternative implementations that are equally effective.
The correct approach is to use a non-tracking cookie to store the fact that the user has viewed the ad campaign. This means that the cookie would contain the ID of the ad campaign, not an ID of the user (and consequently, every user who saw that campaign would have an identical cookie). This cookie should be set to expire automatically when a certain amount of time has passed, beyond which the advertiser is not interested in attributing the visit to the fact the ad was seen. In this system, when the user eventually converts for the advertiser, the cookie containing the campaign ID is transmitted to the advertiser’s ad server, which can then increment the relevant view-through counter.
It is possible that this approach could lead to a lot of cookie data being transmitted and consuming bandwidth. To mitigate this, the view-through pixel should be homed on a domain unique to each advertiser, rather than every advertiser sharing the domain of their ad server.

Viewability Measurement

Viewability measurement allows advertisers to understand how many of the ads they pay to serve on a web site are likely to scroll into view and be displayed for long enough for a user to potentially notice them.
Viewability measures a characteristic of websites, not users, and can therefore be implemented in a GDPR compatible fashion. Although some viewability systems today might store per-user information, there is no fundamental need to do so. All that is required is to detect when a view event has occurred for each ad space, and to send that to the server to be counted. The server will then aggregate these events and provide an overall count of the number of times each ad space has been viewable by any user in each time period.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Notes

[1] See “Europe Online: an experience driven by advertising”, GFK, 2017, p. 7 and “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, 12 September 2017.
[2] For an overview of the issues see a previous PageFair Insider note, “The Privacy Case for Non-Tracking Cookies: PageFair writes to the European Parliament”, PageFair Insider, 10 August 2017 (URL:  https://pagefair.com/blog/2017/non-tracking-cookies/).
[3] Clearly, a campaign that targeted a single individual without a legal basis for doing so would be illegal. It is important that campaigns must target more than a small set of viewers.
[4] These data are automatically transmitted to the ad server along with every request.
[5] Ad servers may also support the counting of “unique impressions”, which means the number of unique users who saw the campaign. This mechanism generally relies on tagging each user with a unique identifier, and counting the number of unique identifiers. Therefore, while impression counting is practical, unique impressions may not be because a unique identifier could be misused.

Adtech consent is meaningless unless one stops data leakage

Websites and advertisers can not prevent personal data from leaking in programmatic advertising. If not fixed, this will render consent to use personal data meaningless. 
The GDPR applies the principle of transparency:[1] People must be able to easily learn who has their personal data, and what they are doing with it.
Equally importantly, people must have surety that no other parties receive these data.
It follows that consent is meaningless without enforcement of data protection: unless a website prevents all data leakage, a visitor who gives consent cannot know where their data may end up.
But the online advertising system leaks data in two ways. This exposes brands, agencies, websites, and adtech companies to legal risk.
How data leakage happens 
If “programmatic”advertising or “real time bidding” was ever a mystery to you, take 43 seconds to watch this PageFair video. It shows the process in which an advertiser decides that a person visiting a website is the right kind of person to show an ad to (click full screen).


This system was not built for data protection. Instead, it was built to enable hundreds of businesses to trade personal data about the people visiting websites, to determine what ads to show them, and what advertisers should pay to show those ads.
The next video shows what happens to personal data in this system. It illustrates each step in the selection and delivery of a single ad. (33 seconds)


The yellow arrows in this video show who ad exchanges and other advertising technology services share data about the website visitor with.
These data include the ad exchange’s own identifier on the user, the URL the user is visiting, the user’s IP address, and the details of the user’s browser and system.
Hundreds of parties receive these data in the milliseconds before an ad is shown.
To complicate matters, some websites work with more than one ad exchange, conducting a mega auction known as “header bidding”, to solicit the more bids for their ad units. The following video is 68 seconds long, and shows how this works.


Conclusion: there are two problems 
First, personal data about a website visitor are shared with hundreds of parties every time the website requests an ad through one or more ad exchanges. There is nothing to prevent these hundreds of parties from leaking these data to anyone else. This must be controlled.
Second, the advertisement that the website visitor is shown, once the bidding process concludes, often contains JavaScript. This code can then summon trackers (or worse). This, also, must be controlled.
The forthcoming Open RTB 3.0 specification contains measures to limit data leakage.[2] But not to stop it entirely. Unless publishers can exercise complete enforcement of data protection on their sites, consent is meaningless. Moreover, so long as data can leak then all parties involved are exposed to legal hazard: publishers and their partners too.[3]
PageFair has been developing a solution to this problem.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Notes

[1] The GDPR, Article 5, paragraph 1 (a), Article 12, Recital 39, and 60.
[2] Websites can prevent specific companies from placing bids to buy ad space on their pages. Websites can also reject JavaScript ads from unauthorised domains. The specification also contains scope for “whitelisted/blacklisted JS trackers”. Open RTB 3.0 draft specification, IAB TechLab, September 2017, pp 14, 27, and AdCom draft specification, IAB TechLab, September 2017, p. 14.
[3] The GDPR, Article 82, paras. 1, 3 – 4, Recital 146. After judgement the processors or controllers who have paid full compensation can claim back part of the compensation from processors or controllers also responsible (ibid., Article 82, para. 5).

Research result: what percentage will consent to tracking for advertising?

This note presents the results of a survey of 300+ publishers, adtech, brands, and various others, on whether users will consent to tracking under the GDPR and the ePrivacy Regulation. 
In early August we published a note on consent, and asked whether people would click “yes”. We would like to thank the 300+ colleagues who responded to our research request. Now we present the results.
UPDATE: 9 January 2018, SEE  MOST RECENT PAGEFAIR INSIDER NOTE ON GDPR CONSENT DIALOGUES from 8 January 2018.  

Tracking for a single brand, on a single site.

305 respondents were asked by a publisher to permit a named brand and its analytics partners to track them on the site. A previous note explains the design of this notice.

It is important to note that this is a limited consent notice. It asks to track behaviour on one site only, and for one brand only, in addition to “analytics partners”. This notice would not satisfy regulators if it were used to cover the vast chain of controllers and processors involved in conventional behavioural targeting.
Even so, four fifths (79%) of respondents said they would click “No” to this limited consent request.

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2017/09/PAGEFAIR-consent-survey-charts.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download PDF of all charts (high resolution) [/x_button]
Only 21% said they would click “OK”. Moreover, as the chart below shows, only 14% were “very highly” or “highly” confident that the average user would also do so.
Respondents are concerned about how their own behaviour online is tracked for advertising, and would avail of proposed measures in the ePrivacy Regulation to protect themselves. Two thirds (67%) of respondents reported being “very highly” or “highly” concerned about their online behaviour being tracked  (and half of these said they were “very highly” concerned).

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2017/09/PAGEFAIR-consent-survey-charts.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download PDF of all charts (high resolution) [/x_button]

Device tracking preferences

Respondents were shown a tracking preferences menu of the kind proposed in the latest draft of the ePrivacy Regulation.[1] They were asked what they would select if shown the message on their own device.
This shows “Accept only first party tracking” selected by default, as proposed in European Parliament rapporter’s draft report.)[2] However, only 20% of respondents said they would select this.
Only 5% were willing to “accept all tracking”. 56% said they would select “Reject tracking unless strictly necessary for services I request”.
The very large majority (81%) of respondents said they would not consent to having their behaviour tracked by companies other than the website they are visiting. Users’ apparent allowance for 1st parties, but objection to 3rd parties, should be heartening for publishers.

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2017/09/PAGEFAIR-consent-survey-charts.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download PDF of all charts (high resolution) [/x_button]

The consenting audience will be tiny

Only a very small proportion (3%) believe that the average user will consent to “web-wide” tracking for the purposes of advertising (tracking by any party, anywhere on the web).
However, almost a third believe that users will consent if forced to do so by “tracking walls”, that deny access to a website unless a visitor agrees to be tracked. Tracking walls, however, are prohibited under Article 7 of the GDPR, the rules of which are already formalised and will apply in law from late May 2018.[3] 

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2017/09/PAGEFAIR-consent-survey-charts.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download PDF of all charts (high resolution) [/x_button]

Publishers see opportunity in adtech need to seek consent 

If adtech companies persist in using old-school personal data, rather than transition to safer non-personal data technologies, then they are likely to have to rely on publishers to facilitate consent requests to their users.  A large majority of publishers viewed this as a potential commercial opportunity. 27% of publishers said “yes”, and 34% said “maybe”, when asked if they saw a potential commercial opportunity in their ability to seek consent from data subjects on behalf of adtech companies that have no direct relationship with them.

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2017/09/PAGEFAIR-consent-survey-charts.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download PDF of all charts (high resolution) [/x_button]
Most adtech colleagues (74%) also anticipate that they may have to compensate publishers for the opportunity to seek consent from their visitors. (Caveat: the sample included 19 adtech respondents, of the 305 total).


Safer ads, and safer data, rather than consent.

These results are not definitive, and we have not undertaken a full scale research project in the production of this note. They do, however, seem to reflect the reality. We take heart from the publication of a survey of some 11,000 respondents by GFK, commissioned by trade groups as supporting collateral in their lobbying against measures in the ePrivacy Regulation. The GFK study notes that only “20% would be happy for their data to be shared with third parties for advertising purposes”.[4] This is a remarkable conclusion for a study that argues for, rather than against, old-school behavioural targeting, because it shows that few will opt-in. The parties involved in the study should be commended for including it, and not burying it. We do note, however, that the finding was not given the headline status that it warranted.
The results presented in this note should trouble any industry colleagues who plan to tackle GDPR and the ePrivacy Regulation by seeking consent, as a means to process personal data largely in the same way as usual. It appears that consent may not be forthcoming.
Adtech must rapidly transition from using old-school personal data to safer non-personal data technologies. We can draw inspiration from the automobile industry, which is transitioning from high pollution combustion technology to electric in response to regulatory pressure.
Safer data and safer advertising can enable programmatic buying, and sophisticated targeting, without requiring consent. The market parallel with the auto industry should be heartening for publishers, since if all petrol and diesel engines were to be outlawed in 2018 without some special dispensation, then demand for – and prices of – hybrid and electric vehicles would rise.

Read next:

Implications for Google and Facebook

[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Notes

[1] Rapporteur’s draft report on the proposal for a regulation of the European concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC, June 2017.
[2] ibid., Recital 23. 
[3] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1.
See Recital 42’s reference to “without detriment”, and Recital 43’s discussion of “freely given” consent, and Article 7(2) prohibition of conditionality. See also the UK Information Commissioner’s Office’s draft guidance on consent, 31 March 2017, p. 21, which explicitly prohibits “tracking walls”.
[4] “Europe Online: an experience driven by advertising”, GFK, 2017, p. 7. (URL: https://www.iabeurope.eu/wp-content/uploads/2017/09/EuropeOnline_FINAL.pdf).

Why the GDPR ‘legitimate interest’ provision will not save you

The “legitimate interest” provision in the GDPR will not save behavioral advertising and data brokers from the challenge of obtaining consent for personally identifiable data.

As previous PageFair analysis illustrates, personal data will become toxic except where it has been obtained and used with consent once the General Data Protection Regulation is applied in May 2018.
[prompt type=”left” title=”Access the GDPR/ePR repository” message=”A repository of GDPR and ePrivacy Regulation explainers, official docs, and current status.” button_text=”Access Now” href=”https://pagefair.com/datapolicydocs/”]
Even so, many advertising intermediaries believe that they can continue to use personal data without consent because of an apparent carve-out related to “legitimate interest” contained in the GDPR. This is a false hope.

Legitimate interest

The GDPR does indeed provide for “legitimate interest” as a legal basis for using personal data without obtaining consent.[1] A legitimate interest provision was also included in the previous Data Protection Directive 95/46/EC.[2] However, the GDPR now includes an explicit mention of direct marketing as a legitimate interest (in Recital 47),[3] which has lured many adtech businesses into the comfortable but erroneous supposition that they will not have to ask people for permission use their personal data.

A legitimate interest is a clearly articulated benefit to a single company, or to society as a whole,[4] that can be derived from processing personal data in a lawful way.[5] However, the Article 29 Working Party of data protection authorities of EU countries has already made it clear that merely having a legitimate interest does not entitle one to use personal data.[6]

The objective of the “legitimate interest” provision is to give controllers “necessary flexibility for data controllers for situations where there is no undue impact on data subjects”.[7] The Article 29 Working Party cautioned that it is not to be used “on the basis that it is less constraining than the other grounds”.[8] In other words, it is not a get-out-of-jail-free card.

Under the Data Protection Directive that preceded the GDPR some EU countries viewed it as “an ‘open door’ to legitimize any data processing which does not fit in one of the other legal grounds.”[9] This will end with the GDPR, which harmonizes the approach across all the countries of the European Union.

The balancing test 

Article 6 (f) of the GDPR includes the following important caveat: “except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject”.[10] In other words, a business that intends to use personal data must balance its legitimate interest not only against the rights of the data subject, which is a significant test in itself,[11] but also the data subject’s interests, irrespective of whether these interests are legitimate or not.[12] Any company that hopes to use legitimate interest also bears the onus for demonstrating that its interest is favored in such a balancing test.[13] 

This is not a figurative exercise. The Article 29 Working Party cautions that the balancing test should be documented in such a way that data subjects, data authorities, and the courts can examine.[14] It should encompass a broad range of factors[15] including “any possible (potential or actual) consequences of data processing”.[16] This would include, for example, “broader emotional impacts” and the “chilling effect on … freedom of research or free speech, that may result from co­ntinuous monitoring/tracking”.[17] 

The test also must consider the manner in which personal data are processed. For example,

“whether large amounts of personal data are processed or combined with other data (e.g. in the case of profiling…). Seemingly innocuous data, when processed on a large scale and combined with other data may lead to inferences about more sensitive data”.[18] 

Europe’s data protection authorities take a dim view of such large scale processing: ­­­­

“Such analysis may lead to uncanny, unexpected, and sometimes also inaccurate predictions, for example, concerning the behavior or personality of the individuals concerned. Depending on the nature and impact of these predictions, this may be highly intrusive to the individual’s privacy”.[19] 

A further factor in the balancing test is mentioned in Recital 47 of the GDPR: “…taking into consideration the reasonable expectation of data subjects based on their relationship to the controller”.[20] A business involved in digital advertising must ask the following question: Is it reasonable to assume that a regular person who peruses the web expects that their behavior is being tracked and measured, consolidated across devices, and that the results of these operations are being traded between different companies that he or she has never heard of, and retained for further trading and consolidation over considerable periods of time?

Behavioral advertising and data-brokering must be based on consent 

The legitimate interest provision in the GDPR sets a high bar. Indeed, the Working Party’s concern about the negative impacts of personal data misuse is so broad as to encompass those that result from many cumulative actions, and where “it may be difficult to identify which processing activity by which controller played a key role”.[21] This is bad news for the cascade of cookie syncing and data trading typical of behavioral advertising.

The Article 29 Working Party has considered what a balancing test would yield where behavioral advertising is concerned. It concluded that “consent should be required, for example, for tracking and profiling for purposes of … behavioral advertising, data-brokering, … [and] tracking-based digital market research”.[22]

The Working Party regards the balance as follows: “the economic interest of business organizations to get to know their customers by tracking and monitoring their activities online and offline” must be balanced “against the (fundamental) rights to privacy and the protection of personal data of these individuals and their interest not to be unduly monitored”.[23]

Consent – and nothing short of it – is the necessary legal basis for processing personally identifiable for behavioral advertising.

Two options  

Therefore, hundreds of adtech companies, who who cannot legitimately obtain the personal data they depend on, are facing a huge challenge. There are two categories of options.

Option 1. Invest heavily in obtaining consent

For the majority of advertising intermediaries this will require reaching an accommodation with publishers who have direct and trusted relationships with end-users. Whatever this accommodation is, it is likely to tip the balance of power away from adtech and back in favor of publishers. Publishers may recover some of the marketing spend that they lost to the many advertising technology companies of the Lumascape in the shift to digital. As we have suggested previously, mergers with, or acquisition of, media properties may be one way for global advertising holding companies to buy trusted first party relationships with end-users, and establishing a means of requesting end-users consent.

Option 2. Avoid the GDPR’s liabilities and regulatory overhead with a no personally identifiable data approach

Programmatic and behavioral advertising are possible without personally identifiable data. A personal data firewall can free brands and intermediaries from the GDPR’s new liabilities and regulatory overhead by anonymizing data while delivering relevant advertising.

We will be writing more about this.

 

Invitation:

RightsCon, Brussels, March 29, 5.15pm – 6.15pm

I will be on the EDRi panel at RightsCon, alongside representatives of the European Data Protection Supervisor and the IAB. Please come and say hello.

[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Notes

[1] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Article 6, paragraph 1, f.

[2] Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, Article 7 (f).

[3] “The legitimate interests of a controller, including those of a controller to which the personal data may be disclosed, or of a third party, may provide a legal basis for processing, provided that the interests or the fundamental rights and freedoms of the data subject are not overriding, taking into consideration the reasonable expectations of data subjects based on their relationship with the controller. Such legitimate interest could exist for example where there is a relevant and appropriate relationship between the data subject and the controller in situations such as where the data subject is a client or in the service of the controller. At any rate the existence of a legitimate interest would need careful assessment including whether a data subject can reasonably expect at the time and in the context of the collection of the personal data that processing for that purpose may take place. The interests and fundamental rights of the data subject could in particular override the interest of the data controller where personal data are processed in circumstances where data subjects do not reasonably expect further processing. Given that it is for the legislator to provide by law for the legal basis for public authorities to process personal data, that legal basis should not apply to the processing by public authorities in the performance of their tasks. The processing of personal data strictly necessary for the purposes of preventing fraud also constitutes a legitimate interest of the data controller concerned. The processing of personal data for direct marketing purposes may be regarded as carried out for a legitimate interest.” Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Recital 47.

[4] Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC, 9 April 2014, p. 10.

[5] ibid., pp 10-11.

[6] ibid., p. 25.

[7] ibid., p. 10.

[8] ibid,, p. 3.

[9] ibid., p. 5.

[10] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Article 6, para 1 (f).

[11] Data protection is a fundamental right in European Law. Article 8 of The European Charter of Fundamental Rights enshrines the right of every citizen to “the protection of personal data concerning him or her”. The European Union Charter of Fundamental Rights, Article 8, paragraph 1. “Such data must be processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law”. The European Union Charter of Fundamental Rights, Article 8, paragraph 2.

[12] Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC, 9 April 2014, p. 9, 30.

[13] ibid., p. 52.

[14] ibid., p. 43, 53-54.

[15] ibid., pp 33, 50-51, 55-56.

[16] ibid., p. 37.

[17] ibid., p. 37.

[18] ibid., p. 39.

[19] ibid., p. 39.

[20] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Recital 47.

[21] Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC, 9 April 2014, p. 37.

[22] ibid., p. 46.

[23] ibid.

Why the GDPR 'legitimate interest' provision will not save you

The “legitimate interest” provision in the GDPR will not save behavioral advertising and data brokers from the challenge of obtaining consent for personally identifiable data.
As previous PageFair analysis illustrates, personal data will become toxic except where it has been obtained and used with consent once the General Data Protection Regulation is applied in May 2018.
[prompt type=”left” title=”Access the GDPR/ePR repository” message=”A repository of GDPR and ePrivacy Regulation explainers, official docs, and current status.” button_text=”Access Now” href=”https://pagefair.com/datapolicydocs/”]
Even so, many advertising intermediaries believe that they can continue to use personal data without consent because of an apparent carve-out related to “legitimate interest” contained in the GDPR. This is a false hope.
Legitimate interest
The GDPR does indeed provide for “legitimate interest” as a legal basis for using personal data without obtaining consent.[1] A legitimate interest provision was also included in the previous Data Protection Directive 95/46/EC.[2] However, the GDPR now includes an explicit mention of direct marketing as a legitimate interest (in Recital 47),[3] which has lured many adtech businesses into the comfortable but erroneous supposition that they will not have to ask people for permission use their personal data.
A legitimate interest is a clearly articulated benefit to a single company, or to society as a whole,[4] that can be derived from processing personal data in a lawful way.[5] However, the Article 29 Working Party of data protection authorities of EU countries has already made it clear that merely having a legitimate interest does not entitle one to use personal data.[6]
The objective of the “legitimate interest” provision is to give controllers “necessary flexibility for data controllers for situations where there is no undue impact on data subjects”.[7] The Article 29 Working Party cautioned that it is not to be used “on the basis that it is less constraining than the other grounds”.[8] In other words, it is not a get-out-of-jail-free card.
Under the Data Protection Directive that preceded the GDPR some EU countries viewed it as “an ‘open door’ to legitimize any data processing which does not fit in one of the other legal grounds.”[9] This will end with the GDPR, which harmonizes the approach across all the countries of the European Union.
The balancing test 
Article 6 (f) of the GDPR includes the following important caveat: “except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject”.[10] In other words, a business that intends to use personal data must balance its legitimate interest not only against the rights of the data subject, which is a significant test in itself,[11] but also the data subject’s interests, irrespective of whether these interests are legitimate or not.[12] Any company that hopes to use legitimate interest also bears the onus for demonstrating that its interest is favored in such a balancing test.[13] 
This is not a figurative exercise. The Article 29 Working Party cautions that the balancing test should be documented in such a way that data subjects, data authorities, and the courts can examine.[14] It should encompass a broad range of factors[15] including “any possible (potential or actual) consequences of data processing”.[16] This would include, for example, “broader emotional impacts” and the “chilling effect on … freedom of research or free speech, that may result from co­ntinuous monitoring/tracking”.[17] 
The test also must consider the manner in which personal data are processed. For example,

“whether large amounts of personal data are processed or combined with other data (e.g. in the case of profiling…). Seemingly innocuous data, when processed on a large scale and combined with other data may lead to inferences about more sensitive data”.[18] 

Europe’s data protection authorities take a dim view of such large scale processing: ­­­­

“Such analysis may lead to uncanny, unexpected, and sometimes also inaccurate predictions, for example, concerning the behavior or personality of the individuals concerned. Depending on the nature and impact of these predictions, this may be highly intrusive to the individual’s privacy”.[19] 

A further factor in the balancing test is mentioned in Recital 47 of the GDPR: “…taking into consideration the reasonable expectation of data subjects based on their relationship to the controller”.[20] A business involved in digital advertising must ask the following question: Is it reasonable to assume that a regular person who peruses the web expects that their behavior is being tracked and measured, consolidated across devices, and that the results of these operations are being traded between different companies that he or she has never heard of, and retained for further trading and consolidation over considerable periods of time?
Behavioral advertising and data-brokering must be based on consent 
The legitimate interest provision in the GDPR sets a high bar. Indeed, the Working Party’s concern about the negative impacts of personal data misuse is so broad as to encompass those that result from many cumulative actions, and where “it may be difficult to identify which processing activity by which controller played a key role”.[21] This is bad news for the cascade of cookie syncing and data trading typical of behavioral advertising.
The Article 29 Working Party has considered what a balancing test would yield where behavioral advertising is concerned. It concluded that “consent should be required, for example, for tracking and profiling for purposes of … behavioral advertising, data-brokering, … [and] tracking-based digital market research”.[22]
The Working Party regards the balance as follows: “the economic interest of business organizations to get to know their customers by tracking and monitoring their activities online and offline” must be balanced “against the (fundamental) rights to privacy and the protection of personal data of these individuals and their interest not to be unduly monitored”.[23]
Consent – and nothing short of it – is the necessary legal basis for processing personally identifiable for behavioral advertising.
Two options  
Therefore, hundreds of adtech companies, who who cannot legitimately obtain the personal data they depend on, are facing a huge challenge. There are two categories of options.
Option 1. Invest heavily in obtaining consent
For the majority of advertising intermediaries this will require reaching an accommodation with publishers who have direct and trusted relationships with end-users. Whatever this accommodation is, it is likely to tip the balance of power away from adtech and back in favor of publishers. Publishers may recover some of the marketing spend that they lost to the many advertising technology companies of the Lumascape in the shift to digital. As we have suggested previously, mergers with, or acquisition of, media properties may be one way for global advertising holding companies to buy trusted first party relationships with end-users, and establishing a means of requesting end-users consent.
Option 2. Avoid the GDPR’s liabilities and regulatory overhead with a no personally identifiable data approach
Programmatic and behavioral advertising are possible without personally identifiable data. A personal data firewall can free brands and intermediaries from the GDPR’s new liabilities and regulatory overhead by anonymizing data while delivering relevant advertising.
We will be writing more about this.
 
Invitation:
RightsCon, Brussels, March 29, 5.15pm – 6.15pm
I will be on the EDRi panel at RightsCon, alongside representatives of the European Data Protection Supervisor and the IAB. Please come and say hello.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Notes

[1] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Article 6, paragraph 1, f.
[2] Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, Article 7 (f).
[3] “The legitimate interests of a controller, including those of a controller to which the personal data may be disclosed, or of a third party, may provide a legal basis for processing, provided that the interests or the fundamental rights and freedoms of the data subject are not overriding, taking into consideration the reasonable expectations of data subjects based on their relationship with the controller. Such legitimate interest could exist for example where there is a relevant and appropriate relationship between the data subject and the controller in situations such as where the data subject is a client or in the service of the controller. At any rate the existence of a legitimate interest would need careful assessment including whether a data subject can reasonably expect at the time and in the context of the collection of the personal data that processing for that purpose may take place. The interests and fundamental rights of the data subject could in particular override the interest of the data controller where personal data are processed in circumstances where data subjects do not reasonably expect further processing. Given that it is for the legislator to provide by law for the legal basis for public authorities to process personal data, that legal basis should not apply to the processing by public authorities in the performance of their tasks. The processing of personal data strictly necessary for the purposes of preventing fraud also constitutes a legitimate interest of the data controller concerned. The processing of personal data for direct marketing purposes may be regarded as carried out for a legitimate interest.” Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Recital 47.
[4] Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC, 9 April 2014, p. 10.
[5] ibid., pp 10-11.
[6] ibid., p. 25.
[7] ibid., p. 10.
[8] ibid,, p. 3.
[9] ibid., p. 5.
[10] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Article 6, para 1 (f).
[11] Data protection is a fundamental right in European Law. Article 8 of The European Charter of Fundamental Rights enshrines the right of every citizen to “the protection of personal data concerning him or her”. The European Union Charter of Fundamental Rights, Article 8, paragraph 1. “Such data must be processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law”. The European Union Charter of Fundamental Rights, Article 8, paragraph 2.
[12] Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC, 9 April 2014, p. 9, 30.
[13] ibid., p. 52.
[14] ibid., p. 43, 53-54.
[15] ibid., pp 33, 50-51, 55-56.
[16] ibid., p. 37.
[17] ibid., p. 37.
[18] ibid., p. 39.
[19] ibid., p. 39.
[20] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Recital 47.
[21] Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC, 9 April 2014, p. 37.
[22] ibid., p. 46.
[23] ibid.

European Commission proposal will kill 3rd party cookies

The 3rd-party cookie – the lifeblood of online advertising – may be about to die. 
A proposal this month from the European Commission to reform the ePrivacy Directive (ePD) requires mandatory privacy options and educates users to distinguish between 1st and 3rd-parties in a way that will make 3rd-party cookies extinct.
[prompt type=”left” title=”Access the GDPR/ePR repository” message=”A repository of GDPR and ePrivacy Regulation explainers, official docs, and current status.” button_text=”Access Now” href=”https://pagefair.com/datapolicydocs/”]
The Commission’s proposal also applies beyond cookies. The proposed reform of the ePD will further add to the the disruption that Europe’s new regulatory regime for privacy – the GDPR – will wreak upon to the media and advertising landscape when it applies in May 2018.
dino-on-whiteCaveat: the proposal is subject to negotiation between the Commission, the European Parliament, and the Council of Ministers. Its text may change before it becomes a regulation across the European Union.

Mandatory and binding privacy settings 

Web browsers (and similar software) will be required to prompt users with a menu of privacy options when they are installed for the first time.[1] The menu options will range from an extreme ban on all cookies to acceptance of all cookies, and will include intermediate options such as “reject third-party cookies” or “only accept first-party cookies”.[2] This is mandatory – a user must select one option from the menu in order to continue with the installation.[3] And unlike previous initiatives such as the Do Not Track standard, the Commission says that user’s privacy menu choice will be “binding on, and enforceable against, any third-parties”.[4] This does not apply only to newly installed software, but also to browsers already in operation before the new rules are introduced. (These must be updated to comply no later than 25 August 2018).[5] 
In other words, at a point in 2018 there will be no browser installed in Europe that does not have legally binding privacy settings that have been selected by a user.

Users will distinguish between 1st and 3rd-parties

The Commission had previously considered forcing web browsers vendors to reject all 3rd-party cookies by default, and giving users the ability to opt in to 3rd-party cookies if they wished.[6] The approach adopted in its final proposal appears less severe than this privacy-by-default approach, but will probably have the same consequence. The mandatory menu will educate users in a way that will cause a widespread rejection of 3rd-party cookies.
There are several measures in the proposal that will cause users to distinguish between 1st and 3rd-party tracking.
First, web browsers will be required to present the privacy settings options to users in a manner that educates them about “the compilation of long-term records of individuals browsing histories and the use of such records to send targeted advertising”.[7] Few users could be expected to willingly opt in to this. The inclination to say no will be compounded by users’ dawning awareness of the data collected about them, the uses to which these data are put, and the extent to which these data are breached, that will result from transparency requirements in the General Data Protection Regulation.[8]
Second, the Commission Proposal requires web browsers to present the higher privacy settings in a manner that does not dissuade users from selecting them.[9] 
Third, users who select lower privacy settings at first installation of a web browser will have controls to allow them to apply privacy controls on specific websites if they wish,[10]  so there could be a gradual fall off.
Finally, The proposal requires that users who have consented to their data being processed are reminded every six months that they can withdraw their consent any time.[11]

Beyond cookies 

The proposal encompasses tracking measures beyond cookies. The Commission regards peoples’ devices, and data flows to and from those devices, as part of the private sphere.[12] As a result users’ prior consent is required for tracking cookies, hidden identifiers, and “other similar unwanted tracking tools can that enter end-user’s terminal without their knowledge in order to gain access to information, to store hidden information and to trace the activities of the user or to instigate certain technical operations or tasks…”.[13] Remote collection of data in order to identify and track users, such as device fingerprinting, is explicitly included among the proposal’s prohibitions.[14]

Where and how this applies, litigation and penalties. 

Once negotiated between the European Commission, the European Parliament, and the European Council, the overhaul of the ePrivacy Directive will be directly transposed into the national laws of every nation in the EU.
The EU is the world’s largest single market.It will impact much of the martech and adtech sector because businesses outside the EU that provide service to users in the EU will have to have a representative in the EU[16] who will be addressable by supervisory authorities.[17] However, beyond the EU the ePD may have less impact than the GDPR. Its territorial scope covers communications within the EU and services to end-users in the EU irrespective of where the processing occurs in the world.[15]
The range of fines for infringements is similar to the GDPR.[18] For example, failure to comply with an order from a supervisory authority will be subject to a fine of up to 20 million or 4% of global turnover.[19] 
There is also reason to anticipate litigation. As with the GDPR, end users can both complain to regulator and seek redress in court against an infringement,[20] and users have a right to receive compensation for damage.[21] Users can also take the regulator to court if unhappy with their action or inaction. Users can also mandate privacy organizations to lodge a complaint and to seek redress in court on their behalf (and Member States may decide that bodies can do so without users mandating it). Any other party that is adversely affected by infringements can bring legal proceedings.[22] 
In short, the proposed ePD has teeth.

Read next:
Europe’s new privacy regime will disrupt the adtech Lumascape


Timeline: what happens next?

  • Negotiation between Commission, Parliament and Council for an unknown duration. The Commission proposes that negotiations will be rapid, and that the ePD will apply on the same date as the GDPR on 25 May 2018.
  • Browsers installed prior to 25 May 2018 will have to require users to choose a privacy setting by 25 August 2018.
  • By 1 January 2018 the Commission will establish a monitoring programme to review the effectiveness of the ePD.
  • Three years after the application of the Regulation the Commission will evaluate its effectiveness.

[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

NOTES

[1] Proposal for a regulation of the European Parliament and of the Council concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC (Regulation on Privacy and Electronic Communications), COM/2017/010 final – 2017/03 (COD), Recital 23 and 24 and Article 10 paragraphs 1 and 2.
[2] ibid., Recital 23.
[3] ibid., Article 10 paragraph 2.
[4] ibid., Recital 22.
[5] ibid., Article 10 paragraph 3.
[6] See Article 10 paragraphs 1 and 2 of a leaked draft of the proposal, available from Politico’s website. The leaked draft included a requirement that “all components of terminal equipment and all software that permits electronic communications on the market will be configured to refuse third-parties from storing or processing data on the terminal equipment of the end-user by default, and will prevent third-parties from using the equipment’s processing capabilities”
[7] Proposal for a regulation of the European Parliament and of the Council concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC (Regulation on Privacy and Electronic Communications), COM/2017/010 final – 2017/03 (COD), Recital 24.
[8] See related PageFair Insider post “Europe’s new privacy regime will disrupt the adtech Lumascape”, see also Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Recitals 39, 58, 60-63, and Article 13 paras. 1-2, and Article 13 and Article 14
[9] Proposal for a regulation of the European Parliament and of the Council concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC (Regulation on Privacy and Electronic Communications), COM/2017/010 final – 2017/03 (COD), Recital 24.
[10]  ibid., Recital 24.
[11] ibid., Article 9 paragraph 3.
[12] ibid., Recital 20.
[13] ibid., Recital 20.
[14] ibid., Recital 20.
[15] ibid., Recital 9 and Article 3.
[16] ibid., Article 3 paragraph 2 and 3.
[17] ibid., Article 3 paragraph 4.
[18] Member States have to determine penalties for infringements of Articles 12,13,14 and 17 (see ibid. Article 23 paragraph 4 and Article 24). Fines of 10 €M or 2% of total global annual turnover (see ibid. Article 23 paragraph 2), whichever is higher, apply to infringements of Article 8, Article 10, Article 15, Article 16. Fines of 20 €M or 4% of total global annual turnover apply to infringements of the principle of confidentiality of communications, permitted processing of electronic communications data, and time limits for erasure (see ibid. Article 23 paragraph 3).
[19] ibid., Article 23 paragraph 4.
[20] ibid., Article 21 paragraph 1.
[21] ibid., Article 22.
[22] ibid., Article 21 paragraph 2.

Europe's new privacy regime will disrupt the adtech Lumascape

In a year and a half new European rules on the use of personal information will disrupt advertising and media across the globe. Here are the three biggest impacts. 
Since 1996 when cookies were first repurposed to track users around the Web there has been an assumption that gathering and trading users’ personal information is the essence of advertising online. This is about to change.
[prompt type=”left” title=”Access the GDPR/ePR repository” message=”A repository of GDPR and ePrivacy Regulation explainers, official docs, and current status.” button_text=”Access Now” href=”https://pagefair.com/datapolicydocs/”]
The General Data Protection Regulation (GDPR) is the most significant update to privacy regulation in two decades.[1] Companies across the globe will have to comply with the GDPR if they want to serve any of the EU’s 500 million people, or handle data for any European companies.[2] European regulators will have the power to fine up to 4% of a company’s global annual turnover.Among the many industries affected it will be media and advertising that will most directly affect peoples’ privacy.
The GDPR’s application sixteen months from now (25 May 2018) is likely to lower the valuations of adtech and martech companies, change user behavior, and prompt a consolidation in media and advertising that favors publishers who have trusted relationships with users.
This note describes the three biggest impacts that the online advertising and media industry should prepare for.

1. Bad news for third-party tracking. 

The Regulation establishes a chain of responsibility for data and a new approach to consent that will disrupt the adtech complex, known in the industry as the “Lumascape”. Lumascapes are maps of the companies that form the bewilderingly complex digital media industry, regularly published by Luma Partners, a specialist investment banker.
Under the new rules it will be illegal for companies anywhere in the world to pass a European user’s personal information to another company, or to store these data, without agreeing a formal contract with the “data controller” (normally this is the company that requested the data from the user in the first place) that defines limits on how the data can be used.[3]
A company that uses personal information beyond these limits will have to obtain consent from users to do so, or in the specific case of direct marketing will have to inform users about what it does with the data, and of the fact that the user can object at any time to their data being used in this way.[4]  Users must be informed “clearly and separately from any other information”.[5]
This will be difficult – perhaps impossible – for most Lumascape adtech and martech companies to comply with because they do not have direct relationships with users. While it is conceivable that regulators may regard this as a permissible reason not to inform users, we think it unlikely.[6]
Instead, the third parties’ lack of relationships with users will make the direct relationships that publishers enjoy with users enormously valuable. This may prompt mergers and acquisitions between the media and adtech industries. Facebook is already vertically integrated in this manner. It has both a direct relationship with its users and the infrastructure to target and deliver ads. It alone does not pass personal information to third parties in order to make money.
It is also likely that publishers and service providers will become extremely cautious about permitting tracking pixels and third party JavaScript on their webpages because they could be liable for infringements that result. This is also likely to end the current practise of introducing unexpected parties to the chain of data sharing. As a result the number of trackers on sites, cookie syncing, pixel dropping, finger printing, and so on is likely to decrease.

2. Lawsuits & fines. 

By changing the rules governing who can use personal information, and how they can use it, the GDPR sets the stage for a wave of lawsuits against adtech, martech, and publishing companies.
Misbehavior will be discoverable – to an extent – because users will have the right to trace data back to its source. For example, a person who receives a marketing communication from a company is now entitled to find out where the sender’s data on them has been obtained from, and may then take legal action or complain to a regulator.[7]
Such cases may be significant because multiple companies “involved in the same processing” of a user’s personal data can each be held liable for the entire damages awarded in a case.[8] The Regulation allows non-profit privacy groups to take legal action on behalf of many users, which raises the prospect that many such cases will be taken.[9] According to TJ McIntyre of Digital Rights Ireland, “The fact that representative bodies can act on behalf of individuals will, practically speaking, be very important where actions require either specialist knowledge or deep pockets”.
The GDPR also gives regulators in each European country powers to impose severe sanctions, and each European country has the option to also impose additional claw backs on profits obtained through infringements of the Regulation.[10] Regulators will be under pressure to act decisively against companies that infringe the GDPR because the Regulation gives consumers the ability to take regulators to court for not properly responding to complaints.[11]

 3. User behavior will change. 

The average user is unaware of how parties across the Lumascape handle their personal information. This is likely to change as a result of two measures contained in the GDPR. First, the Regulation requires in most (perhaps all) cases[12] that an exhaustive level of detail be provided to users on how their personal information is used by every party that wants to use it, and envisages the establishment of iconography to concisely communicate data use, risks, and rights in plain language.[13] Second, the Regulation enshrines the right to access all personal information held by any company about a user.
The box below outlines the new details required in user notices, which goes far beyond current practice.
What a user must be informed about under GDPR:[14]

  • Who is collecting the data, and how to contact them or their European representative.
  • What the personal information are being used for, and the legal basis of the data processing.
  • The “legitimate interest” of the user of the data (This refers to a legal basis that may be used by direct marketing companies).
  • With whom the data will be shared.
  • Whether the controller intends to transfer data to a third country, and if so has the European Commission deemed this country’s protections adequate or what alternative safeguards or rules are in place.
  • The duration of storage, or the criteria used to determine duration.
  • That the user has the right to request rectification to mistakes in this personal information.
  • That the user has the right to withdraw consent.
  • How the user can lodge a complaint with the supervisory authority.
  • What the consequences of not giving consent might be.
  • In cases of automated decision-making, including profiling, what the logic of this process is, and what the significance of the outcomes may be.

In addition, the Regulation introduces a new focus on security that will further contribute to user fears. All parties that handle data are now required to protect personal information from misuse and leakage.[15] In addition, data controllers have to tell users when their personal data have been stolen in a data breach.[16] The practice of covering up data breaches will end, and users will learn how often their data are exposed.
Thus far most users have tended to favor convenience over privacy. This may change. The GDPR will confront users with the extent to which their behavior across the web is tracked, how these data are used, and how often they are stolen. The result will be a wave of paranoia about personal information. One can anticipate that users will react particularly negatively to businesses that hold their data but have no direct relationship with them.
If such a backlash does occur it will prompt users to exploit the new opportunities to opt out of tracking and other data disclosure that are also created by the Regulation. A suite of user rights are enshrined within GDPR that include access to personal information concerning them, rectification of those data, erasure or restriction of processing of those data, portability of those data from one to another service, and the right to object to automated decision making.[17] In addition, the Regulation requires that it must be as easy for a data subject to withdraw consent as it was to give it at any time.[18] The user will be able to significantly disrupt behavioral advertising as a result.

What this means  

A year and a half from now the General Data Protection Regulation will be applied across the EU, the world’s largest market. It will challenge the assumptions that drove the last twenty years of behavioral advertising. The trust that publishers earn from their users will become a precious asset.
Lumascape companies will suffer unless they adapt or integrate with publishers and platforms that have built trusted relationships with users.
Global CMOs will find it easier to apply common global standards that conform to the high bar set by Europe rather than carve data concerning the world’s biggest market from all other territories. This is not the first time that global players have had to bow to European regulators.
Personal privacy is about to receive a long overdue upgrade.

Read next:
Why clickbait can’t last

[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]


Timeline: what happens next?

  • In December 2015 – nearly twenty years after the adoption of the Data Protection Directive – the European Commission, Parliament, and Council (of Ministers of Member State Governments) agreed its replacement: the General Data Protection Regulation.
  • The GDPR was ratified by the Parliament and Council in April 2016.
  • The Article 29 Working Party of data protection authorities, soon to become the European Data Protection Board, is to issue guidance on specifics to the industry.[19] By the end of 2016 it will issue guidance on the role of the Data Protection Officer, the new right of data portability and how to identify an organisation’s main establishment and lead supervisory authority. By February 2017 it will issue guidance on the concept of risk and how to carry out a data protection impact assessment.

    Update (16 December 2016): The Article 29 Working Party has released three guidance documents concerning GDPR
    Guidelines for identifying a controller or processor’s lead supervisory authority
    Guidelines on the right to data portability
    Guidelines on Data Protection Officers (‘DPOs’)

  • The Regulation will be transposed into national laws in every European Member State to have direct applicability on 25 May 2018.
  • The Data Protection Directive will be repealed.
  • Data processing already underway at that point will have to be brought into conformity with the new Regulation within two years. Consent given under the e-Privacy Directive will be acceptable under the GDPR provided that that consent was given in line with the conditions of the new Regulation.[20]
  • The e-Privacy Directive will be refreshed.

Sign up to PageFair Insider to get updates

NOTES

[1] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1.
[2] ibid., Recitals 22, 23, 24, and 101-116, and Articles 3 and 27, paras. 1 and 3.
The Regulation requires that controllers or processors outside the EU who monitor or offer services to users in the EU must establish a representative in one EU member state who shall be addressed by supervisory authorities in relation to the Regulation.
[3] ibid., Article 28, paras. 2, 3 and 4, and Article 29.
Here is how this will operate. Current European rules require contracts between data controller and processor that guarantee that the processor handles the personal data only in the manner dictated by the controller. (see Data Protection Directive (95/46/EC) 1995 Article 17, para. 3. (URL: http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex:31995L0046)) However, this is now backed up by new sanctions, and the GDPR will require that these contracts define the nature and duration of processing (Regulation (EU) 2016/679, Article 28, para. 3). Similar agreements must also be in place when one processor engages another (ibid., Article 28, para. 4), and a processor can only do so with express permission from the controller (ibid., Article 28, para. 2).
[4] See ibid., Article 6 (f) and Recital 47, regarding the particular focus on direct marketing. See Article 21 paras. 1 and 2 regarding the right to object at any time.
[5] ibid., Article 13 para. 1.
[6] Perhaps with reference to ibid., Article 15, para. 5, b.
[7] ibid., Article 15, para. 1 (g).
[8] ibid., Article 82, paras. 1, 3 – 4, Recital 146.
After judgement the processors or controllers who have paid full compensation can claim back part of the compensation from processors or controllers also responsible (ibid., Article 82, para. 5).
[9] ibid., Article 80, para. 1. Indeed, individual EU countries may also allow bodies to do this even without the involvement of individuals, according to ibid., Article 80, para. 2.
[10] ibid., Recital 149
[11] ibid., Article 78, para. 1 and 2, and Recital 143.
[12] There is a caveat. The European Data Protection Board may decide to issue guidance that interprets Article 15, para. 5, (b) this as unnecessary for certain categories of marketing companies.
[13] ibid., Article 12, para. 7, and Article 70, para. 1, (r).
[14] ibid., Recitals 39, 58, 60-63, and Article 13 paras. 1-2, and Article 13 and Article 14.
[15] ibid., Article 32.
[16] ibid., Article 33 and Article 34.
[17] ibid., Article 15 to Article 21.
[18] ibid., Article 7, para. 3, and Article 21, paras. 1 and 2.
[19] ibid., Article 70.
[20] ibid., Recital 171.

Reprieve for IT departments as EU court rules on IP addresses

If you run a website, you might want to breathe a sigh of relief. A decision[1. The text of the ruling is available in a range of European languages (excluding English as of the time of writing) at http://curia.europa.eu/juris/document/document.jsf?text=&docid=184668] this morning from the European Court of Justice means that websites can continue to store visitor IP addresses.
The EU Court of Justice (ECJ) ruled that IP addresses are to be considered “personal data”, which are subject to the EU’s data protection rules, but hedged against causing disruption by watering down the ruling.
From the ECJ press release:

The dynamic internet protocol address of a visitor constitutes personal data, with respect to the operator of the website, if that operator has the legal means allowing it to identify the visitor concerned with additional information about him which is held by the internet access provider.

It would have been a shock to many if the ruling had gone the other way.[1. It could have been different, if the additional clause referring to legal means had not been included, as personal data is subject to stringent protection in the EU. Interestingly, the ruling slightly diverges from an opinion of an ECJ Advocate General delivered in May. Cases before the ECJ are considered in advance by an Advocate General, who publishes a (non-binding) opinion with which the Court often agrees. In May, AG Campos Sánchez-Bordona issued an opinion on this case that agreed that dynamic IP addresses constitute personal data, but also said that these data can be processed and stored without consent in cases where this is necessary to ensure a web service’s functionality.]

Why this matters

The immediate impact of a decision stopping the logging of IP addresses would have been disruption to many websites and services. IT departments everywhere would have thrown up their hands in despair at the task of expunging IP addresses from systems and databases that have relied on them.
Web services routinely keep a log of their users’ IP addresses. These logs are used for numerous largely mundane and innocuous purposes, such as to provide customized features to particular users, to prevent or enable access to content, or to blacklist IP addresses involved in “denial of service” attacks against a site.
IP addresses are rather more valuable to other companies. For instance, some adtech companies use IP addresses to identify and target consumers. Netflix and other content providers rely on IP addresses to restrict the use of VPNs to access TV shows and movies in blocked countries.[1. Geolocation can work at just the country-level, making it unnecessary to track individual IP addresses, and there are ways for Netflix et al to prevent VPN abuse, especially as business entities do not enjoy the same protection as individuals, but such workarounds would take time and money.]
While the ruling will probably pass by unnoticed, it is clear that websites have been granted a very real (although possibly temporary[1. The General Data Protection Regulation could change the rights landscape once it is applied in May 2018, as it includes stringent rules on how websites can handle personal data.]) reprieve, as the EU has been quick to act on ECJ rulings despite potentially devastating effects on companies both in Europe and elsewhere.
[share title=”Share this Post” facebook=”true” twitter=”true” google_plus=”true” linkedin=”true” pinterest=”true” reddit=”true” email=”true”]

Background to the ECJ’s decision

The ECJ was asked to rule on two issues:

  1. whether an IP address is personal data,[1. See http://curia.europa.eu/juris/document/document.jsf?docid=162555&doclang=EN&mode=req&occ=first (accessed October 11, 2016). Also, this is not the first time that the ECJ has concluded that IP addresses could be considered personal data. In Case C-70/10 Scarlet Extended SA v SABAM, a dispute between an ISP and a company “responsible for authorising the use by third parties of the musical works of authors, composers and editors”, the ECJ ruled that the ISP, Scarlet, could not be compelled to install a filtering system to detect and prevent the unlawful exchange of copyrighted works, as

    …the filtering system would also be liable to infringe the fundamental rights of its customers, namely their right to protection of their personal data and their right to receive or impart information, which are rights safeguarded by the Charter of Fundamental Rights of the EU. It is common ground, first, that the injunction would involve a systematic analysis of all content and the collection and identification of users’ IP addresses from which unlawful content on the network is sent. Those addresses are protected personal data.

    However, while it opened the door to the classifying of IP addresses as personal data and was referenced in the Breyer opinion, AG Campos Sánchez-Bordona noted that the SABAM case was “in a context in which the collection and identification of IP addresses was carried out by the Internet service provider”. Today’s judgement has farther-reaching consequences: the ISPs in the SABAM case already knew who their customers are, whereas the Breyer case affects any and all websites.] and

  2. whether the practice of logging IP addresses without consent was legal.[1. Or, more precisely, in accordance with the relevant provision of the German Telemedia Act, which states that a website provider may collect and process the personal data of users without their consent only to the extent it is necessary to (1) enable the general functionality of the website or (2) arrange payment. In addition, the relevant provision of the Telemedia Act states that enabling the general functionality of the website does not permit user data to be processed after the user closes, or navigates away from, the website.]

This followed eight years of litigation in various German courts[1. From Amtsgericht to Landgericht to Bundesgerichtshof.], which initiated in an action taken against the German government by Patrick Breyer, a member of Germany’s Pirate Party.[1. Case C-582/14 Patrick Breyer v Bundesrepublik Deutschland. EUR-Lex. http://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1476178592484&uri=CELEX:62014CN0582 (accessed October 11, 2016).] Breyer argued that government websites did not have an unrestricted right to indefinitely record the IP addresses of visitors without their consent.
Although IP addresses on their own are largely innocuous, Breyer gave two ways that government websites could combine IP addresses with other data to identification of an individual.
First, internet service providers (ISP) record customers’ real names and addresses, and assign their IP addresses. It is not inconceivable that a government could gain access to these records and connect a person’s real identity to their IP address.
Second, when combined with pages visited or search terms, IP addresses can provide an extensive profile of the visitor’s “political opinion, illnesses, religion, union affiliation” and more.[1. Translated from the original German suit brought by Breyer: http://www.daten-speicherung.de/wp-content/uploads/Surfprotokollierung_2008-01-03_Klageschrift_Kl_an_AG.pdf (accessed October 12, 2016).]

The ruling

Today’s ruling will probably allow the German Supreme Court to rule against Breyer, as it effectively states that:

  1. a dynamic IP address constitutes personal data for a website operator only if it has the legal means enabling it to identify the visitor with the help of additional information from the ISP
  2. a website operator may collect and store personal data without consent for an indeterminate period so as to ensure the continued functioning of the website

[x_promo image=”https://pagefair.com/wp-content/uploads/2016/10/keyes-bigip-10pxstroke.png”]

CASE TIMELINE

JANUARY 3, 2008
Patrick Breyer asks Berlin local court to stop German government websites logging IP addresses
AUGUST 13, 2008
Local court dismisses case, arguing that an IP address is insufficient to identify an individual
JANUARY 31, 2013
Breyer appeals decision to Berlin district court, which orders German government to cease unrestricted logging of IP addresses
SEPTEMBER 16, 2014
German Federal Court of Justice addresses appeals from both parties
DECEMBER 17, 2014
German Federal Court of Justice refers two questions to European Court of Justice
MAY 12, 2016
Advocate General Sánchez-Bordona delivers his non-binding but influential opinion
OCTOBER 19, 2016
European Court of Justice rules that IP addresses are personal data under some circumstances
[/x_promo]
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]