Posts

Facebook and adtech face a turbulent time in Europe's courts: the Brussels case.

This note examines a Belgian court ruling against Facebook’s tracking and approach to consent. Facebook and adtech companies should expect tough sanctions when they find themselves before European courts – unless they change their current approach to data protection and the GDPR. 
Facebook is playing a dangerous game of “chicken” with the regulators. First, it has begun to confront users in the EU with a new “terms of service” dialogue, which denies access to Facebook until a user opt-ins to tracking for ad targeting, and various other data processing purposes.[1] (more detail in footnote 1)

This dialogue appears to breach several important principles of the GDPR, including the principle of purpose limitation,[2] freely given, non-conditional consent,[3] and of transparency.[4] In other words, if Facebook attempts to collect consent in this manner, that consent will be unlawful. European Regulators have been very clear on this point.[5]
Second, on 1 May 2018, a mere twenty four days before the application date of the GDPR, Facebook’s head of privacy announced plans to build “Clear History”, a feature with which users can opt-out of Facebook collecting data about their visits to other websites and apps.[6] But the GDPR demands not an opt-out, but an opt-in.[7] Nor is Clear History available to non-Facebook users. And as a further sign of Facebook’s brinksmanship, it said “it will take a few months to build Clear History”,[8] which means that the feature will not be available to users until long after the GDPR has been applied later this month.
Facebook’s approach puts it on a collision course with European courts. This note examines one recent decision in which the Brussels Court of First Instance ruled that Facebook’s tracking of people on other websites is illegal, and that its approach to consent is invalid.[9] The immediate result was a financial penalty, and an order that Facebook must submit to having an independent expert supervise its deletion of all the personal data it illegally amassed.
The implications of the ruling are far wider. It is an insight into the hazard for digital publishers and adtech vendors of failing to heed the warnings of the Article 29 Working Party.

Important lessons for RTB/programmatic

Belgium’s data protection authority, the Belgian Privacy Commission,[10] challenged that Facebook’s “Like” buttons and trackers on websites all over the web enable it to look “over the shoulders of persons while they are browsing from one website to the next … without sufficiently informing the relevant parties and obtaining their valid consent”.[11]
The Court agreed, and summarized Facebook’s tracking in its ruling:

When someone visits a website with such a Facebook social plug-in, his browser will automatically establish a connection with (by sending an http request to) the Facebook server, after which the visitor’s browser directly loads the “plug-in” function from the Facebook server.[12]

The Court’s ruling outlined what data are received by Facebook from its social plugins installed on other websites:

1. IP address;
2. URL of the page of the website requested by the user;
3. The browser management system;
4. The type of browser, and
5. the cookies (previously) placed by the third-party website from which the browser requests the this-party content.[13]

In a previous judgement in 2015 the Court observed that these browsing data are “frequently of a very sensitive nature, allowing, for example, health-related, sexual and political preferences to be gauged”.[14]
This should give pause to digital publishers and adtech vendors, because these data, which reveal special categories of personal data, are exactly the same data that websites routinely broadcast to tens – if not hundreds – of companies in RTB bid requests.[15] This happens every time an advertisement is served.
The Court noted that the scale of Facebook’s presence across the web makes this tracking “practically unavoidable”.[16] The February 2018 ruling reiterated the Court’s previous ruling in 2015 that “the extent of the violations in question is massive: they do not only concern the violation of the fundamental rights of a single person, but of an enormous group of persons.”[17]
This too should give the online media and advertising industry pause, because the same applies the broadcasting of personal data in RTB bid requests by the majority of major websites across the globe, and to the creation of profiles based on these personal data by DMPs and other adtech vendors.

Facebook’s notification fig leaf ruled unlawful

Facebook provided the following notice to users about this tracking:

We use cookies to help personalise content, to target and measure advertising and to provide you with a safer experience…[18]

Unsurprisingly, the Court ruled that this is utterly inadequate:

The court has come to the decision that in all the cases described, Facebook does not obtain any legally valid consent in the sense of Article 5 (a) Privacy Act[19]and Article 129 ECA[20]for the disputed data processing.[21]

As a result, the Court ruled that Facebook does not have a legal basis for tracking Internet users as they browse the web. Nor does Facebook have a legal basis for tracking logged-in users around the web.[22]
Several of the Court’s admonitions are worth including here, because they are directly relevant to Facebook and other online media and adtech companies’ approaches to the GDPR.
First, the Court found that non-Facebook users are never told that their behavior on websites across the web is being profiled by Facebook:

When non-users visit a website of a third party that includes an (invisible) Facebook pixel that allows for tracking of browsing behavior, without indicating that they wish to make use of the Facebook service, no information mechanism (such as a banner) is displayed.[23]

This remains a legal risk for Facebook, and “Clear History” does not adequately mitigate this risk.
Second, the Court ruled that Facebook’s request for consent was not specific, and that any consent that it received was unlawful as a result:

‘Specific’ means that the expression of will must related to a specific instance or category of data processing and can thus not be obtained on the basis of a general authorization for an open series of processing activities.[24]

This part of the ruling was based on Article 1, section 8, of the Belgian Privacy Act, which uses the same formula of words as Article 4, paragraph 11, of the GDPR (“freely given, specific, informed…”). In other words, the Court is upholding a standard that is virtually identical to the standard that will apply under the GDPR. Facebook’s new GDPR consent dialogue faces the same problem, and is unlawful for the same reason.
Third, the Court found that Facebook users are not clearly told what “purposes” Facebook processes the personal data for. Nor does it clearly explain its use of sensitive data including any personal data that could reveal religious belief, sexual orientation, etc.:

the cookie banner, makes it insufficiently clear for which exact purposes the personal data – which indeed also include “sensitive data” (e.g. regarding religious beliefs or sexual orientation) – are being collected, while the following layers (including the cookie policy, data policy) also do not explain this in an easily comprehensible and accessible manner.[25]

Facebook has recently gone some way to inform users about the use of personal data concerning their political interests, but this is only a partial solution to a far broader risk for the company. Its handling of sensitive categories of personal data will be a major challenge, which it has yet to show any ability to resolve.[26] 
Fourth, and unsurprisingly in the aftermath of the Cambridge Analytica scandal, the Court found that Facebook did not properly disclose who it was sharing the data with. Nor did it provide any information about “the existence of a right to access and correction of the personal data concerning him”.[27] This is likely to remain a significant challenge.[28]
Fifth, the Court found that Facebook was not even complying with its own self-regulatory system. Whatever one’s view of the “adchoices” self-regulatory system, it is quite remarkable that Facebook continued to track people even if they had already used it to opt out.[29]

Facebook forced to delete data (and fined)

The Brussels Court ordered Facebook to pay €250,000 per day,[30] up to a maximum of €100 million, until it stopped its unlawful behavior.
This was a strong statement. To put this fine in to perspective, consider that Belgium has a population of 11.35 million people,[31] which is only 2% of the population of the EU.[32] At the same value per person, the EU equivalent would be €12.5 million per day, up to a maximum of €5 billion.
In addition, Facebook was ordered to submit to an independent expert supervising its deletion of all illegal data that it had amassed about every user on Belgian soil.[33] It also had to make sure that third parties to whom it provided illegal data do the same.
The Cambridge Analytica scandal shows that this last point about insuring that third parties delete their copies of Facebook’s illegally accumulated data will be impossible for Facebook to comply with, because of its lax data sharing standards. Recall that Mark Zuckerberg told US lawmakers

When developers told us they weren’t going to sell data, we thought that was a good representation. But one of the big lessons we’ve learned is that clearly, we cannot just take developer’s word for it.[34]

In other words, Facebook was sharing personal data without any control whatsoever, much as websites do when they send visitors’ personal data in RTB bid requests. Even if the original collection of the data had been lawful, this uncontrolled distribution would certainly is not. Again, the parallel with RTB bid requests should give publishers and adtech vendors pause.

What the Article 29 Working Party says, goes

Many of our colleagues in adtech have been unwilling to heed the counsel of the Article 29 Working Party (a roundtable of European regulators). The Brussels Court’s ruling illustrates the Working Party’s importance and authority. Although the Court is the arbiter, it relied on the Working Party’s authoritative opinions throughout its ruling. (The ruling cited the Working Party’s 2011 opinion on consent (15/2011),[35] its 2010 opinion on online behavioral advertising (2/2010)[36], its 2013 opinion on purpose limitation (2/2013)[37], and its 2010 opinion on the concepts of data controller and data processor (1/2010)[38].)
Whether or not businesses take the Working Party seriously, judges do, which is what matters when businesses find themselves facing sanctions for data misuse. This should demonstrate the value of closely abiding by the opinions of the Working Party. The requirements of European data protection law have been well illuminated by the public guidance of the Article 29 Working Party for over two decades, and provide an invaluable guide to businesses scrambling to comply with a body of law largely neglected hitherto.

Facebook can not reject users who refuse non-essential tracking

The Court ruled that Facebook cannot reject users who refuse to agree to tracking – unless the tracking in question is necessary for the service that a user explicitly requests from Facebook.[39] Instead, the Court ruled that users should be

given the option of refusing the placement of these cookies, in as far as this is not strictly necessary for a service explicitly requested by him, without his access to the Facebook.com domain being hereby limited or rendered more difficult.[40]

In December 2015, Facebook had blocked access to all Belgian users, following a court injunction that forbade it to place a (“Datr”) cookie without properly informing users.[41] (See footnote 41 for elaboration.) Facebook attempted to justify this denial of service in a notice to users that claimed it could not provide service because was prohibited from taking measures (unlawful tracking) to prevent unauthorized access to users’ Facebook accounts. The Court took a dim view of this:

The court concurs … that the systematic collection of the personal data of users and non-users via social plug-ins on the websites of third parties is not essential (let alone “strictly essential” in the sense of Article 129 ECA),or at least not proportional to the achievement of the safeguarding objective.[42]

The Court believed that Facebook’s purported fraud detection was insufficient in any case:

the systematic collection of safeguarding cookies is inadequate as a means of safeguarding, as it is easy to circumvent by persons with malicious intentions.[43]

Conclusion: fewer data, not more, will help Facebook in the EU

This ruling is one of several defeats Facebook has suffered in European courts in recent months. In January, the Berlin Regional Court ruled that Facebook’s approach to consent and terms are unlawful.[44] In April, the Irish High Court referred important aspects of Facebook’s trans-Atlantic transfers of personal data to the European Court of Justice, once again, for scrutiny.[45] It is likely that worse is to come, unless it significantly changes its approach to data protection within the EU.
However, the company has options. As unlikely as it may seem now, one can foresee that Facebook will introduce non-personal data based ad targeting to the Newsfeed. This is likely to be necessary because Facebook will be unable to win lawful consent for some of its data processing purposes for sensitive personal data (or data processing purposes for regular personal data, that are not “compatible” with purposes that the user has already agreed to).[46]
It seems likely that problem encompasses all personalized advertising on the newsfeed, custom audiences, and social share buttons on other websites. Therefore, Facebook must have a way of targeting ads to non-consenting users. Non-personal data would allow this.
It may also become important for Facebook to be able to participate in a clean and safe data supply chain, which major advertisers are beginning to show concern about.[47]
In addition, Facebook will have to limit the use of custom audiences to situations where it is certain that the advertiser has a valid legal basis.
There is a broader lesson. Digital publishers and adtech vendors need to urgently reassess the use of personal data in programmatic advertising, and reflect on how adtech’s shaky consent systems will fare in Europe’s courts.

Notes

[1] The new terms mention personalization of ads. See “Terms of service”, Facebook (URL: https://www.facebook.com/legal/terms/update), accessed 2 May 2018.
The Terms also refer to the data policy, which elaborates that “we use the information we have about you – including information about your interests, actions and connections – to select and personalise ads, offers and other sponsored content that we show you.” The data policy also says “We use the information [including] the websites you visit and ads you see … to help advertisers and other partners measure the effectiveness and distribution of their ads and services, and understand the types of people who use their services and how people interact with their websites, apps and services”. “Data policy”, Facebook (URL: https://www.facebook.com/about/privacy/update), accessed 2 May 2018.
[2] The GDPR, Article 5, paragraph 1.
[3] The GDPR, Article 7, paragraph 2.
[4] The GDPR, Article 13, paragraph 1 and paragraph 2.
[5] “Guidelines on consent under Regulation 2016/679”, WP259, Article 29 Working Party, 28 November 2017, p. 11.
[6] “Getting feedback on new tools to protect people’s privacy”, Facebook, 1 May 2018 (URL: https://newsroom.fb.com/news/2018/05/clear-history-2/).
[7] See the GDPR, Article 6, Article 8, and Article 9.
[8] “Getting feedback on new tools to protect people’s privacy”, Facebook.
[9] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., Dutch-language Brussels Court of First Instance (Nederlandstalige Rechtbank van Eerste Aanleg te Brussel/Tribunal de Première Instance néerlandophone de Bruxelles – the “Court”), 16 February, 2016/153/A (URL: https://pagefair.com/wp-content/uploads/2018/04/Belgian-Court-judgement.pdf).
[10 ]It has since changed its name to the Belgian Data Protection Authority.
[12] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 12.
[12] ibid., p. 9. See more detail on pp 49-51.
[13] ibid., p. 9.
[14] “Data leakage in online advertising”, PageFair (URL: https://pagefair.com/data-leakage-in-online-behavioural-advertising/).
[15] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 69.
[16] ibid., p. 69.
[17] ibid.,
Note that this raises the competition (antitrust) question, as Germany’s competition regulator, Andreas Mundt, has pointed out: “If Facebook has a dominant market position, then the consent that the user gives for his data to be used is no longer voluntary” (see https://www.reuters.com/article/us-facebook-privacy-germany/facebooks-hidden-data-haul-troubles-german-cartel-regulator-idUSKBN1HU108).
[18] ibid., p. 8.
[19] Which implemented the Data Protection Directive.
[20] Electronic Communications Act of 20 June 2005, which implemented the ePrivacy Directive.
[21] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 64.
[22] ibid., p. 73-4.
[23] ibid., p. 57
[24] ibid., p. 61.
[25] ibid., p. 58.
[26] See discussion of special categories of data in the newsfeed in “How the GDPR will disrupt Google and Facebook”, PageFair, 30 August, (URL: https://pagefair.com/blog/2017/gdpr_risk_to_the_duopoly/).
[27] ibid., p. 59.
[28] See testimony by Chris Vickery at the UK Parliament Digital, Culture, Media and Sport Committee Wednesday 2 May 2018  (URL: https://www.parliamentlive.tv/Event/Index/0cf92dd0-f484-4699-9e01-81c86acb880c)
[29] ibid., p. 63.
[30] ibid., p. 70.
[31] World Bank, 2016.
[32] Eurostat, population on 1 January 2017 (URL: ec.europa.eu/eurostat/tgm/table.do?tab=table&plugin=1&language=en&pcode=tps00001)
[33] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 14, 70.
[34] Testimony of Mark Zuckerberg Chairman and Chief Executive Officer, Facebook, Hearing before the United States House of Representatives Committee on Energy and Commerce, 11 April 2018 (URL: https://www.c-span.org/video/?443490-1/facebook-ceo-mark-zuckerberg-testifies-data-protection&live&start=4929#).
[35] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., pp. 57-8, 61.
[36] ibid., p. 59.
[37] ibid., p. 60.
[38] ibid., p. 70.
[39] ibid., p. 13, 72.
[40] ibid., p. 72. See the Privacy Commission’s argument for this on p. 13.
[41] After an order from the Privacy Commission, which was backed up by a Court injunction. In 2015, the Privacy Commission ordered Facebook to, among other things, stop tracking non-users, using cookies and social plugs, without consent, and to do the same for users unless “unless strictly necessary for a service explicitly requested b the user” or unless it gets “unequivocal, specific consent”. It was also ordered to use consent requests that are unequivocal and specific. When Facebook failed to comply this was followed by a court order in November 2015. Facebook responded by blocking access to users. See ibid., pp 4-7.
[42] ibid., p. 65-6.
[43] ibid., p. 67.
[44] Judgment of the Berlin Regional Court dated 16 January 2018, Case no. 16 O 341/15 (URL: https://pagefair.com/wp-content/uploads/2018/04/Berlin-Court-judgement-German.pdf)
[45] The High Court, Commercial, 2016, N. 4809 P., The Data Protection Commissioner v Facebook Ireland and Maximilian Schrems, Request for a preliminary ruling, Article 267 TFEU, 12 April 2018.
See also Judgement of Ms Justice Costello, The High Court, Commercial, 2016, No. 4809 P., The Data Protection Commissioner v Facebook Ireland and Maximillian Schrems, 3 October 2017.
Note, this is the second “Schrems” case. The first caused the end of the EU-US Safe Harbor agreement.
[46] See a discussion on Facebook and purpose limitation in “How the GDPR will disrupt Google and Facebook”, PageFair, 30 August, (URL: https://pagefair.com/blog/2017/gdpr_risk_to_the_duopoly/).
[47] “WFA Manifesto for Online Data Transparency”, World Federation of Advertisers, 20 April 2018 (URL: https://www.wfanet.org/news-centre/wfa-manifesto-for-online-data-transparency/). See also Stephan Loerke, WFA CEO, “GDPR data-privacy rules signal a welcome revolution”, AdAge, 25 January 2018 (URL: adage.com/article/cmo-strategy/gdpr-signals-a-revolution/312074/).

Google adopts non-personal ad targeting for the GDPR

This note examines Google’s recent announcement on the GDPR. Google has sensibly adopted non-personal ad targeting. This is very significant step forward and signals a change in the online advertising market. But Google has also taken a new and problematic approach to consent for personal data use in advertising that publishers will find hard to accept. 

Google decides to use non-personal ad targeting to comply with the GDPR 

Last Thursday Google sent a policy update to business partners across the Internet announcing that it would launch an advertising service based on non-personal data in order to comply with the GDPR.[1]
PageFair has advocated a non-personal approach to advertising for some time, and commends Google for taking this position. As we noted six months ago,[2] Google AdWords, for example, can operate without consent if it discards personalized targeting features (and unique IDs). In this case, advertisers can continue to target advertisements to people based on what they search for.
This may be part of a trend for Google, which announced in mid 2017 that it would stop mining personal e-mails in Gmail to inform its ad targeting. Clearly, few users would have given consent for this.[3] Google’s latest announcement has signaled to advertisers the importance of buying targeted advertising without personalization.
Although Google’s “non-personalized ads” may seem promising to advertisers and publishers who are concerned about GDPR liability, more work must be done before they can be considered safe.
Unique tracking IDs are currently vital to Google’s ability to perform frequency capping and bot detection.[4] Meanwhile, data leakage is a problem caused by 3rd party ad creatives liberally loading numerous tracking pixels. Google has been silent on fixing these problems. Therefore, it may be that Google will merely target ads with non-personal data, but will continue to perform tracking as usual. Clarity on this point will be important for advertisers seeking safe inventory.

Problems with Google’s approach to consent for personal data

Despite its new non-personalized ads, Google is also attempting to build a legal basis under the GDPR for its existing personal data advertising business. It has told publishers that it wants them to obtain their visitors’ consent to “the collection, sharing, and use of personal data for personalization of ads or other services”.[5]
Note that the purpose here is “personalization of ads or other services”. This is appears to be a severe conflation of the many separate processing purposes involved in advertising personalization.[6] The addition of “other services” makes the conflation even more egregious. As we previously observed in our note on the approach proposed by IAB Europe, this appears to be a severe breach of Article 5, which requires that consent be requested in a granular manner for “specified, explicit” purposes.[7] As noted in a previous PageFair note, European regulators have explicitly warned against conflating purposes in this way:

“If the controller has conflated several purposes for processing and has not attempted to seek separate consent for each purpose, there is a lack of freedom. This granularity is closely related to the need of consent to be specific …. When data processing is done in pursuit of several purposes, the solution to comply with the conditions for valid consent lies in granularity, i.e. the separation of these purposes and obtaining consent for each purpose”.[8] 

Controller-controller 

Google is asking publishers to obtain consent from their visitors for it to be an independent controller of those users’ personal data.[9] Confusingly, Google has called this a “controller-controller” policy. This evokes “joint-controllership”, a concept in the GDPR that would require force both Google and publisher to jointly determine the purposes and means of processing, and to be transparent with each other.[10] However, what Google proposes is not joint-controllership, but rather independent controllership for the publisher on the one hand, and for Google on the other. Google’s “controller-controller” terms to publishers define each party as

“an independent controller of Controller Personal Data under the Data Protection Legislation; [that] will individually determine the purposes and means of its processing of Controller Personal Data”.[11]

It is not clear why a publisher would choose to do this, since it would enable Google to leverage that publisher’s audience across the entire web (severe conflation of purposes notwithstanding). The head of Digital Content Next, a publisher trade body that represents Disney, New York Times, CBS, and so forth, has already announced that “no way in hell Google will be “co-controller” across publishers’ sites”.[12]
Further problems with Google’s new approach to consent 
Even if publishers did accept that Google could be a controller of their visitors’ data for its own purposes, it is unlikely that many visitors would give their consent for this.[13]
If, however, both a publisher and a visitor were to agree to Google’s controller-controller proposal, two further problems arise. First, when a publisher shares third party personal data with Google, Google’s terms require that the publisher “must use commercially reasonable efforts to ensure the operator of the third party property complies with the above duties [of obtaining adequate consent]”.[14] This phrase “commercially reasonable efforts” is not a meaningful defence in the event that personal data are unlawfully processed.
As one expert from a European data protection authority retorted when I researched this point: “Imagine this as legal defence line: ‘We did not obtain consent because if wasn’t possible with commercially reasonable efforts’?” The Regulation is clear that “each controller or processor shall be held liable for the entire damage”, where more than one controller or processor are “involved in the same processing”.[15]
Second, Google’s policy puts what appears to be an impossible burden on the publisher. It requires that the publisher accurately inform the visitor about how their data will be used if they give consent.

“You must clearly identify each party that may collect, receive, or use end users’ personal data as a consequence of your use of a Google product. You must also provide end users with prominent and easily accessible information about that party’s use of end users’ personal data”.[16]

However, the publisher does not know what personal data Google shares with its own business partners. Nor does it know what purposes these parties process data about its visitors for. So long as this continues, a publisher cannot be in a position to inform its visitors of what will be done with their data. The result is very likely to be a breach Article 6[17] and Article 13[18] of the GDPR.
Giving Google the benefit of the doubt, this may change before 25 May. Google plans to publish some information about its “uses of information and we are asking other ad technology providers with which Google’s products integrate to make available information about their own uses of personal data.”[19] Publishers will not be well served by any further delay in the provision of this information.

Risks for Google 

Google’s decision to rely on non-personal data for ad targeting is highly significant, and will enable the company and advertisers that work with it to operate under the GDPR. However, Google’s new consent policy is fraught with issues that make it impossible for publishers to adopt. Our GDPR risk scale, first published for Google in August 2017, remains unchanged.


Perimeter is a robust regulatory firewall. It preemptively blocks unauthorized requests from 3rd parties, and tightly controls personal data on your website and app. It protects you, your advertising business, and your users. Perimeter makes sure that consent means something.

[x_button shape=”rounded” size=”regular” float=”none” href=”https://pagefair.com/perimeter” info=”none” info_place=”top” info_trigger=”hover”]Learn more[/x_button]

Notes

[1] “Changes to our ad policies to comply with the GDPR”, Google Inside AdWords, 22 March 2018 (URL: https://adwords.googleblog.com/2018/03/changes-to-our-ad-policies-to-comply-with-the-GDPR.html).
[2] “How the GDPR will disrupt Google and Facebook”, PageFair Insider, August 2017 (URL: https://pagefair.com/blog/2017/gdpr_risk_to_the_duopoly/).
[3] ibid.
[4] For alternative methods of performance measurement and reporting see “Frequency capping and ad campaign measurement under GDPR”, PageFair Insider, November 2017 (URL: https://pagefair.com/blog/2017/gdpr-measurement1/).
[5] “EU user consent policy”, Google, to apply from 25 May 2018 (URL: https://www.google.com/about/company/consentstaging.html)
[6] See discussion of data processing purposes in online behavioural advertising, and the degree of granularity required in consent, in “GDPR consent design: how granular must adtech opt-ins be?”, PageFair Insider, January 2018 (URL: https://pagefair.com/blog/2018/granular-gdpr-consent/).
[7] The GDPR, Article 5, paragraph 1, b, and note reference to the principle of “purpose limitation”. See also Recital 43. For more on the purpose limitation principle see “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013.
[8] “Guidelines on consent under Regulation 2016/679”, WP259, Article 29 Working Party, 28 November 2017, p. 11.
[9] “Google Ads Controller-Controller Data Protection Terms, Version 1.1”, Google, 12 October 2017 (URL: https://privacy.google.com/businesses/controllerterms/).
[10] See The GDPR, Article 26.
[11] Clause 4.1 of “Google Ads Controller-Controller Data Protection Terms, Version 1.1”, Google, 12 October 2017 (URL: https://privacy.google.com/businesses/controllerterms/).
[12] Jason Kint, Twitter, 22 March 2018 (URL: https://twitter.com/jason_kint/status/976928024011726848)
[13] “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, 12 September 2017 (URL: https://pagefair.com/blog/2017/new-research-how-many-consent-to-tracking/).
[14] “EU user consent policy”, Google, to apply from 25 May 2018 (URL: https://www.google.com/about/company/consentstaging.html)
[15] The GDPR, Article 4, paragraph 2.
[16] “EU user consent policy”, Google, to apply from 25 May 2018 (URL: https://www.google.com/about/company/consentstaging.html)
[17] The GDPR, Article 6, paragraph 1, a.
[18]  [20] The GDPR, Article 13, paragraph 2, f, and Recital 60.
[19] “Help with the EU user consent policy”, Google (URL:https://www.google.com/about/company/consenthelpstaging.html)

Risks in IAB Europe’s proposed consent mechanism

This note examines the recently published IAB “transparency and consent” proposal. Major flaws render the system unworkable. The real issue is what should be done with the vast majority of the audience who will not give consent. 

Publishers would have no control (and are expected to blindly trust 2,000+ adtech companies)

The adtech companies[1] who drafted the IAB Europe proposal claim that “publishers have full control over who they partner with, who they disclose to their users and who they obtain consent for.”[2] But the IAB Europe documentation shows that adtech companies would remain entirely free to trade the personal data with their business partners if they wish. The proposed system would share a unique[3] consent record “throughout the online advertising ecosystem”, every time an ad is loaded on a website:[4]

“the OpenRTB request [from a website to an ad exchange] will contain the entire DaisyBit [a persistent cookie],[5] allowing a vendor to see which other vendors are an approved vendor or a publisher and whether they have obtained consent (and for which purposes) and which have not.”[6]

There would be no control over what happens to personal data once they enter the RTB system: “[adtech] vendors may choose not to pass bid requests containing personal data to other vendors who do not have consent”.[7] This is a critical problem, because the overriding commercial incentive for many of the companies involved is to share as many data with as many partners as possible, and to share it with parent companies that run data brokerages. In addition, publishers are expected to trust that JavaScript in “ad creatives” is not dropping trackers, even though no tools to police this are proposed here.
IAB Europe is asking publishers and brands to expose themselves to the legal risk of routinely sharing these personal data with several thousand adtech companies. What publishers and brands need is a “trust no one” approach. IAB Europe is proposing a “trust everyone” approach. Indeed, the proposed system looks like the GDPR’s description of a data breach:

“a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data transmitted, stored or otherwise processed”.[8]

Publishers have no control over personal data once they send them into the RTB system. All publishers have is liability.

“OK to everything” jeopardises the publisher’s own opt-ins

The proposed system would also jeopardise the chance of websites obtaining essential opt-ins for their own data processing purposes, such as commenting widgets, video players. IAB Europe proposes that websites bundle all consent under a single “OK”/”Accept all” button. Our wireframe below shows the text and buttons recommended by IAB Europe.[9]

Broadly speaking, websites might expect to receive consent from four out of every five of users for their own data processing.[10] Whereas the opt-in rate for ad tech tracking is tiny in comparison. Our research found that only 3% of people say they would opt in to 3rd parties tracking them across the web for the purposes of advertising.[11] IAB Europe’s commissioned research found that only 20% would do so.[12] The ad tech vendors who drafted the IAB Europe proposal have an incentive to ask publishers to take risk on their behalf: they must realize that there is no chance that Internet users will agree to the cascade of opt-ins that the GDPR requires.[13] A website would be ill advised to jeopardise its own consent requests in a vain effort to get consent for ad tech companies, particularly if those ad tech companies plan to use that same consent to work with the website’s competitors.

Conflation and other matters of presentation

The proposal appears to breach Article 5, Article 6, and Article 13 of the GDPR, for several reasons.
First, Article 5 requires that consent be requested in a granular manner for “specified, explicit” purposes.[14] Instead, IAB Europe’s proposed design bundles together a host of separate data processing purposes under a single opt-in. A user must click the “Manage use of your Data” button in order to view four slightly less general opt-ins, and the companies[15] requesting consent. These opt-ins also appear to breach Article 5, because they too conflate multiple data processing purposes into a very small number of ill defined consent requests. For example, a large array of separate ad tech consent requests[16] are bundled together in a single “advertising personalisation” opt-in.[17] European regulators explicitly warned against conflating purposes:

“If the controller has conflated several purposes for processing and has not attempted to seek separate consent for each purpose, there is a lack of freedom. This granularity is closely related to the need of consent to be specific …. When data processing is done in pursuit of several purposes, the solution to comply with the conditions for valid consent lies in granularity, i.e. the separation of these purposes and obtaining consent for each purpose.”[18]

Second, the text that IAB Europe proposes publishers display for the “advertising personalisation” opt-in appears to severely breach of Article 6[19] and Article 13[20] of the GDPR. In a single 49 word sentence, the text conflates several distinct purposes, and gives virtually no indication of what will be done with the reader’s personal data.

“Advertising personalisation allow processing of a user’s data to provide and inform personalised advertising (including delivery, measurement, and reporting) based on a user’s preferences or interests known or inferred from data collected across multiple sites, apps, or devices; and/or accessing or storing information on devices for that purpose.”[21]

This fails to disclose that hundreds, and perhaps thousands, of companies will be sent your personal data. Nor does it say that some of these companies will combine these with a profile they already have built about you. Nor are you told that this profile includes things like your income bracket, age and gender, habits, social media influence, ethnicity, sexual orientation, religion, political leaning, etc. Nor do you know whether or not some of these companies will sell their data about you to other companies, perhaps for online marketing, credit scoring, insurance companies, background checking services, and law enforcement.
Third, a person must say yes or no for all or none of the companies listed as data controllers.[22] Since one should not be expected to trust all controllers equally, and since it is unlikely that all controllers apply equal safeguards of personal data, we suspect that this “take it or leave it” choice will not satisfy regulatory authorities.
Fourth, there appears to be no way to easily refuse to opt-in to the consent request that IAB Europe proposes, which would also breach the GDPR.[23] It is possible that this last point is simply an accidental oversight in the drafting of IAB Europe’s documentation.

Conclusion: What about the people (80%-97%) who don’t opt-in?

The proposed system has no plan to make consent meaningful, by giving publishers and data subjects control over what happens to personal data. Nor does it have a plan for what happens when users do not give consent. It is time for the discussion to move on.
As the CEO of a Digital Content Next, a major publisher trade body, recent told members, “GDPR will create opportunity for audience selection based on cohorts and context”.[24] Non-personal data such as these are the only way for the industry to approach the GDPR.
PageFair has recently announced Perimeter, a regulatory firewall that enables websites (and apps) protect their ad business, running direct campaigns and use RTB without risk under the GDPR. It prevents unauthorized connections from 3rd parties, so that personal data can not leak through the RTB system, or anywhere else. (For extra peace of mind, PageFair’s SSP delivers guaranteed compliant programmatic display advertising). This is the consent-free approach.
We also believe that consent has a role. The next chapter for online advertising will be written by publishers who use consent-free RTB, and build up consenting audiences for premium advertising too.
Note: thanks to Andrew Shaw at PageFair. 


[x_alert heading=”Feedback Wanted” type=”success” close=”true”]Note: PageFair has just updated its online overview of Perimeter. Please review http://pagefair.com/perimeter and give us your feedback.[/x_alert]

Perimeter is a robust regulatory firewall. It preemptively blocks unauthorized requests from 3rd parties, and tightly controls personal data on your website and app. It protects you, your advertising business, and your users. Perimeter makes sure that consent means something.

[x_button shape=”rounded” size=”regular” float=”none” href=”https://pagefair.com/perimeter” info=”none” info_place=”top” info_trigger=”hover”]Learn more[/x_button]

Notes

[1] AppNexus Inc.; Conversant, LLC; DMG Media Limited; Index Exchange, Inc.; MediaMath, Inc.; Oath, Inc.; Quantcast Corp.; and, Sizmek, Inc. are named in the copyright notice of “Transparency & Consent Framework, Cookie and Vendor List Format, Draft for Public Comment, v1.a”, IAB Europe (URL: URL-shortened), p. 3.
Note: PageFair is a member of IAB TechLab, and IAB UK.
[2] “Transparency & Consent Framework FAQ”, IAB Europe, 8 March 2018 (URL: http://advertisingconsent.eu/wp-content/uploads/2018/03/Transparency_Consent_Framework_FAQ_Formatted_v1_8-March-2018.pdf), p. 8.
[3] Our statistical examination of the data in the cookie showed a very high degree of uniqueness. The proposed cookie is itself a tracking cookie. See the specification of the cookie in “Transparency & Consent Framework, Cookie and Vendor List Format, Draft for Public Comment, v1.a”, IAB Europe, pp 8 – 10.
[4] ibid., p. 3
[5] ibid., p. 8.
To see the content of the proposed consent cookie, see http://gdpr-demo.labs.quantcast.com/user-examples/cookie-workshop.html.
It is envisaged that the record may be server-based in the future, because this will work better. See p. 7.
[6] “Transparency & Consent Framework FAQ”, IAB Europe, 8 March 2018, p. 9.
[7] ibid., p. 10. And from the same page, when an adtech company gets personal data without consent, IAB Europe asks it “to only act upon that data if it has another applicable legal basis for doing so”.
[8] Regulation (EU) 2016/679 of The European Parliament and of The Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), Article 4, paragraph 12.
[9] “Transparency & Consent Framework FAQ”, IAB Europe, 8 March 2018, p. 13.
[10] 20% would accept first party tracking only. An additional 56% would accept tracking that is strictly necessary for services they have requested. 5% say they would accept all tracking.
See “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, 12 September 2017 (URL: https://pagefair.com/blog/2017/new-research-how-many-consent-to-tracking/).
[11] ibid.
[12] “Europe Online: an experience driven by advertising”, GFK, 2017, p. 7. (URL: https://www.iabeurope.eu/wp-content/uploads/2017/09/EuropeOnline_FINAL.pdf).
[13] “GDPR consent design: how granular must adtech opt-ins be?”, PageFair Insider, January 2018 (URL: https://pagefair.com/blog/2018/granular-gdpr-consent/).
[14] The GDPR, Article 5, paragraph 1, b, and note reference to the principle of “purpose limitation”. See also Recital 43. For more on the purpose limitation principle see “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013.
[15] Note that the Article 29 Working Party very recently warned that this alone might be enough to render consent invalid: “when the identity of the controller or the purpose of the processing is not apparent from the first information layer of the layered privacy notice (and are located in further sub-layers), it will be difficult for the data controller to demonstrate that the data subject has given informed consent, unless the data controller can show that the data subject in question accessed that information prior to giving consent”.
Quote from “Guidelines on consent under Regulation 2016/679”, WP259, Article 29 Working Party, 28 November 2017 (URL: https://pagefair.com/wp-content/uploads/2017/12/wp259_enpdf.pdf), p. 15, footnote 39.
[16] See discussion of data processing purposes in online behavioural advertising, and the degree of granularity required in consent, in “GDPR consent design: how granular must adtech opt-ins be?”, PageFair Insider, January 2018 (URL: https://pagefair.com/blog/2018/granular-gdpr-consent/).
[17] “Transparency & Consent Framework FAQ”, IAB Europe, 8 March 2018, p. 18.
[18] “Guidelines on consent under Regulation 2016/679”, WP259, Article 29 Working Party, 28 November 2017, p. 11.
[19] The GDPR, Article 6, paragraph 1, a.
[20] The GDPR, Article 13, paragraph 2, f, and Recital 60.
[21] “Transparency & Consent Framework FAQ”, IAB Europe, 8 March 2018, p. 18.
[22] “Transparency & Consent Framework, Cookie and Vendor List Format, Draft for Public Comment, v1.a”, IAB Europe, p. 5.
This is apparently “due to concerns of payload size and negatively impacting the consumer experience, a per-vendor AND per-purpose option is not available”, p. 22.
[23] The Regulation is clear that “consent should not be regarded as freely given if the data subject has no genuine or free choice”. The GDPR, Recital 42. See also, Article 4, paragraph 11.
[24] Jason Kint, “Why the IAB GDPR Transparency and Consent Framework is a non-starter for publishers”, Digital Content Next, 19 March 2018 (URL: https://digitalcontentnext.org/blog/2018/03/19/iab-gdpr-consent-framework-non-starter-publishers/)

Adtech must change to protect publishers under the GDPR (IAPP podcast)

The follow up to the International Association of Privacy Professionals’ most listened to podcast of 2017. 
Angelique Carson of the International Association of Privacy Professionals quizzes PageFair’s Dr Johnny Ryan on the crisis facing publishers, as they grapple with adtech vendors and attendant risks ahead of the GDPR. The podcast covers:

  • Why personal data can not be used without risk in the RTB/programmatic system under the GDPR.
  • Where consent falls short for publishers.
  • How vulnerable the online advertising system is, because of central points of legal failure.
  • The GDPR is part of a global trend. New privacy standards are on the way in other massive markets including China (and in important tech ecosystems such as Apple iOS, Firefox).

This is the follow up to an earlier IAPP and PageFair podcast discussion (which was the International Association of Privacy Professionals’ most listened to podcast of 2017).

[x_button shape=”rounded” size=”regular” float=”none” href=”https://iapp.org/news/a/the-privacy-advisor-podcast-johnny-ryan-on-the-continuing-crisis-ad-tech-faces/” info=”none” info_place=”top” info_trigger=”hover”]Listen at IAPP[/x_button]

[x_button shape=”rounded” size=”regular” float=”none” href=”https://itunes.apple.com/us/podcast/the-privacy-advisor-podcast/id1095382766?mt=2#” info=”none” info_place=”top” info_trigger=”hover”]Listen on iTunes[/x_button]

Click here to view PageFair’s explainers and official documents about the changes websites and apps must make under the new privacy rules. Elsewhere you can find details about Perimeter, PageFair’s GDPR solution for publishers.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

PageFair's long letter to the Article 29 Working Party

This note discusses a letter that PageFair submitted to the Article 29 Working Party. The answers may shape the future of the adtech industry. 
Eventually the data protection authorities of Europe will gain a thorough understanding of the adtech industry, and enforce data protection upon it. This will change how the industry works. Until then, we are in a period of uncertainty. Industry can not move forward, business can not flourish. Limbo does not serve the interests of publishers. Therefore we press for certainty.
This week PageFair wrote a letter to the Article 29 Working Party presenting insight on the inner workings of adtech, warts and all.
Our letter asked the working party to consider five questions. We suspect that the answers may shape the future of the adtech industry.

  1. We asked for further guidance about two issues that determine the granularity of consent required. First, we asked what the scope of a single “purpose” for processing personal data is. Since one must have a legal basis for each purpose, a clear understanding of scope of an individual purpose is important to determine the number of purposes, and thus the number of granular opt-ins required.
  2. The second question about granularity of consent asked whether multiple controllers that pursue identical purposes should be unbundled from each other. In other words, should consent be requested not only per purpose, but per controller too. This is important because it should not be assumed that a person trusts all data controllers equally. Nor is it likely that all controllers apply equal safeguards of personal data. Therefore, we asked whether it was appropriate to bundle multiple controllers together in a single consent request without the opportunity to accept some, and not all.
  3. We asked for guidance on how explicit consent operates for websites and apps, where a controller wishes to process special categories of personal data. Previously the Working Party cited the double opt-in as method of explicit consent for e-mail marketing. We presented wireframes of how this might operate on web and mobile.
  4. We asked for clarification that all unique identifiers are personal data. This is important because the presence of a unique ID enables the combining of data about the person associated with that unique ID, even if the party that originally assigned the unique ID did so randomly, without any understanding of who the data subject is.
  5. We asked for guidance on how Article 13 of the GDPR applies to non-tracking cookies (without personal data) as opposed to personal data. This is important because some paragraphs of this article were intended to apply to personal data and are not appropriate for non-personal data.

In addition to these questions we made three statements.

  1. Websites, apps, and adtech vendors leak personal data to unknown parties in routine advertising operation (via “RTB” bid requests, cookie syncs, JavaScript ad units, mobile SDKs, and other 3rd party integrations). This is preventable.
  2. We noted our support for the Working Party’s view that the GDPR forbids the demanding of consent for 3rd party tracking that is unrelated to the provision of an online service.
  3. It is untenable for any publisher, adtech vendor, or trade body, to claim that they must use personal data for online advertising. As we and others have shown, sophisticated adtech can work without personal data.

The full letter is available here.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

GDPR's non-tracking cookie banners

This note outlines how an anomaly in European law will impact cookie storage and presents wireframes of permission requests for non-tracking cookies. 
Online media will soon find itself in an anomalous position. It will be necessary to apply the GDPR’s consent requirements to cookies that reveal no personal data, even though the GDPR was not intended to be applied in this way.[1]
Recital 26 of the GDPR says that “the principles of data protection should … not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person…”.[2]
Even so, a hiccup in the choreography of European Law making is creating an unexpected situation in which the GDPR’s conditions will apply to cookies that reveal or contain no personal data.
The Data Protection Directive currently sets out the conditions under which consent should be sought for the storage of cookies.[3] However, this Directive will be repealed on 25 May 2018, before the forthcoming ePrivacy Regulation introduces new conditions for cookie consent.[4]
The Commission had intended that both the GDPR (which repeals the Data Protection Directive) and the ePrivacy Regulation (which updates cookie consent conditions) would be applied on the same date. But now that the ePrivacy Regulation is considerably delayed, a provision of the GDPR that says references to the Data Protection Directive “shall be construed as references to this Regulation” will apply to non-personal data in cookies also.[5]
Non-personal data are data that can not be related to an identifiable person. For example, there is no unique identifier, the data could relate to many people, and could not be used to single out an individual. As the European Court of Justice said in 2016, data are not personal “if the identification of the data subject was prohibited by law or practically impossible on account of the fact that it requires a disproportionate effort in terms of time, cost and manpower, so that the risk of identification appears in reality to be insignificant”.[6]
The GDPR way of asking for consent does not neatly apply to data such as these, that are not personal. For example, the language of the GDPR’s requirements for consent refers explicitly to personal data concepts. Consider some of the important terms: “processing” is “any operation or set of operations which is performed on personal data or on sets of personal data…”.[7] The word “processing” does not have this meaning where personal data are absent. Nor does the word “controller”, because a controller is “the natural or legal person … which … determines the purposes and means of the processing of personal data…”. [8] Similarly, “profiling” is “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects…”[9].

Less friction

Therefore, although the GDPR provides for a very high standard of information to be presented with consent requests, as elaborated in a previous PageFair Insider note,[10] there is considerably less friction when using the GDPR requirements to request storage permission for data that are not personal.
The following table shows what elements are relevant when the GDPR’s requirements for consent are applied to cookies that neither contain nor revel personal data, as opposed to when it is applied to any processing of personal data.

Information to accompany consent requests
GDPR consent requirements – items listed in Article 13 Cookies where there are no personal data Any processing of personal data
the identity and the contact details of the controller[11] and, where applicable, of the controller’s representative;[12] N/A (there is no controller) Yes (where applicable)
the contact details of the data protection officer, where applicable;[13] N/A (there are no personal data) Yes (where applicable)
the purposes of the processing for which the personal data are intended as well as the legal basis for the processing;[14] N/A (there are no personal data being processed) Yes
where the processing is based on point (f) of Article 6(1), the legitimate interests pursued by the controller or by a third party; N/A N/A
the recipients or categories of recipients of the personal data, if any;[15] N/A (there are no personal data being shared) Yes (where applicable)
where applicable, the fact that the controller intends to transfer personal data to a third country or international organisation and the existence or absence of an adequacy decision by the Commission, or in the case of transfers referred to in Article 46 or 47, or the second subparagraph of Article 49(1), reference to the appropriate or suitable safeguards and the means by which to obtain a copy of them or where they have been made available.[16] N/A (there are no transfers of personal data) Yes (where applicable)
the period for which the personal data will be stored, or if that is not possible, the criteria used to determine that period;[17] N/A (there is no storage of personal data) Yes
the existence of the right to request from the controller access to and rectification or erasure of personal data or restriction of processing concerning the data subject or to object to processing as well as the right to data portability;[18] N/A (there are no personal data) Yes
where the processing is based on point (a) of Article 6(1) or point (a) of Article 9(2), the existence of the right to withdraw consent at any time, without affecting the lawfulness of processing based on consent before its withdrawal;[19] N/A (there is no processing of personal data) Yes
the right to lodge a complaint with a supervisory authority;[20] Yes Yes
whether the provision of personal data is a statutory or contractual requirement, or a requirement necessary to enter into a contract, as well as whether the data subject is obliged to provide the personal data and of the possible consequences of failure to provide such data;[21] N/A (there are no personal data) Yes (where applicable)
the existence of automated decision-making, including profiling,[22] referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.[23] N/A (there are no personal data) Yes (where applicable)

As the table shows, the requirements for consent are considerably less demanding when used to request storage permission for non personal data, such as non-tracking cookies. This is because the GDPR was not intended to be applied in this manner. Below is a wireframe of a “storage permission” dialogue.

Storage permission

In this simple wireframe the question mark button reveals two informational buttons. 
The “my data rights” button provides information about how to lodge a complaint with the supervisory authorities, which is required under Article 13, paragraph 2, d. The “What is stored” button describes the non-personal data stored on the device, providing assurance to the user that their consent will not impact their fundamental right to privacy or their fundamental right to data protection. 
Note that this only applies where publishers and their adtech vendors scrupulously avoid the collection and any other processing of personal data, including all unique identifiers, as Perimeter Trusted Partners do. Otherwise, the GDPR’s consent requirements apply as normal.

The future

This anomalous situation will change when the ePrivacy Regulation is applied at some point in 2018 or later. The question is whether enough sensible pro-privacy businesses and NGOs will make the case for non-tracking cookies in the new Regulation. In late 2017 PageFair wrote to Members of the European Parliament to argue the case for permitting non-tracking cookies under the ePrivacy Regulation.[24] Our argument was that websites need a means to store information to operate, even for ancillary operations that their visitors do not request (such as A/B testing, for example) without bothering their users. Certainly, consent is essential where personal data are concerned, or where there exists the possibility to access communications information, for example, or private photo albums. But where non-tracking cookies are concerned, there must be an easier way. Unless there is some provision for protecting the humble non-tracking cookie, websites’ ability to smoothly transition to privacy-by-design advertising will be harmed.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Notes

[1] Regulation (EU) 2016/679 of The European Parliament and of The Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), Article 2, paragraph 1, notes the material scope of the Regulation: “This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than by automated means of personal data which form part of a filing system or are intended to form part of a filing system.”
[2] ibid., Recital 26.
[3] This is because the ePrivacy Directive, Article 2, paragraph f, and Recital 17, say that consent under the ePrivacy Directive should have the same meaning as previously defined in the Data Protection Directive.
[4] Article 94 of the GDPR repeals Directive 95/46/EC (the Data Protection Directive).
The ePrivacy Directive, Recital 17, says that “For the purposes of this Directive, consent of a user or subscriber, regardless of whether the latter is a natural or a legal person, should have the same meaning as the data subject’s consent as defined and further specified in Directive 95/46/EC. Consent may be given by any appropriate method enabling a freely given specific and informed indication of the user’s wishes, including by ticking a box when visiting an Internet website.”
The ePD Article 2, (f) says “‘consent’ by a user or subscriber corresponds to the data subject’s consent in Directive 95/46/EC”.
[5] The GDPR, Article 94, paragraph 2, says that references to the Data Protection Directive “shall be construed as references to this Regulation [the GDPR]”.
[6] Judgment of the Court (Second Chamber) Patrick Breyer v Bundesrepublik Deutschland, Case C-582/14, 19 October 2016.
[7] ibid., Article 4, paragraph 2.
[8] ibid., Article 4, paragraph 7.
[9] ibid., Article 4, paragraph 4.
[10] “GDPR consent design: how granular must adtech opt-ins be?”, PageFair Insider, 8 January 2018 (URL: https://pagefair.com/blog/2018/granular-gdpr-consent/).
[11] Note that the GDPR defines “controller” as an entity concerned with personal data. The definition in Article 4, paragraph 7, begins: “the natural or legal person … which … determines the purposes and means of the processing of personal data…”.
[12] The GDPR, Article 13, paragraph 1, a.
[13] ibid., Article 13, paragraph 1, b.
[14] ibid., Article 13, paragraph 1, c.
[15] ibid., Article 13, paragraph 1, e.
[16] ibid., Article 13, paragraph 1, f.
[17] ibid., Article 13, paragraph 2, a.
[18] ibid., Article 13, paragraph 2, b.
[19] ibid., Article 13, paragraph 2, c.
[20] ibid., Article 13, paragraph 2, d.
[21] ibid., Article 13, paragraph 2, e.
[22] Note that “profiling” is defined in the GDPR as a processing of personal data. The definition in Article 4, paragraph 4 begins: “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects…”
[23] ibid., Article 13, paragraph 2, f.
[24] PageFair to European Parliament ePrivacy rapporteurs, 5 July 2017, re “non-tracking cookies in the ePrivacy Regulation” (URL: https://pagefair.com/blog/2017/non-tracking-cookies/).
[25] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 20.

GDPR consent design: how granular must adtech opt-ins be?

This note examines the range of distinct adtech data processing purposes that will require opt-in under the GDPR.[1]
In late 2017 the Article 29 Working Party cautioned that “data subjects should be free to choose which purpose they accept, rather than having to consent to a bundle of processing purposes”.[2] Consent requests for multiple purposes should “allow users to give specific consent for specific purposes”.[3]  Rather than conflate several purposes for processing, Europe’s regulators caution that “the solution to comply with the conditions for valid consent lies in granularity, i.e. the separation of these purposes and obtaining consent for each purpose”.[4] This draws upon GDPR, Recital 32.[5]
In short, consent requests must be granular, showing opt-ins for each distinct purpose.

How granular must consent opt-ins be?

In its 2013 opinion on “purpose limitation”, the Article 29 Working Party went some way toward defining the scope of a single purpose: a purpose must be “sufficiently defined to enable the implementation of any necessary data protection safeguards,” and must be “sufficiently unambiguous and clearly expressed.”[6]
The test is “If a purpose is sufficiently specific and clear, individuals will know what to expect: the way data are processed will be predictable.”[7] The objective is to prevent “unanticipated use of personal data by the controller or by third parties and in loss of data subject control [of these personal data]”.[8]
In short, a purpose must be specific, transparent and predictable.[9] It must be describable to the extent that the processing undertaken for it would not surprise the person who gave consent for it.
The process of showing an ad to a single person (in online behavioral advertising) involves the processing of personal data for several distinct purposes, by hundreds of different companies.
[accordion id=”video”] [accordion_item title=”Video: how personal data passes between companies in online behavioral advertising” parent_id=”video”]


[/accordion_item][/accordion]
Therefore, a broad, all-encompassing sentence such as “to show you relevant advertising” does not make it possible for one to grasp how one’s data will be used by a large number of companies. It would not be possible to understand from this sentence, for example, that inferences about one’s characteristics would be inferred, or what types of consequences may result.
The following table shows an indicative list of ten purposes for which personal data are currently processed in the online behavioral advertising system. In practice, there may be more purposes at play. The table also generalizes the types of company involved in the selection and display of an ad.
[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/purposes.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download high resolution PDF [/x_button]
A spreadsheet version of this table is available here.
(Refer to footnote 10 to for a discussion the challenges presented by these purposes for all businesses involved.[10])

Pre-consent naming of each controller, and granular post-consent controller consent withdrawal

Recital 42 of the GDPR notes that “For consent to be informed, the data subject should be aware at least of the identity of the controller and the purposes of the processing”.[11] All controllers (including “joint controllers” that “jointly determine the purposes and means of processing”[12]) must be named.[13]
Each purpose must be very clear, and each opt-in requires a “clear affirmative action” that is both “specific”, and “unambiguous”.[14] There can be no pre-ticked boxes,[15] and “consent based on silence” is not permitted.[16]
Therefore, a consent request should be made with granular options for each of these purposes, and the names each controller that processes personal data for each of these purposes. For example:  

Specific purpose 1 | controllers A, B, C | options: Accept / Refuse 

There are two different scenarios for how consent for these purposes will be presented: the best case, and the more likely worst case.

The best scenario

At a minimum, then, assuming that all websites, SSPs, Ad Exchanges, DSPs, DMPs, and advertisers could align to pursue only these purposes, a consent request for this would include granular opt-in controls for a wide range of diverse purposes, the categories of processor pursuing each, and a very long list of controller names pursuing each.
The language and presentation of the request must be simple and clear, ideally the result of user testing.[17]
A consent request for a single purpose, on behalf of many controllers, might look like this.

Specific processing purpose consent, for multiple controllers,
with “next” button for multiple processing purpose opt-ins

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

What is presented when?

The Article 29 Working Party suggests that consent notices should have layers of information so that they do not overload viewers with information, but make necessary details easily available.[18] This is adopted in the design above using “View details”, “Learn about your data rights here”, and similar buttons and links.
When a user clicks “view details” to see the next layer of information about a controller

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

While some details, such as contact details for a company’s data protection officer, can be placed in a secondary layer, the primary layer must include “all basic details of the controller and the data processing activities envisaged”.[19]
Elements presented in this layer

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

The likely scenario:

The scenario above assumes that all businesses in online behavioral advertising can agree to pursue tightly defined purposes without deviation. However, it is more likely that controllers will need granular opt-ins, because their purposes are unique.
Any individual controllers who intend to process data for their own unique purposes will need further granular opt-ins for these purposes. Since adtech companies tend to deviate from the common purposes outlined above, it is likely that most or all of them would ultimately require granular purpose consent for each controller.
However, even if all controllers pursued an identical set of purposes so that they could all receive consent via a single consent dialogue that contained a series of opt-ins, there would need to be a granular set of consent withdrawal controls that covered every single controller once consent had been given. The GDPR says that “the data subject may exercise his or her rights under this Regulation in respect of and against each of the controllers”.[20]

A higher bar: “explicit consent”

Processing of personal data in online behavioral advertising (for example, purposes 2, 3, 5, 8, and 10 in the table above) is highly likely to produce special categories of data by inference.[21] Where this occurs, these purposes require “explicit” consent.[22]
Special categories of data reveal “racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, … [and] data concerning health or data concerning a natural person’s sex life or sexual orientation”.[23] 
To make consent explicit requires more confirmation. For example, the Article 29 Working Party suggests that two-stage verification is a suitable means of obtaining explicit consent.[24] One possible approach to this is suggested in PageFair’s design below.
Suggested mechanism for “explicit consent” 

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

One can confirm one’s opt-in in a second movement of the finger, or cursor and click. It is unlikely that a person could confirm using this interface unless it was their intention.  

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

Note that even this high bar, however, may not be permitted in some Member States. The GDPR gives European Member States the latitude to enact national legislation that prohibits consent as a legal basis for processing of special categories of data.[25] Therefore, it may not be legal to process any special categories of personal data in some EU Member States.

Conclusion 

Consent for website and app publishers is certainly an important objective, but the personal data it provides must only be processed after data leakage has been stopped. Data leakage (through in RTB bid requests, cookie syncs, JavaScript ad units, and mobile SDKs) exposes publishers as the most obviously culpable parties that regulators and privacy NGOs can target. At the same time, it also exposes their adtech vendors, and advertisers, to large fines and legal actions too.[26]
Websites, apps, and adtech vendors, should switch from using personal data to monetize direct and RTB advertising to “non-personal data”.[27] Using non-personal, rather than personal, data neutralizes the risks of the GDPR for advertisers, publishers, and adtech vendors. And it enables them to address the majority (80%-97%) of the audience that will not give consent for 3rd party tracking across the web.[28]
We recently revealed PageFair Perimeter, a regulatory firewall that blocks party data leakage, and enables publishers and adtech partners to use non-personal data for direct and RTB monetization when consent is absent (and leverage personal data when adequate consent has been given). You can learn more about Perimeter here. Publishers using Perimeter do not need people’s personal data (nor the consent required to process it) to monetize websites and apps.

[x_button shape=”rounded” size=”regular” float=”none” href=”http://pagefair.com/perimeter/” info=”none” info_place=”top” info_trigger=”hover”]Learn about Perimeter[/x_button]

Postscript

A hiccup in the choreography of the European Commission’s legislative proposals means that non-tracking cookies will need storage consent, at least until the application of the forthcoming ePrivacy Regulation. These cookies, however, contain no personal data, and obtaining consent for their storage is significantly less burdensome than obtaining consent for to process personal data for multiple purposes and multiple controllers. Update: 16 January 2018: See PageFair Insider note on storage consent for non-tracking cookies.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]


Notes:

[1] See our discussion of why consent is the appropriate legal basis for online behavioral advertising in “Why the GDPR ‘legitimate interest’ provision will not save you” , PageFair Insider, 13 March 2017 (URL: https://pagefair.com/blog/2017/gdpr-legitimate-interest/).
[2] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 11.
[3] ibid., p. 13.
[4] ibid., p. 11.
[5] Regulation (EU) 2016/679 of The European Parliament and of The Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), Recital 32. “…Consent should cover all processing activities carried out for the same purpose or purposes. When the processing has multiple purposes, consent should be given for all of them. …”
[6] “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013, p. 12.
Curiously, the Spanish Data Protection Authority has issued guidance that contains a sentence suggesting that continuing to browse a website might constitute consent, which is at odds with the Article 29 Working Party guidance on consent and appears to be entirely at odds with the text of the Regulation. See “Guía del Reglamento General de Protección de Datos para responsables de tratamiento”, Agencia Española de Protección de Datos, November 2017, p. 6.
[7] “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013, p. 13.
[8] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 12.
[9] “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013, p. 13.
[10] None of these purposes would be permissible unless data leakage were first addressed. See “Consent to use personal data has no value unless one prevents all data leakage”, PageFair Insider, October 2017 (URL: https://pagefair.com/blog/2017/understanding-data-leakage/). Furthermore,

  • Purpose 3 could not be permissible in any situation.
  • Purposes 2, 3, 5, 8, and 10 are highly likely to produce special categories of data by inference. See discussion of “explicit consent” in this note.
  • Regarding the purposes for which data have been sold, and to what category of customer, see “Data brokers: a call for transparency and accountability”, Federal Trade Commission, May 2014, pp 39-40, and B3-B

[11] The GDPR, Recital 42.
[12] The GDPR, Article 26, paragraph 1.
[13] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 14.
[14] The GDPR, Article 4, paragraph 11.
[15] ibid., Recital 32.
[16] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 16.
[17] “Guidelines on transparency under Regulation 216/679” Article 29 Working Party, November 2017, pp 8, 13.
[18] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 14.
[19] ibid., p. 15.
[20] The GDPR, Article 26, paragraph 3.
[21] “Informing data subjects is particularly important in the case of inferences about sensitive preferences and characteristics. The controller should make the data subject aware that not only do they process (non-special category) personal data collected from the data subject or other sources but also that they derive from such data other (and special) categories of personal data relating to them.” See “Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679”, Article 29 Working Party, 3 October 2017, p. 22.
[22] The GDPR, Article 9, paragraph 2, a.
[23] ibid., Article 9, paragraph 1.
[24] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 19.
[25] The GDPR, Article 9, paragraph 2, a.
[26] See “Consent to use personal data has no value unless one prevents all data leakage”, PageFair Insider, October 2017 (URL: https://pagefair.com/blog/2017/understanding-data-leakage/).
[27] Non-personal data are any data that can not be related to an identifiable person. As Recital 26 of the GDPR observes, “the principles of data protection should therefore not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable”. This recital reflects the finding of the European Court of Justice in 2016 that data are not personal “if the identification of the data subject was prohibited by law or practically impossible on account of the fact that it requires a disproportionate effort in terms of time, cost and manpower, so that the risk of identification appears in reality to be insignificant”. Judgment of the Court (Second Chamber) Patrick Breyer v Bundesrepublik Deutschland, Case C-582/14, 19 October 2016.
Non-tracking cookies, which contain no personal data, are useful for privacy-friendly advertising, and for other functions where an individual does not need to be identified such as A/B testing.
[28] See “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, September 2017 (URL: https://pagefair.com/blog/2017/new-research-how-many-consent-to-tracking/).
The granularity of consent required for online behavioral advertising will make the consenting audience even smaller. Moreover, consent for adtech will not only be hard to get, it will also be easy to lose. Consent can be withdrawn with the same degree of ease as it was given, under The GDPR, Article 7, paragraph 3.
The Article 29 Working Party demonstrates what this means in practice: “When consent is obtained … through only one mouse-click, swipe, or keystroke, data subjects must … be able to withdraw that consent equally as easily”.
“Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 21.
The guidance also says that “Where consent is obtained through use of a service specific user interface (for example, via a website, an app, a log-on account, the interface of a IoT device or by e-mail), there is no doubt a data subject must be able to withdraw consent via the same electronic interface, as switching to another interface for the solve reason of withdrawing consent would require undue effort”.

The regulatory firewall for online media and adtech

This note announces Perimeter, a regulatory firewall to enable online advertising under the GDPR. It fixes data leakage from adtech and allows publishers to monetize RTB and direct ads, while respecting people’s data. 
PageFair takes a strict interpretation of the GDPR. To comply, all media owners need to protect their visitors’ personal data, or else find themselves liable for significant fines and court actions. In European Law, personal data includes not only personally identifiable information (PII), but also visitor IP addresses, unique IDs, and browsing history.[1] The problem is that today’s online ads operate by actively disseminating this kind of personal data to countless 3rd parties via header bidding, RTB bid requests, tracking pixels, cookie syncs, mobile SDKs, and javascript in ad creatives. This exposes everyone, from the publisher to the advertiser, to potential fines, litigation and brand damage.[2]
Perimeter fixes this. It enables publishers to securely protect and control the tracking activities of the entire advertising supply chain in their websites and apps, by strictly blocking all third parties unless both publisher and data subject have given their consent.

[x_button shape=”rounded” size=”large” float=”none” href=” https://pagefair.com/perimeter/” title=”perimeterblogbutton” info=”none” info_place=”top” info_trigger=”hover”]Learn more about Perimeter [/x_button]

Revenue with or without tracking and consent

Publishers using Perimeter do not need people’s personal data (nor the consent required to process it) to monetize websites and apps. This is critically important, because only a small minority of people online are expected to consent to third party tracking for online advertising.[3]
Even without personal data, Perimeter enables interoperation with GDPR-compliant ad tech vendors so that frequency capping, attribution, impression counting, click counting, view-through counting, conversion counting, and fraud mitigation, all work without personal data. The list of compliant adtech vendors that PageFair works with to do this is growing.
Perimeter will also re-enable audience targeting by using non-personal segments that can also interoperate with consent-bearing DMPs.
When adequate consent is present, publishers, adtech vendors and advertisers can use personal data, and Perimeter will interoperate with other compliant consent management platforms. Indeed, Perimeter is a necessary partner to make consent meaningful.[4] Adtech vendors are eager for publishers to collect consent on behalf of 3rd parties – but publishers must simultaneously block all parties who do not have consent, or else remain exposed to liabilities.

Take Control of Data Leakage in all your Digital Properties

Perimeter brings privacy and data protection to the RTB/programmatic advertising ecosystem in the following ways:

  • Automatic removal of data leaking scripts from ads before they are rendered.
  • Prevention of unauthorized 3rd parties from accessing personal data.
  • Enforcement of data protection in RTB bid requests.
  • Enforcement of data protection in mobile SDK.

[x_button shape=”rounded” size=”large” float=”none” href=” https://pagefair.com/perimeter/” title=”perimeterblogbutton” info=”none” info_place=”top” info_trigger=”hover”]Learn more about Perimeter [/x_button]

Four Components

Perimeter provides four components that protect website and app publishers, and all of their advertising partners.

  1. Server side ad rendering
    Controls data in bid requests and ad creatives
  2. Policy Manager
    Empowers publishers to decide what 3rd parties are permitted to run in their websites and apps.
  3. User Consent Manager
    Enables granular consent to be obtained, communicated, and withdrawn by users.
  4. Privacy-by-design adtech interoperation
    Perimeter is partnering with other adtech vendors who are innovating to be compatible with strict enforcement of privacy regulations. This re-enables all core campaign management, measurement and targeting features without depending on legally toxic tracking IDs or other personal data.

[x_alert heading=”Resource to check your adtech vendors’ compliance ” type=”success”]Here is a resource for publishers to check whether your adtech vendors are compliant.[/x_alert]

Ethical data

We built Perimeter to enable websites and apps to transition from the old adtech industry to a more ethical one. This is why we are openly sharing the measurement and capping techniques for non-personal data adtech.
Perimeter is the result of 24 months of intensive technical and policy research and development, and combines the feedback of many app developers, advertisers, adtech vendors, privacy NGOs, regulators, and lawmakers.
It enables publishers, and their advertising partners, to operate within a clean and ethical data/media industry.

[x_button shape=”rounded” size=”large” float=”none” href=” https://pagefair.com/perimeter/” title=”perimeterblogbutton” info=”none” info_place=”top” info_trigger=”hover”]Learn more about Perimeter [/x_button]

Are you an ad tech vendor?

Let us know if you would like to find out more about GDPR compliance requirements and getting whitelisted on the Perimeter partner program. You can read about non-personal data methods here.
 
[x_line]

Notes

[1] See the definition of personal data in Article 4, (1), Regulation (EU) 2016/679 of The European Parliament and of The Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation).
[2] ibid., Article 4, paragraph 2.
[3] See “Europe Online: an experience driven by advertising”, GFK, 2017, p. 7 and “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, 12 September 2017.
[4] Because without Perimeter’s mitigation of data leakage, consent has no value. See “Consent to use personal data has no value unless one prevents all data leakage”, October 2017, PageFair Insider (URL: https://pagefair.com/blog/2017/understanding-data-leakage/)

Overview of how the GDPR impacts websites and adtech (IAPP podcast)

In this podcast, the International Association of Privacy Professionals interviews PageFair’s Dr Johnny Ryan about the challenges and opportunities of new European privacy rules for website operators and brands. 
Update: 3 January 2018: This podcast was the International Association of Privacy Professionals’ most listened to podcast of 2017. 

The conversation begins at 4m 14s, and covers the following issues.

  • Risks for website operators
  • How “consent” is an opportunity for publishers to take the upper hand in online media
  • Brands’ exposure to legal risk, and the agency / brand / insurer conundrum
  • Personal data leakage in RTB / programmatic adtech
  • How the adtech industry should adapt

As we told Wired some months ago, it’s not just that websites might expose yourself to litigation, it’s that you might expose your advertisers to litigation too. But this can be fixed.
Click here to view PageFair’s repository of explainers, analysis, and official documents about the new privacy rules.
Elsewhere you can find details about PageFair’s GDPR solutions for website operators.
Note: the IAPP published this podcast this month. The interview was conducted several months ago. 
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Frequency capping and ad campaign measurement under GDPR

This note describes how ad campaigns can be measured and frequency capped without the use of personal data to comply with the GDPR. 
It is likely that most people will not give consent for their personal data to be used for ad targeting purposes by third parties (only a small minority [1] of people online are expected to consent to third party tracking for online advertising). Even so, sophisticated measurement and frequency capping are possible for this audience.
This note briefly outlines how to conduct essential measurement (frequency capping, impression counting, click counting, conversion counting, view through measurement, and viewability measurement) in compliance with the EU’s General Data Protection Regulation. This means that publishers and advertisers can continue to measure the delivery of the ads that sustain their businesses, while simultaneously respecting European citizens’ right to protection of their personal data.
Note that this discussion assumes that the final text of the EU’s ePrivacy Regulation will not incidentally illegalize non-tracking cookies (i.e., cookies that neither contain nor reveal any personal data, and therefore pose no privacy risks) [2].
Table: cleaner ad tech measurement [3]

 

Frequency capping, without personal data

Most of today’s ad servers implement frequency capping by using a server-side database to store the number of times an ad campaign has been shown to each user. Each user is tracked using a unique ID, which is stored both in the database and in a 3rd party cookie in the user’s browser [4]. Since this user ID could be used to track what websites the user is visiting, and could potentially be matched against other online trackers and offline data, this will be illegal under GDPR (unless the user has specifically consented to it).
A privacy-by-design alternative is to get rid of the user ID and move the counter directly into the cookie. The cookie it is stored in can have an expiry equal to the maximum amount of time the campaign should be capped for, and the name of the cookie can be set to the name of the ad campaign. Variations on this approach have been discussed for years – see Arvind Narayanan and Jonathan Mayer’s approach here. Since the counter does not contain any information specific to a particular user, it is not “personal data” under GDPR, and is not subject to consent.
Two inefficiencies of this approach, storage and bandwidth, are addressed below.
First, how much storage space will all those frequency capping cookies take up in the web browser? So long as the cookie expiry dates are reasonable, this data should be proportional to the number of ad campaigns delivered to a browser over a one- or two-week time window, and should not grow beyond that. Even if a user manages to view a hundred thousand different advertising campaigns in a two week period, that would still require no more than a few megabytes to store.
Second, how much bandwidth might be consumed by transmitting all these view counters along with every request to the ad server? This can be reduced by being efficient in the encoding of the cookie data. As shown in the inset below,  transmitting a frequency counter in the ad server cookie could take as little as 9 bytes of extra bandwidth, which means that thousands of counters could be transmitted without significantly impacting the weight of a modern web page.

Calculating potential bandwidth requirements
The name of the cookie might be used to store a campaign ID, efficiently encoded using all the characters available according to RFC 6265 (i.e., “A-Z”, “a-z”, “0-9” and “!#$%&’*+-.^_`|~”). That’s 77 characters, meaning that just four characters can be used to encode over 35 million (i.e. 774) different IDs, which is easily sufficient for all ad campaigns that an ad server might be managing in a given period. Meanwhile, the value portion of the cookie needs only contain a counter, which can be 2 characters long (to store a value up to 99 in decimal, or about 6,000 in base-77 encoding.With this system in place, a browser request to an ad server might consist of the following HTTP request:

GET /ad HTTP/1.1
Host: acmeadserver.com
Cookie: 2iP&=21; Rz%x=13

The “Cookie:” field concatenates all cookies previously stored by the ad server. We can see that the bandwidth consumed by each frequency counter is only 9 bytes in total: 4 characters for the campaign ID, 2 characters for the counter value, and 3 counters for the equals sign, the semi-colon delimiter and the space. It would only take 20 Kb to transmit over two thousand frequency counters in an ad call.

There is a further opportunity to optimize bandwidth, with the help of header bidding. Because header bidding sends multiple potential bids to the client, the client can do the work of deciding which ones have not reached a frequency cap and are therefore eligible for display. SSPs are currently moving from second-price auctions to first-price auctions to better support header bidding, and in time are likely to start returning multiple bids to header bidder wrappers, so that more bids can participate in the final auction. This will provide plenty of choice to a client-side frequency capping algorithm, and probably entirely eliminate the need for the frequency capping data to ever leave the browser.

Campaign Metrics

Campaign metrics allow advertisers and media owners to see how different advertising campaigns are performing, and to optimize campaigns if necessary. The typical metrics are impression counts, click counts, conversion counts, view-through measurement and viewability measurement.
Fortunately, none of these metrics concern individual people. Therefore, ad servers can avoid tracking information at an individual user level to ensure GDPR compliance. It is likely that many of today’s ad servers have not been so careful, and have implementations that currently depend on counting the same user ID that was used for frequency capping. These ad servers will need to consider providing alternative implementations when serving ads to EU users.
In the paragraphs below we review typical campaign metrics for likely GDPR compliance, and suggest alternative implementations where appropriate.

Impression Counting

Impression counting is normally implemented by incrementing a database counter pertaining to the ad campaign whenever a request is made to the ad server for that campaign, or when an impression pixel specific to that campaign is loaded from the ad server by the web browser. Basic impression counting should not pose a problem under GDPR, as no user-specific information is processed or stored [5].

Click Counting

Click measurement is normally performed by redirecting the browser to the ad server when the ad is clicked on, at which point it registers the click event, and then redirects the browser to the advertiser URL.
Like basic impression counting, click counting should not be problematic under GDPR, as all that is required is an overall counter of the number of times that an ad has been clicked on across all users, without the involvement of any user-specific information.
When the number of clicks is known, the click-through-rate is given by dividing the number of clicks by the number of impressions for any given campaign.

Conversion counting

Conversion counting is normally performed by obtaining a campaign-specific pixel from the ad server, and placing that pixel on a web page the user will be brought to when they complete a transaction with the advertiser (when they “convert”). When a user who has clicked on an ad “converts”, the conversion pixel will be loaded from the ad server, which will increase the conversion count for that campaign by one.
As with impression counting and click counting, there is no specific privacy concern here: only campaign-level data is used, and no user-specific information is processed.

View Through Measurement

Although click-through rates help an advertiser understand the immediate positive reaction of users who see their ad, many are also interested in the indirect response, e.g., how many people who saw the ad went on to buy the product during the subsequent weeks regardless of clicking on the ad?
It is possible that some ad servers currently perform view-through measurement using user-specific information. For example, the ad server might record that an ad was viewed by a particular user ID. When the conversion pixel loads on the advertiser’s post-conversion page, the ad server could look up the details of the last time that user ID viewed that campaign.
The above implementation would be incompatible with GDPR, as it involves tracking the behavior of unique users. Fortunately, there are alternative implementations that are equally effective.
The correct approach is to use a non-tracking cookie to store the fact that the user has viewed the ad campaign. This means that the cookie would contain the ID of the ad campaign, not an ID of the user (and consequently, every user who saw that campaign would have an identical cookie). This cookie should be set to expire automatically when a certain amount of time has passed, beyond which the advertiser is not interested in attributing the visit to the fact the ad was seen. In this system, when the user eventually converts for the advertiser, the cookie containing the campaign ID is transmitted to the advertiser’s ad server, which can then increment the relevant view-through counter.
It is possible that this approach could lead to a lot of cookie data being transmitted and consuming bandwidth. To mitigate this, the view-through pixel should be homed on a domain unique to each advertiser, rather than every advertiser sharing the domain of their ad server.

Viewability Measurement

Viewability measurement allows advertisers to understand how many of the ads they pay to serve on a web site are likely to scroll into view and be displayed for long enough for a user to potentially notice them.
Viewability measures a characteristic of websites, not users, and can therefore be implemented in a GDPR compatible fashion. Although some viewability systems today might store per-user information, there is no fundamental need to do so. All that is required is to detect when a view event has occurred for each ad space, and to send that to the server to be counted. The server will then aggregate these events and provide an overall count of the number of times each ad space has been viewable by any user in each time period.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Notes

[1] See “Europe Online: an experience driven by advertising”, GFK, 2017, p. 7 and “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, 12 September 2017.
[2] For an overview of the issues see a previous PageFair Insider note, “The Privacy Case for Non-Tracking Cookies: PageFair writes to the European Parliament”, PageFair Insider, 10 August 2017 (URL:  https://pagefair.com/blog/2017/non-tracking-cookies/).
[3] Clearly, a campaign that targeted a single individual without a legal basis for doing so would be illegal. It is important that campaigns must target more than a small set of viewers.
[4] These data are automatically transmitted to the ad server along with every request.
[5] Ad servers may also support the counting of “unique impressions”, which means the number of unique users who saw the campaign. This mechanism generally relies on tagging each user with a unique identifier, and counting the number of unique identifiers. Therefore, while impression counting is practical, unique impressions may not be because a unique identifier could be misused.