GDPR and ePR

Facebook and adtech face a turbulent time in Europe's courts: the Brussels case.

This note examines a Belgian court ruling against Facebook’s tracking and approach to consent. Facebook and adtech companies should expect tough sanctions when they find themselves before European courts – unless they change their current approach to data protection and the GDPR. 
Facebook is playing a dangerous game of “chicken” with the regulators. First, it has begun to confront users in the EU with a new “terms of service” dialogue, which denies access to Facebook until a user opt-ins to tracking for ad targeting, and various other data processing purposes.[1] (more detail in footnote 1)

This dialogue appears to breach several important principles of the GDPR, including the principle of purpose limitation,[2] freely given, non-conditional consent,[3] and of transparency.[4] In other words, if Facebook attempts to collect consent in this manner, that consent will be unlawful. European Regulators have been very clear on this point.[5]
Second, on 1 May 2018, a mere twenty four days before the application date of the GDPR, Facebook’s head of privacy announced plans to build “Clear History”, a feature with which users can opt-out of Facebook collecting data about their visits to other websites and apps.[6] But the GDPR demands not an opt-out, but an opt-in.[7] Nor is Clear History available to non-Facebook users. And as a further sign of Facebook’s brinksmanship, it said “it will take a few months to build Clear History”,[8] which means that the feature will not be available to users until long after the GDPR has been applied later this month.
Facebook’s approach puts it on a collision course with European courts. This note examines one recent decision in which the Brussels Court of First Instance ruled that Facebook’s tracking of people on other websites is illegal, and that its approach to consent is invalid.[9] The immediate result was a financial penalty, and an order that Facebook must submit to having an independent expert supervise its deletion of all the personal data it illegally amassed.
The implications of the ruling are far wider. It is an insight into the hazard for digital publishers and adtech vendors of failing to heed the warnings of the Article 29 Working Party.

Important lessons for RTB/programmatic

Belgium’s data protection authority, the Belgian Privacy Commission,[10] challenged that Facebook’s “Like” buttons and trackers on websites all over the web enable it to look “over the shoulders of persons while they are browsing from one website to the next … without sufficiently informing the relevant parties and obtaining their valid consent”.[11]
The Court agreed, and summarized Facebook’s tracking in its ruling:

When someone visits a website with such a Facebook social plug-in, his browser will automatically establish a connection with (by sending an http request to) the Facebook server, after which the visitor’s browser directly loads the “plug-in” function from the Facebook server.[12]

The Court’s ruling outlined what data are received by Facebook from its social plugins installed on other websites:

1. IP address;
2. URL of the page of the website requested by the user;
3. The browser management system;
4. The type of browser, and
5. the cookies (previously) placed by the third-party website from which the browser requests the this-party content.[13]

In a previous judgement in 2015 the Court observed that these browsing data are “frequently of a very sensitive nature, allowing, for example, health-related, sexual and political preferences to be gauged”.[14]
This should give pause to digital publishers and adtech vendors, because these data, which reveal special categories of personal data, are exactly the same data that websites routinely broadcast to tens – if not hundreds – of companies in RTB bid requests.[15] This happens every time an advertisement is served.
The Court noted that the scale of Facebook’s presence across the web makes this tracking “practically unavoidable”.[16] The February 2018 ruling reiterated the Court’s previous ruling in 2015 that “the extent of the violations in question is massive: they do not only concern the violation of the fundamental rights of a single person, but of an enormous group of persons.”[17]
This too should give the online media and advertising industry pause, because the same applies the broadcasting of personal data in RTB bid requests by the majority of major websites across the globe, and to the creation of profiles based on these personal data by DMPs and other adtech vendors.

Facebook’s notification fig leaf ruled unlawful

Facebook provided the following notice to users about this tracking:

We use cookies to help personalise content, to target and measure advertising and to provide you with a safer experience…[18]

Unsurprisingly, the Court ruled that this is utterly inadequate:

The court has come to the decision that in all the cases described, Facebook does not obtain any legally valid consent in the sense of Article 5 (a) Privacy Act[19]and Article 129 ECA[20]for the disputed data processing.[21]

As a result, the Court ruled that Facebook does not have a legal basis for tracking Internet users as they browse the web. Nor does Facebook have a legal basis for tracking logged-in users around the web.[22]
Several of the Court’s admonitions are worth including here, because they are directly relevant to Facebook and other online media and adtech companies’ approaches to the GDPR.
First, the Court found that non-Facebook users are never told that their behavior on websites across the web is being profiled by Facebook:

When non-users visit a website of a third party that includes an (invisible) Facebook pixel that allows for tracking of browsing behavior, without indicating that they wish to make use of the Facebook service, no information mechanism (such as a banner) is displayed.[23]

This remains a legal risk for Facebook, and “Clear History” does not adequately mitigate this risk.
Second, the Court ruled that Facebook’s request for consent was not specific, and that any consent that it received was unlawful as a result:

‘Specific’ means that the expression of will must related to a specific instance or category of data processing and can thus not be obtained on the basis of a general authorization for an open series of processing activities.[24]

This part of the ruling was based on Article 1, section 8, of the Belgian Privacy Act, which uses the same formula of words as Article 4, paragraph 11, of the GDPR (“freely given, specific, informed…”). In other words, the Court is upholding a standard that is virtually identical to the standard that will apply under the GDPR. Facebook’s new GDPR consent dialogue faces the same problem, and is unlawful for the same reason.
Third, the Court found that Facebook users are not clearly told what “purposes” Facebook processes the personal data for. Nor does it clearly explain its use of sensitive data including any personal data that could reveal religious belief, sexual orientation, etc.:

the cookie banner, makes it insufficiently clear for which exact purposes the personal data – which indeed also include “sensitive data” (e.g. regarding religious beliefs or sexual orientation) – are being collected, while the following layers (including the cookie policy, data policy) also do not explain this in an easily comprehensible and accessible manner.[25]

Facebook has recently gone some way to inform users about the use of personal data concerning their political interests, but this is only a partial solution to a far broader risk for the company. Its handling of sensitive categories of personal data will be a major challenge, which it has yet to show any ability to resolve.[26] 
Fourth, and unsurprisingly in the aftermath of the Cambridge Analytica scandal, the Court found that Facebook did not properly disclose who it was sharing the data with. Nor did it provide any information about “the existence of a right to access and correction of the personal data concerning him”.[27] This is likely to remain a significant challenge.[28]
Fifth, the Court found that Facebook was not even complying with its own self-regulatory system. Whatever one’s view of the “adchoices” self-regulatory system, it is quite remarkable that Facebook continued to track people even if they had already used it to opt out.[29]

Facebook forced to delete data (and fined)

The Brussels Court ordered Facebook to pay €250,000 per day,[30] up to a maximum of €100 million, until it stopped its unlawful behavior.
This was a strong statement. To put this fine in to perspective, consider that Belgium has a population of 11.35 million people,[31] which is only 2% of the population of the EU.[32] At the same value per person, the EU equivalent would be €12.5 million per day, up to a maximum of €5 billion.
In addition, Facebook was ordered to submit to an independent expert supervising its deletion of all illegal data that it had amassed about every user on Belgian soil.[33] It also had to make sure that third parties to whom it provided illegal data do the same.
The Cambridge Analytica scandal shows that this last point about insuring that third parties delete their copies of Facebook’s illegally accumulated data will be impossible for Facebook to comply with, because of its lax data sharing standards. Recall that Mark Zuckerberg told US lawmakers

When developers told us they weren’t going to sell data, we thought that was a good representation. But one of the big lessons we’ve learned is that clearly, we cannot just take developer’s word for it.[34]

In other words, Facebook was sharing personal data without any control whatsoever, much as websites do when they send visitors’ personal data in RTB bid requests. Even if the original collection of the data had been lawful, this uncontrolled distribution would certainly is not. Again, the parallel with RTB bid requests should give publishers and adtech vendors pause.

What the Article 29 Working Party says, goes

Many of our colleagues in adtech have been unwilling to heed the counsel of the Article 29 Working Party (a roundtable of European regulators). The Brussels Court’s ruling illustrates the Working Party’s importance and authority. Although the Court is the arbiter, it relied on the Working Party’s authoritative opinions throughout its ruling. (The ruling cited the Working Party’s 2011 opinion on consent (15/2011),[35] its 2010 opinion on online behavioral advertising (2/2010)[36], its 2013 opinion on purpose limitation (2/2013)[37], and its 2010 opinion on the concepts of data controller and data processor (1/2010)[38].)
Whether or not businesses take the Working Party seriously, judges do, which is what matters when businesses find themselves facing sanctions for data misuse. This should demonstrate the value of closely abiding by the opinions of the Working Party. The requirements of European data protection law have been well illuminated by the public guidance of the Article 29 Working Party for over two decades, and provide an invaluable guide to businesses scrambling to comply with a body of law largely neglected hitherto.

Facebook can not reject users who refuse non-essential tracking

The Court ruled that Facebook cannot reject users who refuse to agree to tracking – unless the tracking in question is necessary for the service that a user explicitly requests from Facebook.[39] Instead, the Court ruled that users should be

given the option of refusing the placement of these cookies, in as far as this is not strictly necessary for a service explicitly requested by him, without his access to the Facebook.com domain being hereby limited or rendered more difficult.[40]

In December 2015, Facebook had blocked access to all Belgian users, following a court injunction that forbade it to place a (“Datr”) cookie without properly informing users.[41] (See footnote 41 for elaboration.) Facebook attempted to justify this denial of service in a notice to users that claimed it could not provide service because was prohibited from taking measures (unlawful tracking) to prevent unauthorized access to users’ Facebook accounts. The Court took a dim view of this:

The court concurs … that the systematic collection of the personal data of users and non-users via social plug-ins on the websites of third parties is not essential (let alone “strictly essential” in the sense of Article 129 ECA),or at least not proportional to the achievement of the safeguarding objective.[42]

The Court believed that Facebook’s purported fraud detection was insufficient in any case:

the systematic collection of safeguarding cookies is inadequate as a means of safeguarding, as it is easy to circumvent by persons with malicious intentions.[43]

Conclusion: fewer data, not more, will help Facebook in the EU

This ruling is one of several defeats Facebook has suffered in European courts in recent months. In January, the Berlin Regional Court ruled that Facebook’s approach to consent and terms are unlawful.[44] In April, the Irish High Court referred important aspects of Facebook’s trans-Atlantic transfers of personal data to the European Court of Justice, once again, for scrutiny.[45] It is likely that worse is to come, unless it significantly changes its approach to data protection within the EU.
However, the company has options. As unlikely as it may seem now, one can foresee that Facebook will introduce non-personal data based ad targeting to the Newsfeed. This is likely to be necessary because Facebook will be unable to win lawful consent for some of its data processing purposes for sensitive personal data (or data processing purposes for regular personal data, that are not “compatible” with purposes that the user has already agreed to).[46]
It seems likely that problem encompasses all personalized advertising on the newsfeed, custom audiences, and social share buttons on other websites. Therefore, Facebook must have a way of targeting ads to non-consenting users. Non-personal data would allow this.
It may also become important for Facebook to be able to participate in a clean and safe data supply chain, which major advertisers are beginning to show concern about.[47]
In addition, Facebook will have to limit the use of custom audiences to situations where it is certain that the advertiser has a valid legal basis.
There is a broader lesson. Digital publishers and adtech vendors need to urgently reassess the use of personal data in programmatic advertising, and reflect on how adtech’s shaky consent systems will fare in Europe’s courts.

Notes

[1] The new terms mention personalization of ads. See “Terms of service”, Facebook (URL: https://www.facebook.com/legal/terms/update), accessed 2 May 2018.
The Terms also refer to the data policy, which elaborates that “we use the information we have about you – including information about your interests, actions and connections – to select and personalise ads, offers and other sponsored content that we show you.” The data policy also says “We use the information [including] the websites you visit and ads you see … to help advertisers and other partners measure the effectiveness and distribution of their ads and services, and understand the types of people who use their services and how people interact with their websites, apps and services”. “Data policy”, Facebook (URL: https://www.facebook.com/about/privacy/update), accessed 2 May 2018.
[2] The GDPR, Article 5, paragraph 1.
[3] The GDPR, Article 7, paragraph 2.
[4] The GDPR, Article 13, paragraph 1 and paragraph 2.
[5] “Guidelines on consent under Regulation 2016/679”, WP259, Article 29 Working Party, 28 November 2017, p. 11.
[6] “Getting feedback on new tools to protect people’s privacy”, Facebook, 1 May 2018 (URL: https://newsroom.fb.com/news/2018/05/clear-history-2/).
[7] See the GDPR, Article 6, Article 8, and Article 9.
[8] “Getting feedback on new tools to protect people’s privacy”, Facebook.
[9] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., Dutch-language Brussels Court of First Instance (Nederlandstalige Rechtbank van Eerste Aanleg te Brussel/Tribunal de Première Instance néerlandophone de Bruxelles – the “Court”), 16 February, 2016/153/A (URL: https://pagefair.com/wp-content/uploads/2018/04/Belgian-Court-judgement.pdf).
[10 ]It has since changed its name to the Belgian Data Protection Authority.
[12] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 12.
[12] ibid., p. 9. See more detail on pp 49-51.
[13] ibid., p. 9.
[14] “Data leakage in online advertising”, PageFair (URL: https://pagefair.com/data-leakage-in-online-behavioural-advertising/).
[15] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 69.
[16] ibid., p. 69.
[17] ibid.,
Note that this raises the competition (antitrust) question, as Germany’s competition regulator, Andreas Mundt, has pointed out: “If Facebook has a dominant market position, then the consent that the user gives for his data to be used is no longer voluntary” (see https://www.reuters.com/article/us-facebook-privacy-germany/facebooks-hidden-data-haul-troubles-german-cartel-regulator-idUSKBN1HU108).
[18] ibid., p. 8.
[19] Which implemented the Data Protection Directive.
[20] Electronic Communications Act of 20 June 2005, which implemented the ePrivacy Directive.
[21] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 64.
[22] ibid., p. 73-4.
[23] ibid., p. 57
[24] ibid., p. 61.
[25] ibid., p. 58.
[26] See discussion of special categories of data in the newsfeed in “How the GDPR will disrupt Google and Facebook”, PageFair, 30 August, (URL: https://pagefair.com/blog/2017/gdpr_risk_to_the_duopoly/).
[27] ibid., p. 59.
[28] See testimony by Chris Vickery at the UK Parliament Digital, Culture, Media and Sport Committee Wednesday 2 May 2018  (URL: https://www.parliamentlive.tv/Event/Index/0cf92dd0-f484-4699-9e01-81c86acb880c)
[29] ibid., p. 63.
[30] ibid., p. 70.
[31] World Bank, 2016.
[32] Eurostat, population on 1 January 2017 (URL: ec.europa.eu/eurostat/tgm/table.do?tab=table&plugin=1&language=en&pcode=tps00001)
[33] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 14, 70.
[34] Testimony of Mark Zuckerberg Chairman and Chief Executive Officer, Facebook, Hearing before the United States House of Representatives Committee on Energy and Commerce, 11 April 2018 (URL: https://www.c-span.org/video/?443490-1/facebook-ceo-mark-zuckerberg-testifies-data-protection&live&start=4929#).
[35] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., pp. 57-8, 61.
[36] ibid., p. 59.
[37] ibid., p. 60.
[38] ibid., p. 70.
[39] ibid., p. 13, 72.
[40] ibid., p. 72. See the Privacy Commission’s argument for this on p. 13.
[41] After an order from the Privacy Commission, which was backed up by a Court injunction. In 2015, the Privacy Commission ordered Facebook to, among other things, stop tracking non-users, using cookies and social plugs, without consent, and to do the same for users unless “unless strictly necessary for a service explicitly requested b the user” or unless it gets “unequivocal, specific consent”. It was also ordered to use consent requests that are unequivocal and specific. When Facebook failed to comply this was followed by a court order in November 2015. Facebook responded by blocking access to users. See ibid., pp 4-7.
[42] ibid., p. 65-6.
[43] ibid., p. 67.
[44] Judgment of the Berlin Regional Court dated 16 January 2018, Case no. 16 O 341/15 (URL: https://pagefair.com/wp-content/uploads/2018/04/Berlin-Court-judgement-German.pdf)
[45] The High Court, Commercial, 2016, N. 4809 P., The Data Protection Commissioner v Facebook Ireland and Maximilian Schrems, Request for a preliminary ruling, Article 267 TFEU, 12 April 2018.
See also Judgement of Ms Justice Costello, The High Court, Commercial, 2016, No. 4809 P., The Data Protection Commissioner v Facebook Ireland and Maximillian Schrems, 3 October 2017.
Note, this is the second “Schrems” case. The first caused the end of the EU-US Safe Harbor agreement.
[46] See a discussion on Facebook and purpose limitation in “How the GDPR will disrupt Google and Facebook”, PageFair, 30 August, (URL: https://pagefair.com/blog/2017/gdpr_risk_to_the_duopoly/).
[47] “WFA Manifesto for Online Data Transparency”, World Federation of Advertisers, 20 April 2018 (URL: https://www.wfanet.org/news-centre/wfa-manifesto-for-online-data-transparency/). See also Stephan Loerke, WFA CEO, “GDPR data-privacy rules signal a welcome revolution”, AdAge, 25 January 2018 (URL: adage.com/article/cmo-strategy/gdpr-signals-a-revolution/312074/).

Google adopts non-personal ad targeting for the GDPR

This note examines Google’s recent announcement on the GDPR. Google has sensibly adopted non-personal ad targeting. This is very significant step forward and signals a change in the online advertising market. But Google has also taken a new and problematic approach to consent for personal data use in advertising that publishers will find hard to accept. 

Google decides to use non-personal ad targeting to comply with the GDPR 

Last Thursday Google sent a policy update to business partners across the Internet announcing that it would launch an advertising service based on non-personal data in order to comply with the GDPR.[1]
PageFair has advocated a non-personal approach to advertising for some time, and commends Google for taking this position. As we noted six months ago,[2] Google AdWords, for example, can operate without consent if it discards personalized targeting features (and unique IDs). In this case, advertisers can continue to target advertisements to people based on what they search for.
This may be part of a trend for Google, which announced in mid 2017 that it would stop mining personal e-mails in Gmail to inform its ad targeting. Clearly, few users would have given consent for this.[3] Google’s latest announcement has signaled to advertisers the importance of buying targeted advertising without personalization.
Although Google’s “non-personalized ads” may seem promising to advertisers and publishers who are concerned about GDPR liability, more work must be done before they can be considered safe.
Unique tracking IDs are currently vital to Google’s ability to perform frequency capping and bot detection.[4] Meanwhile, data leakage is a problem caused by 3rd party ad creatives liberally loading numerous tracking pixels. Google has been silent on fixing these problems. Therefore, it may be that Google will merely target ads with non-personal data, but will continue to perform tracking as usual. Clarity on this point will be important for advertisers seeking safe inventory.

Problems with Google’s approach to consent for personal data

Despite its new non-personalized ads, Google is also attempting to build a legal basis under the GDPR for its existing personal data advertising business. It has told publishers that it wants them to obtain their visitors’ consent to “the collection, sharing, and use of personal data for personalization of ads or other services”.[5]
Note that the purpose here is “personalization of ads or other services”. This is appears to be a severe conflation of the many separate processing purposes involved in advertising personalization.[6] The addition of “other services” makes the conflation even more egregious. As we previously observed in our note on the approach proposed by IAB Europe, this appears to be a severe breach of Article 5, which requires that consent be requested in a granular manner for “specified, explicit” purposes.[7] As noted in a previous PageFair note, European regulators have explicitly warned against conflating purposes in this way:

“If the controller has conflated several purposes for processing and has not attempted to seek separate consent for each purpose, there is a lack of freedom. This granularity is closely related to the need of consent to be specific …. When data processing is done in pursuit of several purposes, the solution to comply with the conditions for valid consent lies in granularity, i.e. the separation of these purposes and obtaining consent for each purpose”.[8] 

Controller-controller 

Google is asking publishers to obtain consent from their visitors for it to be an independent controller of those users’ personal data.[9] Confusingly, Google has called this a “controller-controller” policy. This evokes “joint-controllership”, a concept in the GDPR that would require force both Google and publisher to jointly determine the purposes and means of processing, and to be transparent with each other.[10] However, what Google proposes is not joint-controllership, but rather independent controllership for the publisher on the one hand, and for Google on the other. Google’s “controller-controller” terms to publishers define each party as

“an independent controller of Controller Personal Data under the Data Protection Legislation; [that] will individually determine the purposes and means of its processing of Controller Personal Data”.[11]

It is not clear why a publisher would choose to do this, since it would enable Google to leverage that publisher’s audience across the entire web (severe conflation of purposes notwithstanding). The head of Digital Content Next, a publisher trade body that represents Disney, New York Times, CBS, and so forth, has already announced that “no way in hell Google will be “co-controller” across publishers’ sites”.[12]
Further problems with Google’s new approach to consent 
Even if publishers did accept that Google could be a controller of their visitors’ data for its own purposes, it is unlikely that many visitors would give their consent for this.[13]
If, however, both a publisher and a visitor were to agree to Google’s controller-controller proposal, two further problems arise. First, when a publisher shares third party personal data with Google, Google’s terms require that the publisher “must use commercially reasonable efforts to ensure the operator of the third party property complies with the above duties [of obtaining adequate consent]”.[14] This phrase “commercially reasonable efforts” is not a meaningful defence in the event that personal data are unlawfully processed.
As one expert from a European data protection authority retorted when I researched this point: “Imagine this as legal defence line: ‘We did not obtain consent because if wasn’t possible with commercially reasonable efforts’?” The Regulation is clear that “each controller or processor shall be held liable for the entire damage”, where more than one controller or processor are “involved in the same processing”.[15]
Second, Google’s policy puts what appears to be an impossible burden on the publisher. It requires that the publisher accurately inform the visitor about how their data will be used if they give consent.

“You must clearly identify each party that may collect, receive, or use end users’ personal data as a consequence of your use of a Google product. You must also provide end users with prominent and easily accessible information about that party’s use of end users’ personal data”.[16]

However, the publisher does not know what personal data Google shares with its own business partners. Nor does it know what purposes these parties process data about its visitors for. So long as this continues, a publisher cannot be in a position to inform its visitors of what will be done with their data. The result is very likely to be a breach Article 6[17] and Article 13[18] of the GDPR.
Giving Google the benefit of the doubt, this may change before 25 May. Google plans to publish some information about its “uses of information and we are asking other ad technology providers with which Google’s products integrate to make available information about their own uses of personal data.”[19] Publishers will not be well served by any further delay in the provision of this information.

Risks for Google 

Google’s decision to rely on non-personal data for ad targeting is highly significant, and will enable the company and advertisers that work with it to operate under the GDPR. However, Google’s new consent policy is fraught with issues that make it impossible for publishers to adopt. Our GDPR risk scale, first published for Google in August 2017, remains unchanged.


Perimeter is a robust regulatory firewall. It preemptively blocks unauthorized requests from 3rd parties, and tightly controls personal data on your website and app. It protects you, your advertising business, and your users. Perimeter makes sure that consent means something.

[x_button shape=”rounded” size=”regular” float=”none” href=”https://pagefair.com/perimeter” info=”none” info_place=”top” info_trigger=”hover”]Learn more[/x_button]

Notes

[1] “Changes to our ad policies to comply with the GDPR”, Google Inside AdWords, 22 March 2018 (URL: https://adwords.googleblog.com/2018/03/changes-to-our-ad-policies-to-comply-with-the-GDPR.html).
[2] “How the GDPR will disrupt Google and Facebook”, PageFair Insider, August 2017 (URL: https://pagefair.com/blog/2017/gdpr_risk_to_the_duopoly/).
[3] ibid.
[4] For alternative methods of performance measurement and reporting see “Frequency capping and ad campaign measurement under GDPR”, PageFair Insider, November 2017 (URL: https://pagefair.com/blog/2017/gdpr-measurement1/).
[5] “EU user consent policy”, Google, to apply from 25 May 2018 (URL: https://www.google.com/about/company/consentstaging.html)
[6] See discussion of data processing purposes in online behavioural advertising, and the degree of granularity required in consent, in “GDPR consent design: how granular must adtech opt-ins be?”, PageFair Insider, January 2018 (URL: https://pagefair.com/blog/2018/granular-gdpr-consent/).
[7] The GDPR, Article 5, paragraph 1, b, and note reference to the principle of “purpose limitation”. See also Recital 43. For more on the purpose limitation principle see “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013.
[8] “Guidelines on consent under Regulation 2016/679”, WP259, Article 29 Working Party, 28 November 2017, p. 11.
[9] “Google Ads Controller-Controller Data Protection Terms, Version 1.1”, Google, 12 October 2017 (URL: https://privacy.google.com/businesses/controllerterms/).
[10] See The GDPR, Article 26.
[11] Clause 4.1 of “Google Ads Controller-Controller Data Protection Terms, Version 1.1”, Google, 12 October 2017 (URL: https://privacy.google.com/businesses/controllerterms/).
[12] Jason Kint, Twitter, 22 March 2018 (URL: https://twitter.com/jason_kint/status/976928024011726848)
[13] “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, 12 September 2017 (URL: https://pagefair.com/blog/2017/new-research-how-many-consent-to-tracking/).
[14] “EU user consent policy”, Google, to apply from 25 May 2018 (URL: https://www.google.com/about/company/consentstaging.html)
[15] The GDPR, Article 4, paragraph 2.
[16] “EU user consent policy”, Google, to apply from 25 May 2018 (URL: https://www.google.com/about/company/consentstaging.html)
[17] The GDPR, Article 6, paragraph 1, a.
[18]  [20] The GDPR, Article 13, paragraph 2, f, and Recital 60.
[19] “Help with the EU user consent policy”, Google (URL:https://www.google.com/about/company/consenthelpstaging.html)

Risks in IAB Europe’s proposed consent mechanism

This note examines the recently published IAB “transparency and consent” proposal. Major flaws render the system unworkable. The real issue is what should be done with the vast majority of the audience who will not give consent. 

Publishers would have no control (and are expected to blindly trust 2,000+ adtech companies)

The adtech companies[1] who drafted the IAB Europe proposal claim that “publishers have full control over who they partner with, who they disclose to their users and who they obtain consent for.”[2] But the IAB Europe documentation shows that adtech companies would remain entirely free to trade the personal data with their business partners if they wish. The proposed system would share a unique[3] consent record “throughout the online advertising ecosystem”, every time an ad is loaded on a website:[4]

“the OpenRTB request [from a website to an ad exchange] will contain the entire DaisyBit [a persistent cookie],[5] allowing a vendor to see which other vendors are an approved vendor or a publisher and whether they have obtained consent (and for which purposes) and which have not.”[6]

There would be no control over what happens to personal data once they enter the RTB system: “[adtech] vendors may choose not to pass bid requests containing personal data to other vendors who do not have consent”.[7] This is a critical problem, because the overriding commercial incentive for many of the companies involved is to share as many data with as many partners as possible, and to share it with parent companies that run data brokerages. In addition, publishers are expected to trust that JavaScript in “ad creatives” is not dropping trackers, even though no tools to police this are proposed here.
IAB Europe is asking publishers and brands to expose themselves to the legal risk of routinely sharing these personal data with several thousand adtech companies. What publishers and brands need is a “trust no one” approach. IAB Europe is proposing a “trust everyone” approach. Indeed, the proposed system looks like the GDPR’s description of a data breach:

“a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data transmitted, stored or otherwise processed”.[8]

Publishers have no control over personal data once they send them into the RTB system. All publishers have is liability.

“OK to everything” jeopardises the publisher’s own opt-ins

The proposed system would also jeopardise the chance of websites obtaining essential opt-ins for their own data processing purposes, such as commenting widgets, video players. IAB Europe proposes that websites bundle all consent under a single “OK”/”Accept all” button. Our wireframe below shows the text and buttons recommended by IAB Europe.[9]

Broadly speaking, websites might expect to receive consent from four out of every five of users for their own data processing.[10] Whereas the opt-in rate for ad tech tracking is tiny in comparison. Our research found that only 3% of people say they would opt in to 3rd parties tracking them across the web for the purposes of advertising.[11] IAB Europe’s commissioned research found that only 20% would do so.[12] The ad tech vendors who drafted the IAB Europe proposal have an incentive to ask publishers to take risk on their behalf: they must realize that there is no chance that Internet users will agree to the cascade of opt-ins that the GDPR requires.[13] A website would be ill advised to jeopardise its own consent requests in a vain effort to get consent for ad tech companies, particularly if those ad tech companies plan to use that same consent to work with the website’s competitors.

Conflation and other matters of presentation

The proposal appears to breach Article 5, Article 6, and Article 13 of the GDPR, for several reasons.
First, Article 5 requires that consent be requested in a granular manner for “specified, explicit” purposes.[14] Instead, IAB Europe’s proposed design bundles together a host of separate data processing purposes under a single opt-in. A user must click the “Manage use of your Data” button in order to view four slightly less general opt-ins, and the companies[15] requesting consent. These opt-ins also appear to breach Article 5, because they too conflate multiple data processing purposes into a very small number of ill defined consent requests. For example, a large array of separate ad tech consent requests[16] are bundled together in a single “advertising personalisation” opt-in.[17] European regulators explicitly warned against conflating purposes:

“If the controller has conflated several purposes for processing and has not attempted to seek separate consent for each purpose, there is a lack of freedom. This granularity is closely related to the need of consent to be specific …. When data processing is done in pursuit of several purposes, the solution to comply with the conditions for valid consent lies in granularity, i.e. the separation of these purposes and obtaining consent for each purpose.”[18]

Second, the text that IAB Europe proposes publishers display for the “advertising personalisation” opt-in appears to severely breach of Article 6[19] and Article 13[20] of the GDPR. In a single 49 word sentence, the text conflates several distinct purposes, and gives virtually no indication of what will be done with the reader’s personal data.

“Advertising personalisation allow processing of a user’s data to provide and inform personalised advertising (including delivery, measurement, and reporting) based on a user’s preferences or interests known or inferred from data collected across multiple sites, apps, or devices; and/or accessing or storing information on devices for that purpose.”[21]

This fails to disclose that hundreds, and perhaps thousands, of companies will be sent your personal data. Nor does it say that some of these companies will combine these with a profile they already have built about you. Nor are you told that this profile includes things like your income bracket, age and gender, habits, social media influence, ethnicity, sexual orientation, religion, political leaning, etc. Nor do you know whether or not some of these companies will sell their data about you to other companies, perhaps for online marketing, credit scoring, insurance companies, background checking services, and law enforcement.
Third, a person must say yes or no for all or none of the companies listed as data controllers.[22] Since one should not be expected to trust all controllers equally, and since it is unlikely that all controllers apply equal safeguards of personal data, we suspect that this “take it or leave it” choice will not satisfy regulatory authorities.
Fourth, there appears to be no way to easily refuse to opt-in to the consent request that IAB Europe proposes, which would also breach the GDPR.[23] It is possible that this last point is simply an accidental oversight in the drafting of IAB Europe’s documentation.

Conclusion: What about the people (80%-97%) who don’t opt-in?

The proposed system has no plan to make consent meaningful, by giving publishers and data subjects control over what happens to personal data. Nor does it have a plan for what happens when users do not give consent. It is time for the discussion to move on.
As the CEO of a Digital Content Next, a major publisher trade body, recent told members, “GDPR will create opportunity for audience selection based on cohorts and context”.[24] Non-personal data such as these are the only way for the industry to approach the GDPR.
PageFair has recently announced Perimeter, a regulatory firewall that enables websites (and apps) protect their ad business, running direct campaigns and use RTB without risk under the GDPR. It prevents unauthorized connections from 3rd parties, so that personal data can not leak through the RTB system, or anywhere else. (For extra peace of mind, PageFair’s SSP delivers guaranteed compliant programmatic display advertising). This is the consent-free approach.
We also believe that consent has a role. The next chapter for online advertising will be written by publishers who use consent-free RTB, and build up consenting audiences for premium advertising too.
Note: thanks to Andrew Shaw at PageFair. 


[x_alert heading=”Feedback Wanted” type=”success” close=”true”]Note: PageFair has just updated its online overview of Perimeter. Please review http://pagefair.com/perimeter and give us your feedback.[/x_alert]

Perimeter is a robust regulatory firewall. It preemptively blocks unauthorized requests from 3rd parties, and tightly controls personal data on your website and app. It protects you, your advertising business, and your users. Perimeter makes sure that consent means something.

[x_button shape=”rounded” size=”regular” float=”none” href=”https://pagefair.com/perimeter” info=”none” info_place=”top” info_trigger=”hover”]Learn more[/x_button]

Notes

[1] AppNexus Inc.; Conversant, LLC; DMG Media Limited; Index Exchange, Inc.; MediaMath, Inc.; Oath, Inc.; Quantcast Corp.; and, Sizmek, Inc. are named in the copyright notice of “Transparency & Consent Framework, Cookie and Vendor List Format, Draft for Public Comment, v1.a”, IAB Europe (URL: URL-shortened), p. 3.
Note: PageFair is a member of IAB TechLab, and IAB UK.
[2] “Transparency & Consent Framework FAQ”, IAB Europe, 8 March 2018 (URL: http://advertisingconsent.eu/wp-content/uploads/2018/03/Transparency_Consent_Framework_FAQ_Formatted_v1_8-March-2018.pdf), p. 8.
[3] Our statistical examination of the data in the cookie showed a very high degree of uniqueness. The proposed cookie is itself a tracking cookie. See the specification of the cookie in “Transparency & Consent Framework, Cookie and Vendor List Format, Draft for Public Comment, v1.a”, IAB Europe, pp 8 – 10.
[4] ibid., p. 3
[5] ibid., p. 8.
To see the content of the proposed consent cookie, see http://gdpr-demo.labs.quantcast.com/user-examples/cookie-workshop.html.
It is envisaged that the record may be server-based in the future, because this will work better. See p. 7.
[6] “Transparency & Consent Framework FAQ”, IAB Europe, 8 March 2018, p. 9.
[7] ibid., p. 10. And from the same page, when an adtech company gets personal data without consent, IAB Europe asks it “to only act upon that data if it has another applicable legal basis for doing so”.
[8] Regulation (EU) 2016/679 of The European Parliament and of The Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), Article 4, paragraph 12.
[9] “Transparency & Consent Framework FAQ”, IAB Europe, 8 March 2018, p. 13.
[10] 20% would accept first party tracking only. An additional 56% would accept tracking that is strictly necessary for services they have requested. 5% say they would accept all tracking.
See “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, 12 September 2017 (URL: https://pagefair.com/blog/2017/new-research-how-many-consent-to-tracking/).
[11] ibid.
[12] “Europe Online: an experience driven by advertising”, GFK, 2017, p. 7. (URL: https://www.iabeurope.eu/wp-content/uploads/2017/09/EuropeOnline_FINAL.pdf).
[13] “GDPR consent design: how granular must adtech opt-ins be?”, PageFair Insider, January 2018 (URL: https://pagefair.com/blog/2018/granular-gdpr-consent/).
[14] The GDPR, Article 5, paragraph 1, b, and note reference to the principle of “purpose limitation”. See also Recital 43. For more on the purpose limitation principle see “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013.
[15] Note that the Article 29 Working Party very recently warned that this alone might be enough to render consent invalid: “when the identity of the controller or the purpose of the processing is not apparent from the first information layer of the layered privacy notice (and are located in further sub-layers), it will be difficult for the data controller to demonstrate that the data subject has given informed consent, unless the data controller can show that the data subject in question accessed that information prior to giving consent”.
Quote from “Guidelines on consent under Regulation 2016/679”, WP259, Article 29 Working Party, 28 November 2017 (URL: https://pagefair.com/wp-content/uploads/2017/12/wp259_enpdf.pdf), p. 15, footnote 39.
[16] See discussion of data processing purposes in online behavioural advertising, and the degree of granularity required in consent, in “GDPR consent design: how granular must adtech opt-ins be?”, PageFair Insider, January 2018 (URL: https://pagefair.com/blog/2018/granular-gdpr-consent/).
[17] “Transparency & Consent Framework FAQ”, IAB Europe, 8 March 2018, p. 18.
[18] “Guidelines on consent under Regulation 2016/679”, WP259, Article 29 Working Party, 28 November 2017, p. 11.
[19] The GDPR, Article 6, paragraph 1, a.
[20] The GDPR, Article 13, paragraph 2, f, and Recital 60.
[21] “Transparency & Consent Framework FAQ”, IAB Europe, 8 March 2018, p. 18.
[22] “Transparency & Consent Framework, Cookie and Vendor List Format, Draft for Public Comment, v1.a”, IAB Europe, p. 5.
This is apparently “due to concerns of payload size and negatively impacting the consumer experience, a per-vendor AND per-purpose option is not available”, p. 22.
[23] The Regulation is clear that “consent should not be regarded as freely given if the data subject has no genuine or free choice”. The GDPR, Recital 42. See also, Article 4, paragraph 11.
[24] Jason Kint, “Why the IAB GDPR Transparency and Consent Framework is a non-starter for publishers”, Digital Content Next, 19 March 2018 (URL: https://digitalcontentnext.org/blog/2018/03/19/iab-gdpr-consent-framework-non-starter-publishers/)

PageFair writes to all EU Member States about the ePrivacy Regulation

This week PageFair wrote to the permanent representatives of all Member States of the European Union in support for the proposed ePrivacy Regulation.
Our remarks were tightly bounded by our expertise in online advertising technology. We do not have an opinion on how the proposed Regulation will impact other areas.
The letter addresses four issues:

  1. PageFair supports the ePrivacy Regulation as a positive contribution to online advertising, provided a minor amendment is made to paragraph 1 of Article 8.
  2. We propose an amendment to Article 8 to allow privacy-by-design advertising. This is because the current drafting of Article 8 will prevent websites from displaying privacy-by-design advertising.
  3. We particularly support the Parliament’s 96th and 99th amendments. These are essential to enable standard Internet Protocol connections to be made in many useful contexts that do not impact of privacy.
  4. We show that tracking is not necessary for the online advertising & media industry to thrive. As we note in the letter, behavioural online advertising currently accounts for only a quarter of European publishers’ gross revenue.

[x_button shape=”rounded” size=”regular” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/03/PageFair-letter-on-ePrivacy-to-perm-reps-13-March-2018.pdf” info=”none” info_place=”top” info_trigger=”hover”]Read the letter [/x_button]

The digital economy requires a foundation of trust to enable innovation and growth. The enormous growth of adblocking (to 615 million active devices) across the globe proves the terrible cost of not regulating. We are witnessing the collapse of the mechanism by which audiences support the majority of online news reports, entertainment videos, cartoons, blogs, and cat videos that make the Web so valuable and interesting. Self-regulation, lax data protection and enforcement have resulted in business practices that promise a bleak future for European digital publishers.
Therefore, we commend the Commission and Parliament’s work thus far, and wish the Council (of Ministers of the Member States) well in their deliberations.

Adtech must change to protect publishers under the GDPR (IAPP podcast)

The follow up to the International Association of Privacy Professionals’ most listened to podcast of 2017. 
Angelique Carson of the International Association of Privacy Professionals quizzes PageFair’s Dr Johnny Ryan on the crisis facing publishers, as they grapple with adtech vendors and attendant risks ahead of the GDPR. The podcast covers:

  • Why personal data can not be used without risk in the RTB/programmatic system under the GDPR.
  • Where consent falls short for publishers.
  • How vulnerable the online advertising system is, because of central points of legal failure.
  • The GDPR is part of a global trend. New privacy standards are on the way in other massive markets including China (and in important tech ecosystems such as Apple iOS, Firefox).

This is the follow up to an earlier IAPP and PageFair podcast discussion (which was the International Association of Privacy Professionals’ most listened to podcast of 2017).

[x_button shape=”rounded” size=”regular” float=”none” href=”https://iapp.org/news/a/the-privacy-advisor-podcast-johnny-ryan-on-the-continuing-crisis-ad-tech-faces/” info=”none” info_place=”top” info_trigger=”hover”]Listen at IAPP[/x_button]

[x_button shape=”rounded” size=”regular” float=”none” href=”https://itunes.apple.com/us/podcast/the-privacy-advisor-podcast/id1095382766?mt=2#” info=”none” info_place=”top” info_trigger=”hover”]Listen on iTunes[/x_button]

Click here to view PageFair’s explainers and official documents about the changes websites and apps must make under the new privacy rules. Elsewhere you can find details about Perimeter, PageFair’s GDPR solution for publishers.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

PageFair's long letter to the Article 29 Working Party

This note discusses a letter that PageFair submitted to the Article 29 Working Party. The answers may shape the future of the adtech industry. 
Eventually the data protection authorities of Europe will gain a thorough understanding of the adtech industry, and enforce data protection upon it. This will change how the industry works. Until then, we are in a period of uncertainty. Industry can not move forward, business can not flourish. Limbo does not serve the interests of publishers. Therefore we press for certainty.
This week PageFair wrote a letter to the Article 29 Working Party presenting insight on the inner workings of adtech, warts and all.
Our letter asked the working party to consider five questions. We suspect that the answers may shape the future of the adtech industry.

  1. We asked for further guidance about two issues that determine the granularity of consent required. First, we asked what the scope of a single “purpose” for processing personal data is. Since one must have a legal basis for each purpose, a clear understanding of scope of an individual purpose is important to determine the number of purposes, and thus the number of granular opt-ins required.
  2. The second question about granularity of consent asked whether multiple controllers that pursue identical purposes should be unbundled from each other. In other words, should consent be requested not only per purpose, but per controller too. This is important because it should not be assumed that a person trusts all data controllers equally. Nor is it likely that all controllers apply equal safeguards of personal data. Therefore, we asked whether it was appropriate to bundle multiple controllers together in a single consent request without the opportunity to accept some, and not all.
  3. We asked for guidance on how explicit consent operates for websites and apps, where a controller wishes to process special categories of personal data. Previously the Working Party cited the double opt-in as method of explicit consent for e-mail marketing. We presented wireframes of how this might operate on web and mobile.
  4. We asked for clarification that all unique identifiers are personal data. This is important because the presence of a unique ID enables the combining of data about the person associated with that unique ID, even if the party that originally assigned the unique ID did so randomly, without any understanding of who the data subject is.
  5. We asked for guidance on how Article 13 of the GDPR applies to non-tracking cookies (without personal data) as opposed to personal data. This is important because some paragraphs of this article were intended to apply to personal data and are not appropriate for non-personal data.

In addition to these questions we made three statements.

  1. Websites, apps, and adtech vendors leak personal data to unknown parties in routine advertising operation (via “RTB” bid requests, cookie syncs, JavaScript ad units, mobile SDKs, and other 3rd party integrations). This is preventable.
  2. We noted our support for the Working Party’s view that the GDPR forbids the demanding of consent for 3rd party tracking that is unrelated to the provision of an online service.
  3. It is untenable for any publisher, adtech vendor, or trade body, to claim that they must use personal data for online advertising. As we and others have shown, sophisticated adtech can work without personal data.

The full letter is available here.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

PageFair Trusted Partners To Join GDPR Compliance Initiative

This note announces an initiative among adtech companies to keep online advertising operations outside the scope of the GDPR by using no personal data.
Dublin, Ireland (24 January, 2018) – PageFair has announced a joint initiative with eight other advertising companies to help equip website and app publishers with new ways of advertising  that fully comply with Europe’s new GDPR regulations.  Among the members are Adzerk, Bannerflow, Bydmath, Clearcode, Converge Digital, Digitize, SegmentIQ, and Velocidi.  The EU’s new privacy regulations will prohibit the kind of online tracking that has powered advertising up to now, unless every user gives explicit consent to the companies that track them. Publishers, advertisers and tech companies who ignore the regulation could face fines of up to €20 million or 4% of their global turnover.  According to PageFair, a growing number of ad tech companies are realising that tracking individual users is a convenience they can live without, and are preparing tracking-free versions of their products for the European market.
In December, PageFair announced a related product called “Perimeter”, which acts as a regulatory firewall for websites and apps, blocking potential tracking by default, while whitelisting approved “Trusted Partners”. PageFair CEO Sean Blanchfield said “This initial group is just a small sample of the companies who are committed to rewiring the advertising supply chain to work without personal data. Together, we have already designed alternative ways to provide advertisers with frequency capping, measurement and targeting, all without tracking individuals. We look forward to welcoming new members to our initiative, more privacy innovation, and to whitelisting all GDPR-compliant partners by default in the PageFair Perimeter platform”.
PageFair’s zero-tracking strategy contrasts with many other advertising technology companies, who are focusing on obtaining consent from each of Europe’s 512 million consumers, or who are hoping for leniency in how the law will be enforced. According to Blanchfield, “Consent has a role to play, but we already know that it will be hard-won and easily lost, and that the majority of online media revenue in Europe now depends on finding ways advertise without depending on personal data in the first place. Anyone who thinks that it’s business-as-usual this Summer will find that leniency is even harder to come by than consent.”
James Avery, CEO of Adzerk, said “What Adzerk is doing with PageFair means that advertisers will now be able to run digital ads safely, free from the large legals risks introduced by the GDPR”.
Nicholas Höglund, CEO of Bannerflow, said “At Bannerflow we’re committed to delivering a non-personal data solution for our clients and respecting the privacy of EU citizens. Serving millions of ads every week, this is core for us. We are working hard to develop a platform that can deliver valuable insight without any need to track individuals.”
Maciej Zawadziński, CEO of Clearcode, said “Clearcode views PageFair’s Perimeter as a  much-needed solution for publishers to comply with the GDPR and end to the out-of-control data collection processes in the current AdTech ecosystem. As a Perimeter Trusted Partner, Clearcode is committed to helping AdTech vendors and publishers become whitelisted by PageFair’s Perimeter and comply with other areas of the GDPR through our custom AdTech development services”.
David Dunne, CEO of Velocidi, said “GDPR is forcing brands, agencies and publishers to take greater control of their data. PageFair has created an ingenious way to facilitate GDPR compliance. Perimeter helps publishers manage data without leaving compliance entirely up to consumer consent”.
Publishers, advertisers and technology companies who want to find out more about Perimeter, or how to get involved can do so on http://pagefair.com/perimeter
https://adzerk.com
https://www.bannerflow.com
https://www.bydmath.com
https://clearcode.cc
http://converge-digital.com
http://www.digitize.ie
http://segmentiq.com
https://www.velocidi.com
 

Further detail

1. Perimeter integrates with websites and apps, and acts as a regulatory firewall. It blocks all 3rd party personal data access. No unique IDs are allowed, or any other personal data, unless adequate consent has been given. This puts the publisher and their advertisers in a zero risk position because they are processing no personal data.
2. Adtech partners can be automatically whitelisted by Perimeter, provided they use no unique IDs or any other personal data unless adequate consent has been given.
3. Perimeter Trusted Partners can include SSPs, ad servers, analytics, DMPs, DSPs, SDKs in mobile, etc., provided they do not expose the publisher and advertiser to risk under the GDPR by using unique IDs or any other personal data unless adequate consent has been given.
4. PageFair is sharing know how about to perform adtech functions without personal data. (For example, RTB bid requests based on non-personal segments, frequency capping using campaign IDs rather than unique IDs, etc.)
5. Avoiding the use of personal data means that there is no consent required for the processing of personal data. This is ultra privacy-by-design adtech places publishers, advertisers, and Perimeter Trusted Partners outside the scope of the GDPR.
6. If the publisher does seek and get appropriate consent, then Perimeter system will permit personal data to be processed.
7. What this means for publishers and buyers is that they can sell and buy online advertising in direct sold campaigns and in programmatic without any risk under the GDPR. This is the start of a clean data adtech stack.
 
***
 

Media Contact

Dr Johnny Ryan
press@pagefair.com
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

GDPR's non-tracking cookie banners

This note outlines how an anomaly in European law will impact cookie storage and presents wireframes of permission requests for non-tracking cookies. 
Online media will soon find itself in an anomalous position. It will be necessary to apply the GDPR’s consent requirements to cookies that reveal no personal data, even though the GDPR was not intended to be applied in this way.[1]
Recital 26 of the GDPR says that “the principles of data protection should … not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person…”.[2]
Even so, a hiccup in the choreography of European Law making is creating an unexpected situation in which the GDPR’s conditions will apply to cookies that reveal or contain no personal data.
The Data Protection Directive currently sets out the conditions under which consent should be sought for the storage of cookies.[3] However, this Directive will be repealed on 25 May 2018, before the forthcoming ePrivacy Regulation introduces new conditions for cookie consent.[4]
The Commission had intended that both the GDPR (which repeals the Data Protection Directive) and the ePrivacy Regulation (which updates cookie consent conditions) would be applied on the same date. But now that the ePrivacy Regulation is considerably delayed, a provision of the GDPR that says references to the Data Protection Directive “shall be construed as references to this Regulation” will apply to non-personal data in cookies also.[5]
Non-personal data are data that can not be related to an identifiable person. For example, there is no unique identifier, the data could relate to many people, and could not be used to single out an individual. As the European Court of Justice said in 2016, data are not personal “if the identification of the data subject was prohibited by law or practically impossible on account of the fact that it requires a disproportionate effort in terms of time, cost and manpower, so that the risk of identification appears in reality to be insignificant”.[6]
The GDPR way of asking for consent does not neatly apply to data such as these, that are not personal. For example, the language of the GDPR’s requirements for consent refers explicitly to personal data concepts. Consider some of the important terms: “processing” is “any operation or set of operations which is performed on personal data or on sets of personal data…”.[7] The word “processing” does not have this meaning where personal data are absent. Nor does the word “controller”, because a controller is “the natural or legal person … which … determines the purposes and means of the processing of personal data…”. [8] Similarly, “profiling” is “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects…”[9].

Less friction

Therefore, although the GDPR provides for a very high standard of information to be presented with consent requests, as elaborated in a previous PageFair Insider note,[10] there is considerably less friction when using the GDPR requirements to request storage permission for data that are not personal.
The following table shows what elements are relevant when the GDPR’s requirements for consent are applied to cookies that neither contain nor revel personal data, as opposed to when it is applied to any processing of personal data.

Information to accompany consent requests
GDPR consent requirements – items listed in Article 13 Cookies where there are no personal data Any processing of personal data
the identity and the contact details of the controller[11] and, where applicable, of the controller’s representative;[12] N/A (there is no controller) Yes (where applicable)
the contact details of the data protection officer, where applicable;[13] N/A (there are no personal data) Yes (where applicable)
the purposes of the processing for which the personal data are intended as well as the legal basis for the processing;[14] N/A (there are no personal data being processed) Yes
where the processing is based on point (f) of Article 6(1), the legitimate interests pursued by the controller or by a third party; N/A N/A
the recipients or categories of recipients of the personal data, if any;[15] N/A (there are no personal data being shared) Yes (where applicable)
where applicable, the fact that the controller intends to transfer personal data to a third country or international organisation and the existence or absence of an adequacy decision by the Commission, or in the case of transfers referred to in Article 46 or 47, or the second subparagraph of Article 49(1), reference to the appropriate or suitable safeguards and the means by which to obtain a copy of them or where they have been made available.[16] N/A (there are no transfers of personal data) Yes (where applicable)
the period for which the personal data will be stored, or if that is not possible, the criteria used to determine that period;[17] N/A (there is no storage of personal data) Yes
the existence of the right to request from the controller access to and rectification or erasure of personal data or restriction of processing concerning the data subject or to object to processing as well as the right to data portability;[18] N/A (there are no personal data) Yes
where the processing is based on point (a) of Article 6(1) or point (a) of Article 9(2), the existence of the right to withdraw consent at any time, without affecting the lawfulness of processing based on consent before its withdrawal;[19] N/A (there is no processing of personal data) Yes
the right to lodge a complaint with a supervisory authority;[20] Yes Yes
whether the provision of personal data is a statutory or contractual requirement, or a requirement necessary to enter into a contract, as well as whether the data subject is obliged to provide the personal data and of the possible consequences of failure to provide such data;[21] N/A (there are no personal data) Yes (where applicable)
the existence of automated decision-making, including profiling,[22] referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.[23] N/A (there are no personal data) Yes (where applicable)

As the table shows, the requirements for consent are considerably less demanding when used to request storage permission for non personal data, such as non-tracking cookies. This is because the GDPR was not intended to be applied in this manner. Below is a wireframe of a “storage permission” dialogue.

Storage permission

In this simple wireframe the question mark button reveals two informational buttons. 
The “my data rights” button provides information about how to lodge a complaint with the supervisory authorities, which is required under Article 13, paragraph 2, d. The “What is stored” button describes the non-personal data stored on the device, providing assurance to the user that their consent will not impact their fundamental right to privacy or their fundamental right to data protection. 
Note that this only applies where publishers and their adtech vendors scrupulously avoid the collection and any other processing of personal data, including all unique identifiers, as Perimeter Trusted Partners do. Otherwise, the GDPR’s consent requirements apply as normal.

The future

This anomalous situation will change when the ePrivacy Regulation is applied at some point in 2018 or later. The question is whether enough sensible pro-privacy businesses and NGOs will make the case for non-tracking cookies in the new Regulation. In late 2017 PageFair wrote to Members of the European Parliament to argue the case for permitting non-tracking cookies under the ePrivacy Regulation.[24] Our argument was that websites need a means to store information to operate, even for ancillary operations that their visitors do not request (such as A/B testing, for example) without bothering their users. Certainly, consent is essential where personal data are concerned, or where there exists the possibility to access communications information, for example, or private photo albums. But where non-tracking cookies are concerned, there must be an easier way. Unless there is some provision for protecting the humble non-tracking cookie, websites’ ability to smoothly transition to privacy-by-design advertising will be harmed.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Notes

[1] Regulation (EU) 2016/679 of The European Parliament and of The Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), Article 2, paragraph 1, notes the material scope of the Regulation: “This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than by automated means of personal data which form part of a filing system or are intended to form part of a filing system.”
[2] ibid., Recital 26.
[3] This is because the ePrivacy Directive, Article 2, paragraph f, and Recital 17, say that consent under the ePrivacy Directive should have the same meaning as previously defined in the Data Protection Directive.
[4] Article 94 of the GDPR repeals Directive 95/46/EC (the Data Protection Directive).
The ePrivacy Directive, Recital 17, says that “For the purposes of this Directive, consent of a user or subscriber, regardless of whether the latter is a natural or a legal person, should have the same meaning as the data subject’s consent as defined and further specified in Directive 95/46/EC. Consent may be given by any appropriate method enabling a freely given specific and informed indication of the user’s wishes, including by ticking a box when visiting an Internet website.”
The ePD Article 2, (f) says “‘consent’ by a user or subscriber corresponds to the data subject’s consent in Directive 95/46/EC”.
[5] The GDPR, Article 94, paragraph 2, says that references to the Data Protection Directive “shall be construed as references to this Regulation [the GDPR]”.
[6] Judgment of the Court (Second Chamber) Patrick Breyer v Bundesrepublik Deutschland, Case C-582/14, 19 October 2016.
[7] ibid., Article 4, paragraph 2.
[8] ibid., Article 4, paragraph 7.
[9] ibid., Article 4, paragraph 4.
[10] “GDPR consent design: how granular must adtech opt-ins be?”, PageFair Insider, 8 January 2018 (URL: https://pagefair.com/blog/2018/granular-gdpr-consent/).
[11] Note that the GDPR defines “controller” as an entity concerned with personal data. The definition in Article 4, paragraph 7, begins: “the natural or legal person … which … determines the purposes and means of the processing of personal data…”.
[12] The GDPR, Article 13, paragraph 1, a.
[13] ibid., Article 13, paragraph 1, b.
[14] ibid., Article 13, paragraph 1, c.
[15] ibid., Article 13, paragraph 1, e.
[16] ibid., Article 13, paragraph 1, f.
[17] ibid., Article 13, paragraph 2, a.
[18] ibid., Article 13, paragraph 2, b.
[19] ibid., Article 13, paragraph 2, c.
[20] ibid., Article 13, paragraph 2, d.
[21] ibid., Article 13, paragraph 2, e.
[22] Note that “profiling” is defined in the GDPR as a processing of personal data. The definition in Article 4, paragraph 4 begins: “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects…”
[23] ibid., Article 13, paragraph 2, f.
[24] PageFair to European Parliament ePrivacy rapporteurs, 5 July 2017, re “non-tracking cookies in the ePrivacy Regulation” (URL: https://pagefair.com/blog/2017/non-tracking-cookies/).
[25] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 20.

How to audit your adtech vendors' GDPR readiness (and a call to adtech vendors to get whitelisted as Trusted Partners)

This note describes how publishers can audit their adtech vendors’ readiness for the GDPR, and opens with a call for adtech vendors to collaborate with PageFair so that they can be whitelisted as Trusted Partners by PageFair Perimeter. 

How adtech and media will work under the GDPR

We anticipate that the GDPR will indeed be enforced, whether by national regulators or by NGOs or individuals in the courts. We also realise that consent is the only applicable legal basis for online behavioural advertising (See analysis). Personal data can not be processed for OBA in the absence of consent.

However, consent dialogues for adtech need a “next” button -or a very long scroll bar- because online behavioural advertising requires many different opt-ins to accommodate many distinct personal data processing purposes.  How many people will click OK 10+ times? (See analysis).

Even people do repeatedly opt-in for various adtech processing purposes, that consent will be inconsequential if there is continued widespread leakage of personal data through RTB bid requests, JavaScript in ads, mobile SDKs, and assets loaded from 3rd parties. Consent only has meaning if one prevents the personal data from falling into the hands of parties that do not have consent (See discussion).

Therefore, it is essential to plan for the eventuality that very few, if any, people will provide the ten or more opt-ins required to cover the diverse range of data processing purposes conducted by today’s online behavioral advertising ecosystem.

The only way to remove all risk of fines and legal suits for publishers, advertisers, and adtech vendors, is to use no personal data at all, unless one has consent. This would put routine advertising outside the scope of the GDPR, with no controllers, processors, nor personal data breaches.

PageFair Perimeter is working with a group of adtech vendors who can provide publishers with direct and programmatic adtech that uses only non-personal data (See footnote for a discussion of non-personal data).[1] Aside from non-personal cookie storage, there will be no need to seek consent because there will be no processing of personal data. 

In the minority of cases where valid consent is present, the corresponding vendors will be free to add incremental value by consensually using personal data.

Perimeter will block all 3rd parties that process personal data on its client publishers’ websites and apps, unless consent is present. Trusted Partner adtech vendors, who can operate programmatic and direct advertising without personal data, will be whitelisted (and promoted to publishers).

This note proceeds with two sections. The first describes Trusted Partner adtech vendors. The second outlines what questions publishers should ask their adtech vendors to audit their use of personal data and GDPR-readiness.

PageFair is calling for “Trusted Partner” Adtech vendors

PageFair is working with a group of adtech vendors who can provide publishers with direct and programmatic adtech that uses only non-personal data. (see footnote for a discussion of non-personal data). And of course, if appropriate consent is present, then these adtech vendors can process personal data.

Subject to verification that personal data is not processed without consent, PageFair Perimeter will whitelist Trusted Partners’ technology, while continuing to block all other 3rd parties.

We invite adtech vendors to work with us as Trusted Partners. Trusted Partners provide versions of their services that scrupulously prevent the collection or other processing of Personal Data, except where suitable consent has been obtained from the data subject and data protection requirements have been satisfied in a manner consistent with the GDPR.

[x_button shape=”rounded” size=”regular” float=”none” href=”https://goo.gl/forms/2RmmTbSiOy89Dx0l2″ info=”none” info_place=”top” info_trigger=”hover”]Become a Trusted Partner[/x_button]

PageFair will share methods for performing essential adtech functions (bid requests, frequency capping, measurement and reporting, etc.) with partners.[2]

How publishers can examine their adtech vendors’ personal data processing and GDPR-readiness.

The following section is a questionnaire for web and mobile publishers to use when exploring whether their adtech vendors are safe under the GDPR.

Questionnaire for adtech vendors

1. Unique identifiers

For each unique user identifier that you use, or introduce into the page, please list the primary purpose, the type, the duration of the identifier, what other companies might receive it, and what secondary purposes it might be used for.

Identifier name Primary purpose Secondary purposes Type
Example: 1st party cookie. 3rd party cookie. localStorage cookie. eTag supercookie. Flash supercookie. HSTS supercookie. device fingerprint / statistical ID. IP stack fingerprint. Other.
Lifetime of ID Other recipients

2. Other personal data

Do you use any other personal data (e.g., IP address, name, address, social security numbers, credit card numbers, email addresses or email address hashes)?

  1. What are their purpose?
  2. Where do you obtain this data from?
  3. Is there auditable consent from the user for the use of this personal data for this purpose?
  4. How do you match this data to unique user IDs?

3. Adtech server-to-server calls

What other advertising systems do you make server-to-server calls to that may communicate user IDs or other personal information, for example RTB bid requests, or automated transfer of RTB or ad call logs?

4. Cookie syncing / user matching

What other domains do you perform cookie syncing / user matching with?

5. Frequency capping

Do you perform frequency capping using unique user IDs? How?

6. Impression counting

Do you depend on any unique user identifiers when you perform impression counting? (for example, to count “unique impressions”)

7. Conversion counting

If you perform conversion counting, does this depend on unique user identifiers to track the user from click to post-conversion?

8. View through counting

If you perform view-through counting, does this depend on unique user identifiers to track the user from view to post-conversion?

9. Viewability measurement

If you perform viewability measurement, does this depend on any unique user identifiers?

10. Cross-device identification

If you perform any cross-device identification of users, what IDs do you use, and how do you match mobile device IDs with other IDs?

11. Fraud detection

If you perform fraud detection, do you use unique identifiers to track devices between websites, or perform other per-user analytics to detect the possibility of bot traffic?

Conclusion

It is inevitable that the audit above will find the processing of personal data is the norm among adtech vendors. This exposes the vendors, and their clients, under the GDPR.

PageFair invites all adtech vendors to provide versions of their service that operate outside the scope of the GDPR. We intend is to share insights on the tweaks required to take personal data out of adtech (where consent is absent) with Trusted Partners, and promote vendors who operate as Trusted Partners to publishers who use Perimeter.

Our objective is to minimize the number of vendors that publishers must block to protect themselves from GDPR liabilities.

[x_button shape=”rounded” size=”regular” float=”none” href=”https://goo.gl/forms/2RmmTbSiOy89Dx0l2″ info=”none” info_place=”top” info_trigger=”hover”]Become a Trusted Partner[/x_button]

[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Notes

[1] Non-personal data are any data that can not be related to an identifiable person. As Recital 26 of the GDPR observes, “the principles of data protection should therefore not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable”. This recital reflects the finding of the European Court of Justice in 2016 that data are not personal “if the identification of the data subject was prohibited by law or practically impossible on account of the fact that it requires a disproportionate effort in terms of time, cost and manpower, so that the risk of identification appears in reality to be insignificant”. Judgment of the Court (Second Chamber) Patrick Breyer v Bundesrepublik Deutschland, Case C-582/14, 19 October 2016.

[2] For methods of performing frequency capping, impression counting, click counting, conversion counting, view through measurement, and viewability measurement see PageFair note at https://pagefair.com/blog/2017/gdpr-measurement1/.

GDPR consent design: how granular must adtech opt-ins be?

This note examines the range of distinct adtech data processing purposes that will require opt-in under the GDPR.[1]
In late 2017 the Article 29 Working Party cautioned that “data subjects should be free to choose which purpose they accept, rather than having to consent to a bundle of processing purposes”.[2] Consent requests for multiple purposes should “allow users to give specific consent for specific purposes”.[3]  Rather than conflate several purposes for processing, Europe’s regulators caution that “the solution to comply with the conditions for valid consent lies in granularity, i.e. the separation of these purposes and obtaining consent for each purpose”.[4] This draws upon GDPR, Recital 32.[5]
In short, consent requests must be granular, showing opt-ins for each distinct purpose.

How granular must consent opt-ins be?

In its 2013 opinion on “purpose limitation”, the Article 29 Working Party went some way toward defining the scope of a single purpose: a purpose must be “sufficiently defined to enable the implementation of any necessary data protection safeguards,” and must be “sufficiently unambiguous and clearly expressed.”[6]
The test is “If a purpose is sufficiently specific and clear, individuals will know what to expect: the way data are processed will be predictable.”[7] The objective is to prevent “unanticipated use of personal data by the controller or by third parties and in loss of data subject control [of these personal data]”.[8]
In short, a purpose must be specific, transparent and predictable.[9] It must be describable to the extent that the processing undertaken for it would not surprise the person who gave consent for it.
The process of showing an ad to a single person (in online behavioral advertising) involves the processing of personal data for several distinct purposes, by hundreds of different companies.
[accordion id=”video”] [accordion_item title=”Video: how personal data passes between companies in online behavioral advertising” parent_id=”video”]


[/accordion_item][/accordion]
Therefore, a broad, all-encompassing sentence such as “to show you relevant advertising” does not make it possible for one to grasp how one’s data will be used by a large number of companies. It would not be possible to understand from this sentence, for example, that inferences about one’s characteristics would be inferred, or what types of consequences may result.
The following table shows an indicative list of ten purposes for which personal data are currently processed in the online behavioral advertising system. In practice, there may be more purposes at play. The table also generalizes the types of company involved in the selection and display of an ad.
[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/purposes.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download high resolution PDF [/x_button]
A spreadsheet version of this table is available here.
(Refer to footnote 10 to for a discussion the challenges presented by these purposes for all businesses involved.[10])

Pre-consent naming of each controller, and granular post-consent controller consent withdrawal

Recital 42 of the GDPR notes that “For consent to be informed, the data subject should be aware at least of the identity of the controller and the purposes of the processing”.[11] All controllers (including “joint controllers” that “jointly determine the purposes and means of processing”[12]) must be named.[13]
Each purpose must be very clear, and each opt-in requires a “clear affirmative action” that is both “specific”, and “unambiguous”.[14] There can be no pre-ticked boxes,[15] and “consent based on silence” is not permitted.[16]
Therefore, a consent request should be made with granular options for each of these purposes, and the names each controller that processes personal data for each of these purposes. For example:  

Specific purpose 1 | controllers A, B, C | options: Accept / Refuse 

There are two different scenarios for how consent for these purposes will be presented: the best case, and the more likely worst case.

The best scenario

At a minimum, then, assuming that all websites, SSPs, Ad Exchanges, DSPs, DMPs, and advertisers could align to pursue only these purposes, a consent request for this would include granular opt-in controls for a wide range of diverse purposes, the categories of processor pursuing each, and a very long list of controller names pursuing each.
The language and presentation of the request must be simple and clear, ideally the result of user testing.[17]
A consent request for a single purpose, on behalf of many controllers, might look like this.

Specific processing purpose consent, for multiple controllers,
with “next” button for multiple processing purpose opt-ins

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

What is presented when?

The Article 29 Working Party suggests that consent notices should have layers of information so that they do not overload viewers with information, but make necessary details easily available.[18] This is adopted in the design above using “View details”, “Learn about your data rights here”, and similar buttons and links.
When a user clicks “view details” to see the next layer of information about a controller

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

While some details, such as contact details for a company’s data protection officer, can be placed in a secondary layer, the primary layer must include “all basic details of the controller and the data processing activities envisaged”.[19]
Elements presented in this layer

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

The likely scenario:

The scenario above assumes that all businesses in online behavioral advertising can agree to pursue tightly defined purposes without deviation. However, it is more likely that controllers will need granular opt-ins, because their purposes are unique.
Any individual controllers who intend to process data for their own unique purposes will need further granular opt-ins for these purposes. Since adtech companies tend to deviate from the common purposes outlined above, it is likely that most or all of them would ultimately require granular purpose consent for each controller.
However, even if all controllers pursued an identical set of purposes so that they could all receive consent via a single consent dialogue that contained a series of opt-ins, there would need to be a granular set of consent withdrawal controls that covered every single controller once consent had been given. The GDPR says that “the data subject may exercise his or her rights under this Regulation in respect of and against each of the controllers”.[20]

A higher bar: “explicit consent”

Processing of personal data in online behavioral advertising (for example, purposes 2, 3, 5, 8, and 10 in the table above) is highly likely to produce special categories of data by inference.[21] Where this occurs, these purposes require “explicit” consent.[22]
Special categories of data reveal “racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, … [and] data concerning health or data concerning a natural person’s sex life or sexual orientation”.[23] 
To make consent explicit requires more confirmation. For example, the Article 29 Working Party suggests that two-stage verification is a suitable means of obtaining explicit consent.[24] One possible approach to this is suggested in PageFair’s design below.
Suggested mechanism for “explicit consent” 

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

One can confirm one’s opt-in in a second movement of the finger, or cursor and click. It is unlikely that a person could confirm using this interface unless it was their intention.  

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

Note that even this high bar, however, may not be permitted in some Member States. The GDPR gives European Member States the latitude to enact national legislation that prohibits consent as a legal basis for processing of special categories of data.[25] Therefore, it may not be legal to process any special categories of personal data in some EU Member States.

Conclusion 

Consent for website and app publishers is certainly an important objective, but the personal data it provides must only be processed after data leakage has been stopped. Data leakage (through in RTB bid requests, cookie syncs, JavaScript ad units, and mobile SDKs) exposes publishers as the most obviously culpable parties that regulators and privacy NGOs can target. At the same time, it also exposes their adtech vendors, and advertisers, to large fines and legal actions too.[26]
Websites, apps, and adtech vendors, should switch from using personal data to monetize direct and RTB advertising to “non-personal data”.[27] Using non-personal, rather than personal, data neutralizes the risks of the GDPR for advertisers, publishers, and adtech vendors. And it enables them to address the majority (80%-97%) of the audience that will not give consent for 3rd party tracking across the web.[28]
We recently revealed PageFair Perimeter, a regulatory firewall that blocks party data leakage, and enables publishers and adtech partners to use non-personal data for direct and RTB monetization when consent is absent (and leverage personal data when adequate consent has been given). You can learn more about Perimeter here. Publishers using Perimeter do not need people’s personal data (nor the consent required to process it) to monetize websites and apps.

[x_button shape=”rounded” size=”regular” float=”none” href=”http://pagefair.com/perimeter/” info=”none” info_place=”top” info_trigger=”hover”]Learn about Perimeter[/x_button]

Postscript

A hiccup in the choreography of the European Commission’s legislative proposals means that non-tracking cookies will need storage consent, at least until the application of the forthcoming ePrivacy Regulation. These cookies, however, contain no personal data, and obtaining consent for their storage is significantly less burdensome than obtaining consent for to process personal data for multiple purposes and multiple controllers. Update: 16 January 2018: See PageFair Insider note on storage consent for non-tracking cookies.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]


Notes:

[1] See our discussion of why consent is the appropriate legal basis for online behavioral advertising in “Why the GDPR ‘legitimate interest’ provision will not save you” , PageFair Insider, 13 March 2017 (URL: https://pagefair.com/blog/2017/gdpr-legitimate-interest/).
[2] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 11.
[3] ibid., p. 13.
[4] ibid., p. 11.
[5] Regulation (EU) 2016/679 of The European Parliament and of The Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), Recital 32. “…Consent should cover all processing activities carried out for the same purpose or purposes. When the processing has multiple purposes, consent should be given for all of them. …”
[6] “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013, p. 12.
Curiously, the Spanish Data Protection Authority has issued guidance that contains a sentence suggesting that continuing to browse a website might constitute consent, which is at odds with the Article 29 Working Party guidance on consent and appears to be entirely at odds with the text of the Regulation. See “Guía del Reglamento General de Protección de Datos para responsables de tratamiento”, Agencia Española de Protección de Datos, November 2017, p. 6.
[7] “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013, p. 13.
[8] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 12.
[9] “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013, p. 13.
[10] None of these purposes would be permissible unless data leakage were first addressed. See “Consent to use personal data has no value unless one prevents all data leakage”, PageFair Insider, October 2017 (URL: https://pagefair.com/blog/2017/understanding-data-leakage/). Furthermore,

  • Purpose 3 could not be permissible in any situation.
  • Purposes 2, 3, 5, 8, and 10 are highly likely to produce special categories of data by inference. See discussion of “explicit consent” in this note.
  • Regarding the purposes for which data have been sold, and to what category of customer, see “Data brokers: a call for transparency and accountability”, Federal Trade Commission, May 2014, pp 39-40, and B3-B

[11] The GDPR, Recital 42.
[12] The GDPR, Article 26, paragraph 1.
[13] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 14.
[14] The GDPR, Article 4, paragraph 11.
[15] ibid., Recital 32.
[16] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 16.
[17] “Guidelines on transparency under Regulation 216/679” Article 29 Working Party, November 2017, pp 8, 13.
[18] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 14.
[19] ibid., p. 15.
[20] The GDPR, Article 26, paragraph 3.
[21] “Informing data subjects is particularly important in the case of inferences about sensitive preferences and characteristics. The controller should make the data subject aware that not only do they process (non-special category) personal data collected from the data subject or other sources but also that they derive from such data other (and special) categories of personal data relating to them.” See “Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679”, Article 29 Working Party, 3 October 2017, p. 22.
[22] The GDPR, Article 9, paragraph 2, a.
[23] ibid., Article 9, paragraph 1.
[24] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 19.
[25] The GDPR, Article 9, paragraph 2, a.
[26] See “Consent to use personal data has no value unless one prevents all data leakage”, PageFair Insider, October 2017 (URL: https://pagefair.com/blog/2017/understanding-data-leakage/).
[27] Non-personal data are any data that can not be related to an identifiable person. As Recital 26 of the GDPR observes, “the principles of data protection should therefore not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable”. This recital reflects the finding of the European Court of Justice in 2016 that data are not personal “if the identification of the data subject was prohibited by law or practically impossible on account of the fact that it requires a disproportionate effort in terms of time, cost and manpower, so that the risk of identification appears in reality to be insignificant”. Judgment of the Court (Second Chamber) Patrick Breyer v Bundesrepublik Deutschland, Case C-582/14, 19 October 2016.
Non-tracking cookies, which contain no personal data, are useful for privacy-friendly advertising, and for other functions where an individual does not need to be identified such as A/B testing.
[28] See “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, September 2017 (URL: https://pagefair.com/blog/2017/new-research-how-many-consent-to-tracking/).
The granularity of consent required for online behavioral advertising will make the consenting audience even smaller. Moreover, consent for adtech will not only be hard to get, it will also be easy to lose. Consent can be withdrawn with the same degree of ease as it was given, under The GDPR, Article 7, paragraph 3.
The Article 29 Working Party demonstrates what this means in practice: “When consent is obtained … through only one mouse-click, swipe, or keystroke, data subjects must … be able to withdraw that consent equally as easily”.
“Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 21.
The guidance also says that “Where consent is obtained through use of a service specific user interface (for example, via a website, an app, a log-on account, the interface of a IoT device or by e-mail), there is no doubt a data subject must be able to withdraw consent via the same electronic interface, as switching to another interface for the solve reason of withdrawing consent would require undue effort”.