Posts

Facebook and adtech face a turbulent time in Europe's courts: the Brussels case.

This note examines a Belgian court ruling against Facebook’s tracking and approach to consent. Facebook and adtech companies should expect tough sanctions when they find themselves before European courts – unless they change their current approach to data protection and the GDPR. 
Facebook is playing a dangerous game of “chicken” with the regulators. First, it has begun to confront users in the EU with a new “terms of service” dialogue, which denies access to Facebook until a user opt-ins to tracking for ad targeting, and various other data processing purposes.[1] (more detail in footnote 1)

This dialogue appears to breach several important principles of the GDPR, including the principle of purpose limitation,[2] freely given, non-conditional consent,[3] and of transparency.[4] In other words, if Facebook attempts to collect consent in this manner, that consent will be unlawful. European Regulators have been very clear on this point.[5]
Second, on 1 May 2018, a mere twenty four days before the application date of the GDPR, Facebook’s head of privacy announced plans to build “Clear History”, a feature with which users can opt-out of Facebook collecting data about their visits to other websites and apps.[6] But the GDPR demands not an opt-out, but an opt-in.[7] Nor is Clear History available to non-Facebook users. And as a further sign of Facebook’s brinksmanship, it said “it will take a few months to build Clear History”,[8] which means that the feature will not be available to users until long after the GDPR has been applied later this month.
Facebook’s approach puts it on a collision course with European courts. This note examines one recent decision in which the Brussels Court of First Instance ruled that Facebook’s tracking of people on other websites is illegal, and that its approach to consent is invalid.[9] The immediate result was a financial penalty, and an order that Facebook must submit to having an independent expert supervise its deletion of all the personal data it illegally amassed.
The implications of the ruling are far wider. It is an insight into the hazard for digital publishers and adtech vendors of failing to heed the warnings of the Article 29 Working Party.

Important lessons for RTB/programmatic

Belgium’s data protection authority, the Belgian Privacy Commission,[10] challenged that Facebook’s “Like” buttons and trackers on websites all over the web enable it to look “over the shoulders of persons while they are browsing from one website to the next … without sufficiently informing the relevant parties and obtaining their valid consent”.[11]
The Court agreed, and summarized Facebook’s tracking in its ruling:

When someone visits a website with such a Facebook social plug-in, his browser will automatically establish a connection with (by sending an http request to) the Facebook server, after which the visitor’s browser directly loads the “plug-in” function from the Facebook server.[12]

The Court’s ruling outlined what data are received by Facebook from its social plugins installed on other websites:

1. IP address;
2. URL of the page of the website requested by the user;
3. The browser management system;
4. The type of browser, and
5. the cookies (previously) placed by the third-party website from which the browser requests the this-party content.[13]

In a previous judgement in 2015 the Court observed that these browsing data are “frequently of a very sensitive nature, allowing, for example, health-related, sexual and political preferences to be gauged”.[14]
This should give pause to digital publishers and adtech vendors, because these data, which reveal special categories of personal data, are exactly the same data that websites routinely broadcast to tens – if not hundreds – of companies in RTB bid requests.[15] This happens every time an advertisement is served.
The Court noted that the scale of Facebook’s presence across the web makes this tracking “practically unavoidable”.[16] The February 2018 ruling reiterated the Court’s previous ruling in 2015 that “the extent of the violations in question is massive: they do not only concern the violation of the fundamental rights of a single person, but of an enormous group of persons.”[17]
This too should give the online media and advertising industry pause, because the same applies the broadcasting of personal data in RTB bid requests by the majority of major websites across the globe, and to the creation of profiles based on these personal data by DMPs and other adtech vendors.

Facebook’s notification fig leaf ruled unlawful

Facebook provided the following notice to users about this tracking:

We use cookies to help personalise content, to target and measure advertising and to provide you with a safer experience…[18]

Unsurprisingly, the Court ruled that this is utterly inadequate:

The court has come to the decision that in all the cases described, Facebook does not obtain any legally valid consent in the sense of Article 5 (a) Privacy Act[19]and Article 129 ECA[20]for the disputed data processing.[21]

As a result, the Court ruled that Facebook does not have a legal basis for tracking Internet users as they browse the web. Nor does Facebook have a legal basis for tracking logged-in users around the web.[22]
Several of the Court’s admonitions are worth including here, because they are directly relevant to Facebook and other online media and adtech companies’ approaches to the GDPR.
First, the Court found that non-Facebook users are never told that their behavior on websites across the web is being profiled by Facebook:

When non-users visit a website of a third party that includes an (invisible) Facebook pixel that allows for tracking of browsing behavior, without indicating that they wish to make use of the Facebook service, no information mechanism (such as a banner) is displayed.[23]

This remains a legal risk for Facebook, and “Clear History” does not adequately mitigate this risk.
Second, the Court ruled that Facebook’s request for consent was not specific, and that any consent that it received was unlawful as a result:

‘Specific’ means that the expression of will must related to a specific instance or category of data processing and can thus not be obtained on the basis of a general authorization for an open series of processing activities.[24]

This part of the ruling was based on Article 1, section 8, of the Belgian Privacy Act, which uses the same formula of words as Article 4, paragraph 11, of the GDPR (“freely given, specific, informed…”). In other words, the Court is upholding a standard that is virtually identical to the standard that will apply under the GDPR. Facebook’s new GDPR consent dialogue faces the same problem, and is unlawful for the same reason.
Third, the Court found that Facebook users are not clearly told what “purposes” Facebook processes the personal data for. Nor does it clearly explain its use of sensitive data including any personal data that could reveal religious belief, sexual orientation, etc.:

the cookie banner, makes it insufficiently clear for which exact purposes the personal data – which indeed also include “sensitive data” (e.g. regarding religious beliefs or sexual orientation) – are being collected, while the following layers (including the cookie policy, data policy) also do not explain this in an easily comprehensible and accessible manner.[25]

Facebook has recently gone some way to inform users about the use of personal data concerning their political interests, but this is only a partial solution to a far broader risk for the company. Its handling of sensitive categories of personal data will be a major challenge, which it has yet to show any ability to resolve.[26] 
Fourth, and unsurprisingly in the aftermath of the Cambridge Analytica scandal, the Court found that Facebook did not properly disclose who it was sharing the data with. Nor did it provide any information about “the existence of a right to access and correction of the personal data concerning him”.[27] This is likely to remain a significant challenge.[28]
Fifth, the Court found that Facebook was not even complying with its own self-regulatory system. Whatever one’s view of the “adchoices” self-regulatory system, it is quite remarkable that Facebook continued to track people even if they had already used it to opt out.[29]

Facebook forced to delete data (and fined)

The Brussels Court ordered Facebook to pay €250,000 per day,[30] up to a maximum of €100 million, until it stopped its unlawful behavior.
This was a strong statement. To put this fine in to perspective, consider that Belgium has a population of 11.35 million people,[31] which is only 2% of the population of the EU.[32] At the same value per person, the EU equivalent would be €12.5 million per day, up to a maximum of €5 billion.
In addition, Facebook was ordered to submit to an independent expert supervising its deletion of all illegal data that it had amassed about every user on Belgian soil.[33] It also had to make sure that third parties to whom it provided illegal data do the same.
The Cambridge Analytica scandal shows that this last point about insuring that third parties delete their copies of Facebook’s illegally accumulated data will be impossible for Facebook to comply with, because of its lax data sharing standards. Recall that Mark Zuckerberg told US lawmakers

When developers told us they weren’t going to sell data, we thought that was a good representation. But one of the big lessons we’ve learned is that clearly, we cannot just take developer’s word for it.[34]

In other words, Facebook was sharing personal data without any control whatsoever, much as websites do when they send visitors’ personal data in RTB bid requests. Even if the original collection of the data had been lawful, this uncontrolled distribution would certainly is not. Again, the parallel with RTB bid requests should give publishers and adtech vendors pause.

What the Article 29 Working Party says, goes

Many of our colleagues in adtech have been unwilling to heed the counsel of the Article 29 Working Party (a roundtable of European regulators). The Brussels Court’s ruling illustrates the Working Party’s importance and authority. Although the Court is the arbiter, it relied on the Working Party’s authoritative opinions throughout its ruling. (The ruling cited the Working Party’s 2011 opinion on consent (15/2011),[35] its 2010 opinion on online behavioral advertising (2/2010)[36], its 2013 opinion on purpose limitation (2/2013)[37], and its 2010 opinion on the concepts of data controller and data processor (1/2010)[38].)
Whether or not businesses take the Working Party seriously, judges do, which is what matters when businesses find themselves facing sanctions for data misuse. This should demonstrate the value of closely abiding by the opinions of the Working Party. The requirements of European data protection law have been well illuminated by the public guidance of the Article 29 Working Party for over two decades, and provide an invaluable guide to businesses scrambling to comply with a body of law largely neglected hitherto.

Facebook can not reject users who refuse non-essential tracking

The Court ruled that Facebook cannot reject users who refuse to agree to tracking – unless the tracking in question is necessary for the service that a user explicitly requests from Facebook.[39] Instead, the Court ruled that users should be

given the option of refusing the placement of these cookies, in as far as this is not strictly necessary for a service explicitly requested by him, without his access to the Facebook.com domain being hereby limited or rendered more difficult.[40]

In December 2015, Facebook had blocked access to all Belgian users, following a court injunction that forbade it to place a (“Datr”) cookie without properly informing users.[41] (See footnote 41 for elaboration.) Facebook attempted to justify this denial of service in a notice to users that claimed it could not provide service because was prohibited from taking measures (unlawful tracking) to prevent unauthorized access to users’ Facebook accounts. The Court took a dim view of this:

The court concurs … that the systematic collection of the personal data of users and non-users via social plug-ins on the websites of third parties is not essential (let alone “strictly essential” in the sense of Article 129 ECA),or at least not proportional to the achievement of the safeguarding objective.[42]

The Court believed that Facebook’s purported fraud detection was insufficient in any case:

the systematic collection of safeguarding cookies is inadequate as a means of safeguarding, as it is easy to circumvent by persons with malicious intentions.[43]

Conclusion: fewer data, not more, will help Facebook in the EU

This ruling is one of several defeats Facebook has suffered in European courts in recent months. In January, the Berlin Regional Court ruled that Facebook’s approach to consent and terms are unlawful.[44] In April, the Irish High Court referred important aspects of Facebook’s trans-Atlantic transfers of personal data to the European Court of Justice, once again, for scrutiny.[45] It is likely that worse is to come, unless it significantly changes its approach to data protection within the EU.
However, the company has options. As unlikely as it may seem now, one can foresee that Facebook will introduce non-personal data based ad targeting to the Newsfeed. This is likely to be necessary because Facebook will be unable to win lawful consent for some of its data processing purposes for sensitive personal data (or data processing purposes for regular personal data, that are not “compatible” with purposes that the user has already agreed to).[46]
It seems likely that problem encompasses all personalized advertising on the newsfeed, custom audiences, and social share buttons on other websites. Therefore, Facebook must have a way of targeting ads to non-consenting users. Non-personal data would allow this.
It may also become important for Facebook to be able to participate in a clean and safe data supply chain, which major advertisers are beginning to show concern about.[47]
In addition, Facebook will have to limit the use of custom audiences to situations where it is certain that the advertiser has a valid legal basis.
There is a broader lesson. Digital publishers and adtech vendors need to urgently reassess the use of personal data in programmatic advertising, and reflect on how adtech’s shaky consent systems will fare in Europe’s courts.

Notes

[1] The new terms mention personalization of ads. See “Terms of service”, Facebook (URL: https://www.facebook.com/legal/terms/update), accessed 2 May 2018.
The Terms also refer to the data policy, which elaborates that “we use the information we have about you – including information about your interests, actions and connections – to select and personalise ads, offers and other sponsored content that we show you.” The data policy also says “We use the information [including] the websites you visit and ads you see … to help advertisers and other partners measure the effectiveness and distribution of their ads and services, and understand the types of people who use their services and how people interact with their websites, apps and services”. “Data policy”, Facebook (URL: https://www.facebook.com/about/privacy/update), accessed 2 May 2018.
[2] The GDPR, Article 5, paragraph 1.
[3] The GDPR, Article 7, paragraph 2.
[4] The GDPR, Article 13, paragraph 1 and paragraph 2.
[5] “Guidelines on consent under Regulation 2016/679”, WP259, Article 29 Working Party, 28 November 2017, p. 11.
[6] “Getting feedback on new tools to protect people’s privacy”, Facebook, 1 May 2018 (URL: https://newsroom.fb.com/news/2018/05/clear-history-2/).
[7] See the GDPR, Article 6, Article 8, and Article 9.
[8] “Getting feedback on new tools to protect people’s privacy”, Facebook.
[9] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., Dutch-language Brussels Court of First Instance (Nederlandstalige Rechtbank van Eerste Aanleg te Brussel/Tribunal de Première Instance néerlandophone de Bruxelles – the “Court”), 16 February, 2016/153/A (URL: https://pagefair.com/wp-content/uploads/2018/04/Belgian-Court-judgement.pdf).
[10 ]It has since changed its name to the Belgian Data Protection Authority.
[12] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 12.
[12] ibid., p. 9. See more detail on pp 49-51.
[13] ibid., p. 9.
[14] “Data leakage in online advertising”, PageFair (URL: https://pagefair.com/data-leakage-in-online-behavioural-advertising/).
[15] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 69.
[16] ibid., p. 69.
[17] ibid.,
Note that this raises the competition (antitrust) question, as Germany’s competition regulator, Andreas Mundt, has pointed out: “If Facebook has a dominant market position, then the consent that the user gives for his data to be used is no longer voluntary” (see https://www.reuters.com/article/us-facebook-privacy-germany/facebooks-hidden-data-haul-troubles-german-cartel-regulator-idUSKBN1HU108).
[18] ibid., p. 8.
[19] Which implemented the Data Protection Directive.
[20] Electronic Communications Act of 20 June 2005, which implemented the ePrivacy Directive.
[21] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 64.
[22] ibid., p. 73-4.
[23] ibid., p. 57
[24] ibid., p. 61.
[25] ibid., p. 58.
[26] See discussion of special categories of data in the newsfeed in “How the GDPR will disrupt Google and Facebook”, PageFair, 30 August, (URL: https://pagefair.com/blog/2017/gdpr_risk_to_the_duopoly/).
[27] ibid., p. 59.
[28] See testimony by Chris Vickery at the UK Parliament Digital, Culture, Media and Sport Committee Wednesday 2 May 2018  (URL: https://www.parliamentlive.tv/Event/Index/0cf92dd0-f484-4699-9e01-81c86acb880c)
[29] ibid., p. 63.
[30] ibid., p. 70.
[31] World Bank, 2016.
[32] Eurostat, population on 1 January 2017 (URL: ec.europa.eu/eurostat/tgm/table.do?tab=table&plugin=1&language=en&pcode=tps00001)
[33] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 14, 70.
[34] Testimony of Mark Zuckerberg Chairman and Chief Executive Officer, Facebook, Hearing before the United States House of Representatives Committee on Energy and Commerce, 11 April 2018 (URL: https://www.c-span.org/video/?443490-1/facebook-ceo-mark-zuckerberg-testifies-data-protection&live&start=4929#).
[35] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., pp. 57-8, 61.
[36] ibid., p. 59.
[37] ibid., p. 60.
[38] ibid., p. 70.
[39] ibid., p. 13, 72.
[40] ibid., p. 72. See the Privacy Commission’s argument for this on p. 13.
[41] After an order from the Privacy Commission, which was backed up by a Court injunction. In 2015, the Privacy Commission ordered Facebook to, among other things, stop tracking non-users, using cookies and social plugs, without consent, and to do the same for users unless “unless strictly necessary for a service explicitly requested b the user” or unless it gets “unequivocal, specific consent”. It was also ordered to use consent requests that are unequivocal and specific. When Facebook failed to comply this was followed by a court order in November 2015. Facebook responded by blocking access to users. See ibid., pp 4-7.
[42] ibid., p. 65-6.
[43] ibid., p. 67.
[44] Judgment of the Berlin Regional Court dated 16 January 2018, Case no. 16 O 341/15 (URL: https://pagefair.com/wp-content/uploads/2018/04/Berlin-Court-judgement-German.pdf)
[45] The High Court, Commercial, 2016, N. 4809 P., The Data Protection Commissioner v Facebook Ireland and Maximilian Schrems, Request for a preliminary ruling, Article 267 TFEU, 12 April 2018.
See also Judgement of Ms Justice Costello, The High Court, Commercial, 2016, No. 4809 P., The Data Protection Commissioner v Facebook Ireland and Maximillian Schrems, 3 October 2017.
Note, this is the second “Schrems” case. The first caused the end of the EU-US Safe Harbor agreement.
[46] See a discussion on Facebook and purpose limitation in “How the GDPR will disrupt Google and Facebook”, PageFair, 30 August, (URL: https://pagefair.com/blog/2017/gdpr_risk_to_the_duopoly/).
[47] “WFA Manifesto for Online Data Transparency”, World Federation of Advertisers, 20 April 2018 (URL: https://www.wfanet.org/news-centre/wfa-manifesto-for-online-data-transparency/). See also Stephan Loerke, WFA CEO, “GDPR data-privacy rules signal a welcome revolution”, AdAge, 25 January 2018 (URL: adage.com/article/cmo-strategy/gdpr-signals-a-revolution/312074/).

Google adopts non-personal ad targeting for the GDPR

This note examines Google’s recent announcement on the GDPR. Google has sensibly adopted non-personal ad targeting. This is very significant step forward and signals a change in the online advertising market. But Google has also taken a new and problematic approach to consent for personal data use in advertising that publishers will find hard to accept. 

Google decides to use non-personal ad targeting to comply with the GDPR 

Last Thursday Google sent a policy update to business partners across the Internet announcing that it would launch an advertising service based on non-personal data in order to comply with the GDPR.[1]
PageFair has advocated a non-personal approach to advertising for some time, and commends Google for taking this position. As we noted six months ago,[2] Google AdWords, for example, can operate without consent if it discards personalized targeting features (and unique IDs). In this case, advertisers can continue to target advertisements to people based on what they search for.
This may be part of a trend for Google, which announced in mid 2017 that it would stop mining personal e-mails in Gmail to inform its ad targeting. Clearly, few users would have given consent for this.[3] Google’s latest announcement has signaled to advertisers the importance of buying targeted advertising without personalization.
Although Google’s “non-personalized ads” may seem promising to advertisers and publishers who are concerned about GDPR liability, more work must be done before they can be considered safe.
Unique tracking IDs are currently vital to Google’s ability to perform frequency capping and bot detection.[4] Meanwhile, data leakage is a problem caused by 3rd party ad creatives liberally loading numerous tracking pixels. Google has been silent on fixing these problems. Therefore, it may be that Google will merely target ads with non-personal data, but will continue to perform tracking as usual. Clarity on this point will be important for advertisers seeking safe inventory.

Problems with Google’s approach to consent for personal data

Despite its new non-personalized ads, Google is also attempting to build a legal basis under the GDPR for its existing personal data advertising business. It has told publishers that it wants them to obtain their visitors’ consent to “the collection, sharing, and use of personal data for personalization of ads or other services”.[5]
Note that the purpose here is “personalization of ads or other services”. This is appears to be a severe conflation of the many separate processing purposes involved in advertising personalization.[6] The addition of “other services” makes the conflation even more egregious. As we previously observed in our note on the approach proposed by IAB Europe, this appears to be a severe breach of Article 5, which requires that consent be requested in a granular manner for “specified, explicit” purposes.[7] As noted in a previous PageFair note, European regulators have explicitly warned against conflating purposes in this way:

“If the controller has conflated several purposes for processing and has not attempted to seek separate consent for each purpose, there is a lack of freedom. This granularity is closely related to the need of consent to be specific …. When data processing is done in pursuit of several purposes, the solution to comply with the conditions for valid consent lies in granularity, i.e. the separation of these purposes and obtaining consent for each purpose”.[8] 

Controller-controller 

Google is asking publishers to obtain consent from their visitors for it to be an independent controller of those users’ personal data.[9] Confusingly, Google has called this a “controller-controller” policy. This evokes “joint-controllership”, a concept in the GDPR that would require force both Google and publisher to jointly determine the purposes and means of processing, and to be transparent with each other.[10] However, what Google proposes is not joint-controllership, but rather independent controllership for the publisher on the one hand, and for Google on the other. Google’s “controller-controller” terms to publishers define each party as

“an independent controller of Controller Personal Data under the Data Protection Legislation; [that] will individually determine the purposes and means of its processing of Controller Personal Data”.[11]

It is not clear why a publisher would choose to do this, since it would enable Google to leverage that publisher’s audience across the entire web (severe conflation of purposes notwithstanding). The head of Digital Content Next, a publisher trade body that represents Disney, New York Times, CBS, and so forth, has already announced that “no way in hell Google will be “co-controller” across publishers’ sites”.[12]
Further problems with Google’s new approach to consent 
Even if publishers did accept that Google could be a controller of their visitors’ data for its own purposes, it is unlikely that many visitors would give their consent for this.[13]
If, however, both a publisher and a visitor were to agree to Google’s controller-controller proposal, two further problems arise. First, when a publisher shares third party personal data with Google, Google’s terms require that the publisher “must use commercially reasonable efforts to ensure the operator of the third party property complies with the above duties [of obtaining adequate consent]”.[14] This phrase “commercially reasonable efforts” is not a meaningful defence in the event that personal data are unlawfully processed.
As one expert from a European data protection authority retorted when I researched this point: “Imagine this as legal defence line: ‘We did not obtain consent because if wasn’t possible with commercially reasonable efforts’?” The Regulation is clear that “each controller or processor shall be held liable for the entire damage”, where more than one controller or processor are “involved in the same processing”.[15]
Second, Google’s policy puts what appears to be an impossible burden on the publisher. It requires that the publisher accurately inform the visitor about how their data will be used if they give consent.

“You must clearly identify each party that may collect, receive, or use end users’ personal data as a consequence of your use of a Google product. You must also provide end users with prominent and easily accessible information about that party’s use of end users’ personal data”.[16]

However, the publisher does not know what personal data Google shares with its own business partners. Nor does it know what purposes these parties process data about its visitors for. So long as this continues, a publisher cannot be in a position to inform its visitors of what will be done with their data. The result is very likely to be a breach Article 6[17] and Article 13[18] of the GDPR.
Giving Google the benefit of the doubt, this may change before 25 May. Google plans to publish some information about its “uses of information and we are asking other ad technology providers with which Google’s products integrate to make available information about their own uses of personal data.”[19] Publishers will not be well served by any further delay in the provision of this information.

Risks for Google 

Google’s decision to rely on non-personal data for ad targeting is highly significant, and will enable the company and advertisers that work with it to operate under the GDPR. However, Google’s new consent policy is fraught with issues that make it impossible for publishers to adopt. Our GDPR risk scale, first published for Google in August 2017, remains unchanged.


Perimeter is a robust regulatory firewall. It preemptively blocks unauthorized requests from 3rd parties, and tightly controls personal data on your website and app. It protects you, your advertising business, and your users. Perimeter makes sure that consent means something.

[x_button shape=”rounded” size=”regular” float=”none” href=”https://pagefair.com/perimeter” info=”none” info_place=”top” info_trigger=”hover”]Learn more[/x_button]

Notes

[1] “Changes to our ad policies to comply with the GDPR”, Google Inside AdWords, 22 March 2018 (URL: https://adwords.googleblog.com/2018/03/changes-to-our-ad-policies-to-comply-with-the-GDPR.html).
[2] “How the GDPR will disrupt Google and Facebook”, PageFair Insider, August 2017 (URL: https://pagefair.com/blog/2017/gdpr_risk_to_the_duopoly/).
[3] ibid.
[4] For alternative methods of performance measurement and reporting see “Frequency capping and ad campaign measurement under GDPR”, PageFair Insider, November 2017 (URL: https://pagefair.com/blog/2017/gdpr-measurement1/).
[5] “EU user consent policy”, Google, to apply from 25 May 2018 (URL: https://www.google.com/about/company/consentstaging.html)
[6] See discussion of data processing purposes in online behavioural advertising, and the degree of granularity required in consent, in “GDPR consent design: how granular must adtech opt-ins be?”, PageFair Insider, January 2018 (URL: https://pagefair.com/blog/2018/granular-gdpr-consent/).
[7] The GDPR, Article 5, paragraph 1, b, and note reference to the principle of “purpose limitation”. See also Recital 43. For more on the purpose limitation principle see “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013.
[8] “Guidelines on consent under Regulation 2016/679”, WP259, Article 29 Working Party, 28 November 2017, p. 11.
[9] “Google Ads Controller-Controller Data Protection Terms, Version 1.1”, Google, 12 October 2017 (URL: https://privacy.google.com/businesses/controllerterms/).
[10] See The GDPR, Article 26.
[11] Clause 4.1 of “Google Ads Controller-Controller Data Protection Terms, Version 1.1”, Google, 12 October 2017 (URL: https://privacy.google.com/businesses/controllerterms/).
[12] Jason Kint, Twitter, 22 March 2018 (URL: https://twitter.com/jason_kint/status/976928024011726848)
[13] “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, 12 September 2017 (URL: https://pagefair.com/blog/2017/new-research-how-many-consent-to-tracking/).
[14] “EU user consent policy”, Google, to apply from 25 May 2018 (URL: https://www.google.com/about/company/consentstaging.html)
[15] The GDPR, Article 4, paragraph 2.
[16] “EU user consent policy”, Google, to apply from 25 May 2018 (URL: https://www.google.com/about/company/consentstaging.html)
[17] The GDPR, Article 6, paragraph 1, a.
[18]  [20] The GDPR, Article 13, paragraph 2, f, and Recital 60.
[19] “Help with the EU user consent policy”, Google (URL:https://www.google.com/about/company/consenthelpstaging.html)

Risks in IAB Europe’s proposed consent mechanism

This note examines the recently published IAB “transparency and consent” proposal. Major flaws render the system unworkable. The real issue is what should be done with the vast majority of the audience who will not give consent. 

Publishers would have no control (and are expected to blindly trust 2,000+ adtech companies)

The adtech companies[1] who drafted the IAB Europe proposal claim that “publishers have full control over who they partner with, who they disclose to their users and who they obtain consent for.”[2] But the IAB Europe documentation shows that adtech companies would remain entirely free to trade the personal data with their business partners if they wish. The proposed system would share a unique[3] consent record “throughout the online advertising ecosystem”, every time an ad is loaded on a website:[4]

“the OpenRTB request [from a website to an ad exchange] will contain the entire DaisyBit [a persistent cookie],[5] allowing a vendor to see which other vendors are an approved vendor or a publisher and whether they have obtained consent (and for which purposes) and which have not.”[6]

There would be no control over what happens to personal data once they enter the RTB system: “[adtech] vendors may choose not to pass bid requests containing personal data to other vendors who do not have consent”.[7] This is a critical problem, because the overriding commercial incentive for many of the companies involved is to share as many data with as many partners as possible, and to share it with parent companies that run data brokerages. In addition, publishers are expected to trust that JavaScript in “ad creatives” is not dropping trackers, even though no tools to police this are proposed here.
IAB Europe is asking publishers and brands to expose themselves to the legal risk of routinely sharing these personal data with several thousand adtech companies. What publishers and brands need is a “trust no one” approach. IAB Europe is proposing a “trust everyone” approach. Indeed, the proposed system looks like the GDPR’s description of a data breach:

“a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data transmitted, stored or otherwise processed”.[8]

Publishers have no control over personal data once they send them into the RTB system. All publishers have is liability.

“OK to everything” jeopardises the publisher’s own opt-ins

The proposed system would also jeopardise the chance of websites obtaining essential opt-ins for their own data processing purposes, such as commenting widgets, video players. IAB Europe proposes that websites bundle all consent under a single “OK”/”Accept all” button. Our wireframe below shows the text and buttons recommended by IAB Europe.[9]

Broadly speaking, websites might expect to receive consent from four out of every five of users for their own data processing.[10] Whereas the opt-in rate for ad tech tracking is tiny in comparison. Our research found that only 3% of people say they would opt in to 3rd parties tracking them across the web for the purposes of advertising.[11] IAB Europe’s commissioned research found that only 20% would do so.[12] The ad tech vendors who drafted the IAB Europe proposal have an incentive to ask publishers to take risk on their behalf: they must realize that there is no chance that Internet users will agree to the cascade of opt-ins that the GDPR requires.[13] A website would be ill advised to jeopardise its own consent requests in a vain effort to get consent for ad tech companies, particularly if those ad tech companies plan to use that same consent to work with the website’s competitors.

Conflation and other matters of presentation

The proposal appears to breach Article 5, Article 6, and Article 13 of the GDPR, for several reasons.
First, Article 5 requires that consent be requested in a granular manner for “specified, explicit” purposes.[14] Instead, IAB Europe’s proposed design bundles together a host of separate data processing purposes under a single opt-in. A user must click the “Manage use of your Data” button in order to view four slightly less general opt-ins, and the companies[15] requesting consent. These opt-ins also appear to breach Article 5, because they too conflate multiple data processing purposes into a very small number of ill defined consent requests. For example, a large array of separate ad tech consent requests[16] are bundled together in a single “advertising personalisation” opt-in.[17] European regulators explicitly warned against conflating purposes:

“If the controller has conflated several purposes for processing and has not attempted to seek separate consent for each purpose, there is a lack of freedom. This granularity is closely related to the need of consent to be specific …. When data processing is done in pursuit of several purposes, the solution to comply with the conditions for valid consent lies in granularity, i.e. the separation of these purposes and obtaining consent for each purpose.”[18]

Second, the text that IAB Europe proposes publishers display for the “advertising personalisation” opt-in appears to severely breach of Article 6[19] and Article 13[20] of the GDPR. In a single 49 word sentence, the text conflates several distinct purposes, and gives virtually no indication of what will be done with the reader’s personal data.

“Advertising personalisation allow processing of a user’s data to provide and inform personalised advertising (including delivery, measurement, and reporting) based on a user’s preferences or interests known or inferred from data collected across multiple sites, apps, or devices; and/or accessing or storing information on devices for that purpose.”[21]

This fails to disclose that hundreds, and perhaps thousands, of companies will be sent your personal data. Nor does it say that some of these companies will combine these with a profile they already have built about you. Nor are you told that this profile includes things like your income bracket, age and gender, habits, social media influence, ethnicity, sexual orientation, religion, political leaning, etc. Nor do you know whether or not some of these companies will sell their data about you to other companies, perhaps for online marketing, credit scoring, insurance companies, background checking services, and law enforcement.
Third, a person must say yes or no for all or none of the companies listed as data controllers.[22] Since one should not be expected to trust all controllers equally, and since it is unlikely that all controllers apply equal safeguards of personal data, we suspect that this “take it or leave it” choice will not satisfy regulatory authorities.
Fourth, there appears to be no way to easily refuse to opt-in to the consent request that IAB Europe proposes, which would also breach the GDPR.[23] It is possible that this last point is simply an accidental oversight in the drafting of IAB Europe’s documentation.

Conclusion: What about the people (80%-97%) who don’t opt-in?

The proposed system has no plan to make consent meaningful, by giving publishers and data subjects control over what happens to personal data. Nor does it have a plan for what happens when users do not give consent. It is time for the discussion to move on.
As the CEO of a Digital Content Next, a major publisher trade body, recent told members, “GDPR will create opportunity for audience selection based on cohorts and context”.[24] Non-personal data such as these are the only way for the industry to approach the GDPR.
PageFair has recently announced Perimeter, a regulatory firewall that enables websites (and apps) protect their ad business, running direct campaigns and use RTB without risk under the GDPR. It prevents unauthorized connections from 3rd parties, so that personal data can not leak through the RTB system, or anywhere else. (For extra peace of mind, PageFair’s SSP delivers guaranteed compliant programmatic display advertising). This is the consent-free approach.
We also believe that consent has a role. The next chapter for online advertising will be written by publishers who use consent-free RTB, and build up consenting audiences for premium advertising too.
Note: thanks to Andrew Shaw at PageFair. 


[x_alert heading=”Feedback Wanted” type=”success” close=”true”]Note: PageFair has just updated its online overview of Perimeter. Please review http://pagefair.com/perimeter and give us your feedback.[/x_alert]

Perimeter is a robust regulatory firewall. It preemptively blocks unauthorized requests from 3rd parties, and tightly controls personal data on your website and app. It protects you, your advertising business, and your users. Perimeter makes sure that consent means something.

[x_button shape=”rounded” size=”regular” float=”none” href=”https://pagefair.com/perimeter” info=”none” info_place=”top” info_trigger=”hover”]Learn more[/x_button]

Notes

[1] AppNexus Inc.; Conversant, LLC; DMG Media Limited; Index Exchange, Inc.; MediaMath, Inc.; Oath, Inc.; Quantcast Corp.; and, Sizmek, Inc. are named in the copyright notice of “Transparency & Consent Framework, Cookie and Vendor List Format, Draft for Public Comment, v1.a”, IAB Europe (URL: URL-shortened), p. 3.
Note: PageFair is a member of IAB TechLab, and IAB UK.
[2] “Transparency & Consent Framework FAQ”, IAB Europe, 8 March 2018 (URL: http://advertisingconsent.eu/wp-content/uploads/2018/03/Transparency_Consent_Framework_FAQ_Formatted_v1_8-March-2018.pdf), p. 8.
[3] Our statistical examination of the data in the cookie showed a very high degree of uniqueness. The proposed cookie is itself a tracking cookie. See the specification of the cookie in “Transparency & Consent Framework, Cookie and Vendor List Format, Draft for Public Comment, v1.a”, IAB Europe, pp 8 – 10.
[4] ibid., p. 3
[5] ibid., p. 8.
To see the content of the proposed consent cookie, see http://gdpr-demo.labs.quantcast.com/user-examples/cookie-workshop.html.
It is envisaged that the record may be server-based in the future, because this will work better. See p. 7.
[6] “Transparency & Consent Framework FAQ”, IAB Europe, 8 March 2018, p. 9.
[7] ibid., p. 10. And from the same page, when an adtech company gets personal data without consent, IAB Europe asks it “to only act upon that data if it has another applicable legal basis for doing so”.
[8] Regulation (EU) 2016/679 of The European Parliament and of The Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), Article 4, paragraph 12.
[9] “Transparency & Consent Framework FAQ”, IAB Europe, 8 March 2018, p. 13.
[10] 20% would accept first party tracking only. An additional 56% would accept tracking that is strictly necessary for services they have requested. 5% say they would accept all tracking.
See “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, 12 September 2017 (URL: https://pagefair.com/blog/2017/new-research-how-many-consent-to-tracking/).
[11] ibid.
[12] “Europe Online: an experience driven by advertising”, GFK, 2017, p. 7. (URL: https://www.iabeurope.eu/wp-content/uploads/2017/09/EuropeOnline_FINAL.pdf).
[13] “GDPR consent design: how granular must adtech opt-ins be?”, PageFair Insider, January 2018 (URL: https://pagefair.com/blog/2018/granular-gdpr-consent/).
[14] The GDPR, Article 5, paragraph 1, b, and note reference to the principle of “purpose limitation”. See also Recital 43. For more on the purpose limitation principle see “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013.
[15] Note that the Article 29 Working Party very recently warned that this alone might be enough to render consent invalid: “when the identity of the controller or the purpose of the processing is not apparent from the first information layer of the layered privacy notice (and are located in further sub-layers), it will be difficult for the data controller to demonstrate that the data subject has given informed consent, unless the data controller can show that the data subject in question accessed that information prior to giving consent”.
Quote from “Guidelines on consent under Regulation 2016/679”, WP259, Article 29 Working Party, 28 November 2017 (URL: https://pagefair.com/wp-content/uploads/2017/12/wp259_enpdf.pdf), p. 15, footnote 39.
[16] See discussion of data processing purposes in online behavioural advertising, and the degree of granularity required in consent, in “GDPR consent design: how granular must adtech opt-ins be?”, PageFair Insider, January 2018 (URL: https://pagefair.com/blog/2018/granular-gdpr-consent/).
[17] “Transparency & Consent Framework FAQ”, IAB Europe, 8 March 2018, p. 18.
[18] “Guidelines on consent under Regulation 2016/679”, WP259, Article 29 Working Party, 28 November 2017, p. 11.
[19] The GDPR, Article 6, paragraph 1, a.
[20] The GDPR, Article 13, paragraph 2, f, and Recital 60.
[21] “Transparency & Consent Framework FAQ”, IAB Europe, 8 March 2018, p. 18.
[22] “Transparency & Consent Framework, Cookie and Vendor List Format, Draft for Public Comment, v1.a”, IAB Europe, p. 5.
This is apparently “due to concerns of payload size and negatively impacting the consumer experience, a per-vendor AND per-purpose option is not available”, p. 22.
[23] The Regulation is clear that “consent should not be regarded as freely given if the data subject has no genuine or free choice”. The GDPR, Recital 42. See also, Article 4, paragraph 11.
[24] Jason Kint, “Why the IAB GDPR Transparency and Consent Framework is a non-starter for publishers”, Digital Content Next, 19 March 2018 (URL: https://digitalcontentnext.org/blog/2018/03/19/iab-gdpr-consent-framework-non-starter-publishers/)

Adtech must change to protect publishers under the GDPR (IAPP podcast)

The follow up to the International Association of Privacy Professionals’ most listened to podcast of 2017. 
Angelique Carson of the International Association of Privacy Professionals quizzes PageFair’s Dr Johnny Ryan on the crisis facing publishers, as they grapple with adtech vendors and attendant risks ahead of the GDPR. The podcast covers:

  • Why personal data can not be used without risk in the RTB/programmatic system under the GDPR.
  • Where consent falls short for publishers.
  • How vulnerable the online advertising system is, because of central points of legal failure.
  • The GDPR is part of a global trend. New privacy standards are on the way in other massive markets including China (and in important tech ecosystems such as Apple iOS, Firefox).

This is the follow up to an earlier IAPP and PageFair podcast discussion (which was the International Association of Privacy Professionals’ most listened to podcast of 2017).

[x_button shape=”rounded” size=”regular” float=”none” href=”https://iapp.org/news/a/the-privacy-advisor-podcast-johnny-ryan-on-the-continuing-crisis-ad-tech-faces/” info=”none” info_place=”top” info_trigger=”hover”]Listen at IAPP[/x_button]

[x_button shape=”rounded” size=”regular” float=”none” href=”https://itunes.apple.com/us/podcast/the-privacy-advisor-podcast/id1095382766?mt=2#” info=”none” info_place=”top” info_trigger=”hover”]Listen on iTunes[/x_button]

Click here to view PageFair’s explainers and official documents about the changes websites and apps must make under the new privacy rules. Elsewhere you can find details about Perimeter, PageFair’s GDPR solution for publishers.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

PageFair's long letter to the Article 29 Working Party

This note discusses a letter that PageFair submitted to the Article 29 Working Party. The answers may shape the future of the adtech industry. 
Eventually the data protection authorities of Europe will gain a thorough understanding of the adtech industry, and enforce data protection upon it. This will change how the industry works. Until then, we are in a period of uncertainty. Industry can not move forward, business can not flourish. Limbo does not serve the interests of publishers. Therefore we press for certainty.
This week PageFair wrote a letter to the Article 29 Working Party presenting insight on the inner workings of adtech, warts and all.
Our letter asked the working party to consider five questions. We suspect that the answers may shape the future of the adtech industry.

  1. We asked for further guidance about two issues that determine the granularity of consent required. First, we asked what the scope of a single “purpose” for processing personal data is. Since one must have a legal basis for each purpose, a clear understanding of scope of an individual purpose is important to determine the number of purposes, and thus the number of granular opt-ins required.
  2. The second question about granularity of consent asked whether multiple controllers that pursue identical purposes should be unbundled from each other. In other words, should consent be requested not only per purpose, but per controller too. This is important because it should not be assumed that a person trusts all data controllers equally. Nor is it likely that all controllers apply equal safeguards of personal data. Therefore, we asked whether it was appropriate to bundle multiple controllers together in a single consent request without the opportunity to accept some, and not all.
  3. We asked for guidance on how explicit consent operates for websites and apps, where a controller wishes to process special categories of personal data. Previously the Working Party cited the double opt-in as method of explicit consent for e-mail marketing. We presented wireframes of how this might operate on web and mobile.
  4. We asked for clarification that all unique identifiers are personal data. This is important because the presence of a unique ID enables the combining of data about the person associated with that unique ID, even if the party that originally assigned the unique ID did so randomly, without any understanding of who the data subject is.
  5. We asked for guidance on how Article 13 of the GDPR applies to non-tracking cookies (without personal data) as opposed to personal data. This is important because some paragraphs of this article were intended to apply to personal data and are not appropriate for non-personal data.

In addition to these questions we made three statements.

  1. Websites, apps, and adtech vendors leak personal data to unknown parties in routine advertising operation (via “RTB” bid requests, cookie syncs, JavaScript ad units, mobile SDKs, and other 3rd party integrations). This is preventable.
  2. We noted our support for the Working Party’s view that the GDPR forbids the demanding of consent for 3rd party tracking that is unrelated to the provision of an online service.
  3. It is untenable for any publisher, adtech vendor, or trade body, to claim that they must use personal data for online advertising. As we and others have shown, sophisticated adtech can work without personal data.

The full letter is available here.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

GDPR consent design: how granular must adtech opt-ins be?

This note examines the range of distinct adtech data processing purposes that will require opt-in under the GDPR.[1]
In late 2017 the Article 29 Working Party cautioned that “data subjects should be free to choose which purpose they accept, rather than having to consent to a bundle of processing purposes”.[2] Consent requests for multiple purposes should “allow users to give specific consent for specific purposes”.[3]  Rather than conflate several purposes for processing, Europe’s regulators caution that “the solution to comply with the conditions for valid consent lies in granularity, i.e. the separation of these purposes and obtaining consent for each purpose”.[4] This draws upon GDPR, Recital 32.[5]
In short, consent requests must be granular, showing opt-ins for each distinct purpose.

How granular must consent opt-ins be?

In its 2013 opinion on “purpose limitation”, the Article 29 Working Party went some way toward defining the scope of a single purpose: a purpose must be “sufficiently defined to enable the implementation of any necessary data protection safeguards,” and must be “sufficiently unambiguous and clearly expressed.”[6]
The test is “If a purpose is sufficiently specific and clear, individuals will know what to expect: the way data are processed will be predictable.”[7] The objective is to prevent “unanticipated use of personal data by the controller or by third parties and in loss of data subject control [of these personal data]”.[8]
In short, a purpose must be specific, transparent and predictable.[9] It must be describable to the extent that the processing undertaken for it would not surprise the person who gave consent for it.
The process of showing an ad to a single person (in online behavioral advertising) involves the processing of personal data for several distinct purposes, by hundreds of different companies.
[accordion id=”video”] [accordion_item title=”Video: how personal data passes between companies in online behavioral advertising” parent_id=”video”]


[/accordion_item][/accordion]
Therefore, a broad, all-encompassing sentence such as “to show you relevant advertising” does not make it possible for one to grasp how one’s data will be used by a large number of companies. It would not be possible to understand from this sentence, for example, that inferences about one’s characteristics would be inferred, or what types of consequences may result.
The following table shows an indicative list of ten purposes for which personal data are currently processed in the online behavioral advertising system. In practice, there may be more purposes at play. The table also generalizes the types of company involved in the selection and display of an ad.
[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/purposes.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download high resolution PDF [/x_button]
A spreadsheet version of this table is available here.
(Refer to footnote 10 to for a discussion the challenges presented by these purposes for all businesses involved.[10])

Pre-consent naming of each controller, and granular post-consent controller consent withdrawal

Recital 42 of the GDPR notes that “For consent to be informed, the data subject should be aware at least of the identity of the controller and the purposes of the processing”.[11] All controllers (including “joint controllers” that “jointly determine the purposes and means of processing”[12]) must be named.[13]
Each purpose must be very clear, and each opt-in requires a “clear affirmative action” that is both “specific”, and “unambiguous”.[14] There can be no pre-ticked boxes,[15] and “consent based on silence” is not permitted.[16]
Therefore, a consent request should be made with granular options for each of these purposes, and the names each controller that processes personal data for each of these purposes. For example:  

Specific purpose 1 | controllers A, B, C | options: Accept / Refuse 

There are two different scenarios for how consent for these purposes will be presented: the best case, and the more likely worst case.

The best scenario

At a minimum, then, assuming that all websites, SSPs, Ad Exchanges, DSPs, DMPs, and advertisers could align to pursue only these purposes, a consent request for this would include granular opt-in controls for a wide range of diverse purposes, the categories of processor pursuing each, and a very long list of controller names pursuing each.
The language and presentation of the request must be simple and clear, ideally the result of user testing.[17]
A consent request for a single purpose, on behalf of many controllers, might look like this.

Specific processing purpose consent, for multiple controllers,
with “next” button for multiple processing purpose opt-ins

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

What is presented when?

The Article 29 Working Party suggests that consent notices should have layers of information so that they do not overload viewers with information, but make necessary details easily available.[18] This is adopted in the design above using “View details”, “Learn about your data rights here”, and similar buttons and links.
When a user clicks “view details” to see the next layer of information about a controller

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

While some details, such as contact details for a company’s data protection officer, can be placed in a secondary layer, the primary layer must include “all basic details of the controller and the data processing activities envisaged”.[19]
Elements presented in this layer

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

The likely scenario:

The scenario above assumes that all businesses in online behavioral advertising can agree to pursue tightly defined purposes without deviation. However, it is more likely that controllers will need granular opt-ins, because their purposes are unique.
Any individual controllers who intend to process data for their own unique purposes will need further granular opt-ins for these purposes. Since adtech companies tend to deviate from the common purposes outlined above, it is likely that most or all of them would ultimately require granular purpose consent for each controller.
However, even if all controllers pursued an identical set of purposes so that they could all receive consent via a single consent dialogue that contained a series of opt-ins, there would need to be a granular set of consent withdrawal controls that covered every single controller once consent had been given. The GDPR says that “the data subject may exercise his or her rights under this Regulation in respect of and against each of the controllers”.[20]

A higher bar: “explicit consent”

Processing of personal data in online behavioral advertising (for example, purposes 2, 3, 5, 8, and 10 in the table above) is highly likely to produce special categories of data by inference.[21] Where this occurs, these purposes require “explicit” consent.[22]
Special categories of data reveal “racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, … [and] data concerning health or data concerning a natural person’s sex life or sexual orientation”.[23] 
To make consent explicit requires more confirmation. For example, the Article 29 Working Party suggests that two-stage verification is a suitable means of obtaining explicit consent.[24] One possible approach to this is suggested in PageFair’s design below.
Suggested mechanism for “explicit consent” 

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

One can confirm one’s opt-in in a second movement of the finger, or cursor and click. It is unlikely that a person could confirm using this interface unless it was their intention.  

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

Note that even this high bar, however, may not be permitted in some Member States. The GDPR gives European Member States the latitude to enact national legislation that prohibits consent as a legal basis for processing of special categories of data.[25] Therefore, it may not be legal to process any special categories of personal data in some EU Member States.

Conclusion 

Consent for website and app publishers is certainly an important objective, but the personal data it provides must only be processed after data leakage has been stopped. Data leakage (through in RTB bid requests, cookie syncs, JavaScript ad units, and mobile SDKs) exposes publishers as the most obviously culpable parties that regulators and privacy NGOs can target. At the same time, it also exposes their adtech vendors, and advertisers, to large fines and legal actions too.[26]
Websites, apps, and adtech vendors, should switch from using personal data to monetize direct and RTB advertising to “non-personal data”.[27] Using non-personal, rather than personal, data neutralizes the risks of the GDPR for advertisers, publishers, and adtech vendors. And it enables them to address the majority (80%-97%) of the audience that will not give consent for 3rd party tracking across the web.[28]
We recently revealed PageFair Perimeter, a regulatory firewall that blocks party data leakage, and enables publishers and adtech partners to use non-personal data for direct and RTB monetization when consent is absent (and leverage personal data when adequate consent has been given). You can learn more about Perimeter here. Publishers using Perimeter do not need people’s personal data (nor the consent required to process it) to monetize websites and apps.

[x_button shape=”rounded” size=”regular” float=”none” href=”http://pagefair.com/perimeter/” info=”none” info_place=”top” info_trigger=”hover”]Learn about Perimeter[/x_button]

Postscript

A hiccup in the choreography of the European Commission’s legislative proposals means that non-tracking cookies will need storage consent, at least until the application of the forthcoming ePrivacy Regulation. These cookies, however, contain no personal data, and obtaining consent for their storage is significantly less burdensome than obtaining consent for to process personal data for multiple purposes and multiple controllers. Update: 16 January 2018: See PageFair Insider note on storage consent for non-tracking cookies.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]


Notes:

[1] See our discussion of why consent is the appropriate legal basis for online behavioral advertising in “Why the GDPR ‘legitimate interest’ provision will not save you” , PageFair Insider, 13 March 2017 (URL: https://pagefair.com/blog/2017/gdpr-legitimate-interest/).
[2] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 11.
[3] ibid., p. 13.
[4] ibid., p. 11.
[5] Regulation (EU) 2016/679 of The European Parliament and of The Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), Recital 32. “…Consent should cover all processing activities carried out for the same purpose or purposes. When the processing has multiple purposes, consent should be given for all of them. …”
[6] “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013, p. 12.
Curiously, the Spanish Data Protection Authority has issued guidance that contains a sentence suggesting that continuing to browse a website might constitute consent, which is at odds with the Article 29 Working Party guidance on consent and appears to be entirely at odds with the text of the Regulation. See “Guía del Reglamento General de Protección de Datos para responsables de tratamiento”, Agencia Española de Protección de Datos, November 2017, p. 6.
[7] “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013, p. 13.
[8] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 12.
[9] “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013, p. 13.
[10] None of these purposes would be permissible unless data leakage were first addressed. See “Consent to use personal data has no value unless one prevents all data leakage”, PageFair Insider, October 2017 (URL: https://pagefair.com/blog/2017/understanding-data-leakage/). Furthermore,

  • Purpose 3 could not be permissible in any situation.
  • Purposes 2, 3, 5, 8, and 10 are highly likely to produce special categories of data by inference. See discussion of “explicit consent” in this note.
  • Regarding the purposes for which data have been sold, and to what category of customer, see “Data brokers: a call for transparency and accountability”, Federal Trade Commission, May 2014, pp 39-40, and B3-B

[11] The GDPR, Recital 42.
[12] The GDPR, Article 26, paragraph 1.
[13] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 14.
[14] The GDPR, Article 4, paragraph 11.
[15] ibid., Recital 32.
[16] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 16.
[17] “Guidelines on transparency under Regulation 216/679” Article 29 Working Party, November 2017, pp 8, 13.
[18] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 14.
[19] ibid., p. 15.
[20] The GDPR, Article 26, paragraph 3.
[21] “Informing data subjects is particularly important in the case of inferences about sensitive preferences and characteristics. The controller should make the data subject aware that not only do they process (non-special category) personal data collected from the data subject or other sources but also that they derive from such data other (and special) categories of personal data relating to them.” See “Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679”, Article 29 Working Party, 3 October 2017, p. 22.
[22] The GDPR, Article 9, paragraph 2, a.
[23] ibid., Article 9, paragraph 1.
[24] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 19.
[25] The GDPR, Article 9, paragraph 2, a.
[26] See “Consent to use personal data has no value unless one prevents all data leakage”, PageFair Insider, October 2017 (URL: https://pagefair.com/blog/2017/understanding-data-leakage/).
[27] Non-personal data are any data that can not be related to an identifiable person. As Recital 26 of the GDPR observes, “the principles of data protection should therefore not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable”. This recital reflects the finding of the European Court of Justice in 2016 that data are not personal “if the identification of the data subject was prohibited by law or practically impossible on account of the fact that it requires a disproportionate effort in terms of time, cost and manpower, so that the risk of identification appears in reality to be insignificant”. Judgment of the Court (Second Chamber) Patrick Breyer v Bundesrepublik Deutschland, Case C-582/14, 19 October 2016.
Non-tracking cookies, which contain no personal data, are useful for privacy-friendly advertising, and for other functions where an individual does not need to be identified such as A/B testing.
[28] See “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, September 2017 (URL: https://pagefair.com/blog/2017/new-research-how-many-consent-to-tracking/).
The granularity of consent required for online behavioral advertising will make the consenting audience even smaller. Moreover, consent for adtech will not only be hard to get, it will also be easy to lose. Consent can be withdrawn with the same degree of ease as it was given, under The GDPR, Article 7, paragraph 3.
The Article 29 Working Party demonstrates what this means in practice: “When consent is obtained … through only one mouse-click, swipe, or keystroke, data subjects must … be able to withdraw that consent equally as easily”.
“Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 21.
The guidance also says that “Where consent is obtained through use of a service specific user interface (for example, via a website, an app, a log-on account, the interface of a IoT device or by e-mail), there is no doubt a data subject must be able to withdraw consent via the same electronic interface, as switching to another interface for the solve reason of withdrawing consent would require undue effort”.

The regulatory firewall for online media and adtech

This note announces Perimeter, a regulatory firewall to enable online advertising under the GDPR. It fixes data leakage from adtech and allows publishers to monetize RTB and direct ads, while respecting people’s data. 
PageFair takes a strict interpretation of the GDPR. To comply, all media owners need to protect their visitors’ personal data, or else find themselves liable for significant fines and court actions. In European Law, personal data includes not only personally identifiable information (PII), but also visitor IP addresses, unique IDs, and browsing history.[1] The problem is that today’s online ads operate by actively disseminating this kind of personal data to countless 3rd parties via header bidding, RTB bid requests, tracking pixels, cookie syncs, mobile SDKs, and javascript in ad creatives. This exposes everyone, from the publisher to the advertiser, to potential fines, litigation and brand damage.[2]
Perimeter fixes this. It enables publishers to securely protect and control the tracking activities of the entire advertising supply chain in their websites and apps, by strictly blocking all third parties unless both publisher and data subject have given their consent.

[x_button shape=”rounded” size=”large” float=”none” href=” https://pagefair.com/perimeter/” title=”perimeterblogbutton” info=”none” info_place=”top” info_trigger=”hover”]Learn more about Perimeter [/x_button]

Revenue with or without tracking and consent

Publishers using Perimeter do not need people’s personal data (nor the consent required to process it) to monetize websites and apps. This is critically important, because only a small minority of people online are expected to consent to third party tracking for online advertising.[3]
Even without personal data, Perimeter enables interoperation with GDPR-compliant ad tech vendors so that frequency capping, attribution, impression counting, click counting, view-through counting, conversion counting, and fraud mitigation, all work without personal data. The list of compliant adtech vendors that PageFair works with to do this is growing.
Perimeter will also re-enable audience targeting by using non-personal segments that can also interoperate with consent-bearing DMPs.
When adequate consent is present, publishers, adtech vendors and advertisers can use personal data, and Perimeter will interoperate with other compliant consent management platforms. Indeed, Perimeter is a necessary partner to make consent meaningful.[4] Adtech vendors are eager for publishers to collect consent on behalf of 3rd parties – but publishers must simultaneously block all parties who do not have consent, or else remain exposed to liabilities.

Take Control of Data Leakage in all your Digital Properties

Perimeter brings privacy and data protection to the RTB/programmatic advertising ecosystem in the following ways:

  • Automatic removal of data leaking scripts from ads before they are rendered.
  • Prevention of unauthorized 3rd parties from accessing personal data.
  • Enforcement of data protection in RTB bid requests.
  • Enforcement of data protection in mobile SDK.

[x_button shape=”rounded” size=”large” float=”none” href=” https://pagefair.com/perimeter/” title=”perimeterblogbutton” info=”none” info_place=”top” info_trigger=”hover”]Learn more about Perimeter [/x_button]

Four Components

Perimeter provides four components that protect website and app publishers, and all of their advertising partners.

  1. Server side ad rendering
    Controls data in bid requests and ad creatives
  2. Policy Manager
    Empowers publishers to decide what 3rd parties are permitted to run in their websites and apps.
  3. User Consent Manager
    Enables granular consent to be obtained, communicated, and withdrawn by users.
  4. Privacy-by-design adtech interoperation
    Perimeter is partnering with other adtech vendors who are innovating to be compatible with strict enforcement of privacy regulations. This re-enables all core campaign management, measurement and targeting features without depending on legally toxic tracking IDs or other personal data.

[x_alert heading=”Resource to check your adtech vendors’ compliance ” type=”success”]Here is a resource for publishers to check whether your adtech vendors are compliant.[/x_alert]

Ethical data

We built Perimeter to enable websites and apps to transition from the old adtech industry to a more ethical one. This is why we are openly sharing the measurement and capping techniques for non-personal data adtech.
Perimeter is the result of 24 months of intensive technical and policy research and development, and combines the feedback of many app developers, advertisers, adtech vendors, privacy NGOs, regulators, and lawmakers.
It enables publishers, and their advertising partners, to operate within a clean and ethical data/media industry.

[x_button shape=”rounded” size=”large” float=”none” href=” https://pagefair.com/perimeter/” title=”perimeterblogbutton” info=”none” info_place=”top” info_trigger=”hover”]Learn more about Perimeter [/x_button]

Are you an ad tech vendor?

Let us know if you would like to find out more about GDPR compliance requirements and getting whitelisted on the Perimeter partner program. You can read about non-personal data methods here.
 
[x_line]

Notes

[1] See the definition of personal data in Article 4, (1), Regulation (EU) 2016/679 of The European Parliament and of The Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation).
[2] ibid., Article 4, paragraph 2.
[3] See “Europe Online: an experience driven by advertising”, GFK, 2017, p. 7 and “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, 12 September 2017.
[4] Because without Perimeter’s mitigation of data leakage, consent has no value. See “Consent to use personal data has no value unless one prevents all data leakage”, October 2017, PageFair Insider (URL: https://pagefair.com/blog/2017/understanding-data-leakage/)

Can websites use "tracking walls" to force consent under GDPR?

This note examines whether websites can use “tracking walls” under the GDPR, and challenges the recent guidance on this issue from IAB Europe. 
This week, IAB Europe published a paper that advises website owners that tracking walls (i.e., modal dialogs that require people to give consent to be tracked in order to access a website) will be permissible under the GDPR. Our view is different.
Several months ago we provided feedback to the IAB of what we regarded as serious mistakes in a preliminary draft of this paper, which we believe will be very detrimental to publishers who follow the paper’s advice. As it appears that our feedback did not make it into the published version of the paper, we want to put our opinion on the record, so that publishers can take it in to account when deciding what course to follow under the GDPR.
We provide an analysis below, and have published our original feedback to the IAB here, for those who want to dig into it.
The GDPR forbids tracking walls.[1] This prohibition may seem curious to adtech colleagues working outside the European Union, who may view personal data as a valid payment for for online content and services. It must be borne in mind that many Europe’s nations have strong historical motivations, and have protected the right to privacy and the right to protection of one’s data as fundamental rights in the European Charter.[2] To understand how European regulators have viewed these rights in the context of tracking walls, consider the following, from the European Data Protection Supervisor:

“There might well be a market for personal data, just like there is, tragically, a market for live human organs, but that does not mean that we can or should give that market the blessing of legislation. One cannot monetise and subject a fundamental right to a simple commercial transaction, even if it is the individual concerned by the data who is a party to the transaction.”[3]

We believe that publishers who implement tracking walls on their websites could shoulder significant risk of fines and legal action on behalf of the adtech companies that track users on their websites. As we show below, the defenses set forth in the IAB Europe paper are unlikely to convince a judge when the first publisher is sued for breaching the Regulation.
To be clear, we do believe that freely-given consent can help monetise a loyal minority of a publisher’s audience. But, to monetise the majority for whom personal data will not be available,[4] we must join together to build ads that work without personal data.[5] PageFair is partnering with publishers and adtech companies who share a commitment to building a safe adtech stack that is compliant with a strict interpretation of the regulations. This safe adtech can monetise the majority of the audience who will not freely consent to hundreds of 3rd party technology vendors, and interoperate with consent wherever it is available.

Errors in the IAB Europe paper

The IAB Europe paper advises websites that:

“Private companies are allowed to make access to their services conditional upon the consent of data subjects. The GDPR provides that account has to be taken of this when determining whether consent has been freely given, but does not prohibit the practice. Moreover, the ePrivacy Directive similarly explains that services may be made conditional on consent.”[6]

The following section details serious errors in this guidance. Here is a summary: the paper misreads Article 95 in the GDPR to mean that websites can ignore the GDPR’s prohibition on tracking walls, and that they can instead rely on a narrow allowance provided for in Recital 25 of the ePrivacy Directive. In a further misreading, the paper mistakenly suggests that Recital 25’s allowance can be applied to all website content. The problems with this are outlined below.
What the GDPR Article 95 says
The IAB Europe paper refers to Article 95 of the GDPR to say that “the ePrivacy Directive’s more specific rules prevail over the rules of the GDPR”. There are two important mistakes in this sentence. First, the actual text of the Article is:

“This Regulation shall not impose additional  obligations  on  natural  or  legal  persons  in  relation  to  processing  in connection  with  the  provision  of  publicly  available  electronic  communications  services  in  public  communication networks  in  the  Union  in  relation  to  matters  for  which  they  are  subject  to  specific  obligations  with  the  same  objective set  out  in  Directive  2002/58/EC.”[7]

The paper mistakenly reads this to mean that website owners can ignore the GDPR and refer instead to the ePrivacy Directive’s narrow allowance for tracking walls. This is wrong for two reasons.
First, Article 95 does not cover websites. Rather, it covers “electronic communications services”, which are defined in European telecommunications law as transmission services, not content. (In fact, the definition of electronic communications services explicitly excludes services “providing, or exercising editorial control over, content” such as websites).[8]
Second, the paper mistakenly suggests that Article 95 is applicable to Recital 25 in the ePrivacy Directive. As the next section shows, this is important because the paper mistakenly claims that Recital 25 of the ePrivacy Directive permits tracking walls. But Article 95 of the GDPR would only apply to Recital 25 of the ePrivacy Directive if “specific obligations” were defined in Recital 25 that the GDPR was now adding additional obligations to. This is not the case: Recital 25 does not impose obligations. In fact, if provides narrow allowances, which is quite the opposite.
What the ePrivacy Directive Recital 25 says
The paper makes several incorrect assumptions about Recital 25 in the ePrivacy Directive. It cites part of a sentence from Recital 25 to suggest that tracking walls are permissible for all websites:

“website content may still be made conditional on the well-informed acceptance of cookies”.

However, the complete sentence has a different meaning. Here is the full sentence:

“Access to specific website content[9] may still be made conditional on the well-informed acceptance of a cookie or similar device, if it is used for a legitimate purpose.”[10]

The complete sentence includes two important concepts that the paper does not address: “specific website content” and “legitimate purpose”.
This reference to “specific website content” in Recital 25, as European data protection authorities noted in 2013, means that “websites should not make conditional ‘general access’ to the site on acceptance of all cookies but can only limit certain content if the user does not consent to cookies”.[11]
Furthermore, limiting access to specific content is permissible only for a “legitimate purpose”. As Recital 25 notes, this relates to purposes such as to “facilitate the provision of information society services”. The term “information society services” is defined in European Law to mean services explicitly requested by users.[12] Clearly, ads that require tracking are not the service that the user has requested.

Conclusion

To summarise, we believe the paper currently misreads Article 95 in the GDPR, and incorrectly assumes that this article is applicable to Recital 25 of the ePrivacy Directive, which the paper then mistakenly concludes can be applied to all website content.
We suggest no bad faith on the part of IAB Europe, or on the part of the adtech companies that led its drafting process. Nevertheless, we fear that website owners may expose themselves to risk as a result of following the guidance in this paper.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Notes

[1] See for example Recital 43, Regulation (EU) 2016/679 of The European Parliament and of The Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation). “…Consent is presumed not to be freely given if it does not allow separate consent to be given to different personal data processing operations despite it being appropriate in the individual case, or if the performance of a contract, including the provision of a service, is dependent on the consent despite such consent not being necessary for such performance”. See also Recital 32 and 42.
[2] Article 7 and Article 8 of the Charter of Fundamental Rights of The European Union.
[3] Opinion 4/2017 on the Proposal for a Directive on certain aspects concerning contracts for the supply of digital content, European Data Protection Supervisor, 14 March 2017 (URL: https://edps.europa.eu/sites/edp/files/publication/17-03-14_opinion_digital_content_en.pdf).
[4] See “Europe Online: an experience driven by advertising”, GFK, 2017 (URL: https://www.iabeurope.eu/wp-content/uploads/2017/09/EuropeOnline_FINAL.pdf), p. 7 and “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, 12 September 2017 (URL: https://pagefair.com/blog/2017/new-research-how-many-consent-to-tracking/)..
[5] See for example “Frequency capping and ad campaign measurement under GDPR”, PageFair Insider, 7 November 2017 (URL: https://pagefair.com/blog/2017/gdpr-measurement1/).
[6] “Consent, Working Paper 03/2017”, IAB Europe, 28 November 2017, p. 4 (URL: https://www.iabeurope.eu/wp-content/uploads/2017/11/20171128-Working_Paper03_Consent.pdf).
[7] Article 95, General Data Protection Regulation.
[8] “Electronic communications service means a service normally provided for remuneration which consists wholly or mainly in the conveyance of signals on electronic communications networks, including telecommunications services and transmission services in networks used for broadcasting, but exclude services providing, or exercising editorial control over, content transmitted using electronic communications networks and services; it does not include information society services, as defined in Article 1 of Directive 98/34/EC, which do not consist wholly or mainly in the conveyance of signals on electronic communications networks”. Article 2, paragraph c, of Directive 2002/21/EC of The European Parliament and of The Council of 7 March 2002 on a common regulatory framework for electronic communications networks and services (Framework Directive).
[9] As the Article 29 Working Party’s Opinion of 2013 notes: “The emphasis on “specific website content” clarifies that websites should not make conditional “general access” to the site on acceptance of all cookies but can only limit certain content if the user does not consent to cookies (e.g.: for e-commerce websites, whose main purpose is to sell products, not accepting (non-functional) cookies should not prevent a user from buying products on this website).” Working Document 02/2013 providing guidance on obtaining consent for cookies, Article 29 Working Party, (URL: http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2013/wp208_en.pdf), p. 5.
[10] Recital 25, Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data.
[11] Working Document 02/2013 providing guidance on obtaining consent for cookies, Article 29 Working Party, (URL: http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2013/wp208_en.pdf), p. 5.
[12] “..any Information Society service, that is to say, any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services. For the purposes of this definition: …  “at the individual request of a recipient of services” means that the service is provided through the transmission of data on individual request.” Article 1, paragraph 2 of Directive 98/48/EC of The European Parliament and of The Council of 20 July 1998 amending directive 98/34/EC laying down a procedure for the provision of information in the field of technical standards and regulations.

Research result: what percentage will consent to tracking for advertising?

This note presents the results of a survey of 300+ publishers, adtech, brands, and various others, on whether users will consent to tracking under the GDPR and the ePrivacy Regulation. 
In early August we published a note on consent, and asked whether people would click “yes”. We would like to thank the 300+ colleagues who responded to our research request. Now we present the results.
UPDATE: 9 January 2018, SEE  MOST RECENT PAGEFAIR INSIDER NOTE ON GDPR CONSENT DIALOGUES from 8 January 2018.  

Tracking for a single brand, on a single site.

305 respondents were asked by a publisher to permit a named brand and its analytics partners to track them on the site. A previous note explains the design of this notice.

It is important to note that this is a limited consent notice. It asks to track behaviour on one site only, and for one brand only, in addition to “analytics partners”. This notice would not satisfy regulators if it were used to cover the vast chain of controllers and processors involved in conventional behavioural targeting.
Even so, four fifths (79%) of respondents said they would click “No” to this limited consent request.

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2017/09/PAGEFAIR-consent-survey-charts.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download PDF of all charts (high resolution) [/x_button]
Only 21% said they would click “OK”. Moreover, as the chart below shows, only 14% were “very highly” or “highly” confident that the average user would also do so.
Respondents are concerned about how their own behaviour online is tracked for advertising, and would avail of proposed measures in the ePrivacy Regulation to protect themselves. Two thirds (67%) of respondents reported being “very highly” or “highly” concerned about their online behaviour being tracked  (and half of these said they were “very highly” concerned).

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2017/09/PAGEFAIR-consent-survey-charts.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download PDF of all charts (high resolution) [/x_button]

Device tracking preferences

Respondents were shown a tracking preferences menu of the kind proposed in the latest draft of the ePrivacy Regulation.[1] They were asked what they would select if shown the message on their own device.
This shows “Accept only first party tracking” selected by default, as proposed in European Parliament rapporter’s draft report.)[2] However, only 20% of respondents said they would select this.
Only 5% were willing to “accept all tracking”. 56% said they would select “Reject tracking unless strictly necessary for services I request”.
The very large majority (81%) of respondents said they would not consent to having their behaviour tracked by companies other than the website they are visiting. Users’ apparent allowance for 1st parties, but objection to 3rd parties, should be heartening for publishers.

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2017/09/PAGEFAIR-consent-survey-charts.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download PDF of all charts (high resolution) [/x_button]

The consenting audience will be tiny

Only a very small proportion (3%) believe that the average user will consent to “web-wide” tracking for the purposes of advertising (tracking by any party, anywhere on the web).
However, almost a third believe that users will consent if forced to do so by “tracking walls”, that deny access to a website unless a visitor agrees to be tracked. Tracking walls, however, are prohibited under Article 7 of the GDPR, the rules of which are already formalised and will apply in law from late May 2018.[3] 

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2017/09/PAGEFAIR-consent-survey-charts.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download PDF of all charts (high resolution) [/x_button]

Publishers see opportunity in adtech need to seek consent 

If adtech companies persist in using old-school personal data, rather than transition to safer non-personal data technologies, then they are likely to have to rely on publishers to facilitate consent requests to their users.  A large majority of publishers viewed this as a potential commercial opportunity. 27% of publishers said “yes”, and 34% said “maybe”, when asked if they saw a potential commercial opportunity in their ability to seek consent from data subjects on behalf of adtech companies that have no direct relationship with them.

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2017/09/PAGEFAIR-consent-survey-charts.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download PDF of all charts (high resolution) [/x_button]
Most adtech colleagues (74%) also anticipate that they may have to compensate publishers for the opportunity to seek consent from their visitors. (Caveat: the sample included 19 adtech respondents, of the 305 total).


Safer ads, and safer data, rather than consent.

These results are not definitive, and we have not undertaken a full scale research project in the production of this note. They do, however, seem to reflect the reality. We take heart from the publication of a survey of some 11,000 respondents by GFK, commissioned by trade groups as supporting collateral in their lobbying against measures in the ePrivacy Regulation. The GFK study notes that only “20% would be happy for their data to be shared with third parties for advertising purposes”.[4] This is a remarkable conclusion for a study that argues for, rather than against, old-school behavioural targeting, because it shows that few will opt-in. The parties involved in the study should be commended for including it, and not burying it. We do note, however, that the finding was not given the headline status that it warranted.
The results presented in this note should trouble any industry colleagues who plan to tackle GDPR and the ePrivacy Regulation by seeking consent, as a means to process personal data largely in the same way as usual. It appears that consent may not be forthcoming.
Adtech must rapidly transition from using old-school personal data to safer non-personal data technologies. We can draw inspiration from the automobile industry, which is transitioning from high pollution combustion technology to electric in response to regulatory pressure.
Safer data and safer advertising can enable programmatic buying, and sophisticated targeting, without requiring consent. The market parallel with the auto industry should be heartening for publishers, since if all petrol and diesel engines were to be outlawed in 2018 without some special dispensation, then demand for – and prices of – hybrid and electric vehicles would rise.

Read next:

Implications for Google and Facebook

[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Notes

[1] Rapporteur’s draft report on the proposal for a regulation of the European concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC, June 2017.
[2] ibid., Recital 23. 
[3] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1.
See Recital 42’s reference to “without detriment”, and Recital 43’s discussion of “freely given” consent, and Article 7(2) prohibition of conditionality. See also the UK Information Commissioner’s Office’s draft guidance on consent, 31 March 2017, p. 21, which explicitly prohibits “tracking walls”.
[4] “Europe Online: an experience driven by advertising”, GFK, 2017, p. 7. (URL: https://www.iabeurope.eu/wp-content/uploads/2017/09/EuropeOnline_FINAL.pdf).

How the GDPR will disrupt Google and Facebook

Google and Facebook will be disrupted by the new European data protection rules that are due to apply in May 2018. This note explains how. 
Google and Facebook will be unable to use the personal data they hold for advertising purposes without user permission. This is an acute challenge because, contrary to what some commentators have assumed, they cannot use a “service-wide” opt-in for everything. Nor can they deny access to their services to users who refuse to opt-in to tracking.[1] Some parts of their businesses are likely to be disrupted more than others.

The GDPR Scale

When one uses Google or Facebook.com one willingly discloses personal data. These businesses have the right to process these data to provide their services when one asks them to. However, the application of the GDPR will prevent them from using these personal data for any further purpose unless the user permits. The GDPR applies the principle of “purpose limitation”, under which personal data must only be “collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes”.[2]
Google and Facebook cannot confront their users with broad, non-specific, consent requests that cover the entire breadth of their activities. Data protection regulators across the EU have made clear what they expect:

“A purpose that is vague or general, such as for instance ‘Improving users’ experience’, ‘marketing purposes’, or ‘future research’ will – without further detail – usually not meet the criteria of being ‘specific’”.[3]

A business cannot, for example, collect more data for a purpose than it needs and then retroactively ask to use those data for additional purposes.[4]
It will be necessary to ask for consent, or present an opt-out choice, at different times, and for different things. This creates varying levels of risk. We estimate these risks on the “GDPR scale”, shown below.

The scale ranges from zero to five. Five, at the high end of the scale, describes the circumstances that many adtech companies that have no direct relationship with Internet users will find themselves in. They need to get the consent of the people whose data they rely on. But they have no channel of communication through which they can do so.
Four, next highest on the scale, refers to companies that have direct relationships with users, and can use this to ask for consent. However, users have little incentive to “opt-in” to being tracked for advertising. Whereas a user might opt-in to some form of profiling that comes with tangible benefits, such as a loyalty scheme, the same user might not be willing to opt-in to more extensive profiling that yields no benefit. The extensiveness of the profiling is important because, as the note at the bottom of this page shows, users will be aware of the uses of their data when consent is sought. Thus adtech tracking across the web might rank as four, but a loyalty scheme might rank as three on the GDPR scale.
A slightly more attractive prospect, from Google and Facebook’s perspective, is to inform a user about what they want to do with the personal data, and give the user a chance to “opt-out” beforehand.[5] This is two on the scale. This opt-out approach has the benefit – from the company’s perspective – that some users’ inaction may allow their data to be used. The GDPR permits the opt-out approach when the purposes that the companies want to use the data for are “compatible” with the original purpose for which personal data were shared by users.[6] In addition to the opt-out notice, users also have to be told of their right to object at any time to the use of their data for direct marketing.[7]
One on the scale refers to activities that currently involve the processing of personal data, but that do not need to do so. With modification, these activities could be put beyond the scope of the Regulation.
Activities at the zero end of the scale are outside the scope of the Regulation, because they use no personal data.

Google

Our estimate of Google, when applied to this scale, shows a significant range of products at four on the scale, with the proviso that some part of that set of products can be modified, which would lower their score from four to one.[x_button shape=”rounded” size=”mini” float=”none” href=”https://pagefair.com/wp-content/uploads/2017/08/GDPR-scale-Google-Facebook-2.pdf” title=”Download PDF” info=”none” info_place=”top” info_trigger=”hover”]Download PDF[/x_button]

All personalized[8] advertising on Google sites such as Search, Youtube, Maps, and the websites where Google provides advertising is scored four because it will require that users opt-in to extensive tracking.
If, however, users have already “signed in” to Google Search or Chrome, Google may argue that the purpose of these technologies is “compatible” with purposes users agreed to, and hope to use an opt-out rather than an opt-in. Whether this would be successful, however, remains to be seen.
The technologies that will be affected include:

  • Certain targeting features of AdWords such as “remarketing”,[9] “affinity audiences”,[10] “custom affinity audiences”,[11] “in-market audiences”,[12] “similar audiences”,[13] “demographic targeting”,[14] “Floodlight” cross-device tracking.[15]
  • “Customer Match”, which targets users and similar users based on personal data contributed by an advertisers.[16] A prospect would have had to give their consent to the advertiser for this to occur.
  • “Remarketing lists for search ads (RLSA)”, retargeting from site visitors by using Google Analytics, is likely to be prevented by the ePR.[20]

Gmail, the most popular e-mail service in the world, will also be affected. Google mines the content and metadata of each email message sent and received in Gmail to target advertising. This could not have continued under the GDPR and ePR without each sender and recipient giving their consent. Clearly, few would do so, and Gmail is at four on the scale. This may be the real reason, or at least a contributing reason, why Google has recently announced that it will stop mining people’s emails for ads.[21]
In addition, “programmatic” advertising services that Google provides to advertisers and publishers under its DoubleClick business will be affected. Operating these under the GDPR would require not only that a user consents to Google’s use of data for advertising targeting purposes, but to the many other companies such as DMPs (data management platforms), DSPs (demand side platforms), and so forth processing these data too. The DoubleClick business is therefore at four on the scale.
At two on the scale is “location targeting”,[22] and “location extensions”, technologies in Google Maps that enable advertising to target users based on geographical proximity. This score, however, is based on the assumption that advertising in map search results is accepted as a compatible purpose with the original purpose for which location data were shared by users.
Google’s AdWords product has the benefit that it can be modified to operate entirely outside the scope of the GDPR and ePR. This is why it appears at four on the scale, and at one. If Google discards personalized targeting features from AdWords, then it can continue to target advertisements to people based on what they search for.
Finally, at zero on the scale is Google’s “placement-targeted” advertisements.[23] These target only by the context of the pages they appear on, rather than by using personal data. Therefore they are out of scope of the GDPR.

Facebook

Significant parts of Facebook’s business are at two and four on the scale.[x_button shape=”rounded” size=”mini” float=”none” href=”https://pagefair.com/wp-content/uploads/2017/08/GDPR-scale-Google-Facebook-2.pdf” title=”Download PDF” info=”none” info_place=”top” info_trigger=”hover”]Download PDF[/x_button]
The Facebook Audience Network is scored four because it requires the processing of personal data from Facebook users to target them on other websites. It is unlikely that this will be regarded as a compatible use. If it is, Facebook will have to convince users not to opt-out.
WhatsApp advertising is also scored four on the scale because it will be necessary for users to give their consent (an opt-in, rather than an opt-out) for their personal data on WhatsApp to be processed for purposes unrelated to WhatsApp functionality on Facebook properties other than WhatsApp.[24]
Farther down the scale, at two, is Facebook’s Newsfeed, which may be able to use an opt-out approach to get some users to permit the processing of these personal data.
However, the nature of the content in the Newsfeed may limit the range of data it can process. Any information that reveals a person’s race, ethnicity, political opinion, religious or philosophical beliefs, trade union membership, or are related to a person’s sex life or sexual orientation are in “special categories of data”. These cannot be used without explicit consent, or unless they have been “manifestly made public by the data subject”.[25] Facebook may not be able to mine some posts in the Newsfeed that are not marked “public” (or, perhaps, “friends of friends”[26]). It may even be that the determination of which posts are “special categories” of data, and which are not, may itself be processing that goes to far.
The use of personal data from Instagram for advertising on Instagram may accepted as a compatible purpose, and enable Instagram to use an opt-out notice rather than request an opt-in.
[prompt type=”left” title=”Access the GDPR/ePR repository” message=”A repository of GDPR and ePrivacy Regulation explainers, official docs, and current status.” button_text=”Access Now” href=”https://pagefair.com/datapolicydocs/”]

Conclusion 

Both Google and Facebook have direct relationships with their users, and have a well thought out design for their current privacy requests. However, they are not immune to disruption when the new regulations apply. Indeed, some parts of their businesses may be particularly susceptible to them. While they can process personal data necessary to provide services that their users request, using these data for any other purpose requires user-permission, or inaction, in the case of out-outs. The critical question for both businesses is whether users will click “yes”, when asked to consent.

PageFair Research

We are surveying sample industry-insiders’ insights into this question. Your shared insights may illuminate this issue. Please click the button below to take the survey.

[x_button shape=”square” size=”jumbo” float=”none” href=”https://docs.google.com/forms/d/e/1FAIpQLSfTiphQfdMtZXpXvhQoeLkmRX6d3HST71Q7KXwmhj3-zhaGxg/viewform?entry.74111772=not_applicable” info=”none” info_place=”top” info_trigger=”hover”]70 second survey[/x_button]

We have designed the survey to take 70 seconds to complete. Thank you for your input – we will share the results.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Notes

[1] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1. See Recital 42’s reference to “without detriment”, Recital 43’s discussion of “freely given” consent, and Article 7(2) prohibition of conditionality. See also the UK Information Commissioner’s Office’s draft guidance on consent, 31 March 2017, p. 21, which clearly prohibits so-called “tracking walls”.
[2] The GDPR, Article 5, paragraph 1, b.
[3] Article 29 Working Party, Opinion 03/2013 on purpose limitation, 2 April 2013, p. 16. This is evident in GDPR, Article 13, paragraph 1, c.
[4] The GDPR, Recital 32 notes that “When the processing has multiple purposes, consent should be given for all of them”. Recital 39 notes that “specific purposes for which personal data are processed should be explicit and legitimate and determined at the time of the collection of the personal data. The personal data should be adequate, relevant and limited to what is necessary for the purposes for which they are processed. This requires, in particular, ensuring that the period for which the personal data are stored is limited to a strict minimum”.
[5] ibid., Recital 61.
[6] ibid., Article 6, paragraph 4, and Recital 50.
The Article 29 Working Party has provided some guidance on how one should determine whether purposes are compatible. Among the issues to consider are “the impact of the further processing on the data subjects”. Article 29 Working Party, Opinion 03/2013 on purpose limitation, 2 April 2013, p. 3.
This may be a challenge for social platforms. Facebook, for example, was the subject of a scandal in May and April 2017 when a document leaked from its Australian business that described its capabilities to identify “moments when young people need a confidence boost”, or feel “worthless” or “insecure”, for marketing purposes. “Facebook targets insecure young people to sell ads”, The Australian, 1 May 2017 (URL: http://www.theaustralian.com.au/business/media/digital/facebook-targets-insecure-young-people-to-sell-ads/news-story/a89949ad016eee7d7a61c3c30c909fa6); see Facebook’s reply of 30 April 2017 (URL: https://newsroom.fb.com/news/h/comments-on-research-and-ad-targeting/).
[7] The GDPR, Article 21, paragraph 2 and 3; see also Recital 70 on the manner in which the user is to be informed of this right.
[8] “Personalized advertising”, Google Advertising Policies Help, (URL: https://support.google.com/adwordspolicy/answer/143465?hl=en). Note that even users who are not signed out receive personalised search results, as described in Brian Horling and Matthew Kulick, “Personalized Search for everyone”, 4 December 2009, Google Blog (URL: https://googleblog.blogspot.ie/2009/12/personalized-search-for-everyone.html).
[9] “About remarketing lists for search ads”, Google AdWords Help, (URL: https://support.google.com/adwords/answer/2701222?hl=en).
[10] According to Google, this is “based on their specific interests as they browse pages, apps, channels, videos, and content across YouTube and the Google Display Network as well as on YouTube search results”. See “About targeting your ads by audience interests”, Google AdWords Help, (URL:https://support.google.com/adwords/answer/2497941?hl=en).
[11] “About targeting your ads by audience interests”, Google AdWords Help (URL: https://support.google.com/adwords/answer/2497941?hl=en)
[12] “In-Market Audiences”, Think with Google (URL: https://www.thinkwithgoogle.com/products/in-market-audiences/).
[13] “AdWords looks at browsing activity on Display Network sites over the last 30 days, and uses this, along with its contextual engine, to understand the shared interests and characteristics of the people in your remarketing list.” “About similar audiences on the Display Network”, Google AdWords Help (URL: https://support.google.com/adwords/answer/2676774?hl=en).
[14] “When people are signed in from their Google Account, we may use demographics derived from their settings or activity on Google properties, depending on their account status”, “About demographic targeting”, AdWords Help (URL: https://support.google.com/adwords/answer/2580383?co=ADWORDS.IsAWNCustomer%3Dfalse&hl=en).
[15] “About Floodlight”, DoubleClick Digital Marketing Partners Help https://support.google.com/dcm/partner/answer/4304205?hl=en&ref_topic=4241549.
[16] “About Customer Match”, Google AdWords Help (URL: https://support.google.com/adwords/answer/6379332?hl=en).
[17] “About remarketing lists for search ads”, Google AdWords Help (URL: https://support.google.com/adwords/answer/2701222?hl=en).
[21] “Consumer Gmail content will not be used or scanned for any ads personalization after this change.” Diane Greene, 23 June 2017 (URL: https://www.blog.google/products/gmail/g-suite-gains-traction-in-the-enterprise-g-suites-gmail-and-consumer-gmail-to-more-closely-align/).
[22] “Target customers near an address with location extensions”, Google AdWords Help (URL:https://support.google.com/adwords/answer/2914785?hl=en&ref_topic=3119074).
[23] “Add, edit, and remove managed placements”, Google AdWords Help (URL: https://support.google.com/adwords/answer/2471182).
[24] See the recent correspondence between the Irish regulator and Facebook “Data Protection Commissioner’s Statement on the Frequently Asked Questions published by WhatsApp”, 16 August 2017 (URL: https://www.dataprotection.ie/documents/press/16-08-17_whatapp_DPC_Statement.pdf).
[25] The prohibition is in the GDPR, Article 9. See also Article 6, paragraph 4, c. The exception is Article 9, paragraph 2, e. See also Recital 71.
[26] An average user has 40,000 friends of friends, though the 99th percentile has 800,000. See Lars Backstrom, “People you may know”, 12 July 2010 (URL: www.graphanalysis.org/SIAM-AN10/01_Backstrom.pdf).
 

Businesses will have to provide the following information to internet users when seeking their consent.

  • Who is collecting the data, and how to contact them or their European representative. 
  • What the personal information are being used for, and the legal basis of the data processing.
  • The “legitimate interest” of the user of the data (This refers to a legal basis that may be used by direct marketing companies).
  • With whom the data will be shared.
  • Whether the controller intends to transfer data to a third country, and if so has the European Commission deemed this country’s protections adequate or what alternative safeguards or rules are in place.
  • The duration of storage, or the criteria used to determine duration.
  • That the user has the right to request rectification to mistakes in this personal information.
  • That the user has the right to withdraw consent.
  • How the user can lodge a complaint with the supervisory authority.
  • What the consequences of not giving consent might be.
  • In cases of automated decision-making, including profiling, what the logic of this process is, and what the significance of the outcomes may be.