Posts

Google adopts non-personal ad targeting for the GDPR

This note examines Google’s recent announcement on the GDPR. Google has sensibly adopted non-personal ad targeting. This is very significant step forward and signals a change in the online advertising market. But Google has also taken a new and problematic approach to consent for personal data use in advertising that publishers will find hard to accept. 

Google decides to use non-personal ad targeting to comply with the GDPR 

Last Thursday Google sent a policy update to business partners across the Internet announcing that it would launch an advertising service based on non-personal data in order to comply with the GDPR.[1]
PageFair has advocated a non-personal approach to advertising for some time, and commends Google for taking this position. As we noted six months ago,[2] Google AdWords, for example, can operate without consent if it discards personalized targeting features (and unique IDs). In this case, advertisers can continue to target advertisements to people based on what they search for.
This may be part of a trend for Google, which announced in mid 2017 that it would stop mining personal e-mails in Gmail to inform its ad targeting. Clearly, few users would have given consent for this.[3] Google’s latest announcement has signaled to advertisers the importance of buying targeted advertising without personalization.
Although Google’s “non-personalized ads” may seem promising to advertisers and publishers who are concerned about GDPR liability, more work must be done before they can be considered safe.
Unique tracking IDs are currently vital to Google’s ability to perform frequency capping and bot detection.[4] Meanwhile, data leakage is a problem caused by 3rd party ad creatives liberally loading numerous tracking pixels. Google has been silent on fixing these problems. Therefore, it may be that Google will merely target ads with non-personal data, but will continue to perform tracking as usual. Clarity on this point will be important for advertisers seeking safe inventory.

Problems with Google’s approach to consent for personal data

Despite its new non-personalized ads, Google is also attempting to build a legal basis under the GDPR for its existing personal data advertising business. It has told publishers that it wants them to obtain their visitors’ consent to “the collection, sharing, and use of personal data for personalization of ads or other services”.[5]
Note that the purpose here is “personalization of ads or other services”. This is appears to be a severe conflation of the many separate processing purposes involved in advertising personalization.[6] The addition of “other services” makes the conflation even more egregious. As we previously observed in our note on the approach proposed by IAB Europe, this appears to be a severe breach of Article 5, which requires that consent be requested in a granular manner for “specified, explicit” purposes.[7] As noted in a previous PageFair note, European regulators have explicitly warned against conflating purposes in this way:

“If the controller has conflated several purposes for processing and has not attempted to seek separate consent for each purpose, there is a lack of freedom. This granularity is closely related to the need of consent to be specific …. When data processing is done in pursuit of several purposes, the solution to comply with the conditions for valid consent lies in granularity, i.e. the separation of these purposes and obtaining consent for each purpose”.[8] 

Controller-controller 

Google is asking publishers to obtain consent from their visitors for it to be an independent controller of those users’ personal data.[9] Confusingly, Google has called this a “controller-controller” policy. This evokes “joint-controllership”, a concept in the GDPR that would require force both Google and publisher to jointly determine the purposes and means of processing, and to be transparent with each other.[10] However, what Google proposes is not joint-controllership, but rather independent controllership for the publisher on the one hand, and for Google on the other. Google’s “controller-controller” terms to publishers define each party as

“an independent controller of Controller Personal Data under the Data Protection Legislation; [that] will individually determine the purposes and means of its processing of Controller Personal Data”.[11]

It is not clear why a publisher would choose to do this, since it would enable Google to leverage that publisher’s audience across the entire web (severe conflation of purposes notwithstanding). The head of Digital Content Next, a publisher trade body that represents Disney, New York Times, CBS, and so forth, has already announced that “no way in hell Google will be “co-controller” across publishers’ sites”.[12]
Further problems with Google’s new approach to consent 
Even if publishers did accept that Google could be a controller of their visitors’ data for its own purposes, it is unlikely that many visitors would give their consent for this.[13]
If, however, both a publisher and a visitor were to agree to Google’s controller-controller proposal, two further problems arise. First, when a publisher shares third party personal data with Google, Google’s terms require that the publisher “must use commercially reasonable efforts to ensure the operator of the third party property complies with the above duties [of obtaining adequate consent]”.[14] This phrase “commercially reasonable efforts” is not a meaningful defence in the event that personal data are unlawfully processed.
As one expert from a European data protection authority retorted when I researched this point: “Imagine this as legal defence line: ‘We did not obtain consent because if wasn’t possible with commercially reasonable efforts’?” The Regulation is clear that “each controller or processor shall be held liable for the entire damage”, where more than one controller or processor are “involved in the same processing”.[15]
Second, Google’s policy puts what appears to be an impossible burden on the publisher. It requires that the publisher accurately inform the visitor about how their data will be used if they give consent.

“You must clearly identify each party that may collect, receive, or use end users’ personal data as a consequence of your use of a Google product. You must also provide end users with prominent and easily accessible information about that party’s use of end users’ personal data”.[16]

However, the publisher does not know what personal data Google shares with its own business partners. Nor does it know what purposes these parties process data about its visitors for. So long as this continues, a publisher cannot be in a position to inform its visitors of what will be done with their data. The result is very likely to be a breach Article 6[17] and Article 13[18] of the GDPR.
Giving Google the benefit of the doubt, this may change before 25 May. Google plans to publish some information about its “uses of information and we are asking other ad technology providers with which Google’s products integrate to make available information about their own uses of personal data.”[19] Publishers will not be well served by any further delay in the provision of this information.

Risks for Google 

Google’s decision to rely on non-personal data for ad targeting is highly significant, and will enable the company and advertisers that work with it to operate under the GDPR. However, Google’s new consent policy is fraught with issues that make it impossible for publishers to adopt. Our GDPR risk scale, first published for Google in August 2017, remains unchanged.


Perimeter is a robust regulatory firewall. It preemptively blocks unauthorized requests from 3rd parties, and tightly controls personal data on your website and app. It protects you, your advertising business, and your users. Perimeter makes sure that consent means something.

[x_button shape=”rounded” size=”regular” float=”none” href=”https://pagefair.com/perimeter” info=”none” info_place=”top” info_trigger=”hover”]Learn more[/x_button]

Notes

[1] “Changes to our ad policies to comply with the GDPR”, Google Inside AdWords, 22 March 2018 (URL: https://adwords.googleblog.com/2018/03/changes-to-our-ad-policies-to-comply-with-the-GDPR.html).
[2] “How the GDPR will disrupt Google and Facebook”, PageFair Insider, August 2017 (URL: https://pagefair.com/blog/2017/gdpr_risk_to_the_duopoly/).
[3] ibid.
[4] For alternative methods of performance measurement and reporting see “Frequency capping and ad campaign measurement under GDPR”, PageFair Insider, November 2017 (URL: https://pagefair.com/blog/2017/gdpr-measurement1/).
[5] “EU user consent policy”, Google, to apply from 25 May 2018 (URL: https://www.google.com/about/company/consentstaging.html)
[6] See discussion of data processing purposes in online behavioural advertising, and the degree of granularity required in consent, in “GDPR consent design: how granular must adtech opt-ins be?”, PageFair Insider, January 2018 (URL: https://pagefair.com/blog/2018/granular-gdpr-consent/).
[7] The GDPR, Article 5, paragraph 1, b, and note reference to the principle of “purpose limitation”. See also Recital 43. For more on the purpose limitation principle see “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013.
[8] “Guidelines on consent under Regulation 2016/679”, WP259, Article 29 Working Party, 28 November 2017, p. 11.
[9] “Google Ads Controller-Controller Data Protection Terms, Version 1.1”, Google, 12 October 2017 (URL: https://privacy.google.com/businesses/controllerterms/).
[10] See The GDPR, Article 26.
[11] Clause 4.1 of “Google Ads Controller-Controller Data Protection Terms, Version 1.1”, Google, 12 October 2017 (URL: https://privacy.google.com/businesses/controllerterms/).
[12] Jason Kint, Twitter, 22 March 2018 (URL: https://twitter.com/jason_kint/status/976928024011726848)
[13] “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, 12 September 2017 (URL: https://pagefair.com/blog/2017/new-research-how-many-consent-to-tracking/).
[14] “EU user consent policy”, Google, to apply from 25 May 2018 (URL: https://www.google.com/about/company/consentstaging.html)
[15] The GDPR, Article 4, paragraph 2.
[16] “EU user consent policy”, Google, to apply from 25 May 2018 (URL: https://www.google.com/about/company/consentstaging.html)
[17] The GDPR, Article 6, paragraph 1, a.
[18]  [20] The GDPR, Article 13, paragraph 2, f, and Recital 60.
[19] “Help with the EU user consent policy”, Google (URL:https://www.google.com/about/company/consenthelpstaging.html)

PageFair writes to all EU Member States about the ePrivacy Regulation

This week PageFair wrote to the permanent representatives of all Member States of the European Union in support for the proposed ePrivacy Regulation.
Our remarks were tightly bounded by our expertise in online advertising technology. We do not have an opinion on how the proposed Regulation will impact other areas.
The letter addresses four issues:

  1. PageFair supports the ePrivacy Regulation as a positive contribution to online advertising, provided a minor amendment is made to paragraph 1 of Article 8.
  2. We propose an amendment to Article 8 to allow privacy-by-design advertising. This is because the current drafting of Article 8 will prevent websites from displaying privacy-by-design advertising.
  3. We particularly support the Parliament’s 96th and 99th amendments. These are essential to enable standard Internet Protocol connections to be made in many useful contexts that do not impact of privacy.
  4. We show that tracking is not necessary for the online advertising & media industry to thrive. As we note in the letter, behavioural online advertising currently accounts for only a quarter of European publishers’ gross revenue.

[x_button shape=”rounded” size=”regular” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/03/PageFair-letter-on-ePrivacy-to-perm-reps-13-March-2018.pdf” info=”none” info_place=”top” info_trigger=”hover”]Read the letter [/x_button]

The digital economy requires a foundation of trust to enable innovation and growth. The enormous growth of adblocking (to 615 million active devices) across the globe proves the terrible cost of not regulating. We are witnessing the collapse of the mechanism by which audiences support the majority of online news reports, entertainment videos, cartoons, blogs, and cat videos that make the Web so valuable and interesting. Self-regulation, lax data protection and enforcement have resulted in business practices that promise a bleak future for European digital publishers.
Therefore, we commend the Commission and Parliament’s work thus far, and wish the Council (of Ministers of the Member States) well in their deliberations.

PageFair's long letter to the Article 29 Working Party

This note discusses a letter that PageFair submitted to the Article 29 Working Party. The answers may shape the future of the adtech industry. 
Eventually the data protection authorities of Europe will gain a thorough understanding of the adtech industry, and enforce data protection upon it. This will change how the industry works. Until then, we are in a period of uncertainty. Industry can not move forward, business can not flourish. Limbo does not serve the interests of publishers. Therefore we press for certainty.
This week PageFair wrote a letter to the Article 29 Working Party presenting insight on the inner workings of adtech, warts and all.
Our letter asked the working party to consider five questions. We suspect that the answers may shape the future of the adtech industry.

  1. We asked for further guidance about two issues that determine the granularity of consent required. First, we asked what the scope of a single “purpose” for processing personal data is. Since one must have a legal basis for each purpose, a clear understanding of scope of an individual purpose is important to determine the number of purposes, and thus the number of granular opt-ins required.
  2. The second question about granularity of consent asked whether multiple controllers that pursue identical purposes should be unbundled from each other. In other words, should consent be requested not only per purpose, but per controller too. This is important because it should not be assumed that a person trusts all data controllers equally. Nor is it likely that all controllers apply equal safeguards of personal data. Therefore, we asked whether it was appropriate to bundle multiple controllers together in a single consent request without the opportunity to accept some, and not all.
  3. We asked for guidance on how explicit consent operates for websites and apps, where a controller wishes to process special categories of personal data. Previously the Working Party cited the double opt-in as method of explicit consent for e-mail marketing. We presented wireframes of how this might operate on web and mobile.
  4. We asked for clarification that all unique identifiers are personal data. This is important because the presence of a unique ID enables the combining of data about the person associated with that unique ID, even if the party that originally assigned the unique ID did so randomly, without any understanding of who the data subject is.
  5. We asked for guidance on how Article 13 of the GDPR applies to non-tracking cookies (without personal data) as opposed to personal data. This is important because some paragraphs of this article were intended to apply to personal data and are not appropriate for non-personal data.

In addition to these questions we made three statements.

  1. Websites, apps, and adtech vendors leak personal data to unknown parties in routine advertising operation (via “RTB” bid requests, cookie syncs, JavaScript ad units, mobile SDKs, and other 3rd party integrations). This is preventable.
  2. We noted our support for the Working Party’s view that the GDPR forbids the demanding of consent for 3rd party tracking that is unrelated to the provision of an online service.
  3. It is untenable for any publisher, adtech vendor, or trade body, to claim that they must use personal data for online advertising. As we and others have shown, sophisticated adtech can work without personal data.

The full letter is available here.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

GDPR consent design: how granular must adtech opt-ins be?

This note examines the range of distinct adtech data processing purposes that will require opt-in under the GDPR.[1]
In late 2017 the Article 29 Working Party cautioned that “data subjects should be free to choose which purpose they accept, rather than having to consent to a bundle of processing purposes”.[2] Consent requests for multiple purposes should “allow users to give specific consent for specific purposes”.[3]  Rather than conflate several purposes for processing, Europe’s regulators caution that “the solution to comply with the conditions for valid consent lies in granularity, i.e. the separation of these purposes and obtaining consent for each purpose”.[4] This draws upon GDPR, Recital 32.[5]
In short, consent requests must be granular, showing opt-ins for each distinct purpose.

How granular must consent opt-ins be?

In its 2013 opinion on “purpose limitation”, the Article 29 Working Party went some way toward defining the scope of a single purpose: a purpose must be “sufficiently defined to enable the implementation of any necessary data protection safeguards,” and must be “sufficiently unambiguous and clearly expressed.”[6]
The test is “If a purpose is sufficiently specific and clear, individuals will know what to expect: the way data are processed will be predictable.”[7] The objective is to prevent “unanticipated use of personal data by the controller or by third parties and in loss of data subject control [of these personal data]”.[8]
In short, a purpose must be specific, transparent and predictable.[9] It must be describable to the extent that the processing undertaken for it would not surprise the person who gave consent for it.
The process of showing an ad to a single person (in online behavioral advertising) involves the processing of personal data for several distinct purposes, by hundreds of different companies.
[accordion id=”video”] [accordion_item title=”Video: how personal data passes between companies in online behavioral advertising” parent_id=”video”]


[/accordion_item][/accordion]
Therefore, a broad, all-encompassing sentence such as “to show you relevant advertising” does not make it possible for one to grasp how one’s data will be used by a large number of companies. It would not be possible to understand from this sentence, for example, that inferences about one’s characteristics would be inferred, or what types of consequences may result.
The following table shows an indicative list of ten purposes for which personal data are currently processed in the online behavioral advertising system. In practice, there may be more purposes at play. The table also generalizes the types of company involved in the selection and display of an ad.
[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/purposes.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download high resolution PDF [/x_button]
A spreadsheet version of this table is available here.
(Refer to footnote 10 to for a discussion the challenges presented by these purposes for all businesses involved.[10])

Pre-consent naming of each controller, and granular post-consent controller consent withdrawal

Recital 42 of the GDPR notes that “For consent to be informed, the data subject should be aware at least of the identity of the controller and the purposes of the processing”.[11] All controllers (including “joint controllers” that “jointly determine the purposes and means of processing”[12]) must be named.[13]
Each purpose must be very clear, and each opt-in requires a “clear affirmative action” that is both “specific”, and “unambiguous”.[14] There can be no pre-ticked boxes,[15] and “consent based on silence” is not permitted.[16]
Therefore, a consent request should be made with granular options for each of these purposes, and the names each controller that processes personal data for each of these purposes. For example:  

Specific purpose 1 | controllers A, B, C | options: Accept / Refuse 

There are two different scenarios for how consent for these purposes will be presented: the best case, and the more likely worst case.

The best scenario

At a minimum, then, assuming that all websites, SSPs, Ad Exchanges, DSPs, DMPs, and advertisers could align to pursue only these purposes, a consent request for this would include granular opt-in controls for a wide range of diverse purposes, the categories of processor pursuing each, and a very long list of controller names pursuing each.
The language and presentation of the request must be simple and clear, ideally the result of user testing.[17]
A consent request for a single purpose, on behalf of many controllers, might look like this.

Specific processing purpose consent, for multiple controllers,
with “next” button for multiple processing purpose opt-ins

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

What is presented when?

The Article 29 Working Party suggests that consent notices should have layers of information so that they do not overload viewers with information, but make necessary details easily available.[18] This is adopted in the design above using “View details”, “Learn about your data rights here”, and similar buttons and links.
When a user clicks “view details” to see the next layer of information about a controller

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

While some details, such as contact details for a company’s data protection officer, can be placed in a secondary layer, the primary layer must include “all basic details of the controller and the data processing activities envisaged”.[19]
Elements presented in this layer

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

The likely scenario:

The scenario above assumes that all businesses in online behavioral advertising can agree to pursue tightly defined purposes without deviation. However, it is more likely that controllers will need granular opt-ins, because their purposes are unique.
Any individual controllers who intend to process data for their own unique purposes will need further granular opt-ins for these purposes. Since adtech companies tend to deviate from the common purposes outlined above, it is likely that most or all of them would ultimately require granular purpose consent for each controller.
However, even if all controllers pursued an identical set of purposes so that they could all receive consent via a single consent dialogue that contained a series of opt-ins, there would need to be a granular set of consent withdrawal controls that covered every single controller once consent had been given. The GDPR says that “the data subject may exercise his or her rights under this Regulation in respect of and against each of the controllers”.[20]

A higher bar: “explicit consent”

Processing of personal data in online behavioral advertising (for example, purposes 2, 3, 5, 8, and 10 in the table above) is highly likely to produce special categories of data by inference.[21] Where this occurs, these purposes require “explicit” consent.[22]
Special categories of data reveal “racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, … [and] data concerning health or data concerning a natural person’s sex life or sexual orientation”.[23] 
To make consent explicit requires more confirmation. For example, the Article 29 Working Party suggests that two-stage verification is a suitable means of obtaining explicit consent.[24] One possible approach to this is suggested in PageFair’s design below.
Suggested mechanism for “explicit consent” 

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

One can confirm one’s opt-in in a second movement of the finger, or cursor and click. It is unlikely that a person could confirm using this interface unless it was their intention.  

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

Note that even this high bar, however, may not be permitted in some Member States. The GDPR gives European Member States the latitude to enact national legislation that prohibits consent as a legal basis for processing of special categories of data.[25] Therefore, it may not be legal to process any special categories of personal data in some EU Member States.

Conclusion 

Consent for website and app publishers is certainly an important objective, but the personal data it provides must only be processed after data leakage has been stopped. Data leakage (through in RTB bid requests, cookie syncs, JavaScript ad units, and mobile SDKs) exposes publishers as the most obviously culpable parties that regulators and privacy NGOs can target. At the same time, it also exposes their adtech vendors, and advertisers, to large fines and legal actions too.[26]
Websites, apps, and adtech vendors, should switch from using personal data to monetize direct and RTB advertising to “non-personal data”.[27] Using non-personal, rather than personal, data neutralizes the risks of the GDPR for advertisers, publishers, and adtech vendors. And it enables them to address the majority (80%-97%) of the audience that will not give consent for 3rd party tracking across the web.[28]
We recently revealed PageFair Perimeter, a regulatory firewall that blocks party data leakage, and enables publishers and adtech partners to use non-personal data for direct and RTB monetization when consent is absent (and leverage personal data when adequate consent has been given). You can learn more about Perimeter here. Publishers using Perimeter do not need people’s personal data (nor the consent required to process it) to monetize websites and apps.

[x_button shape=”rounded” size=”regular” float=”none” href=”http://pagefair.com/perimeter/” info=”none” info_place=”top” info_trigger=”hover”]Learn about Perimeter[/x_button]

Postscript

A hiccup in the choreography of the European Commission’s legislative proposals means that non-tracking cookies will need storage consent, at least until the application of the forthcoming ePrivacy Regulation. These cookies, however, contain no personal data, and obtaining consent for their storage is significantly less burdensome than obtaining consent for to process personal data for multiple purposes and multiple controllers. Update: 16 January 2018: See PageFair Insider note on storage consent for non-tracking cookies.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]


Notes:

[1] See our discussion of why consent is the appropriate legal basis for online behavioral advertising in “Why the GDPR ‘legitimate interest’ provision will not save you” , PageFair Insider, 13 March 2017 (URL: https://pagefair.com/blog/2017/gdpr-legitimate-interest/).
[2] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 11.
[3] ibid., p. 13.
[4] ibid., p. 11.
[5] Regulation (EU) 2016/679 of The European Parliament and of The Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), Recital 32. “…Consent should cover all processing activities carried out for the same purpose or purposes. When the processing has multiple purposes, consent should be given for all of them. …”
[6] “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013, p. 12.
Curiously, the Spanish Data Protection Authority has issued guidance that contains a sentence suggesting that continuing to browse a website might constitute consent, which is at odds with the Article 29 Working Party guidance on consent and appears to be entirely at odds with the text of the Regulation. See “Guía del Reglamento General de Protección de Datos para responsables de tratamiento”, Agencia Española de Protección de Datos, November 2017, p. 6.
[7] “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013, p. 13.
[8] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 12.
[9] “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013, p. 13.
[10] None of these purposes would be permissible unless data leakage were first addressed. See “Consent to use personal data has no value unless one prevents all data leakage”, PageFair Insider, October 2017 (URL: https://pagefair.com/blog/2017/understanding-data-leakage/). Furthermore,

  • Purpose 3 could not be permissible in any situation.
  • Purposes 2, 3, 5, 8, and 10 are highly likely to produce special categories of data by inference. See discussion of “explicit consent” in this note.
  • Regarding the purposes for which data have been sold, and to what category of customer, see “Data brokers: a call for transparency and accountability”, Federal Trade Commission, May 2014, pp 39-40, and B3-B

[11] The GDPR, Recital 42.
[12] The GDPR, Article 26, paragraph 1.
[13] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 14.
[14] The GDPR, Article 4, paragraph 11.
[15] ibid., Recital 32.
[16] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 16.
[17] “Guidelines on transparency under Regulation 216/679” Article 29 Working Party, November 2017, pp 8, 13.
[18] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 14.
[19] ibid., p. 15.
[20] The GDPR, Article 26, paragraph 3.
[21] “Informing data subjects is particularly important in the case of inferences about sensitive preferences and characteristics. The controller should make the data subject aware that not only do they process (non-special category) personal data collected from the data subject or other sources but also that they derive from such data other (and special) categories of personal data relating to them.” See “Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679”, Article 29 Working Party, 3 October 2017, p. 22.
[22] The GDPR, Article 9, paragraph 2, a.
[23] ibid., Article 9, paragraph 1.
[24] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 19.
[25] The GDPR, Article 9, paragraph 2, a.
[26] See “Consent to use personal data has no value unless one prevents all data leakage”, PageFair Insider, October 2017 (URL: https://pagefair.com/blog/2017/understanding-data-leakage/).
[27] Non-personal data are any data that can not be related to an identifiable person. As Recital 26 of the GDPR observes, “the principles of data protection should therefore not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable”. This recital reflects the finding of the European Court of Justice in 2016 that data are not personal “if the identification of the data subject was prohibited by law or practically impossible on account of the fact that it requires a disproportionate effort in terms of time, cost and manpower, so that the risk of identification appears in reality to be insignificant”. Judgment of the Court (Second Chamber) Patrick Breyer v Bundesrepublik Deutschland, Case C-582/14, 19 October 2016.
Non-tracking cookies, which contain no personal data, are useful for privacy-friendly advertising, and for other functions where an individual does not need to be identified such as A/B testing.
[28] See “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, September 2017 (URL: https://pagefair.com/blog/2017/new-research-how-many-consent-to-tracking/).
The granularity of consent required for online behavioral advertising will make the consenting audience even smaller. Moreover, consent for adtech will not only be hard to get, it will also be easy to lose. Consent can be withdrawn with the same degree of ease as it was given, under The GDPR, Article 7, paragraph 3.
The Article 29 Working Party demonstrates what this means in practice: “When consent is obtained … through only one mouse-click, swipe, or keystroke, data subjects must … be able to withdraw that consent equally as easily”.
“Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 21.
The guidance also says that “Where consent is obtained through use of a service specific user interface (for example, via a website, an app, a log-on account, the interface of a IoT device or by e-mail), there is no doubt a data subject must be able to withdraw consent via the same electronic interface, as switching to another interface for the solve reason of withdrawing consent would require undue effort”.

The regulatory firewall for online media and adtech

This note announces Perimeter, a regulatory firewall to enable online advertising under the GDPR. It fixes data leakage from adtech and allows publishers to monetize RTB and direct ads, while respecting people’s data. 
PageFair takes a strict interpretation of the GDPR. To comply, all media owners need to protect their visitors’ personal data, or else find themselves liable for significant fines and court actions. In European Law, personal data includes not only personally identifiable information (PII), but also visitor IP addresses, unique IDs, and browsing history.[1] The problem is that today’s online ads operate by actively disseminating this kind of personal data to countless 3rd parties via header bidding, RTB bid requests, tracking pixels, cookie syncs, mobile SDKs, and javascript in ad creatives. This exposes everyone, from the publisher to the advertiser, to potential fines, litigation and brand damage.[2]
Perimeter fixes this. It enables publishers to securely protect and control the tracking activities of the entire advertising supply chain in their websites and apps, by strictly blocking all third parties unless both publisher and data subject have given their consent.

[x_button shape=”rounded” size=”large” float=”none” href=” https://pagefair.com/perimeter/” title=”perimeterblogbutton” info=”none” info_place=”top” info_trigger=”hover”]Learn more about Perimeter [/x_button]

Revenue with or without tracking and consent

Publishers using Perimeter do not need people’s personal data (nor the consent required to process it) to monetize websites and apps. This is critically important, because only a small minority of people online are expected to consent to third party tracking for online advertising.[3]
Even without personal data, Perimeter enables interoperation with GDPR-compliant ad tech vendors so that frequency capping, attribution, impression counting, click counting, view-through counting, conversion counting, and fraud mitigation, all work without personal data. The list of compliant adtech vendors that PageFair works with to do this is growing.
Perimeter will also re-enable audience targeting by using non-personal segments that can also interoperate with consent-bearing DMPs.
When adequate consent is present, publishers, adtech vendors and advertisers can use personal data, and Perimeter will interoperate with other compliant consent management platforms. Indeed, Perimeter is a necessary partner to make consent meaningful.[4] Adtech vendors are eager for publishers to collect consent on behalf of 3rd parties – but publishers must simultaneously block all parties who do not have consent, or else remain exposed to liabilities.

Take Control of Data Leakage in all your Digital Properties

Perimeter brings privacy and data protection to the RTB/programmatic advertising ecosystem in the following ways:

  • Automatic removal of data leaking scripts from ads before they are rendered.
  • Prevention of unauthorized 3rd parties from accessing personal data.
  • Enforcement of data protection in RTB bid requests.
  • Enforcement of data protection in mobile SDK.

[x_button shape=”rounded” size=”large” float=”none” href=” https://pagefair.com/perimeter/” title=”perimeterblogbutton” info=”none” info_place=”top” info_trigger=”hover”]Learn more about Perimeter [/x_button]

Four Components

Perimeter provides four components that protect website and app publishers, and all of their advertising partners.

  1. Server side ad rendering
    Controls data in bid requests and ad creatives
  2. Policy Manager
    Empowers publishers to decide what 3rd parties are permitted to run in their websites and apps.
  3. User Consent Manager
    Enables granular consent to be obtained, communicated, and withdrawn by users.
  4. Privacy-by-design adtech interoperation
    Perimeter is partnering with other adtech vendors who are innovating to be compatible with strict enforcement of privacy regulations. This re-enables all core campaign management, measurement and targeting features without depending on legally toxic tracking IDs or other personal data.

[x_alert heading=”Resource to check your adtech vendors’ compliance ” type=”success”]Here is a resource for publishers to check whether your adtech vendors are compliant.[/x_alert]

Ethical data

We built Perimeter to enable websites and apps to transition from the old adtech industry to a more ethical one. This is why we are openly sharing the measurement and capping techniques for non-personal data adtech.
Perimeter is the result of 24 months of intensive technical and policy research and development, and combines the feedback of many app developers, advertisers, adtech vendors, privacy NGOs, regulators, and lawmakers.
It enables publishers, and their advertising partners, to operate within a clean and ethical data/media industry.

[x_button shape=”rounded” size=”large” float=”none” href=” https://pagefair.com/perimeter/” title=”perimeterblogbutton” info=”none” info_place=”top” info_trigger=”hover”]Learn more about Perimeter [/x_button]

Are you an ad tech vendor?

Let us know if you would like to find out more about GDPR compliance requirements and getting whitelisted on the Perimeter partner program. You can read about non-personal data methods here.
 
[x_line]

Notes

[1] See the definition of personal data in Article 4, (1), Regulation (EU) 2016/679 of The European Parliament and of The Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation).
[2] ibid., Article 4, paragraph 2.
[3] See “Europe Online: an experience driven by advertising”, GFK, 2017, p. 7 and “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, 12 September 2017.
[4] Because without Perimeter’s mitigation of data leakage, consent has no value. See “Consent to use personal data has no value unless one prevents all data leakage”, October 2017, PageFair Insider (URL: https://pagefair.com/blog/2017/understanding-data-leakage/)

Overview of how the GDPR impacts websites and adtech (IAPP podcast)

In this podcast, the International Association of Privacy Professionals interviews PageFair’s Dr Johnny Ryan about the challenges and opportunities of new European privacy rules for website operators and brands. 
Update: 3 January 2018: This podcast was the International Association of Privacy Professionals’ most listened to podcast of 2017. 

The conversation begins at 4m 14s, and covers the following issues.

  • Risks for website operators
  • How “consent” is an opportunity for publishers to take the upper hand in online media
  • Brands’ exposure to legal risk, and the agency / brand / insurer conundrum
  • Personal data leakage in RTB / programmatic adtech
  • How the adtech industry should adapt

As we told Wired some months ago, it’s not just that websites might expose yourself to litigation, it’s that you might expose your advertisers to litigation too. But this can be fixed.
Click here to view PageFair’s repository of explainers, analysis, and official documents about the new privacy rules.
Elsewhere you can find details about PageFair’s GDPR solutions for website operators.
Note: the IAPP published this podcast this month. The interview was conducted several months ago. 
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Frequency capping and ad campaign measurement under GDPR

This note describes how ad campaigns can be measured and frequency capped without the use of personal data to comply with the GDPR. 
It is likely that most people will not give consent for their personal data to be used for ad targeting purposes by third parties (only a small minority [1] of people online are expected to consent to third party tracking for online advertising). Even so, sophisticated measurement and frequency capping are possible for this audience.
This note briefly outlines how to conduct essential measurement (frequency capping, impression counting, click counting, conversion counting, view through measurement, and viewability measurement) in compliance with the EU’s General Data Protection Regulation. This means that publishers and advertisers can continue to measure the delivery of the ads that sustain their businesses, while simultaneously respecting European citizens’ right to protection of their personal data.
Note that this discussion assumes that the final text of the EU’s ePrivacy Regulation will not incidentally illegalize non-tracking cookies (i.e., cookies that neither contain nor reveal any personal data, and therefore pose no privacy risks) [2].
Table: cleaner ad tech measurement [3]

 

Frequency capping, without personal data

Most of today’s ad servers implement frequency capping by using a server-side database to store the number of times an ad campaign has been shown to each user. Each user is tracked using a unique ID, which is stored both in the database and in a 3rd party cookie in the user’s browser [4]. Since this user ID could be used to track what websites the user is visiting, and could potentially be matched against other online trackers and offline data, this will be illegal under GDPR (unless the user has specifically consented to it).
A privacy-by-design alternative is to get rid of the user ID and move the counter directly into the cookie. The cookie it is stored in can have an expiry equal to the maximum amount of time the campaign should be capped for, and the name of the cookie can be set to the name of the ad campaign. Variations on this approach have been discussed for years – see Arvind Narayanan and Jonathan Mayer’s approach here. Since the counter does not contain any information specific to a particular user, it is not “personal data” under GDPR, and is not subject to consent.
Two inefficiencies of this approach, storage and bandwidth, are addressed below.
First, how much storage space will all those frequency capping cookies take up in the web browser? So long as the cookie expiry dates are reasonable, this data should be proportional to the number of ad campaigns delivered to a browser over a one- or two-week time window, and should not grow beyond that. Even if a user manages to view a hundred thousand different advertising campaigns in a two week period, that would still require no more than a few megabytes to store.
Second, how much bandwidth might be consumed by transmitting all these view counters along with every request to the ad server? This can be reduced by being efficient in the encoding of the cookie data. As shown in the inset below,  transmitting a frequency counter in the ad server cookie could take as little as 9 bytes of extra bandwidth, which means that thousands of counters could be transmitted without significantly impacting the weight of a modern web page.

Calculating potential bandwidth requirements
The name of the cookie might be used to store a campaign ID, efficiently encoded using all the characters available according to RFC 6265 (i.e., “A-Z”, “a-z”, “0-9” and “!#$%&’*+-.^_`|~”). That’s 77 characters, meaning that just four characters can be used to encode over 35 million (i.e. 774) different IDs, which is easily sufficient for all ad campaigns that an ad server might be managing in a given period. Meanwhile, the value portion of the cookie needs only contain a counter, which can be 2 characters long (to store a value up to 99 in decimal, or about 6,000 in base-77 encoding.With this system in place, a browser request to an ad server might consist of the following HTTP request:

GET /ad HTTP/1.1
Host: acmeadserver.com
Cookie: 2iP&=21; Rz%x=13

The “Cookie:” field concatenates all cookies previously stored by the ad server. We can see that the bandwidth consumed by each frequency counter is only 9 bytes in total: 4 characters for the campaign ID, 2 characters for the counter value, and 3 counters for the equals sign, the semi-colon delimiter and the space. It would only take 20 Kb to transmit over two thousand frequency counters in an ad call.

There is a further opportunity to optimize bandwidth, with the help of header bidding. Because header bidding sends multiple potential bids to the client, the client can do the work of deciding which ones have not reached a frequency cap and are therefore eligible for display. SSPs are currently moving from second-price auctions to first-price auctions to better support header bidding, and in time are likely to start returning multiple bids to header bidder wrappers, so that more bids can participate in the final auction. This will provide plenty of choice to a client-side frequency capping algorithm, and probably entirely eliminate the need for the frequency capping data to ever leave the browser.

Campaign Metrics

Campaign metrics allow advertisers and media owners to see how different advertising campaigns are performing, and to optimize campaigns if necessary. The typical metrics are impression counts, click counts, conversion counts, view-through measurement and viewability measurement.
Fortunately, none of these metrics concern individual people. Therefore, ad servers can avoid tracking information at an individual user level to ensure GDPR compliance. It is likely that many of today’s ad servers have not been so careful, and have implementations that currently depend on counting the same user ID that was used for frequency capping. These ad servers will need to consider providing alternative implementations when serving ads to EU users.
In the paragraphs below we review typical campaign metrics for likely GDPR compliance, and suggest alternative implementations where appropriate.

Impression Counting

Impression counting is normally implemented by incrementing a database counter pertaining to the ad campaign whenever a request is made to the ad server for that campaign, or when an impression pixel specific to that campaign is loaded from the ad server by the web browser. Basic impression counting should not pose a problem under GDPR, as no user-specific information is processed or stored [5].

Click Counting

Click measurement is normally performed by redirecting the browser to the ad server when the ad is clicked on, at which point it registers the click event, and then redirects the browser to the advertiser URL.
Like basic impression counting, click counting should not be problematic under GDPR, as all that is required is an overall counter of the number of times that an ad has been clicked on across all users, without the involvement of any user-specific information.
When the number of clicks is known, the click-through-rate is given by dividing the number of clicks by the number of impressions for any given campaign.

Conversion counting

Conversion counting is normally performed by obtaining a campaign-specific pixel from the ad server, and placing that pixel on a web page the user will be brought to when they complete a transaction with the advertiser (when they “convert”). When a user who has clicked on an ad “converts”, the conversion pixel will be loaded from the ad server, which will increase the conversion count for that campaign by one.
As with impression counting and click counting, there is no specific privacy concern here: only campaign-level data is used, and no user-specific information is processed.

View Through Measurement

Although click-through rates help an advertiser understand the immediate positive reaction of users who see their ad, many are also interested in the indirect response, e.g., how many people who saw the ad went on to buy the product during the subsequent weeks regardless of clicking on the ad?
It is possible that some ad servers currently perform view-through measurement using user-specific information. For example, the ad server might record that an ad was viewed by a particular user ID. When the conversion pixel loads on the advertiser’s post-conversion page, the ad server could look up the details of the last time that user ID viewed that campaign.
The above implementation would be incompatible with GDPR, as it involves tracking the behavior of unique users. Fortunately, there are alternative implementations that are equally effective.
The correct approach is to use a non-tracking cookie to store the fact that the user has viewed the ad campaign. This means that the cookie would contain the ID of the ad campaign, not an ID of the user (and consequently, every user who saw that campaign would have an identical cookie). This cookie should be set to expire automatically when a certain amount of time has passed, beyond which the advertiser is not interested in attributing the visit to the fact the ad was seen. In this system, when the user eventually converts for the advertiser, the cookie containing the campaign ID is transmitted to the advertiser’s ad server, which can then increment the relevant view-through counter.
It is possible that this approach could lead to a lot of cookie data being transmitted and consuming bandwidth. To mitigate this, the view-through pixel should be homed on a domain unique to each advertiser, rather than every advertiser sharing the domain of their ad server.

Viewability Measurement

Viewability measurement allows advertisers to understand how many of the ads they pay to serve on a web site are likely to scroll into view and be displayed for long enough for a user to potentially notice them.
Viewability measures a characteristic of websites, not users, and can therefore be implemented in a GDPR compatible fashion. Although some viewability systems today might store per-user information, there is no fundamental need to do so. All that is required is to detect when a view event has occurred for each ad space, and to send that to the server to be counted. The server will then aggregate these events and provide an overall count of the number of times each ad space has been viewable by any user in each time period.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Notes

[1] See “Europe Online: an experience driven by advertising”, GFK, 2017, p. 7 and “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, 12 September 2017.
[2] For an overview of the issues see a previous PageFair Insider note, “The Privacy Case for Non-Tracking Cookies: PageFair writes to the European Parliament”, PageFair Insider, 10 August 2017 (URL:  https://pagefair.com/blog/2017/non-tracking-cookies/).
[3] Clearly, a campaign that targeted a single individual without a legal basis for doing so would be illegal. It is important that campaigns must target more than a small set of viewers.
[4] These data are automatically transmitted to the ad server along with every request.
[5] Ad servers may also support the counting of “unique impressions”, which means the number of unique users who saw the campaign. This mechanism generally relies on tagging each user with a unique identifier, and counting the number of unique identifiers. Therefore, while impression counting is practical, unique impressions may not be because a unique identifier could be misused.

Adtech consent is meaningless unless one stops data leakage

Websites and advertisers can not prevent personal data from leaking in programmatic advertising. If not fixed, this will render consent to use personal data meaningless. 
The GDPR applies the principle of transparency:[1] People must be able to easily learn who has their personal data, and what they are doing with it.
Equally importantly, people must have surety that no other parties receive these data.
It follows that consent is meaningless without enforcement of data protection: unless a website prevents all data leakage, a visitor who gives consent cannot know where their data may end up.
But the online advertising system leaks data in two ways. This exposes brands, agencies, websites, and adtech companies to legal risk.
How data leakage happens 
If “programmatic”advertising or “real time bidding” was ever a mystery to you, take 43 seconds to watch this PageFair video. It shows the process in which an advertiser decides that a person visiting a website is the right kind of person to show an ad to (click full screen).


This system was not built for data protection. Instead, it was built to enable hundreds of businesses to trade personal data about the people visiting websites, to determine what ads to show them, and what advertisers should pay to show those ads.
The next video shows what happens to personal data in this system. It illustrates each step in the selection and delivery of a single ad. (33 seconds)


The yellow arrows in this video show who ad exchanges and other advertising technology services share data about the website visitor with.
These data include the ad exchange’s own identifier on the user, the URL the user is visiting, the user’s IP address, and the details of the user’s browser and system.
Hundreds of parties receive these data in the milliseconds before an ad is shown.
To complicate matters, some websites work with more than one ad exchange, conducting a mega auction known as “header bidding”, to solicit the more bids for their ad units. The following video is 68 seconds long, and shows how this works.


Conclusion: there are two problems 
First, personal data about a website visitor are shared with hundreds of parties every time the website requests an ad through one or more ad exchanges. There is nothing to prevent these hundreds of parties from leaking these data to anyone else. This must be controlled.
Second, the advertisement that the website visitor is shown, once the bidding process concludes, often contains JavaScript. This code can then summon trackers (or worse). This, also, must be controlled.
The forthcoming Open RTB 3.0 specification contains measures to limit data leakage.[2] But not to stop it entirely. Unless publishers can exercise complete enforcement of data protection on their sites, consent is meaningless. Moreover, so long as data can leak then all parties involved are exposed to legal hazard: publishers and their partners too.[3]
PageFair has been developing a solution to this problem.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Notes

[1] The GDPR, Article 5, paragraph 1 (a), Article 12, Recital 39, and 60.
[2] Websites can prevent specific companies from placing bids to buy ad space on their pages. Websites can also reject JavaScript ads from unauthorised domains. The specification also contains scope for “whitelisted/blacklisted JS trackers”. Open RTB 3.0 draft specification, IAB TechLab, September 2017, pp 14, 27, and AdCom draft specification, IAB TechLab, September 2017, p. 14.
[3] The GDPR, Article 82, paras. 1, 3 – 4, Recital 146. After judgement the processors or controllers who have paid full compensation can claim back part of the compensation from processors or controllers also responsible (ibid., Article 82, para. 5).

The Privacy Case for Non-Tracking Cookies: PageFair writes to the European Parliament

In the last month, we have written to the MEPs leading the Parliament’s work on the ePrivacy Regulation (the “rapporteurs”) to propose an amendment. Here is a copy of the letter.
PageFair supports the proposed ePrivacy Regulation, in so far as it will change online behavioural advertising. This is an unusual position for an ad tech company, and we have described why we have taken it in a previous note. We agree with the restriction on the use of tracking cookies in Article 8 of the Commission’s proposal for an ePrivacy Regulation, and in the draft report of the Parliament’s rapporteur.
However, non-tracking cookies should not be treated the same way as tracking cookies. While tracking cookies pose a severe risk to data protection (Article 8 of the EU Charter of Fundamental Rights) and privacy of communications (Article 7 of the EU Charter of Fundamental Rights), non-tracking cookies do not.
The Regulation should be amended to allow for non-tracking cookies. One way to achieve this is to add a point to Article 8, paragraph 1, to permit the use of terminal equipment storage and processing if no personal data are processed.
It is important to permit non-tracking cookies that pose no risk to privacy or to the confidentiality of personal communications for two reasons.

  1. Incentivising the use of non-tracking cookies will help industry to adopt privacy by design.
    Non-tracking cookies do not contain or directly or indirectly reveal metadata, content of communication, or personal data. Nor do they enable individual identification of a person. However, non-tracking cookies are a useful technical means for industry to take privacy friendly approaches, and support innovation.
  2. The current text’s prohibition of non-tracking cookies will disadvantage European businesses and web users. 
    Non-tracking cookies are an important means of enabling both essential and nonessential functions of websites. Websites often use non-tracking cookies to provide functionality that is useful to visitors, whether or not the functionality would be deemed strictly necessary, and irrespective of whether it was explicitly requested by a user. Non-tracking cookies often offer the most secure and robust method of enabling these functions. European companies should not have to revert to outmoded techniques such as passing data via long parameters appended to every URL, which was typical of the earliest “CGI” web applications and was afflicted with reliability and security issues.

To illustrate this point consider several examples of non-tracking cookies that would be prohibited under the current text. These examples show how important non-tracking cookies are to the functioning of websites and services, and show their compatibility with the right to respect for private life and communications and the right to the protection of personal data.

Examples of non-tracking cookies. 

Example 1: A website that changes its appearance periodically 

An artist’s web site is designed so that it changes its background colour every three days for one month after a visitor discovers it. To do this the website sets a non-tracking cookie containing only an expiry date. The website refers to this expiry date, which it finds in the non-tracking cookie, to determine which three day colour rotation to show the visitor.
This is what the information in this non-tracking cookie looks like: Set-Cookie: path=/; expires=Mon, 19 Jun 2017 04:28:00 GMT. This non-tracking cookie has no value as a tracking tool, and makes no impact on the user’s privacy or on the confidentiality of their communications.
In this example the non-tracking cookie is providing an important function for the artist’s website – whether or not the user finds it strictly necessary. This functionality is merely an experiment on the part of the site’s owner, but it may become a useful innovation that differentiates the website, or spurs some unforeseen innovation.

Example 2: Currency localisation widget 

A payments company provides an online widget on which visitors to international shopping sites can see prices in their local currencies. For example, a browser on a US site that appears be visiting from Denmark also displays the price in Danish Krone. This service is not essential, and the user has not requested it. But it is useful, and the website publishers on whose sites the widget appears hope that it will improve their sales.
The widget designers use a non-tracking cookie, which means they can avoid alternative methods that involve storing unique identifiers and personal data. This is a privacy-by-design approach. The non-tracking cookie contains only the letters “DK”: Set-Cookie: path=/; currency=DK. Note that the non-tracking cookie can be overwritten later on if the user chooses. This is a third party non-tracking cookie (the payments company provides the widget to publishers, who embed it on their websites).

Example 3: Adventure game 

A free-to-play game on the web is modelled after a popular “choose your own adventure” novel. In this game the user reads a passage of text, makes a choice, rolls a dice, and is then taken to the next part of the story determined by the choice they made and the result of the dice roll.
To do this the game must store the user’s progress. This includes a record of the game sections already completed, the player’s health, and current situation in the game. The designer could do this in several ways, but using a non-tracking cookie is by far the best.
In this example a non-tracking cookie is not strictly technically necessary to make the game work, but it is an important part of making the game easy to play. One alternative to a non-tracking cookie would have been to require the user to log in and set up an account, after which information about the user’s situation in the game could be recorded on the game’s server. However, the user does not want to log in each time to play the game, and the game designer does not want to have to force the user to set up an account.

Example 4: A/B testing 

A newspaper wants to improve its website in order to increase its number of paying subscribers. It uses “A/B testing”, a popular design method in which visitors are assigned into one of two test groups, called “A” and “B”. Users who in group A are shown the original version of the website, and users in group B are shown a version with potential improvements. The experiment gives the newspaper statistical evidence of the effects of the potential improvements on subscriptions.
To do this, the newspaper must use a non-tracking cookie to store the test group that visitors are randomly assigned to: Set-Cookie: path=/; letter=A. The non-tracking cookie contains only the letter “A” or the letter “B”. Several thousand visitors are in A group, and several thousand are in B.

Example 5: A/B test 2 

Similar A/B testing to that described in the previous example can be conducted by third parties that provide embedded functionality for newspapers: for example, a newspaper contains a daily crossword provided by an external company. The crossword is displayed on the newspaper’s website in an iframe. The crossword company is considering changing its default typeface, and wants to make sure that the choice it makes doesn’t prompt a users to abandon the crossword due to decreased legibility. It would conduct A/B testing using non-tracking cookies in the same way that the newspaper did in Example 4.

Example 6: Frequency capping 

People dislike seeing the same ad repeatedly on different websites. This also wastes advertisers’ budgets. Ad tech companies typically prevent repeated advertising by using tracking cookies, and recording the number of times that a person has been shown an ad in a database next to their unique tracking ID. However, in this example an advertiser wants to prepare for the GDPR by abandoning unique tracking IDs. Instead it will use a short-lived non-tracking cookie that contains the number of times that the ad has been displayed.
The first time that the ad is shown, a non-tracking cookie is set, containing the value “1”: Set-Cookie: path=/; count=1. Each subsequent time the ad is shown, the value of the non-tracking cookie will be increased. When the value of the non-tracking cookie reaches the frequency cap (10, for example), the ad server will no longer return that ad, and will instead display an ad from a different advertiser. When the non-tracking cookie reaches its maximum age (2 weeks, for example), it will expire, and the ad will once again be eligible to be displayed.

Example 7: Personalised stock page 

A financial website is visited many times a day by stock traders seeking the latest information on particular stocks. The operators of a finance website decide to automatically show each visitor the latest stock price on their five most recently searched stock tickets on the front page of the site. This means that frequent visitors will not need to find and reload each separate stock every time they visit the site.
To do this without storing personal data or requiring a login, the website stores a list of the 5 most recently-searched stock tickers in a non-tracking cookie: Set-Cookie: path=/; stocks=NVS,BUD,HSBC,UN,UL.
Whenever the web page first loads, the current prices of these stocks will be automatically displayed beside the core stock ticker search functionality of the website.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Here is what GDPR consent dialogues could look like. Will people click yes?

THIS NOTE HAS NOW BEEN SUPERSEDED BY A A MORE RECENT PAGEFAIR INSIDER NOTE ON GDPR CONSENT DIALOGUES. PLEASE REFER TO THE NEW NOTE. 
This note presents sketches of GDPR consent dialogues, and invites readers to participate in research on whether people will consent. 
[x_alert heading=”Note” type=”info”]It is important to note that the dialogue presented in this note is only a limited consent notice. It asks to track behaviour on one site only, and for one brand only, in addition to “analytics partners”. This notice would not satisfy regulators if it were used to cover the vast chain of controllers and processors involved in conventional behavioural targeting.[/x_alert]

Consent requests

In less than a year the General Data Protection Regulation (GDPR) will force businesses to ask Internet users for consent before they can use their personal data. Many businesses lack a direct channel to users to do this. Therefore, it is likely that they will have to ask publishers to seek consent on their behalf.
This is a sketch of what a GDPR consent request by a publisher on behalf of a third party may look like, with references to the elements required in the GDPR.

Update: it is important to note that this is a limited consent notice. It asks to track behaviour on one site only, and for one brand only, in addition to “analytics partners”. This notice would not satisfy regulators if it were used to cover the vast chain of controllers and processors involved in conventional behavioural targeting.
[accordion id=”my-accordion”] [accordion_item title=”Click to expand: Information that data subjects must be given in GDPR-compliant consent requests.” parent_id=”my-accordion” open=”false”]
Businesses will have to provide the following information to internet users when seeking their consent.

  • Who is collecting the data, and how to contact them or their European representative.
  • What the personal information are being used for, and the legal basis of the data processing.
  • The “legitimate interest” of the user of the data (This refers to a legal basis that may be used by direct marketing companies).
  • With whom the data will be shared.
  • Whether the controller intends to transfer data to a third country, and if so has the European Commission deemed this country’s protections adequate or what alternative safeguards or rules are in place.
  • The duration of storage, or the criteria used to determine duration.
  • That the user has the right to request rectification to mistakes in this personal information.
  • That the user has the right to withdraw consent.
  • How the user can lodge a complaint with the supervisory authority.
  • What the consequences of not giving consent might be.
  • In cases of automated decision-making, including profiling, what the logic of this process is, and what the significance of the outcomes may be.

[/accordion_item] [/accordion]
What percentage of people are likely to click “OK”?

Tracking preferences

In addition to the consent requirements in the GDPR, the forthcoming ePrivacy Regulation requires that users be presented with a menu of tracking preferences when first they install a browser or setup a new system that connects to the Internet. See a sketch of this menu below.

The menu above is as it might have appeared under the original proposal from the European Commission, in January 2017. However, the European Parliament is developing amendments to the Commission’s proposal. Below is a sketch of the menu as it might appear under the latest text from June 2017.

Notice that “accept only first party tracking” is pre-selected. This is because Recital 23 in the current draft stipulates that the default setting should prevent “cross-domain tracking” by third-parties. Click here to see an animated version of these menu designs.
This menu may change again as the Regulation is further developed. But assuming that some version of this tracking preferences menu becomes law across the European Union, how many people can be expected to opt back into tracking for online advertising?
We would like to find out, and reveal the answer.

PageFair Research

We are surveying sample industry-insiders’ insights into this question. Your shared insights may illuminate this issue. Please click the button below to take the survey.
We have designed the survey to take 70 seconds to complete.
Update: Click here to see the results of this survey.
Thank you for your input.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]