Posts

Google adopts non-personal ad targeting for the GDPR

This note examines Google’s recent announcement on the GDPR. Google has sensibly adopted non-personal ad targeting. This is very significant step forward and signals a change in the online advertising market. But Google has also taken a new and problematic approach to consent for personal data use in advertising that publishers will find hard to accept. 

Google decides to use non-personal ad targeting to comply with the GDPR 

Last Thursday Google sent a policy update to business partners across the Internet announcing that it would launch an advertising service based on non-personal data in order to comply with the GDPR.[1]
PageFair has advocated a non-personal approach to advertising for some time, and commends Google for taking this position. As we noted six months ago,[2] Google AdWords, for example, can operate without consent if it discards personalized targeting features (and unique IDs). In this case, advertisers can continue to target advertisements to people based on what they search for.
This may be part of a trend for Google, which announced in mid 2017 that it would stop mining personal e-mails in Gmail to inform its ad targeting. Clearly, few users would have given consent for this.[3] Google’s latest announcement has signaled to advertisers the importance of buying targeted advertising without personalization.
Although Google’s “non-personalized ads” may seem promising to advertisers and publishers who are concerned about GDPR liability, more work must be done before they can be considered safe.
Unique tracking IDs are currently vital to Google’s ability to perform frequency capping and bot detection.[4] Meanwhile, data leakage is a problem caused by 3rd party ad creatives liberally loading numerous tracking pixels. Google has been silent on fixing these problems. Therefore, it may be that Google will merely target ads with non-personal data, but will continue to perform tracking as usual. Clarity on this point will be important for advertisers seeking safe inventory.

Problems with Google’s approach to consent for personal data

Despite its new non-personalized ads, Google is also attempting to build a legal basis under the GDPR for its existing personal data advertising business. It has told publishers that it wants them to obtain their visitors’ consent to “the collection, sharing, and use of personal data for personalization of ads or other services”.[5]
Note that the purpose here is “personalization of ads or other services”. This is appears to be a severe conflation of the many separate processing purposes involved in advertising personalization.[6] The addition of “other services” makes the conflation even more egregious. As we previously observed in our note on the approach proposed by IAB Europe, this appears to be a severe breach of Article 5, which requires that consent be requested in a granular manner for “specified, explicit” purposes.[7] As noted in a previous PageFair note, European regulators have explicitly warned against conflating purposes in this way:

“If the controller has conflated several purposes for processing and has not attempted to seek separate consent for each purpose, there is a lack of freedom. This granularity is closely related to the need of consent to be specific …. When data processing is done in pursuit of several purposes, the solution to comply with the conditions for valid consent lies in granularity, i.e. the separation of these purposes and obtaining consent for each purpose”.[8] 

Controller-controller 

Google is asking publishers to obtain consent from their visitors for it to be an independent controller of those users’ personal data.[9] Confusingly, Google has called this a “controller-controller” policy. This evokes “joint-controllership”, a concept in the GDPR that would require force both Google and publisher to jointly determine the purposes and means of processing, and to be transparent with each other.[10] However, what Google proposes is not joint-controllership, but rather independent controllership for the publisher on the one hand, and for Google on the other. Google’s “controller-controller” terms to publishers define each party as

“an independent controller of Controller Personal Data under the Data Protection Legislation; [that] will individually determine the purposes and means of its processing of Controller Personal Data”.[11]

It is not clear why a publisher would choose to do this, since it would enable Google to leverage that publisher’s audience across the entire web (severe conflation of purposes notwithstanding). The head of Digital Content Next, a publisher trade body that represents Disney, New York Times, CBS, and so forth, has already announced that “no way in hell Google will be “co-controller” across publishers’ sites”.[12]
Further problems with Google’s new approach to consent 
Even if publishers did accept that Google could be a controller of their visitors’ data for its own purposes, it is unlikely that many visitors would give their consent for this.[13]
If, however, both a publisher and a visitor were to agree to Google’s controller-controller proposal, two further problems arise. First, when a publisher shares third party personal data with Google, Google’s terms require that the publisher “must use commercially reasonable efforts to ensure the operator of the third party property complies with the above duties [of obtaining adequate consent]”.[14] This phrase “commercially reasonable efforts” is not a meaningful defence in the event that personal data are unlawfully processed.
As one expert from a European data protection authority retorted when I researched this point: “Imagine this as legal defence line: ‘We did not obtain consent because if wasn’t possible with commercially reasonable efforts’?” The Regulation is clear that “each controller or processor shall be held liable for the entire damage”, where more than one controller or processor are “involved in the same processing”.[15]
Second, Google’s policy puts what appears to be an impossible burden on the publisher. It requires that the publisher accurately inform the visitor about how their data will be used if they give consent.

“You must clearly identify each party that may collect, receive, or use end users’ personal data as a consequence of your use of a Google product. You must also provide end users with prominent and easily accessible information about that party’s use of end users’ personal data”.[16]

However, the publisher does not know what personal data Google shares with its own business partners. Nor does it know what purposes these parties process data about its visitors for. So long as this continues, a publisher cannot be in a position to inform its visitors of what will be done with their data. The result is very likely to be a breach Article 6[17] and Article 13[18] of the GDPR.
Giving Google the benefit of the doubt, this may change before 25 May. Google plans to publish some information about its “uses of information and we are asking other ad technology providers with which Google’s products integrate to make available information about their own uses of personal data.”[19] Publishers will not be well served by any further delay in the provision of this information.

Risks for Google 

Google’s decision to rely on non-personal data for ad targeting is highly significant, and will enable the company and advertisers that work with it to operate under the GDPR. However, Google’s new consent policy is fraught with issues that make it impossible for publishers to adopt. Our GDPR risk scale, first published for Google in August 2017, remains unchanged.


Perimeter is a robust regulatory firewall. It preemptively blocks unauthorized requests from 3rd parties, and tightly controls personal data on your website and app. It protects you, your advertising business, and your users. Perimeter makes sure that consent means something.

[x_button shape=”rounded” size=”regular” float=”none” href=”https://pagefair.com/perimeter” info=”none” info_place=”top” info_trigger=”hover”]Learn more[/x_button]

Notes

[1] “Changes to our ad policies to comply with the GDPR”, Google Inside AdWords, 22 March 2018 (URL: https://adwords.googleblog.com/2018/03/changes-to-our-ad-policies-to-comply-with-the-GDPR.html).
[2] “How the GDPR will disrupt Google and Facebook”, PageFair Insider, August 2017 (URL: https://pagefair.com/blog/2017/gdpr_risk_to_the_duopoly/).
[3] ibid.
[4] For alternative methods of performance measurement and reporting see “Frequency capping and ad campaign measurement under GDPR”, PageFair Insider, November 2017 (URL: https://pagefair.com/blog/2017/gdpr-measurement1/).
[5] “EU user consent policy”, Google, to apply from 25 May 2018 (URL: https://www.google.com/about/company/consentstaging.html)
[6] See discussion of data processing purposes in online behavioural advertising, and the degree of granularity required in consent, in “GDPR consent design: how granular must adtech opt-ins be?”, PageFair Insider, January 2018 (URL: https://pagefair.com/blog/2018/granular-gdpr-consent/).
[7] The GDPR, Article 5, paragraph 1, b, and note reference to the principle of “purpose limitation”. See also Recital 43. For more on the purpose limitation principle see “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013.
[8] “Guidelines on consent under Regulation 2016/679”, WP259, Article 29 Working Party, 28 November 2017, p. 11.
[9] “Google Ads Controller-Controller Data Protection Terms, Version 1.1”, Google, 12 October 2017 (URL: https://privacy.google.com/businesses/controllerterms/).
[10] See The GDPR, Article 26.
[11] Clause 4.1 of “Google Ads Controller-Controller Data Protection Terms, Version 1.1”, Google, 12 October 2017 (URL: https://privacy.google.com/businesses/controllerterms/).
[12] Jason Kint, Twitter, 22 March 2018 (URL: https://twitter.com/jason_kint/status/976928024011726848)
[13] “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, 12 September 2017 (URL: https://pagefair.com/blog/2017/new-research-how-many-consent-to-tracking/).
[14] “EU user consent policy”, Google, to apply from 25 May 2018 (URL: https://www.google.com/about/company/consentstaging.html)
[15] The GDPR, Article 4, paragraph 2.
[16] “EU user consent policy”, Google, to apply from 25 May 2018 (URL: https://www.google.com/about/company/consentstaging.html)
[17] The GDPR, Article 6, paragraph 1, a.
[18]  [20] The GDPR, Article 13, paragraph 2, f, and Recital 60.
[19] “Help with the EU user consent policy”, Google (URL:https://www.google.com/about/company/consenthelpstaging.html)

PageFair writes to all EU Member States about the ePrivacy Regulation

This week PageFair wrote to the permanent representatives of all Member States of the European Union in support for the proposed ePrivacy Regulation.
Our remarks were tightly bounded by our expertise in online advertising technology. We do not have an opinion on how the proposed Regulation will impact other areas.
The letter addresses four issues:

  1. PageFair supports the ePrivacy Regulation as a positive contribution to online advertising, provided a minor amendment is made to paragraph 1 of Article 8.
  2. We propose an amendment to Article 8 to allow privacy-by-design advertising. This is because the current drafting of Article 8 will prevent websites from displaying privacy-by-design advertising.
  3. We particularly support the Parliament’s 96th and 99th amendments. These are essential to enable standard Internet Protocol connections to be made in many useful contexts that do not impact of privacy.
  4. We show that tracking is not necessary for the online advertising & media industry to thrive. As we note in the letter, behavioural online advertising currently accounts for only a quarter of European publishers’ gross revenue.

[x_button shape=”rounded” size=”regular” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/03/PageFair-letter-on-ePrivacy-to-perm-reps-13-March-2018.pdf” info=”none” info_place=”top” info_trigger=”hover”]Read the letter [/x_button]

The digital economy requires a foundation of trust to enable innovation and growth. The enormous growth of adblocking (to 615 million active devices) across the globe proves the terrible cost of not regulating. We are witnessing the collapse of the mechanism by which audiences support the majority of online news reports, entertainment videos, cartoons, blogs, and cat videos that make the Web so valuable and interesting. Self-regulation, lax data protection and enforcement have resulted in business practices that promise a bleak future for European digital publishers.
Therefore, we commend the Commission and Parliament’s work thus far, and wish the Council (of Ministers of the Member States) well in their deliberations.

GDPR consent design: how granular must adtech opt-ins be?

This note examines the range of distinct adtech data processing purposes that will require opt-in under the GDPR.[1]
In late 2017 the Article 29 Working Party cautioned that “data subjects should be free to choose which purpose they accept, rather than having to consent to a bundle of processing purposes”.[2] Consent requests for multiple purposes should “allow users to give specific consent for specific purposes”.[3]  Rather than conflate several purposes for processing, Europe’s regulators caution that “the solution to comply with the conditions for valid consent lies in granularity, i.e. the separation of these purposes and obtaining consent for each purpose”.[4] This draws upon GDPR, Recital 32.[5]
In short, consent requests must be granular, showing opt-ins for each distinct purpose.

How granular must consent opt-ins be?

In its 2013 opinion on “purpose limitation”, the Article 29 Working Party went some way toward defining the scope of a single purpose: a purpose must be “sufficiently defined to enable the implementation of any necessary data protection safeguards,” and must be “sufficiently unambiguous and clearly expressed.”[6]
The test is “If a purpose is sufficiently specific and clear, individuals will know what to expect: the way data are processed will be predictable.”[7] The objective is to prevent “unanticipated use of personal data by the controller or by third parties and in loss of data subject control [of these personal data]”.[8]
In short, a purpose must be specific, transparent and predictable.[9] It must be describable to the extent that the processing undertaken for it would not surprise the person who gave consent for it.
The process of showing an ad to a single person (in online behavioral advertising) involves the processing of personal data for several distinct purposes, by hundreds of different companies.
[accordion id=”video”] [accordion_item title=”Video: how personal data passes between companies in online behavioral advertising” parent_id=”video”]


[/accordion_item][/accordion]
Therefore, a broad, all-encompassing sentence such as “to show you relevant advertising” does not make it possible for one to grasp how one’s data will be used by a large number of companies. It would not be possible to understand from this sentence, for example, that inferences about one’s characteristics would be inferred, or what types of consequences may result.
The following table shows an indicative list of ten purposes for which personal data are currently processed in the online behavioral advertising system. In practice, there may be more purposes at play. The table also generalizes the types of company involved in the selection and display of an ad.
[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/purposes.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download high resolution PDF [/x_button]
A spreadsheet version of this table is available here.
(Refer to footnote 10 to for a discussion the challenges presented by these purposes for all businesses involved.[10])

Pre-consent naming of each controller, and granular post-consent controller consent withdrawal

Recital 42 of the GDPR notes that “For consent to be informed, the data subject should be aware at least of the identity of the controller and the purposes of the processing”.[11] All controllers (including “joint controllers” that “jointly determine the purposes and means of processing”[12]) must be named.[13]
Each purpose must be very clear, and each opt-in requires a “clear affirmative action” that is both “specific”, and “unambiguous”.[14] There can be no pre-ticked boxes,[15] and “consent based on silence” is not permitted.[16]
Therefore, a consent request should be made with granular options for each of these purposes, and the names each controller that processes personal data for each of these purposes. For example:  

Specific purpose 1 | controllers A, B, C | options: Accept / Refuse 

There are two different scenarios for how consent for these purposes will be presented: the best case, and the more likely worst case.

The best scenario

At a minimum, then, assuming that all websites, SSPs, Ad Exchanges, DSPs, DMPs, and advertisers could align to pursue only these purposes, a consent request for this would include granular opt-in controls for a wide range of diverse purposes, the categories of processor pursuing each, and a very long list of controller names pursuing each.
The language and presentation of the request must be simple and clear, ideally the result of user testing.[17]
A consent request for a single purpose, on behalf of many controllers, might look like this.

Specific processing purpose consent, for multiple controllers,
with “next” button for multiple processing purpose opt-ins

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

What is presented when?

The Article 29 Working Party suggests that consent notices should have layers of information so that they do not overload viewers with information, but make necessary details easily available.[18] This is adopted in the design above using “View details”, “Learn about your data rights here”, and similar buttons and links.
When a user clicks “view details” to see the next layer of information about a controller

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

While some details, such as contact details for a company’s data protection officer, can be placed in a secondary layer, the primary layer must include “all basic details of the controller and the data processing activities envisaged”.[19]
Elements presented in this layer

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

The likely scenario:

The scenario above assumes that all businesses in online behavioral advertising can agree to pursue tightly defined purposes without deviation. However, it is more likely that controllers will need granular opt-ins, because their purposes are unique.
Any individual controllers who intend to process data for their own unique purposes will need further granular opt-ins for these purposes. Since adtech companies tend to deviate from the common purposes outlined above, it is likely that most or all of them would ultimately require granular purpose consent for each controller.
However, even if all controllers pursued an identical set of purposes so that they could all receive consent via a single consent dialogue that contained a series of opt-ins, there would need to be a granular set of consent withdrawal controls that covered every single controller once consent had been given. The GDPR says that “the data subject may exercise his or her rights under this Regulation in respect of and against each of the controllers”.[20]

A higher bar: “explicit consent”

Processing of personal data in online behavioral advertising (for example, purposes 2, 3, 5, 8, and 10 in the table above) is highly likely to produce special categories of data by inference.[21] Where this occurs, these purposes require “explicit” consent.[22]
Special categories of data reveal “racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, … [and] data concerning health or data concerning a natural person’s sex life or sexual orientation”.[23] 
To make consent explicit requires more confirmation. For example, the Article 29 Working Party suggests that two-stage verification is a suitable means of obtaining explicit consent.[24] One possible approach to this is suggested in PageFair’s design below.
Suggested mechanism for “explicit consent” 

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

One can confirm one’s opt-in in a second movement of the finger, or cursor and click. It is unlikely that a person could confirm using this interface unless it was their intention.  

[x_button shape=”rounded” size=”small” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/01/consent-dialogues.pdf” info=”none” info_place=”top” info_trigger=”hover”]Download wireframes [/x_button]

Note that even this high bar, however, may not be permitted in some Member States. The GDPR gives European Member States the latitude to enact national legislation that prohibits consent as a legal basis for processing of special categories of data.[25] Therefore, it may not be legal to process any special categories of personal data in some EU Member States.

Conclusion 

Consent for website and app publishers is certainly an important objective, but the personal data it provides must only be processed after data leakage has been stopped. Data leakage (through in RTB bid requests, cookie syncs, JavaScript ad units, and mobile SDKs) exposes publishers as the most obviously culpable parties that regulators and privacy NGOs can target. At the same time, it also exposes their adtech vendors, and advertisers, to large fines and legal actions too.[26]
Websites, apps, and adtech vendors, should switch from using personal data to monetize direct and RTB advertising to “non-personal data”.[27] Using non-personal, rather than personal, data neutralizes the risks of the GDPR for advertisers, publishers, and adtech vendors. And it enables them to address the majority (80%-97%) of the audience that will not give consent for 3rd party tracking across the web.[28]
We recently revealed PageFair Perimeter, a regulatory firewall that blocks party data leakage, and enables publishers and adtech partners to use non-personal data for direct and RTB monetization when consent is absent (and leverage personal data when adequate consent has been given). You can learn more about Perimeter here. Publishers using Perimeter do not need people’s personal data (nor the consent required to process it) to monetize websites and apps.

[x_button shape=”rounded” size=”regular” float=”none” href=”http://pagefair.com/perimeter/” info=”none” info_place=”top” info_trigger=”hover”]Learn about Perimeter[/x_button]

Postscript

A hiccup in the choreography of the European Commission’s legislative proposals means that non-tracking cookies will need storage consent, at least until the application of the forthcoming ePrivacy Regulation. These cookies, however, contain no personal data, and obtaining consent for their storage is significantly less burdensome than obtaining consent for to process personal data for multiple purposes and multiple controllers. Update: 16 January 2018: See PageFair Insider note on storage consent for non-tracking cookies.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]


Notes:

[1] See our discussion of why consent is the appropriate legal basis for online behavioral advertising in “Why the GDPR ‘legitimate interest’ provision will not save you” , PageFair Insider, 13 March 2017 (URL: https://pagefair.com/blog/2017/gdpr-legitimate-interest/).
[2] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 11.
[3] ibid., p. 13.
[4] ibid., p. 11.
[5] Regulation (EU) 2016/679 of The European Parliament and of The Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), Recital 32. “…Consent should cover all processing activities carried out for the same purpose or purposes. When the processing has multiple purposes, consent should be given for all of them. …”
[6] “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013, p. 12.
Curiously, the Spanish Data Protection Authority has issued guidance that contains a sentence suggesting that continuing to browse a website might constitute consent, which is at odds with the Article 29 Working Party guidance on consent and appears to be entirely at odds with the text of the Regulation. See “Guía del Reglamento General de Protección de Datos para responsables de tratamiento”, Agencia Española de Protección de Datos, November 2017, p. 6.
[7] “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013, p. 13.
[8] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 12.
[9] “Opinion 03/2013 on purpose limitation”, Article 29 Working Party, 2 April 2013, p. 13.
[10] None of these purposes would be permissible unless data leakage were first addressed. See “Consent to use personal data has no value unless one prevents all data leakage”, PageFair Insider, October 2017 (URL: https://pagefair.com/blog/2017/understanding-data-leakage/). Furthermore,

  • Purpose 3 could not be permissible in any situation.
  • Purposes 2, 3, 5, 8, and 10 are highly likely to produce special categories of data by inference. See discussion of “explicit consent” in this note.
  • Regarding the purposes for which data have been sold, and to what category of customer, see “Data brokers: a call for transparency and accountability”, Federal Trade Commission, May 2014, pp 39-40, and B3-B

[11] The GDPR, Recital 42.
[12] The GDPR, Article 26, paragraph 1.
[13] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 14.
[14] The GDPR, Article 4, paragraph 11.
[15] ibid., Recital 32.
[16] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 16.
[17] “Guidelines on transparency under Regulation 216/679” Article 29 Working Party, November 2017, pp 8, 13.
[18] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 14.
[19] ibid., p. 15.
[20] The GDPR, Article 26, paragraph 3.
[21] “Informing data subjects is particularly important in the case of inferences about sensitive preferences and characteristics. The controller should make the data subject aware that not only do they process (non-special category) personal data collected from the data subject or other sources but also that they derive from such data other (and special) categories of personal data relating to them.” See “Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679”, Article 29 Working Party, 3 October 2017, p. 22.
[22] The GDPR, Article 9, paragraph 2, a.
[23] ibid., Article 9, paragraph 1.
[24] “Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 19.
[25] The GDPR, Article 9, paragraph 2, a.
[26] See “Consent to use personal data has no value unless one prevents all data leakage”, PageFair Insider, October 2017 (URL: https://pagefair.com/blog/2017/understanding-data-leakage/).
[27] Non-personal data are any data that can not be related to an identifiable person. As Recital 26 of the GDPR observes, “the principles of data protection should therefore not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable”. This recital reflects the finding of the European Court of Justice in 2016 that data are not personal “if the identification of the data subject was prohibited by law or practically impossible on account of the fact that it requires a disproportionate effort in terms of time, cost and manpower, so that the risk of identification appears in reality to be insignificant”. Judgment of the Court (Second Chamber) Patrick Breyer v Bundesrepublik Deutschland, Case C-582/14, 19 October 2016.
Non-tracking cookies, which contain no personal data, are useful for privacy-friendly advertising, and for other functions where an individual does not need to be identified such as A/B testing.
[28] See “Research result: what percentage will consent to tracking for advertising?”, PageFair Insider, September 2017 (URL: https://pagefair.com/blog/2017/new-research-how-many-consent-to-tracking/).
The granularity of consent required for online behavioral advertising will make the consenting audience even smaller. Moreover, consent for adtech will not only be hard to get, it will also be easy to lose. Consent can be withdrawn with the same degree of ease as it was given, under The GDPR, Article 7, paragraph 3.
The Article 29 Working Party demonstrates what this means in practice: “When consent is obtained … through only one mouse-click, swipe, or keystroke, data subjects must … be able to withdraw that consent equally as easily”.
“Guidelines on consent under Regulation 2016/679”, Article 29 Working Party, 28 November 2017, p. 21.
The guidance also says that “Where consent is obtained through use of a service specific user interface (for example, via a website, an app, a log-on account, the interface of a IoT device or by e-mail), there is no doubt a data subject must be able to withdraw consent via the same electronic interface, as switching to another interface for the solve reason of withdrawing consent would require undue effort”.

DSP 'contextual' targeting offers solution to strict GDPR regulations

Programmatic online advertising will not cease to exist because of the GDPR or the proposed ePrivacy Regulation. Personal data may seem essential to digital advertising, but it is not the only way to target a relevant audience. Targeting based on context was a reliable method for decades before we came to rely on collecting and cross-referencing vast amounts of intrusive data. If data leakage is solved then contextual targeting is a quick fix for the industry as it struggles to adapt to GDPR.
Brands, and agencies acting on behalf of brands, use Demand Side Platforms (DSPs) to place advertising. Analysis of every leading DSP’s marketing material reveals that they offer ‘contextual’ targeting that requires no personal data, in addition to ‘behavioural’ targeting that relies on personal data. In other words, DSPs are already able to target based on content and context without exposing themselves and brands to the legal hazard of using personal data. This provides a safety net for brands, who can continue to use DSPs to reach audiences once the GDPR and ePrivacy Directive apply, provided those DSPs are using contextual targeting. (For context on where DSPs fit within the complicated structure of advertising technology see the Display Lumascape).

DSP Offers contextual targeting
AdMantx YES
MediaMath YES
InviteMedia (by Google) YES
Turn YES
Data XU YES
EfficientFrontier (Adobe) YES
theTradeDesk YES
Chango (Rubicon Project) YES
Simpli.fi YES
Sitescout YES
Digilant YES
Acuity YES
AdBuyer.com (MBuy) YES
CTRL/SHIFT YES
Brandscreen No information available*
Choozle YES

* Brandscreen was reported as having again been placed in administration in late-2016 and we could find no clear information on its current status.
Moving on to companies found under other sections of the the Lumascape, but which actually operate as DSPs, TubeMogul – listed under Ad Networks and recently acquired by Adobe – calls contextual advertising an “essential component of programmatic advertising” and uses external partners to enable dynamic analysis of URLs to determine context and content. In fact, external partners such as Grapeshot and Peer39 are behind much of the contextual targeting carried out by DSPs.
Whether DSPs implement their own solutions for contextual targeting or use external partners, forthcoming EU regulations will not mean the end of programmatic advertising.
And that may be a good thing, especially for brands.
Selecting where to place an ad solely based on context enables an advertiser to avoid the minefield of the GDPR or the ePrivacy Directive and is safer for brands, with analysis of content offering the potential to weed out sites that exist to farm ads or include unsuitable content, while avoiding any impression that the brand is involved in the kind of invisible user tracking that is becoming increasingly unpopular around the world.
If you work for one of these DSPs or in another section of the Lumascape, please tell us what you’re doing to future-proof your company against the GDPR and ePrivacy Directive. Leave a comment below, tweet at @PageFair or email us at press@pagefair.com
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Why the GDPR 'legitimate interest' provision will not save you

The “legitimate interest” provision in the GDPR will not save behavioral advertising and data brokers from the challenge of obtaining consent for personally identifiable data.
As previous PageFair analysis illustrates, personal data will become toxic except where it has been obtained and used with consent once the General Data Protection Regulation is applied in May 2018.
[prompt type=”left” title=”Access the GDPR/ePR repository” message=”A repository of GDPR and ePrivacy Regulation explainers, official docs, and current status.” button_text=”Access Now” href=”https://pagefair.com/datapolicydocs/”]
Even so, many advertising intermediaries believe that they can continue to use personal data without consent because of an apparent carve-out related to “legitimate interest” contained in the GDPR. This is a false hope.
Legitimate interest
The GDPR does indeed provide for “legitimate interest” as a legal basis for using personal data without obtaining consent.[1] A legitimate interest provision was also included in the previous Data Protection Directive 95/46/EC.[2] However, the GDPR now includes an explicit mention of direct marketing as a legitimate interest (in Recital 47),[3] which has lured many adtech businesses into the comfortable but erroneous supposition that they will not have to ask people for permission use their personal data.
A legitimate interest is a clearly articulated benefit to a single company, or to society as a whole,[4] that can be derived from processing personal data in a lawful way.[5] However, the Article 29 Working Party of data protection authorities of EU countries has already made it clear that merely having a legitimate interest does not entitle one to use personal data.[6]
The objective of the “legitimate interest” provision is to give controllers “necessary flexibility for data controllers for situations where there is no undue impact on data subjects”.[7] The Article 29 Working Party cautioned that it is not to be used “on the basis that it is less constraining than the other grounds”.[8] In other words, it is not a get-out-of-jail-free card.
Under the Data Protection Directive that preceded the GDPR some EU countries viewed it as “an ‘open door’ to legitimize any data processing which does not fit in one of the other legal grounds.”[9] This will end with the GDPR, which harmonizes the approach across all the countries of the European Union.
The balancing test 
Article 6 (f) of the GDPR includes the following important caveat: “except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject”.[10] In other words, a business that intends to use personal data must balance its legitimate interest not only against the rights of the data subject, which is a significant test in itself,[11] but also the data subject’s interests, irrespective of whether these interests are legitimate or not.[12] Any company that hopes to use legitimate interest also bears the onus for demonstrating that its interest is favored in such a balancing test.[13] 
This is not a figurative exercise. The Article 29 Working Party cautions that the balancing test should be documented in such a way that data subjects, data authorities, and the courts can examine.[14] It should encompass a broad range of factors[15] including “any possible (potential or actual) consequences of data processing”.[16] This would include, for example, “broader emotional impacts” and the “chilling effect on … freedom of research or free speech, that may result from co­ntinuous monitoring/tracking”.[17] 
The test also must consider the manner in which personal data are processed. For example,

“whether large amounts of personal data are processed or combined with other data (e.g. in the case of profiling…). Seemingly innocuous data, when processed on a large scale and combined with other data may lead to inferences about more sensitive data”.[18] 

Europe’s data protection authorities take a dim view of such large scale processing: ­­­­

“Such analysis may lead to uncanny, unexpected, and sometimes also inaccurate predictions, for example, concerning the behavior or personality of the individuals concerned. Depending on the nature and impact of these predictions, this may be highly intrusive to the individual’s privacy”.[19] 

A further factor in the balancing test is mentioned in Recital 47 of the GDPR: “…taking into consideration the reasonable expectation of data subjects based on their relationship to the controller”.[20] A business involved in digital advertising must ask the following question: Is it reasonable to assume that a regular person who peruses the web expects that their behavior is being tracked and measured, consolidated across devices, and that the results of these operations are being traded between different companies that he or she has never heard of, and retained for further trading and consolidation over considerable periods of time?
Behavioral advertising and data-brokering must be based on consent 
The legitimate interest provision in the GDPR sets a high bar. Indeed, the Working Party’s concern about the negative impacts of personal data misuse is so broad as to encompass those that result from many cumulative actions, and where “it may be difficult to identify which processing activity by which controller played a key role”.[21] This is bad news for the cascade of cookie syncing and data trading typical of behavioral advertising.
The Article 29 Working Party has considered what a balancing test would yield where behavioral advertising is concerned. It concluded that “consent should be required, for example, for tracking and profiling for purposes of … behavioral advertising, data-brokering, … [and] tracking-based digital market research”.[22]
The Working Party regards the balance as follows: “the economic interest of business organizations to get to know their customers by tracking and monitoring their activities online and offline” must be balanced “against the (fundamental) rights to privacy and the protection of personal data of these individuals and their interest not to be unduly monitored”.[23]
Consent – and nothing short of it – is the necessary legal basis for processing personally identifiable for behavioral advertising.
Two options  
Therefore, hundreds of adtech companies, who who cannot legitimately obtain the personal data they depend on, are facing a huge challenge. There are two categories of options.
Option 1. Invest heavily in obtaining consent
For the majority of advertising intermediaries this will require reaching an accommodation with publishers who have direct and trusted relationships with end-users. Whatever this accommodation is, it is likely to tip the balance of power away from adtech and back in favor of publishers. Publishers may recover some of the marketing spend that they lost to the many advertising technology companies of the Lumascape in the shift to digital. As we have suggested previously, mergers with, or acquisition of, media properties may be one way for global advertising holding companies to buy trusted first party relationships with end-users, and establishing a means of requesting end-users consent.
Option 2. Avoid the GDPR’s liabilities and regulatory overhead with a no personally identifiable data approach
Programmatic and behavioral advertising are possible without personally identifiable data. A personal data firewall can free brands and intermediaries from the GDPR’s new liabilities and regulatory overhead by anonymizing data while delivering relevant advertising.
We will be writing more about this.
 
Invitation:
RightsCon, Brussels, March 29, 5.15pm – 6.15pm
I will be on the EDRi panel at RightsCon, alongside representatives of the European Data Protection Supervisor and the IAB. Please come and say hello.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Notes

[1] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Article 6, paragraph 1, f.
[2] Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, Article 7 (f).
[3] “The legitimate interests of a controller, including those of a controller to which the personal data may be disclosed, or of a third party, may provide a legal basis for processing, provided that the interests or the fundamental rights and freedoms of the data subject are not overriding, taking into consideration the reasonable expectations of data subjects based on their relationship with the controller. Such legitimate interest could exist for example where there is a relevant and appropriate relationship between the data subject and the controller in situations such as where the data subject is a client or in the service of the controller. At any rate the existence of a legitimate interest would need careful assessment including whether a data subject can reasonably expect at the time and in the context of the collection of the personal data that processing for that purpose may take place. The interests and fundamental rights of the data subject could in particular override the interest of the data controller where personal data are processed in circumstances where data subjects do not reasonably expect further processing. Given that it is for the legislator to provide by law for the legal basis for public authorities to process personal data, that legal basis should not apply to the processing by public authorities in the performance of their tasks. The processing of personal data strictly necessary for the purposes of preventing fraud also constitutes a legitimate interest of the data controller concerned. The processing of personal data for direct marketing purposes may be regarded as carried out for a legitimate interest.” Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Recital 47.
[4] Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC, 9 April 2014, p. 10.
[5] ibid., pp 10-11.
[6] ibid., p. 25.
[7] ibid., p. 10.
[8] ibid,, p. 3.
[9] ibid., p. 5.
[10] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Article 6, para 1 (f).
[11] Data protection is a fundamental right in European Law. Article 8 of The European Charter of Fundamental Rights enshrines the right of every citizen to “the protection of personal data concerning him or her”. The European Union Charter of Fundamental Rights, Article 8, paragraph 1. “Such data must be processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law”. The European Union Charter of Fundamental Rights, Article 8, paragraph 2.
[12] Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC, 9 April 2014, p. 9, 30.
[13] ibid., p. 52.
[14] ibid., p. 43, 53-54.
[15] ibid., pp 33, 50-51, 55-56.
[16] ibid., p. 37.
[17] ibid., p. 37.
[18] ibid., p. 39.
[19] ibid., p. 39.
[20] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Recital 47.
[21] Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC, 9 April 2014, p. 37.
[22] ibid., p. 46.
[23] ibid.

Why the GDPR ‘legitimate interest’ provision will not save you

The “legitimate interest” provision in the GDPR will not save behavioral advertising and data brokers from the challenge of obtaining consent for personally identifiable data.

As previous PageFair analysis illustrates, personal data will become toxic except where it has been obtained and used with consent once the General Data Protection Regulation is applied in May 2018.
[prompt type=”left” title=”Access the GDPR/ePR repository” message=”A repository of GDPR and ePrivacy Regulation explainers, official docs, and current status.” button_text=”Access Now” href=”https://pagefair.com/datapolicydocs/”]
Even so, many advertising intermediaries believe that they can continue to use personal data without consent because of an apparent carve-out related to “legitimate interest” contained in the GDPR. This is a false hope.

Legitimate interest

The GDPR does indeed provide for “legitimate interest” as a legal basis for using personal data without obtaining consent.[1] A legitimate interest provision was also included in the previous Data Protection Directive 95/46/EC.[2] However, the GDPR now includes an explicit mention of direct marketing as a legitimate interest (in Recital 47),[3] which has lured many adtech businesses into the comfortable but erroneous supposition that they will not have to ask people for permission use their personal data.

A legitimate interest is a clearly articulated benefit to a single company, or to society as a whole,[4] that can be derived from processing personal data in a lawful way.[5] However, the Article 29 Working Party of data protection authorities of EU countries has already made it clear that merely having a legitimate interest does not entitle one to use personal data.[6]

The objective of the “legitimate interest” provision is to give controllers “necessary flexibility for data controllers for situations where there is no undue impact on data subjects”.[7] The Article 29 Working Party cautioned that it is not to be used “on the basis that it is less constraining than the other grounds”.[8] In other words, it is not a get-out-of-jail-free card.

Under the Data Protection Directive that preceded the GDPR some EU countries viewed it as “an ‘open door’ to legitimize any data processing which does not fit in one of the other legal grounds.”[9] This will end with the GDPR, which harmonizes the approach across all the countries of the European Union.

The balancing test 

Article 6 (f) of the GDPR includes the following important caveat: “except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject”.[10] In other words, a business that intends to use personal data must balance its legitimate interest not only against the rights of the data subject, which is a significant test in itself,[11] but also the data subject’s interests, irrespective of whether these interests are legitimate or not.[12] Any company that hopes to use legitimate interest also bears the onus for demonstrating that its interest is favored in such a balancing test.[13] 

This is not a figurative exercise. The Article 29 Working Party cautions that the balancing test should be documented in such a way that data subjects, data authorities, and the courts can examine.[14] It should encompass a broad range of factors[15] including “any possible (potential or actual) consequences of data processing”.[16] This would include, for example, “broader emotional impacts” and the “chilling effect on … freedom of research or free speech, that may result from co­ntinuous monitoring/tracking”.[17] 

The test also must consider the manner in which personal data are processed. For example,

“whether large amounts of personal data are processed or combined with other data (e.g. in the case of profiling…). Seemingly innocuous data, when processed on a large scale and combined with other data may lead to inferences about more sensitive data”.[18] 

Europe’s data protection authorities take a dim view of such large scale processing: ­­­­

“Such analysis may lead to uncanny, unexpected, and sometimes also inaccurate predictions, for example, concerning the behavior or personality of the individuals concerned. Depending on the nature and impact of these predictions, this may be highly intrusive to the individual’s privacy”.[19] 

A further factor in the balancing test is mentioned in Recital 47 of the GDPR: “…taking into consideration the reasonable expectation of data subjects based on their relationship to the controller”.[20] A business involved in digital advertising must ask the following question: Is it reasonable to assume that a regular person who peruses the web expects that their behavior is being tracked and measured, consolidated across devices, and that the results of these operations are being traded between different companies that he or she has never heard of, and retained for further trading and consolidation over considerable periods of time?

Behavioral advertising and data-brokering must be based on consent 

The legitimate interest provision in the GDPR sets a high bar. Indeed, the Working Party’s concern about the negative impacts of personal data misuse is so broad as to encompass those that result from many cumulative actions, and where “it may be difficult to identify which processing activity by which controller played a key role”.[21] This is bad news for the cascade of cookie syncing and data trading typical of behavioral advertising.

The Article 29 Working Party has considered what a balancing test would yield where behavioral advertising is concerned. It concluded that “consent should be required, for example, for tracking and profiling for purposes of … behavioral advertising, data-brokering, … [and] tracking-based digital market research”.[22]

The Working Party regards the balance as follows: “the economic interest of business organizations to get to know their customers by tracking and monitoring their activities online and offline” must be balanced “against the (fundamental) rights to privacy and the protection of personal data of these individuals and their interest not to be unduly monitored”.[23]

Consent – and nothing short of it – is the necessary legal basis for processing personally identifiable for behavioral advertising.

Two options  

Therefore, hundreds of adtech companies, who who cannot legitimately obtain the personal data they depend on, are facing a huge challenge. There are two categories of options.

Option 1. Invest heavily in obtaining consent

For the majority of advertising intermediaries this will require reaching an accommodation with publishers who have direct and trusted relationships with end-users. Whatever this accommodation is, it is likely to tip the balance of power away from adtech and back in favor of publishers. Publishers may recover some of the marketing spend that they lost to the many advertising technology companies of the Lumascape in the shift to digital. As we have suggested previously, mergers with, or acquisition of, media properties may be one way for global advertising holding companies to buy trusted first party relationships with end-users, and establishing a means of requesting end-users consent.

Option 2. Avoid the GDPR’s liabilities and regulatory overhead with a no personally identifiable data approach

Programmatic and behavioral advertising are possible without personally identifiable data. A personal data firewall can free brands and intermediaries from the GDPR’s new liabilities and regulatory overhead by anonymizing data while delivering relevant advertising.

We will be writing more about this.

 

Invitation:

RightsCon, Brussels, March 29, 5.15pm – 6.15pm

I will be on the EDRi panel at RightsCon, alongside representatives of the European Data Protection Supervisor and the IAB. Please come and say hello.

[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Notes

[1] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Article 6, paragraph 1, f.

[2] Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, Article 7 (f).

[3] “The legitimate interests of a controller, including those of a controller to which the personal data may be disclosed, or of a third party, may provide a legal basis for processing, provided that the interests or the fundamental rights and freedoms of the data subject are not overriding, taking into consideration the reasonable expectations of data subjects based on their relationship with the controller. Such legitimate interest could exist for example where there is a relevant and appropriate relationship between the data subject and the controller in situations such as where the data subject is a client or in the service of the controller. At any rate the existence of a legitimate interest would need careful assessment including whether a data subject can reasonably expect at the time and in the context of the collection of the personal data that processing for that purpose may take place. The interests and fundamental rights of the data subject could in particular override the interest of the data controller where personal data are processed in circumstances where data subjects do not reasonably expect further processing. Given that it is for the legislator to provide by law for the legal basis for public authorities to process personal data, that legal basis should not apply to the processing by public authorities in the performance of their tasks. The processing of personal data strictly necessary for the purposes of preventing fraud also constitutes a legitimate interest of the data controller concerned. The processing of personal data for direct marketing purposes may be regarded as carried out for a legitimate interest.” Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Recital 47.

[4] Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC, 9 April 2014, p. 10.

[5] ibid., pp 10-11.

[6] ibid., p. 25.

[7] ibid., p. 10.

[8] ibid,, p. 3.

[9] ibid., p. 5.

[10] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Article 6, para 1 (f).

[11] Data protection is a fundamental right in European Law. Article 8 of The European Charter of Fundamental Rights enshrines the right of every citizen to “the protection of personal data concerning him or her”. The European Union Charter of Fundamental Rights, Article 8, paragraph 1. “Such data must be processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law”. The European Union Charter of Fundamental Rights, Article 8, paragraph 2.

[12] Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC, 9 April 2014, p. 9, 30.

[13] ibid., p. 52.

[14] ibid., p. 43, 53-54.

[15] ibid., pp 33, 50-51, 55-56.

[16] ibid., p. 37.

[17] ibid., p. 37.

[18] ibid., p. 39.

[19] ibid., p. 39.

[20] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Recital 47.

[21] Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC, 9 April 2014, p. 37.

[22] ibid., p. 46.

[23] ibid.

The Need to Know Adblock Slidedeck (updated!)

Dr Johnny Ryan of PageFair presented at The Advertising Research Foundation in New York this month.

This presentation includes

  • The latest adblock figures globally and for the US
  • Demographic discussion of who adblock users are
  • Options for media owners to address adblocking, from access restriction to tamper-proof ad serving
  • Unexpected benefits for marketers from the adblocking crisis

[x_video_embed type=”16:9″]

 

 

[/x_video_embed]

He was speaking alongside Omnicom, Annalect, and Intel.

Ten Key Things That Happened in Q4

Amid the blizzard of press releases and conference tidbits concerning media, advertising, and adblocking, only some really matter.
Here are the ten key things that happened in Q4.

OCTOBER

1. US Department of Justice examines possible agency shenanigans. 

It transpired that the US Department of Justice had launched an investigation into rigged bids that unfairly favored advertising agencies’ in-house services over others, at clients’ expense. The Association of National Advertisers’ report into agency kickbacks, released in June, exposed agency practises that shortchanged clients, and several big brand CMOs launched audits of their agencies. But the DOJ investigation now raises the stakes for agency executives: previous investigations in 2002 resulted in prison sentences.

2. Media consolidation

AT&T agreed a deal to purchase Time Warner for $85.4 Billion. Meanwhile, Verizon’s acquisition of Yahoo! was troubled by the news that half a billion Yahoo! user accounts had been compromised by hackers.

NOVEMBER

3. Fake news, boycotts, and trustworthy media 

The US election upset prompted a focus on “fake news”, and the advertising that supports it. One episode stood out amid the wider media turmoil: Kellogg’s, a large food brand, removed its advertising from Breitbart, a website associated with the “alt-right”. In a singularly bizarre countermove Breitbart called for a boycott of the brand while at the same time soliciting advertising from other brands. The episode confirmed yet again the utmost value of trustworthy media in a changing information economy. 
screen_shot_2016-12-09_at_07-49-09

Image: Breitbart advertisement.

4. Facebook revealed upside to its stance on adblocking

Facebook released earnings figures for Q3 (read earnings call transcript here) that showed that its decision to deploy ads that could not be tampered with by adblockers had increased its desktop ad revenue by 9% in the second half of Q3. Facebook is likely to make an additional three quarters of a billion dollars by showing tamper-proof advertising in 2017.

5. Google and Facebook dominated advertising revenue growth 

Jason Kint, the influential CEO of publisher trade group Digital Content Next, parsed PwC revenue figures and concluded that although spending on digital advertising had increased by 19% year-over-year in the first half of 2016, this was entirely accounted for by Google and Facebook’s growing revenues. Established media had actually declined.

6. Further consolidation 

Adtech & martech companies consolidated. Krux, a data warehouse, was acquired by SalesForce for $700 million. TubeMogul, a system that advertisers use to buy video ads, was acquired by Adobe for $540 million. Criteo also completed its acquisition of e-commerce advertising firm HookLogic for $250 million.

7. Facebook closes its ad server 

Meanwhile, Facebook announced that it would close its ad server, Atlas, which was used by advertisers to understand who was seeing their advertising campaigns. Instead, Facebook will support other ad servers to apply Facebook’s 1st party understanding of its users to measure the efficacy of advertisers’ campaigns.

DECEMBER

8. Amazon strengthens its advertising business 

Amazon strengthened its advertising business by launching two new services for publishers: “Transparent Ad Marketplace” and “Shopping Insights Service”. The marketplace handles header bidding in the cloud and allows publishers to take bids from more parties for their visitors’ attention without slowing page load. The insights service lets publishers use some of Amazon’s own extensive 1st party user data to better understand who is visiting their own websites.

9. EU privacy reforms hint at global shift 

Privacy reforms in the European Union continued to create more protections for users on the one hand, and existential challenges for the Lumascape on the other. eustomp800Three developments are worth highlighting.

  • First, the WFA’s CEO signalled that that the big brands are likely to apply the privacy standards due to be introduced in the EU’s General Data Protection Regulation across the globe.
  • Second, the Article 29 Working Party of data protection regulators published guidance on the new data protection officers that businesses providing services in the EU are required to appoint: these officers must have tenure and report to the highest levels.
  • Third, a draft text of the new ePrivacy Directive was leaked to Politico. A finalised proposal is only due from the European Commission next week, and will then be subject to a three-way negotiation between Commission officials, European Parliament representatives, and the Council of Member State Ministers. Nonetheless, the leaked draft indicates the Commission’s strong stance on consumer privacy: among its many reforms are measures that will essentially kill 3rd party cookies.

10. WhiteOps revealed a colossal new ad fraud 

WhiteOps released details of a colossal ad fraud operation in late December: “Methbot” generates $3-$5 million every day by defrauding brands that spend on video ads. The fraud is two sided. Counterfeit websites mimic thousands of genuine premium sites and request ads from networks. In parallel, hundreds of servers operate automated web browser sessions (complete with with fake mouse movements and fake social network logins) to simulate the viewing of 200-300 million video ads every day. This release will give pause to any who assume that ad fraud is not a enormous problem. 
Q4 2016 in one sentence…
Agencies faced scrutiny not only from wary CMOs but also from the DOJ, ad fraud graduated to a new level, new privacy standards threaten 3rd party tracking, Facebook revealed an immediate financial upside of its stance on adblocking, and Google and Facebook dominated advertising revenue growth.

You may be interested in our roundup of the previous quarter >

Sign up to PageFair Insider to get updates

Publishers – your only weapon is trust

This post was first published on Digital Content Next.
Adblocking—and publisher responses to it—sit at the nexus of two trends: the increasing value of trust in the publisher-consumer relationship, and the emerging conditions of the new information market.
I wrote some years ago that the information market had been turned on its head. The Internet turns many types of information that were once scarce and expensive into overabundant—and therefore cheap—commodities. By corollary, trust and attention have become increasingly valuable.
In short: As information becomes cheap, trust becomes precious.
Ryans-loose-theory-of-TMI-1200px
This generation suffers from the crushing pressure of information overload. Consumers trust premium publishers to help them cut through the white noise and present information and media that are worth spending their limited attention on. As the information deluge continues to swell over time this trust will become even more valuable.
We are witnessing the erosion of this trust and of the fair deal between users and content creators as millions of consumers install adblockers. Publishers must choose between two conflicting options at this historic moment: to reinforce trust by addressing consumer grievances; or to ignore those grievances and restrict consumer choice.
This is a human question rather than a technological one. Adblocking will soon be technologically irrelevant. PageFair has the technology to serve ads in a manner that adblockers cannot circumvent, and some of our competitors claim similar abilities. Indeed, publishers can defeat adblocking by serving ads from their own editorial content systems. But the ability to serve ads should not prevent publishers from addressing the genuine consumer grievances that caused the rise of adblocking.
These consumer grievances that caused adblocking are now widely understood and can be summarized thus: aggressive ads obscure content, infringe on user privacy, hoover up bandwidth and thereby add expense to users’ data plans, slow page load times, and expose users’ devices to easily avoidable security hazards.
Ignoring these grievances and erecting an ‘adblock wall’ that prevents visitors from entering one’s web site until they switch off their adblockers is a lost opportunity to rebuild trust with the user. In the longer term this tactic restricts user choice on the open web and harms publishers—with few exceptions—by causing large numbers of users to go to other sites.
The ultimate power to leave any website that does not live up to their expectations has always been in consumers’ hands. It is a mistake to think of an adblock wall as a mechanism to give the consumer choice rather than restrict their access.
The ‘blocked web’, the portion of the web where users block ads, is steadily growing. This is creating a new, premium space that brand marketers (and therefore publishers) can ill afford to ignore—particularly since it is uncluttered and unaffected by ad fraud. But simply reinserting ads on the blocked web without addressing the consumers’ grievances is a mistake that will undermine the trust between publisher and user.
two-webs-1200px
Publishers must make sure that the technological solutions they employ to serve ads on the blocked web also solve the speed, privacy, and UX issues that caused adblocking in the first place.
It is worth describing the approach that we have taken. Whether or not a publisher opts for PageFair or some other solution, the following are requisite elements in a sustainable response on the blocked web, irrespective of whatever one is doing on the normal web.
First, we decided to take a strong consumer-friendly approach on privacy. For example, our analytics respect the Do Not Track standard. Advertising can be relevant and at the same time be respectful of user’s data.
Second, we adopted a robust approach to security. “Malvertising” incidents are only possible because hackers can programmatically book ads to deliver their malicious JavaScript into the user’s browser. In programmatic and elsewhere we nullify this problem by executing active JavaScript in a safe sandbox on our servers, not in the user’s browser.
Third, we learned how to limit the file size of ads to guarantee better page load times and protect the user from undue charges from their bandwidth provider. We also committed not to work with any ad formats that unexpectedly interrupt the main content of websites.
This is an approach built on the trust that users have the choice of what site to visit, and that publishers have the responsibility to protect users’ data, bandwidth, security and experience.
This is not re-insertion, it is reinvention. Showing respectful ads in a way that protects consumers from hacking, data snooping, and unwanted data plan fees will start to re-establish trust on the blocked web.

CPM, CPC, CPA: Ad pricing models explained

While documenting the journey of first-time publishers, we uncovered that comparing pricing models can be a bit difficult. This post gives a brief overview of each pricing model and explains how to compare their revenue potential.

CPM – payment when an ad is seen

CPM (cost per mille or thousand impressions) is the granddaddy of them all- a pricing model that existed long before the advent of the internet. Under this pricing model the publisher is paid every time a website visitor sees an ad. It’s commonly used where an advertiser wants a branding campaign; the focus is on raising consumer awareness of a company or product rather than persuading them to buy right now.

Read more