Posts

PageFair writes to all EU Member States about the ePrivacy Regulation

This week PageFair wrote to the permanent representatives of all Member States of the European Union in support for the proposed ePrivacy Regulation.
Our remarks were tightly bounded by our expertise in online advertising technology. We do not have an opinion on how the proposed Regulation will impact other areas.
The letter addresses four issues:

  1. PageFair supports the ePrivacy Regulation as a positive contribution to online advertising, provided a minor amendment is made to paragraph 1 of Article 8.
  2. We propose an amendment to Article 8 to allow privacy-by-design advertising. This is because the current drafting of Article 8 will prevent websites from displaying privacy-by-design advertising.
  3. We particularly support the Parliament’s 96th and 99th amendments. These are essential to enable standard Internet Protocol connections to be made in many useful contexts that do not impact of privacy.
  4. We show that tracking is not necessary for the online advertising & media industry to thrive. As we note in the letter, behavioural online advertising currently accounts for only a quarter of European publishers’ gross revenue.

[x_button shape=”rounded” size=”regular” float=”none” href=”https://pagefair.com/wp-content/uploads/2018/03/PageFair-letter-on-ePrivacy-to-perm-reps-13-March-2018.pdf” info=”none” info_place=”top” info_trigger=”hover”]Read the letter [/x_button]

The digital economy requires a foundation of trust to enable innovation and growth. The enormous growth of adblocking (to 615 million active devices) across the globe proves the terrible cost of not regulating. We are witnessing the collapse of the mechanism by which audiences support the majority of online news reports, entertainment videos, cartoons, blogs, and cat videos that make the Web so valuable and interesting. Self-regulation, lax data protection and enforcement have resulted in business practices that promise a bleak future for European digital publishers.
Therefore, we commend the Commission and Parliament’s work thus far, and wish the Council (of Ministers of the Member States) well in their deliberations.

Here is what GDPR consent dialogues could look like. Will people click yes?

THIS NOTE HAS NOW BEEN SUPERSEDED BY A A MORE RECENT PAGEFAIR INSIDER NOTE ON GDPR CONSENT DIALOGUES. PLEASE REFER TO THE NEW NOTE. 
This note presents sketches of GDPR consent dialogues, and invites readers to participate in research on whether people will consent. 
[x_alert heading=”Note” type=”info”]It is important to note that the dialogue presented in this note is only a limited consent notice. It asks to track behaviour on one site only, and for one brand only, in addition to “analytics partners”. This notice would not satisfy regulators if it were used to cover the vast chain of controllers and processors involved in conventional behavioural targeting.[/x_alert]

Consent requests

In less than a year the General Data Protection Regulation (GDPR) will force businesses to ask Internet users for consent before they can use their personal data. Many businesses lack a direct channel to users to do this. Therefore, it is likely that they will have to ask publishers to seek consent on their behalf.
This is a sketch of what a GDPR consent request by a publisher on behalf of a third party may look like, with references to the elements required in the GDPR.

Update: it is important to note that this is a limited consent notice. It asks to track behaviour on one site only, and for one brand only, in addition to “analytics partners”. This notice would not satisfy regulators if it were used to cover the vast chain of controllers and processors involved in conventional behavioural targeting.
[accordion id=”my-accordion”] [accordion_item title=”Click to expand: Information that data subjects must be given in GDPR-compliant consent requests.” parent_id=”my-accordion” open=”false”]
Businesses will have to provide the following information to internet users when seeking their consent.

  • Who is collecting the data, and how to contact them or their European representative.
  • What the personal information are being used for, and the legal basis of the data processing.
  • The “legitimate interest” of the user of the data (This refers to a legal basis that may be used by direct marketing companies).
  • With whom the data will be shared.
  • Whether the controller intends to transfer data to a third country, and if so has the European Commission deemed this country’s protections adequate or what alternative safeguards or rules are in place.
  • The duration of storage, or the criteria used to determine duration.
  • That the user has the right to request rectification to mistakes in this personal information.
  • That the user has the right to withdraw consent.
  • How the user can lodge a complaint with the supervisory authority.
  • What the consequences of not giving consent might be.
  • In cases of automated decision-making, including profiling, what the logic of this process is, and what the significance of the outcomes may be.

[/accordion_item] [/accordion]
What percentage of people are likely to click “OK”?

Tracking preferences

In addition to the consent requirements in the GDPR, the forthcoming ePrivacy Regulation requires that users be presented with a menu of tracking preferences when first they install a browser or setup a new system that connects to the Internet. See a sketch of this menu below.

The menu above is as it might have appeared under the original proposal from the European Commission, in January 2017. However, the European Parliament is developing amendments to the Commission’s proposal. Below is a sketch of the menu as it might appear under the latest text from June 2017.

Notice that “accept only first party tracking” is pre-selected. This is because Recital 23 in the current draft stipulates that the default setting should prevent “cross-domain tracking” by third-parties. Click here to see an animated version of these menu designs.
This menu may change again as the Regulation is further developed. But assuming that some version of this tracking preferences menu becomes law across the European Union, how many people can be expected to opt back into tracking for online advertising?
We would like to find out, and reveal the answer.

PageFair Research

We are surveying sample industry-insiders’ insights into this question. Your shared insights may illuminate this issue. Please click the button below to take the survey.
We have designed the survey to take 70 seconds to complete.
Update: Click here to see the results of this survey.
Thank you for your input.
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Risks to brands under new EU regulations

Brands face serious new risks under the GDPR and the ePrivacy Regulation (ePR), and agencies will not be able to shield them. This note explains why, and describes what these risks are. 
When the GDPR and the ePrivacy Regulation (ePR) apply a year from now brands that use personal data in their marketing campaigns will become exposed to new legal risks, irrespective of their arrangements with ad agencies. Though the new rules are European, the exposure will be global.
[prompt type=”left” title=”Access the GDPR/ePR repository” message=”A repository of GDPR and ePrivacy Regulation explainers, official docs, and current status.” button_text=”Access Now” href=”https://pagefair.com/datapolicydocs/”]
Brands are directly exposed for two reasons.

Why agencies can not shield brands

The first reason is legal. The first reason is that the text of the General Data Protection Regulation (GDPR) says that “each controller or processor shall be held liable for the entire damage”, where more than one controller or processor are “involved in the same processing”[1]. In other words, all parties involved in the use of personal data are fully liable. A brand is safe from this liability only if it can prove that it was “not in any way responsible for the event giving rise to the damage”.[3] 
The second reason is financial. The administrative fines provided for in the GDPR (and in the proposed ePR) rise to 4% of “total worldwide annual turnover of the preceding financial year”.[4] Apple, for example, will be exposed to €7.6 billion if it or service providers acting on its behalf misuse personal data.[5] For P&G the exposure will be €2.3 billion, based on its 2016 turnover,[6] and for Unilever the exposure will be €2.1 billion.[7] It is not known yet whether data protection authorities will apply these maximum fines, but this new risk is now defined in the text of the Regulations. This potential exposure is so great that agencies can not adequately indemnify brands against losses.

New pressure on brand-agency relationship

This introduces new pressures to the relationship between brands and agencies. Brand-agency contracts generally include limited indemnities. As brands become aware of the risk to which they are about to be exposed, they will expect agencies to take on far greater liability. As a result, contract negotiations between brands and agencies will become fraught.This has already happened in the cloud services industry, where both service providers and clients are acutely aware of the risks and argue fiercely over indemnity.
Not only will advertising agencies be unwilling to take on vast, new liabilities to protect their clients, they may be unable to do so too. Agencies may be inadequately covered by their insurers to offer adequate indemnities to brands. Indeed the exposure may be so great that the question may not be whether an insurer can cover the agency, but whether a re-insurance provider can cover an insurer to do so. The data leakage inherent in the online behavioural advertising system means that insurance and reinsurance companies are likely to take a dim view of the risk involved.

AnchorThree causes of brand risk

This leads to the question of what things expose brands to risk. There are three things to consider.
1. The brand’s own personal data 
The first type of exposure comes from how brands directly obtain and use personal data. They are exposed in several ways, including if they use personal data that are not compliant,[8] or if their web sites leak these data, or if the personal data they hold are otherwise breached. These causes are immediately decipherable, and are easily remedied.
Anchor2. Augmenting personal data held by the brand with broker data 
The second type of exposure is less obvious. Many brands purchase data to augment the profiles they maintain of their customers, or of advertising targets. This is seductive from a marketer’s perspective. For example, one data seller offers brands the ability to “tie online and offline data across multiple channels back to the consumer … and activate upon them everywhere”.[9] Brands can buy the location, real names, contact details, interests, purchasing history, and demographic information of customers for whom one has some data from a data broker, to “accurately identify the precise locations of your customers or prospects”.[10] 
But however attractive this may be to a marketer, these data are fraught with risk because they are derived from personal data, or are themselves personal data.[11] Combining these data of unknown provenance with a brand’s own first party data therefore exposes a brand to risk, irrespective of whether the brand’s first party data are compliant. There are four reasons for this (see box below).


The Four Dangers of Purchased Data 

  1. The purchased data are personal data, or were generated from personal data, the use of which requires informed consent from the data subject.[12] This is very unlikely to have been obtained.
  2. Personal data must be accessible, rectifiable, and portable,[13] and a person has the right to object to profiling for direct marketing.[14] These rights are very unlikely to have been adequately provided.
  3. Personal data can not include “sensitive data” as defined in the GDPR.[15]
  4. Data subjects must be informed of and able to object to automated decisions that use personal data about them, such as segmentation, where this has a material impact.[16] This is very unlikely to have been provided.

3. Using personal data in online advertising 
The third type of exposure is the most challenging. Under the new rules it will be illegal for companies anywhere in the world to pass a European user’s personal information to another company, or to store these data, without agreeing a formal contract with the “data controller” (normally this is the company that requested the data from the user in the first place) that defines limits on how the data can be used.[17]
This is challenging because the online behavioural advertising system passes personal data among countless parties including ad exchanges, retargeting systems, media owners, demand side platforms, data management platforms, and potentially among many unknown others. We drafted this 30 second explanatory video to show how sharing personal data within this system exposes both brands and agencies acting on their behalf to risk.
A brand that passes personal data to partners within this system, or pays its agency to do so on its behalf, is exposed to risk because it is impossible to agree the required contractual agreements with all of the parties that might gain access to the data. Even if a brand could conclude contracts with all foreseeable parties, the use of javascript on publishers’ websites allows nefarious and unforeseen parties to gain unauthorized access to the personal data.
[x_video_embed type=”16:9″][/x_video_embed]

Snippets of the data discussion at the World Federation of Advertisers’ Global Marketing Week in Toronto 2017

Conclusion

For US-based companies the new rules may seem like an unwarranted European overreaction. But it is important to note that they contain many ideas suggested by American regulators almost a decade ago.[18]
Brands have eleven months to resolve these issues before they become exposed. The first two types of risk – what brands do with their own data, and whether they contaminate these data with purchased data of unknown provenance – are comparatively easy to resolve. The third type of risk, which is inherent to the online behavioural advertising system, is far more difficult to address. But it is addressable. PageFair is now drawing together interested parties to collaborate on a Data Protection Platform that solves this problem.
Please share our call for collaboration on this with any colleagues who might be interested. We are keen to hear from agencies, brands, and publishers.
Thanks to Philip Lee, Partner, Fieldfisher LLP; Anna Buchta, Head of litigation at European Data Protection Supervisor; Rachel Glasser, Director of Digital Privacy at Groupm; Bethan Crocket at Groupm. 
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

See also

See previous PageFair Insider notes on the ePrivacy Regulation and the GDPR:

PageFair statements at the European Parliament


Notes

[1] The word “processing” here means an “operation or set of operations” performed on personal data or sets of personal data. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Article 4, paragraph 2.
[2] ibid., Article 82, paragraph 5.
[3] ibid., Article 82, paragraph 3.
[4] ibid., Article 82. (See also Recital 149, which discusses criminal penalties and recovery of profits under Member State laws).
[5] $215,639 million net sales in the twelve months up to 24 September 2016. Apple Consolidated Financial Statements, 25 October 2016.
[6] $65.3 billion net sales (€58.2). “Financial highlights”, P&G annual report 2016.
[7]€52.7 billion. Unilever Annual Report And Accounts 2016, p. 23.
[8] For example, the brand would have had to inform each data subject of all purposes of to which their data will be used, and all types of parties that will receive the data when they first got the data. The GDPR, Article 13, paragraph 1, c, and 2, and Article 14, paragraph 1, c.
[9] “LiveRamp’s identity graph”, LiveRamp (URL: https://liveramp.com/discover-identitylink/identitylink-features/identity-graph/, last accessed 18 May 2017).
[10] Alistair Dent, “Third-Party Data Is Awesome, But Maybe Too Powerful”, Marketing Land, 18 June 2015 (URL: marketingland.com/third-party-data-awesome-maybe-powerful-131652); see also “Micromarketer Xpress”, Experian.com (URL: www.experian.co.uk/marketing-services/products/micromarketer-xpress.html); and “Data brokers: a call for transparency and accountability”, Federal Trade Commission, May 2014, p. 22.
[11] For example, one global brand’s privacy policy says it may obtain information from commercial sources, including “including name, postal address, email address, date of birth, income level, household information, Your interests such as hobbies and pets, Consumer and market research data, Purchase behaviour, Publicly observed data or activities, such as blogs, videos, internet postings, and user generated content”. The policy says “All of the information we collect about you may be combined …”
P&G Global Consumer Privacy Policy, URL: http://www.pg.com/privacy/english/privacy_statement.shtml#tab2, last accessed 7 April 2017.
Another top global brand’s privacy policy says the personal data it has on customers “may be combined with [information] … that is publicly available, or that we may otherwise obtain … from providers of demographic and other information, social media platforms and other third parties”.
The Coca-Cola Company Privacy Policy, February 2017, (URL: www.coca-colacompany.com/our-company/privacy-policy).
Yet another top global brand’s privacy policy tells readers that “we may receive information about you from publicly and commercially available sources (as permitted by law), which we may combine with other information we receive from or about you.” Samsung Privacy Policy & Choices, 10 February 2015 (URL: www.samsung.com/us/common/privacy.html#info).
[12] The FTC reports that data brokers collect data from sources such as warranty registrations, consumer purchases, and website registrations and cookies. None of these are likely to meet the heightened standard for consent set in the GDPR, in Article 6. “Data brokers: a call for transparency and accountability”, Federal Trade Commission, May 2014, pp iv, v.
Indeed, it would be impossible for them to do so in many cases. Seven of the nine data brokers in the FTC’s 2014 study provided data to each other. “It would be virtually impossible for a consumer to determine how a data broker obtained his or her data; the consumer would have to retrace the path of data through a series of data brokers”. “Data brokers: a call for transparency and accountability”, Federal Trade Commission, May 2014, p. iv.
[13] The General Data Protection Regulation, Article 15, 16, 17, 18, 19, 20, and 21. Note that the FTC reported in 2014 that only two brokers allowed people to correct information about them, and four allowed people to ‘suppress’ rather than delete data about themselves. “Data brokers: a call for transparency and accountability”, Federal Trade Commission, May 2014, pp. 42-3.
[14] ibid., Recital 70, and Article 21, paragraph 2.
[15] ibid., Recital 75, and  Article 9, paragraph 1. “…data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation shall be prohibited”.
[16] ibid., Recital 71, and Article 13, paragraph 2, f, and Article 14, paragraph 2, g, and Article 15, paragraph 1, h, Article 21, paragraph 1, and Article 22.
[17] ibid., Article 28, paras. 2, 3 and 4, and Article 29.
Here is how this will operate. Current European rules require contracts between data controller and processor that guarantee that the processor handles the personal data only in the manner dictated by the controller. (see Data Protection Directive (95/46/EC) 1995 Article 17, para. 3. (URL: http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex:31995L0046)) However, this is now backed up by new sanctions, and the GDPR will require that these contracts define the nature and duration of processing (Regulation (EU) 2016/679, Article 28, para. 3). Similar agreements must also be in place when one processor engages another (ibid., Article 28, para. 4), and a processor can only do so with express permission from the controller (ibid., Article 28, para. 2).
[18] The US FTC proposed several of the provisions of the GDPR as long ago as 2009, and again in 2012 and 2014. The US Government Accountability Office made similar calls in 2013. “Data brokers: a call for transparency and accountability”, Federal Trade Commission, May 2014, pp 5-7, 49-52.

PageFair statement at European Parliament rapporteur's ePrivacy Regulation roundtable

Lightly edited transcription of PageFair remarks at rapporteur’s sessions at the European Parliament in Brussels on 29 May 2017, concerning the ePrivacy Regulation. 
Statement at roundtable on Articles 9, and 10. 
Dr Johnny Ryan: Thank you. PageFair is a European adtech company. We are very much in support of the Regulation as proposed, in so far as it relates to online behavioural advertising (OBA). Read more

PageFair statement at European Parliament ALDE shadow rapporteurs session on the proposed ePrivacy Regulation

Lightly edited transcription of PageFair remarks at European Parliament ALDE session on 4 May 2017. 
Dr Johnny Ryan: Thank you. It’s a pleasure to be with you this afternoon. I’ve been on both sides: the adtech side, and the publisher’s side, of the particular part of this story that I want to talk about. Several years ago I was at The Irish Times as Chief Innovation Officer, and my background before that was academic: I wrote a history of the Internet, which is now a standard text. Now I work at PageFair, a European adtech company, based primarily in Dublin.
I want to make clear that my remarks are limited only to the ePrivacy Regulation as it affects online advertising. There may be issues with other domains. We don’t know about them. Within that strict limit let me say that we think that the direction of travel that the ePrivacy proposal represents – there may be issues with detail – but that the direction of travel that it represents will solve a crisis on both sides of the online media system.  Read more

Supporting new European data regulation

Unusually for an ad-tech company, PageFair supports the proposed ePrivacy Regulation. Here is why.
[x_alert type=”success”]Additional note (11 May 2017): our position concerns the proposal’s impact on online behavioural advertising (OBA). Though there are kinks to work out, as we note in our recent statement to Parliament representatives, we strongly endorse the proposal’s broad approach to OBA.[/x_alert]
 The European Commission has proposed new rules for ePrivacy, which will supplement the GDPR.[1] Unlike colleagues in other digital advertising companies, PageFair commends the proposed privacy protections for online advertising.
[prompt type=”left” title=”Access the GDPR/ePR repository” message=”A repository of GDPR and ePrivacy Regulation explainers, official docs, and current status.” button_text=”Access Now” href=”https://pagefair.com/datapolicydocs/”]
PageFair has taken this position for two reasons.

First, personal data are not required for online advertising. 

The online advertising system can deliver relevant ads without the need to use personal data, or third party cookies that collect personal data. Where cookies are required for advertising, non-tracking cookies are adequate.
Consider (A) and (B) below.

  1. The online advertising industry already uses systems to target advertising online without using personal data. Advertising media have relied on ‘contextual’ targeting for over a century. Many of the digital tools that advertisers and their agencies use to buy advertising space already offer contextual targeting. This is often accompanied by ‘behavioural’ targeting that uses personal data, but these data are not strictly necessary for the placement of relevant advertising on websites.
  2. Even if point A was not already the case, PageFair has developed a method of serving ‘group-interest based’ relevant ads that target ads relevant to reader’s interests without using personal data. We envisage sharing this freely with no commercial terms. This is an example of the kind of innovation that the proposed Regulation will stimulate.

Second, trustworthy publishers will benefit from the proposed rules.

By making Do Not Track (DNT) enforceable, the Regulation puts a new commercial value on trust. Publishers that have earned the trust of their users will benefit from the proposed Regulation. A small, niche topic website that has earned the trust of its users is more likely to gain a user’s consent to use her personal data than a large website that misleads with “click bait” headlines and so forth. Trust is the new asset that matters, and the size of the publisher is immaterial.
Publishers will gain power over advertising intermediaries (agencies, ad-tech companies), reversing the trend of two decades. Today, visiting a website exposes a citizen’s personal data to a cascade of third parties, and to third parties of third parties. Under the new Regulations, visiting a website will be like visiting one’s doctor: the new Regulations create a situation in which no one else is admitted to the consultation room unless both citizen and publisher invite them in.
Ad-tech companies that now extract more than half of the money spent by advertisers will have to cooperate with publishers to gain consent exceptions from visitors for the use of their personal data.[2] This creates an opportunity for publishers to command a greater share of the advertising budget. Indeed, trustworthy publishers are likely to become attractive targets for acquisition by advertising holding companies.
 

Conclusion

The status quo is unsustainable in two respects. On the one hand, the aggregation and processing of personal data by data brokers and others with access to the OBA system poses threats to citizens’ individual interests, and to society.[3] On the other hand, the brands that pay for online advertising are deeply dissatisfied with how it currently works. The world’s largest advertisers complain of poor or misleading data about advertising and its effectiveness.[4] They also waste billions of dollars a year due to ‘ad fraud’, in which advertisers pay for clicks and views that are actually simulated by ‘ad fraud bots’.[5]
The enormous growth of adblocking (to 615 million active devices) across the globe proves the terrible cost of not regulating. We are witnessing the collapse of the mechanism by which audiences support the majority of online news reports, entertainment videos, cartoons, blogs, and cat videos that make the Web so valuable and interesting. There is no future for a European digital media, content, or advertising industry based on the kind of data practice that self regulation and lax data protection have permitted.
The digital economy requires a foundation of trust to enable innovation and growth. The high data protection standards in the ePR as proposed are entirely compatible with the success of the publishing and media industry on the one hand, and the advertising industry on the other. Indeed, they are essential.
The ePR, together with the GDPR, puts a new premium on trust that will help publishers, and will change the web for the better.

See also 

See our analyses on the ePrivacy Regulation the GDPR:

 
[x_callout type=”center” title=”Perimeter: the regulatory firewall for online media and adtech. ” message=”Feature-rich adtech, even without personal data. Control user data and 3rd parties in websites + apps. Get robust consent.” button_text=”Learn more” href=”https://pagefair.com/perimeter”]

Notes

[1] The European Commission’s proposed ePrivacy Regulation is currently under negotiation between the European Commission, the European Parliament, and the Council of Ministers of European Union Member States.
[2] “Tracking Preference Expression (DNT): W3C Candidate Recommendation”, W3C, 20 August 2015 (URL: https://www.w3.org/TR/tracking-dnt/). For technical details of how publisher-specific exemptions function see https://www.w3.org/TR/tracking-dnt/#exceptions.
[3] Two alarming examples of how OBA leakage in the United States has enabled the aggregation of sensitive personally identifiable information: Tanya O’Carroll and Joshua Franco, “‘Muslim registries’, Big Data and Human Rights”, Amnesty Interntional, 27 February 2017 (URL: https://www.amnesty.org/en/latest/research/2017/02/muslim-registries-big-data-and-human-rights/); and “Cambridge Analytica Explained: Data and Elections”, Privacy International, 13 April 2017 (URL: https://medium.com/@privacyint/cambridge-analytica-explained-data-and-elections-6d4e06549491).
[4] For example, see one of several speeches by the CMO of the world’s largest advertiser: “P&G’s Pritchard Blasts Objections to His Digital Demands as ‘Head Fakes'”, Advertising Age, 2 March 2017 (URL: http://adage.com/article/cmo-strategy/p-g-s-pritchard-dismisses-objections-digital-demands-head-fakes/308144/).
[5] Mikko Kotila, Ruben Cuevas Rumin, Shailin Dhar, “Compendium of ad fraud knowledge for media investors”, World Federation of Advertisers and The Advertising Fraud Council, 2016 (URL:  https://www.wfanet.org/app/uploads/2017/04/WFA_Compendium_Of_Ad_Fraud_Knowledge.pdf), p. 3.