Facebook and adtech face a turbulent time in Europe’s courts: the Brussels case.

man looking at facebook on a computer

This note examines a Belgian court ruling against Facebook’s tracking and approach to consent. Facebook and adtech companies should expect tough sanctions when they find themselves before European courts – unless they change their current approach to data protection and the GDPR. 

Facebook is playing a dangerous game of “chicken” with the regulators. First, it has begun to confront users in the EU with a new “terms of service” dialogue, which denies access to Facebook until a user opt-ins to tracking for ad targeting, and various other data processing purposes.[1] (more detail in footnote 1)

This dialogue appears to breach several important principles of the GDPR, including the principle of purpose limitation,[2] freely given, non-conditional consent,[3] and of transparency.[4] In other words, if Facebook attempts to collect consent in this manner, that consent will be unlawful. European Regulators have been very clear on this point.[5]

Second, on 1 May 2018, a mere twenty four days before the application date of the GDPR, Facebook’s head of privacy announced plans to build “Clear History”, a feature with which users can opt-out of Facebook collecting data about their visits to other websites and apps.[6] But the GDPR demands not an opt-out, but an opt-in.[7] Nor is Clear History available to non-Facebook users. And as a further sign of Facebook’s brinksmanship, it said “it will take a few months to build Clear History”,[8] which means that the feature will not be available to users until long after the GDPR has been applied later this month.

Facebook’s approach puts it on a collision course with European courts. This note examines one recent decision in which the Brussels Court of First Instance ruled that Facebook’s tracking of people on other websites is illegal, and that its approach to consent is invalid.[9] The immediate result was a financial penalty, and an order that Facebook must submit to having an independent expert supervise its deletion of all the personal data it illegally amassed.

The implications of the ruling are far wider. It is an insight into the hazard for digital publishers and adtech vendors of failing to heed the warnings of the Article 29 Working Party.

Important lessons for RTB/programmatic

Belgium’s data protection authority, the Belgian Privacy Commission,[10] challenged that Facebook’s “Like” buttons and trackers on websites all over the web enable it to look “over the shoulders of persons while they are browsing from one website to the next … without sufficiently informing the relevant parties and obtaining their valid consent”.[11]

The Court agreed, and summarized Facebook’s tracking in its ruling:

When someone visits a website with such a Facebook social plug-in, his browser will automatically establish a connection with (by sending an http request to) the Facebook server, after which the visitor’s browser directly loads the “plug-in” function from the Facebook server.[12]

The Court’s ruling outlined what data are received by Facebook from its social plugins installed on other websites:

1. IP address;
2. URL of the page of the website requested by the user;
3. The browser management system;
4. The type of browser, and
5. the cookies (previously) placed by the third-party website from which the browser requests the this-party content.[13]

In a previous judgement in 2015 the Court observed that these browsing data are “frequently of a very sensitive nature, allowing, for example, health-related, sexual and political preferences to be gauged”.[14]
This should give pause to digital publishers and adtech vendors, because these data, which reveal special categories of personal data, are exactly the same data that websites routinely broadcast to tens – if not hundreds – of companies in RTB bid requests.[15] This happens every time an advertisement is served.

The Court noted that the scale of Facebook’s presence across the web makes this tracking “practically unavoidable”.[16] The February 2018 ruling reiterated the Court’s previous ruling in 2015 that “the extent of the violations in question is massive: they do not only concern the violation of the fundamental rights of a single person, but of an enormous group of persons.”[17]

This too should give the online media and advertising industry pause, because the same applies the broadcasting of personal data in RTB bid requests by the majority of major websites across the globe, and to the creation of profiles based on these personal data by DMPs and other adtech vendors.

Facebook’s notification fig leaf ruled unlawful

Facebook provided the following notice to users about this tracking:

We use cookies to help personalise content, to target and measure advertising and to provide you with a safer experience…[18]

Unsurprisingly, the Court ruled that this is utterly inadequate:

The court has come to the decision that in all the cases described, Facebook does not obtain any legally valid consent in the sense of Article 5 (a) Privacy Act[19]and Article 129 ECA[20]for the disputed data processing.[21]

As a result, the Court ruled that Facebook does not have a legal basis for tracking Internet users as they browse the web. Nor does Facebook have a legal basis for tracking logged-in users around the web.[22]

Several of the Court’s admonitions are worth including here, because they are directly relevant to Facebook and other online media and adtech companies’ approaches to the GDPR.

First, the Court found that non-Facebook users are never told that their behavior on websites across the web is being profiled by Facebook:

When non-users visit a website of a third party that includes an (invisible) Facebook pixel that allows for tracking of browsing behavior, without indicating that they wish to make use of the Facebook service, no information mechanism (such as a banner) is displayed.[23]

This remains a legal risk for Facebook, and “Clear History” does not adequately mitigate this risk.

Second, the Court ruled that Facebook’s request for consent was not specific, and that any consent that it received was unlawful as a result:

‘Specific’ means that the expression of will must related to a specific instance or category of data processing and can thus not be obtained on the basis of a general authorization for an open series of processing activities.[24]

This part of the ruling was based on Article 1, section 8, of the Belgian Privacy Act, which uses the same formula of words as Article 4, paragraph 11, of the GDPR (“freely given, specific, informed…”). In other words, the Court is upholding a standard that is virtually identical to the standard that will apply under the GDPR. Facebook’s new GDPR consent dialogue faces the same problem, and is unlawful for the same reason.

Third, the Court found that Facebook users are not clearly told what “purposes” Facebook processes the personal data for. Nor does it clearly explain its use of sensitive data including any personal data that could reveal religious belief, sexual orientation, etc.:

the cookie banner, makes it insufficiently clear for which exact purposes the personal data – which indeed also include “sensitive data” (e.g. regarding religious beliefs or sexual orientation) – are being collected, while the following layers (including the cookie policy, data policy) also do not explain this in an easily comprehensible and accessible manner.[25]

Facebook has recently gone some way to inform users about the use of personal data concerning their political interests, but this is only a partial solution to a far broader risk for the company. Its handling of sensitive categories of personal data will be a major challenge, which it has yet to show any ability to resolve.[26]

Fourth, and unsurprisingly in the aftermath of the Cambridge Analytica scandal, the Court found that Facebook did not properly disclose who it was sharing the data with. Nor did it provide any information about “the existence of a right to access and correction of the personal data concerning him”.[27] This is likely to remain a significant challenge.[28]

Fifth, the Court found that Facebook was not even complying with its own self-regulatory system. Whatever one’s view of the “adchoices” self-regulatory system, it is quite remarkable that Facebook continued to track people even if they had already used it to opt out.[29]

Facebook forced to delete data (and fined)

The Brussels Court ordered Facebook to pay €250,000 per day,[30] up to a maximum of €100 million, until it stopped its unlawful behavior.

This was a strong statement. To put this fine in to perspective, consider that Belgium has a population of 11.35 million people.[31] At the same value per person, the EU equivalent would be €12.5 million per day, up to a maximum of €5 billion.

In addition, Facebook was ordered to submit to an independent expert supervising its deletion of all illegal data that it had amassed about every user on Belgian soil.[33] It also had to make sure that third parties to whom it provided illegal data do the same.

The Cambridge Analytica scandal shows that this last point about insuring that third parties delete their copies of Facebook’s illegally accumulated data will be impossible for Facebook to comply with, because of its lax data sharing standards. Recall that Mark Zuckerberg told US lawmakers

When developers told us they weren’t going to sell data, we thought that was a good representation. But one of the big lessons we’ve learned is that clearly, we cannot just take developer’s word for it.[34]

In other words, Facebook was sharing personal data without any control whatsoever, much as websites do when they send visitors’ personal data in RTB bid requests. Even if the original collection of the data had been lawful, this uncontrolled distribution would certainly is not. Again, the parallel with RTB bid requests should give publishers and adtech vendors pause.

What the Article 29 Working Party says, goes

Many of our colleagues in adtech have been unwilling to heed the counsel of the Article 29 Working Party (a roundtable of European regulators). The Brussels Court’s ruling illustrates the Working Party’s importance and authority. Although the Court is the arbiter, it relied on the Working Party’s authoritative opinions throughout its ruling. (The ruling cited the Working Party’s 2011 opinion on consent (15/2011),[35] its 2010 opinion on online behavioral advertising (2/2010)[36], its 2013 opinion on purpose limitation (2/2013)[37], and its 2010 opinion on the concepts of data controller and data processor (1/2010)[38].)

Whether or not businesses take the Working Party seriously, judges do, which is what matters when businesses find themselves facing sanctions for data misuse. This should demonstrate the value of closely abiding by the opinions of the Working Party. The requirements of European data protection law have been well illuminated by the public guidance of the Article 29 Working Party for over two decades, and provide an invaluable guide to businesses scrambling to comply with a body of law largely neglected hitherto.

Facebook can not reject users who refuse non-essential tracking

The Court ruled that Facebook cannot reject users who refuse to agree to tracking – unless the tracking in question is necessary for the service that a user explicitly requests from Facebook.[39] Instead, the Court ruled that users should be

given the option of refusing the placement of these cookies, in as far as this is not strictly necessary for a service explicitly requested by him, without his access to the Facebook.com domain being hereby limited or rendered more difficult.[40]

In December 2015, Facebook had blocked access to all Belgian users, following a court injunction that forbade it to place a (“Datr”) cookie without properly informing users.[41] (See footnote 41 for elaboration.) Facebook attempted to justify this denial of service in a notice to users that claimed it could not provide service because was prohibited from taking measures (unlawful tracking) to prevent unauthorized access to users’ Facebook accounts. The Court took a dim view of this:

The court concurs … that the systematic collection of the personal data of users and non-users via social plug-ins on the websites of third parties is not essential (let alone “strictly essential” in the sense of Article 129 ECA),or at least not proportional to the achievement of the safeguarding objective.[42]

The Court believed that Facebook’s purported fraud detection was insufficient in any case:

the systematic collection of safeguarding cookies is inadequate as a means of safeguarding, as it is easy to circumvent by persons with malicious intentions.[43]

Conclusion: fewer data, not more, will help Facebook in the EU

This ruling is one of several defeats Facebook has suffered in European courts in recent months. In January, the Berlin Regional Court ruled that Facebook’s approach to consent and terms are unlawful.[44] In April, the Irish High Court referred important aspects of Facebook’s trans-Atlantic transfers of personal data to the European Court of Justice, once again, for scrutiny.[45] It is likely that worse is to come, unless it significantly changes its approach to data protection within the EU.

However, the company has options. As unlikely as it may seem now, one can foresee that Facebook will introduce non-personal data based ad targeting to the Newsfeed. This is likely to be necessary because Facebook will be unable to win lawful consent for some of its data processing purposes for sensitive personal data (or data processing purposes for regular personal data, that are not “compatible” with purposes that the user has already agreed to).[46]

It seems likely that problem encompasses all personalized advertising on the newsfeed, custom audiences, and social share buttons on other websites. Therefore, Facebook must have a way of targeting ads to non-consenting users. Non-personal data would allow this.

It may also become important for Facebook to be able to participate in a clean and safe data supply chain, which major advertisers are beginning to show concern about.[47]

In addition, Facebook will have to limit the use of custom audiences to situations where it is certain that the advertiser has a valid legal basis.
There is a broader lesson. Digital publishers and adtech vendors need to urgently reassess the use of personal data in programmatic advertising, and reflect on how adtech’s shaky consent systems will fare in Europe’s courts.

Notes

[1] The new terms mention personalization of ads. See “Terms of service”, Facebook (URL: https://www.facebook.com/legal/terms/update), accessed 2 May 2018.
The Terms also refer to the data policy, which elaborates that “we use the information we have about you – including information about your interests, actions and connections – to select and personalise ads, offers and other sponsored content that we show you.” The data policy also says “We use the information [including] the websites you visit and ads you see … to help advertisers and other partners measure the effectiveness and distribution of their ads and services, and understand the types of people who use their services and how people interact with their websites, apps and services”. “Data policy”, Facebook (URL: https://www.facebook.com/about/privacy/update), accessed 2 May 2018.
[2] The GDPR, Article 5, paragraph 1.
[3] The GDPR, Article 7, paragraph 2.
[4] The GDPR, Article 13, paragraph 1 and paragraph 2.
[5] “Guidelines on consent under Regulation 2016/679”, WP259, Article 29 Working Party, 28 November 2017, p. 11.
[6] “Getting feedback on new tools to protect people’s privacy”, Facebook, 1 May 2018 (URL: https://newsroom.fb.com/news/2018/05/clear-history-2/).
[7] See the GDPR, Article 6, Article 8, and Article 9.
[8] “Getting feedback on new tools to protect people’s privacy”, Facebook.
[9] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., Dutch-language Brussels Court of First Instance (Nederlandstalige Rechtbank van Eerste Aanleg te Brussel/Tribunal de Première Instance néerlandophone de Bruxelles – the “Court”), 16 February, 2016/153/A.
[10 ]It has since changed its name to the Belgian Data Protection Authority.
[12] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 12.
[12] ibid., p. 9. See more detail on pp 49-51.
[13] ibid., p. 9.
[14] “Data leakage in online advertising”, PageFair.
[15] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 69.
[16] ibid., p. 69.
[17] ibid.,
Note that this raises the competition (antitrust) question, as Germany’s competition regulator, Andreas Mundt, has pointed out: “If Facebook has a dominant market position, then the consent that the user gives for his data to be used is no longer voluntary” (see https://www.reuters.com/article/us-facebook-privacy-germany/facebooks-hidden-data-haul-troubles-german-cartel-regulator-idUSKBN1HU108).
[18] ibid., p. 8.
[19] Which implemented the Data Protection Directive.
[20] Electronic Communications Act of 20 June 2005, which implemented the ePrivacy Directive.
[21] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 64.
[22] ibid., p. 73-4.
[23] ibid., p. 57
[24] ibid., p. 61.
[25] ibid., p. 58.
[26] See discussion of special categories of data in the newsfeed in “How the GDPR will disrupt Google and Facebook”, Blockthrough, 30 August, (URL: https://blockthrou1dev.wpengine.com/blog/gdpr_risk_to_the_duopoly/).
[27] ibid., p. 59.
[28] See testimony by Chris Vickery at the UK Parliament Digital, Culture, Media and Sport Committee Wednesday 2 May 2018  (URL: https://www.parliamentlive.tv/Event/Index/0cf92dd0-f484-4699-9e01-81c86acb880c)
[29] ibid., p. 63.
[30] ibid., p. 70.
[31] World Bank, 2016.
[33] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., p. 14, 70.
[34] Testimony of Mark Zuckerberg Chairman and Chief Executive Officer, Facebook, Hearing before the United States House of Representatives Committee on Energy and Commerce, 11 April 2018 (URL: https://www.c-span.org/video/?443490-1/facebook-ceo-mark-zuckerberg-testifies-data-protection&live&start=4929#).
[35] Final ruling, Willem Debeuckelaere v Facebook Ireland Ltd., and Facebook Inc., and Facebook Belgium Bvba., pp. 57-8, 61.
[36] ibid., p. 59.
[37] ibid., p. 60.
[38] ibid., p. 70.
[39] ibid., p. 13, 72.
[40] ibid., p. 72. See the Privacy Commission’s argument for this on p. 13.
[41] After an order from the Privacy Commission, which was backed up by a Court injunction. In 2015, the Privacy Commission ordered Facebook to, among other things, stop tracking non-users, using cookies and social plugs, without consent, and to do the same for users unless “unless strictly necessary for a service explicitly requested b the user” or unless it gets “unequivocal, specific consent”. It was also ordered to use consent requests that are unequivocal and specific. When Facebook failed to comply this was followed by a court order in November 2015. Facebook responded by blocking access to users. See ibid., pp 4-7.
[42] ibid., p. 65-6.
[43] ibid., p. 67.
[44] Judgment of the Berlin Regional Court dated 16 January 2018, Case no. 16 O 341/15 (URL: https://www.vzbv.de/sites/default/files/downloads/2018/02/12/facebook_lg_berlin.pdf)
[45] The High Court, Commercial, 2016, N. 4809 P., The Data Protection Commissioner v Facebook Ireland and Maximilian Schrems, Request for a preliminary ruling, Article 267 TFEU, 12 April 2018.
See also Judgement of Ms Justice Costello, The High Court, Commercial, 2016, No. 4809 P., The Data Protection Commissioner v Facebook Ireland and Maximillian Schrems, 3 October 2017.
Note, this is the second “Schrems” case. The first caused the end of the EU-US Safe Harbor agreement.
[46] See a discussion on Facebook and purpose limitation in “How the GDPR will disrupt Google and Facebook”, Blockthrough, 30 August, (URL: https://blockthrou1dev.wpengine.com/blog/gdpr_risk_to_the_duopoly/).
[47] “WFA Manifesto for Online Data Transparency”, World Federation of Advertisers, 20 April 2018 (URL: https://www.wfanet.org/news-centre/wfa-manifesto-for-online-data-transparency/). See also Stephan Loerke, WFA CEO, “GDPR data-privacy rules signal a welcome revolution”, AdAge, 25 January 2018 (URL: adage.com/article/cmo-strategy/gdpr-signals-a-revolution/312074/).

While you're here...

Did you know that the average publisher loses 10-40% of their revenue to ad blocking? What you may not know is that ad blocking has largely shifted to ad-filtering, with over 300M users allowing a safer, less interruptive ad experience to be served to them—in turn supporting their favorite sites and creators.

Blockthrough's award-winning technology plugs into publishers' header bidding wrapper and ad server to scan ad creatives for compliance with the Acceptable Ads Standard to activate this "hidden" audience and generate incremental revenue, while respecting the choice and experience of ad-filtering users.

Want to learn more?