That document states
In March 2018, in response to a complaint, the Office of the Privacy Commissioner of Canada (“OPC”) commenced an investigation into Facebook, Inc. (“Facebook”) relating to its compliance with the Personal Information Protection and Electronic Documents Act (“PIPEDA”) in the wake of revelations about Facebook’s disclosure of the personal information of certain of its users to a third-party application (the “TYDL App”)—information that was later used by third-parties for targeted political messaging. In April 2018, the OPC was joined by the Office of the Information and Privacy Commissioner for British Columbia (“OIPC BC”) and the investigation continued as a joint investigation.
Our investigation focused on three general areas of concern under PIPEDA and the Personal Information Protection Act (British Columbia) (“PIPA”): (i) consent of users, both those who installed an app and their friends, whose information was disclosed by Facebook to apps, and in particular to the TYDL App; (ii) safeguards against unauthorized access, use and disclosure by apps; and (iii) accountability for the information under Facebook’s control.
To ensure a fair investigation of the facts, we have sought information and submissions from Facebook. We are disappointed that many of our questions have as yet gone unanswered or not answered to our satisfaction (i.e. they were incomplete, or otherwise deficient).
Based on the evidence gathered during this investigation, our findings can be summarized as follows:
Facebook failed to obtain valid and meaningful consent of installing users. Facebook relied on apps to obtain consent from users for its disclosures to those apps, but Facebook was unable to demonstrate that: (a) the TYDL App actually obtained meaningful consent for its purposes, including potentially, political purposes; or (b) Facebook made reasonable efforts, in particular by reviewing privacy communications, to ensure that the TYDL App, and apps in general, were obtaining meaningful consent from users.
Facebook also failed to obtain meaningful consent from friends of installing users. Facebook relied on overbroad and conflicting language in its privacy communications that was clearly insufficient to support meaningful consent. That language was presented to users, generally on registration, in relation to disclosures that could occur years later, to unknown apps for unknown purposes. Facebook further relied, unreasonably, on installing users to provide consent on behalf of each of their friends, often counting in the hundreds, to release those friends’ information to an app, even though the friends would have had no knowledge of that disclosure.
Facebook had inadequate safeguards to protect user information. Facebook relied on contractual terms with apps to protect against unauthorized access to users’ information, but then put in place superficial, largely reactive, and thus ineffective, monitoring to ensure compliance with those terms. Furthermore, Facebook was unable to provide evidence of enforcement actions taken in relation to privacy related contraventions of those contractual requirements.
Facebook failed to be accountable for the user information under its control. Facebook did not take responsibility for giving real and meaningful effect to the privacy protection of its users. It abdicated its responsibility for the personal information under its control, effectively shifting that responsibility almost exclusively to users and Apps. Facebook relied on overbroad consent language, and consent mechanisms that were not supported by meaningful implementation. Its purported safeguards with respect to privacy, and implementation of such safeguards, were superficial and did not adequately protect users’ personal information. The sum of these measures resulted in a privacy protection framework that was empty.
These failures are extremely concerning given that in a 2009 investigation of Facebook, the OPC also found contraventions with respect to seeking overbroad and uninformed consent for disclosures of personal information to third-party apps, and inadequate monitoring to protect against unauthorized access by those apps. In our view, if Facebook had implemented the OPC’s recommendations and its eventual commitments meaningfully, with a privacy protection framework that was not only mechanical, but substantive and effective the risk of unauthorized access and use of Canadians’ personal information by third-party apps would have been avoided or significantly mitigated.
Pursuant to our Findings in this report, we had made several recommendations, with a view to allowing Facebook to bring itself into compliance with the PIPEDA and PIPA, and to ensuring its ongoing commitment to upholding Canadian privacy law in the future. We are disappointed that Facebook either outright rejected, or refused to implement our recommendations in any manner acceptable to our Offices. This is particularly troubling given Facebook’s public commitments to work with regulators and rectify the “breach of trust” associated with these events.
In our view, therefore, the risk is high that Canadians’ personal information will be disclosed to apps and used in ways the user may not know of or expect.The report concludes
The complaint against Facebook on each of the aspects of accountability, consent, and safeguards, is well-founded, and remains unresolved. We will proceed to address the unresolved issues in accordance with our authorities under PIPEDA and PIPA.The report notes
PIPEDA and PIPA provide that Facebook is responsible for users’ personal information under its control, and to implement policies and practices to give effect to the privacy protections afforded under Canadian and British Columbia privacy law.
Despite Facebook’s claims and representations regarding its respect for users’ privacy, Facebook’s actions in this case paint, in our view, a very different picture, one more consistent with Facebook CEO Mark Zuckerberg’s public statement, in March 2018, that Facebook committed a “major breach of trust”.
The facts in this case, as outlined above, do not in our view, portray an organisation taking responsibility for giving real and meaningful effect to privacy protection. They demonstrate Facebook abdicating its responsibility for personal information under its control, effectively shifting that responsibility to users and apps. Their purported safeguards were, at the time the TYDL App was launched, superficial and still do not, in our view, adequately protect users’ personal information. Ultimately, the ineffectiveness of the consent and safeguard regimes, resulted in the TYDL App’s unauthorized access to millions of users information and its use of that information for political purposes. This is particularly concerning given the vast amount of personal information under Facebook’s control, much of which can be highly sensitive (e.g. private messages). The TYDL App is but one of potentially millions of apps that could have had access to such information, potentially using it for a myriad of unknown purposes.
In respect of meaningful consent to disclose information about installing users to third-party apps, Facebook relied on third-party apps to obtain that consent, without implementing reasonable measures to ensure that such consent was actually obtained.
In respect of meaningful consent from installing users’ friends, Facebook could have implemented measures to provide the specific and timely information those users would need to grant meaningful express consent in each instance, prior to (or at the time when) Facebook disclosing information to third party apps, but it did not do so.
To the contrary, Facebook relied on vague and over-broad, over-arching language in its terms and conditions, leaving users’ with insufficient knowledge of all the potential apps to which Facebook might disclose their information, and all the potential purposes for which those apps might use their information. Further, Facebook relied on users’ ability to navigate through various app controls to decide how, and how much, information would be disclosed to apps installed by their friends, without sufficient context to make such decisions meaningful. And finally, they relied on Facebook users to provide consent on behalf of each of their friends, often in the hundreds, to release those friends’ information to an app, without even ensuring that the friends had any knowledge of that disclosure, before or after it took place.
In respect of safeguards, Facebook again relied, for millions of apps (other than the 500 “top apps”), on others to ensure its policies were being followed by third-party apps—for example, relying on user and media reports of concerns, when in reality, users and media are not well-equipped to determine if or when Facebook has disclosed information to a third-party app, let alone if that app is conforming with the Platform Policy. While Facebook maintained a written policy regarding third-party apps’ access to and treatment of user information, in practice, Facebook has been unable to provide evidence that these policies were effectively monitored or enforced so as to prevent the unauthorized access to and use of users’ personal information by the TYDL App, or third-party apps in general.
Furthermore, Facebook only took action to investigate and disable the TYDL App in December 2015 following a media report, rather than in May 2014, when it should have been readily apparent to Facebook—through App Review—that the TYDL App may have been in violation of the Platform Policy. Facebook also chose not to alert users to the TYDL App’s breach of its contractual safeguards, and the TYDL App’s resulting unauthorized access to as many as 87,000,000 users’ information, until 2018—again, only in response to media coverage and ensuing investigations by data protection authorities, including the OPC and OIPC BC.
The evidence and the analysis throughout this Report highlight that many of the very same concerns the OPC raised in its 2009 investigation remained into 2015. While Facebook undertook to address these issues to some degree, following the OPC’s 2009 investigation, we are of the view that a truly accountable organisation would have implemented those commitments in a manner that gave real effect to its privacy protection obligations. Facebook’s establishment of a “granular data permissions” model and additional disclosure to users may have represented an improvement at the time, in consideration of the sheer absence of controls in place prior to the 2009 investigation. Those controls were, however, as identified in this Report, still ineffective, having been neither adequately implemented nor dynamically maintained. This does not, in our view, resemble a “continuing and dedicated commitment” to obtaining consent from users for disclosures to third-party apps.
As a result, Installing Users and Affected Users did not meaningfully understand what information, including sensitive information, would be disclosed to what apps for what purposes, which was particularly concerning in the case of the TYDL App, where millions of users’ information was disclosed for purposes of political targeting.
In our view, Facebook’s failure to take responsibility for its own privacy practices indicates a clear and concerning lack of accountability. Facebook did not take real responsibility for the vast amounts of user information, much of it sensitive, within its control, in that it did not implement sufficient practices and procedures to give effect to the principles set forth in PIPEDA and PIPA. In sum, we agree that Facebook’s privacy practices, including its superficial and ineffective implementation of the OPC’s 2009 recommendations, represent not only a “major breach of trust” with Facebook users, but also a serious failure with respect to Facebook’s ongoing compliance with Canadian and British Columbia privacy law.
In light of the above, we find that Facebook did not implement policies and practices to give effect to the principles, contrary to Clause 4.1.4(a) in Schedule 1 of PIPEDA, and subsection 4(2) of PIPA. Facebook’s Response to our Recommendations
In determining an appropriate resolution to this matter, we considered a range of factors. First, we considered the serious nature of the failings described above. Second, we considered that OPC had already raised many of these concerns and the risks that flow from these concerns in their 2009 findings. The recommendations that were made to Facebook in that Report of Finding should have served as a warning to Facebook regarding its privacy practices. Facebook’s failure to effectively address those concerns, to meaningfully implement its 2009 commitments to the OPC, and to act on violations of its own policies in a material way, is demonstrative of the lack of accountability at Facebook.
Certain of the above issues have been addressed via technical fixes—for example, the switch to Graph v2 and the implementation of App Review in 2014/2015 reduced the information an app could receive by default and placed significant limits on the types of apps that could receive information about friends of installing users, subject to potential limitations outlined in paragraph 15 of this report, an issue we are investigating.
We also recognize that Facebook will retro-actively review apps that had access, under Graph v1, to a “large” amount of personal information, and inform users of the potential disclosure of their information to apps installed by one of their “friends” where Facebook determines that those apps have misused their data. However, before the issuance of these findings, we recommended, in a preliminary report, that Facebook make certain commitments, outlined below, to be supported by a Compliance Agreement with Facebook, to: (i) bring Facebook into compliance with PIPEDA and PIPA; (ii) remediate the effects of Facebook’s past non-compliance; (iii) ensure effective implementation of its commitments; and (iv) ensure Facebook’s future compliance with Canadian privacy law.
After we provided Facebook our preliminary report, we also provided it with further specification regarding our Offices’ expectations with respect to its implementation of these recommendations. This was done during two in-person meetings and via a letter.
Ultimately, we were very disappointed with Facebook’s response to our recommendations, which it provided to our Offices on March 27, 2019. Facebook disagreed with our findings and proposed alternative commitments, which reflected material amendments to our recommendations, in certain instances, altering the very nature of the recommendations themselves, undermining the objectives of our proposed remedies, or outright rejecting the proposed remedy. Facebook offered very limited remedial action over and above its existing practices. In our view, such commitments would not bring Facebook into compliance with PIPEDA or PIPA.
Below, we provide: (i) each of our five recommendations, as shared with Facebook in our preliminary report; (ii) further recommendation details and clarifications subsequently provided to Facebook; and (iii) Facebook’s ultimate response to our recommendations.
Our primary recommendation was that: Facebook should implement measures, including adequate monitoring, to ensure that it obtains meaningful and valid consent from installing users and their friends. That consent must: (i) clearly inform users about the nature, purposes and consequences of the disclosures; (ii) occur in a timely manner, before or at the time when their personal information is disclosed; and (iii) be express where the personal information to be disclosed is sensitive.
We subsequently explained that we expect Facebook to implement additional measures to ensure that it is obtaining meaningful consent for its disclosure of user information to each third-party app, such as: implementation of contractual terms requiring apps to comply with consent requirements consistent with those under PIPEDA and PIPA, including at a minimum, to comply with the “must dos” as outlined in our Offices’ Guidelines for Obtaining Meaningful Consent; proactive review, through automated and/or manual means, of all apps’ privacy communications to ensure compliance with those legal/contractual requirements; reactive review of apps’ privacy communications and associated privacy practices, where privacy or compliance ‘red-flags’ have been identified; and a robust and demonstrably effective program of enforcement and remediation where apps practices are inconsistent with Facebook’s privacy-related policies or requirements.
Facebook did not agree with our findings or to implement the above measures. Rather, Facebook essentially proposed the status quo with respect to its consent practices. Facebook asserted that the shift to Graph v2 largely eliminated Facebook’s disclosure of friends’ information to third-party apps. The extent to which Facebook continued to share friends’ information with apps outside the context of Graph v2 is the subject of an ongoing investigation by our Office (see paragraph 15). To the extent that Facebook is now allowing, or does in future allow apps to access information of installing users’ friends, it should obtain consent for this practice consistent with the recommendation outlined above.
We made two further recommendations with a view to remediating the effects resulting from Facebook’s privacy contraventions, by empowering users with the knowledge necessary to protect their privacy rights and better control their personal information in respect of apps that may have gained unauthorized access to their personal information: Facebook should implement an easily accessible mechanism whereby users can: (i) determine, at any time, clearly what apps have access to what elements of their personal information [including by virtue of the app having been installed by one of the user’s friends];Footnote 26 (ii) the nature, purposes and consequences of that access; and (iii) change their preferences to disallow all or part of that access.
Facebook’s retroactive review and resulting notifications should cover all apps. Further, the resulting notifications should include adequate detail for [each user] to understand the nature, purpose and consequences of disclosures that may have been made to apps installed by a friend. Users should also be able to, from this notification, access the controls to switch off any ongoing disclosure to individual apps, or all apps.
With respect to (a), above, Facebook did not agree to inform users regarding friends’ apps that may have accessed their information. Facebook indicated that such a practice would confuse users by notifying them regarding apps that may or may not have actually accessed their information, since Facebook was, itself, unable to determine which apps would have had such access. Facebook also asserted that it already substantively complies with the recommendation, in respect of installing users, through its “Apps and Websites” dashboard.
With respect to (b), in response to concerns raised by Facebook relating to the scope of the recommended review, we explained that we were open to alternative proposals that reflect what is possible, based on the information currently available to Facebook. Facebook did not agree to expand its retroactive review as recommended or propose a viable alternative. Facebook provided no evidence to substantiate an inability to expand its review. Nor did it provide any metrics of the reviews it has conducted, to substantiate the effectiveness of the current state. To ensure Facebook’s implementation of any commitments accepted by our Offices, we recommended that: Facebook should agree to oversight by a third-party monitor, appointed by and serving to the benefit of the Commissioner[s],Footnote 28 at the expense of Facebook, to monitor and regularly report on Facebook’s compliance with the above recommendations for a period of five years.
Facebook indicated that it was willing to agree to third-party monitoring, subject to certain proposed material conditions and restrictions. However, given that Facebook has not agreed to implement our substantive recommendations, the monitor would serve no purpose.
Finally, given our findings regarding Facebook’s serious accountability failures, noting the broader audit powers available to our counterparts in Europe (including the UK Information Commissioner’s Office), we recommended that: Facebook should, for a period of five years, permit the OPC and/or OIPC BC to conduct audits, at the OPC and/or OIPC BC’s discretion, of its privacy policies and practices to assess Facebook’s compliance with requirements under PIPEDA and PIPA respectively.
Facebook rejected this recommendation outright, claiming that it was unnecessary and unreasonable, and that it exceeds the powers currently provided under PIPEDA. Facebook then proposed a wholly revised version of the recommendation that would have limited our ability to audit, even moreso than that currently provided for under PIPEDA and PIPA.
Given the serious accountability failings we have identified in this report, which are consistent with Facebook’s admission that it has breached users’ trust, we are particularly disappointed that Facebook would not agree to this recommendation. We find it difficult to reconcile Facebook’s CEO’s recent public statements regarding Facebook’s desire to work with regulators towards a more privacy-focused platform, with Facebook’s refusal to submit to audits whereby our Offices could confirm that Facebook is acting in an accountable way.