The settlement requires Facebook to "live up to its promises in the future", including -
• giving consumers clear and prominent noticeConcerns regarding consent have been recurrently highlighted in this blog, for example in the post on the Article 29 Working Party statement in Europe.
• obtaining consumers' express consent before their information is shared beyond the privacy settings they have established.
The FTC charged that Facebook made unfair and deceptive claims, and violated federal law. FTC Chair Jon Leibowitz commented that -
Facebook is obligated to keep the promises about privacy that it makes to its hundreds of millions of users. Facebook's innovation does not have to come at the expense of consumer privacy. The FTC action will ensure it will not.Specific complaints from the FTC include -
• in December 2009, Facebook changed its website so certain information that users may have designated as private – such as their Friends List – was made public. They didn't warn users that this change was coming, or get their approval in advance.The FTC's media release states that Facebook is barred from making any further deceptive privacy claims, is required to get consumers' approval before it changes the way it shares their data, and required to obtain periodic assessments of its privacy practices by independent, third-party auditors for the next 20 years.
• Facebook represented that third-party apps that users' installed would have access only to user information that they needed to operate. In fact, the apps could access nearly all of users' personal data – data the apps didn't need.
• Facebook told users they could restrict sharing of data to limited audiences – for example with "Friends Only." In fact, selecting "Friends Only" did not prevent their information from being shared with third-party applications their friends used.
• Facebook had a "Verified Apps" program & claimed it certified the security of participating apps. It didn't.
• Facebook promised users that it would not share their personal information with advertisers. It did.
• Facebook claimed that when users deactivated or deleted their accounts, their photos and videos would be inaccessible. But Facebook allowed access to the content, even after users had deactivated or deleted their accounts.
• Facebook claimed that it complied with the US-EU Safe Harbor Framework that governs data transfer between the US and the European Union. It didn't.
Given that the devil is the detail, Facebook is -
• barred from making misrepresentations about the privacy or security of consumers' personal information;The Order, subject to community comment, features record-keeping provisions to allow the FTC to monitor compliance.
• required to obtain consumers' affirmative express consent before enacting changes that override their privacy preferences;
• required to prevent anyone from accessing a user's material more than 30 days after the user has deleted his or her account;
• required to establish and maintain a comprehensive privacy program designed to address privacy risks associated with the development and management of new and existing products and services, and to protect the privacy and confidentiality of consumers' information; and
• required, within 180 days, and every two years after that for the next 20 years, to obtain independent, third-party audits certifying that it has a privacy program in place that meets or exceeds the requirements of the FTC order, and to ensure that the privacy of consumers' information is protected.
In its corporate blog the FTC highlights its action, commenting that -
Privacy changes – unfair practices. Furthermore, according to the FTC, by designating certain user profile info as public when it had previously been subject to more restrictive privacy settings, Facebook overrode users’ existing privacy choices. In doing that, the company materially changed the privacy of users’ information and retroactively applied these changes to information that it previously collected. The FTC said that doing that without users’ informed consent was an unfair practice, in violation of the FTC Act.
What info apps had access to. According to the complaint, for a significant period of time after Facebook started featuring apps on its site, it deceived users about how much of their information was shared with the apps they used. Facebook said that when people authorized an app, the app would only have information about the users “that it requires to work.” Not accurate, says the FTC. According to the complaint, apps could access pretty much all of the user’s information – even info unrelated to the operation of the app. For example, an app with a TV quiz could access a user’s Relationship Status, as well as the URL for every photo and video the user had uploaded – information that went well beyond what the app “requires to work.”
What info Facebook shared with advertisers. Facebook also told users it wouldn’t share their personal information with advertisers. In Facebook’s Statement of Rights and Responsibilities, the company said, “We don’t share your information with advertisers unless you tell us to (e.g., to get a sample, hear more, or enter a contest). Any assertion to the contrary is false. Period ... we never provide the advertiser any names or other information about the people who are shown, or even who click on, the ads.” In fact, says the FTC, from at least September 2008 until May 2010, Facebook ran its site so that in many instances, the User ID of a person who clicked on an ad was shared with the advertiser. So much for “never.”
Facebook’s “Verified Apps” program. The FTC also challenged the operation of Facebook’s Verified Apps program. Facebook told people that the program involved a “detailed review process” and was “designed to offer extra assurances to help users identify applications they can trust – applications that are secure, respectful and transparent, and have demonstrated commitment to compliance with Platform policies.” About 250 apps paid between $175 and $375 for the seal. But according to the FTC, Facebook took no steps to verify either the security of a Verified App’s website or the security the app provided for the information it collected, beyond the steps it took for any other app.
Photo and video deletion. In addition, the FTC charged Facebook with making deceptive claims about its photo and video deletion policy. Each of the photos and videos a user uploads onto Facebook has a Content URL – a URL for its location on Facebook’s servers. Facebook told users, “If you want to stop using your account you may deactivate it or delete it. When you deactivate an account, no user will be able to see it, but it will not be deleted ... When you delete an account, it is permanently deleted from Facebook.” But even after users followed Facebook’s procedure for deactivating or deleting an account, Facebook still served up these photos and videos to anyone who accessed them via the Content URL. That, said the FTC, rendered Facebook’s statements deceptive.