20 September 2024

Social Media Surveillance Practice

The US Federal Trade Commission 'A Look Behind the Screens Examining the Data Practices of Social Media and Video Streaming Services' report comments 

Social Media and Video Streaming Services (“SMVSSs”) have become a ubiquitous part of our daily lives and culture. Various types of SMVSSs provide places where people can connect, create, share, or stream everything from media content like videos, music, photos, and games; comment on or react to content; connect with and send messages to other users; join, participate in, or subscribe to groups, message boards, or content channels; read or watch news; and consume advertisements for consumer products. Unsurprisingly, this ease of accessing information and connecting others has transformed our society in many ways. 

These types of services let you connect with the world from the palm of your hand. At the same time, many of these services have been at the forefront of building the infrastructure for mass commercial surveillance. Some firms have unique access to information about our likes and dislikes, our relationships, our religious faiths, our medical conditions, and every other facet of our behavior, at all times and across multiple devices. This vast surveillance has come with serious costs to our privacy. It also has harmed our competitive landscape and affected the way we communicate and our well-being, especially the well-being of children and teens. Moreover, certain large SMVSSs may enjoy significant market power and therefore face fewer competitive constraints on their privacy practices and other dimensions of quality. 

In December 2020, the Federal Trade Commission (“Commission” or “FTC”) issued identical Orders to File Special Reports under Section 6(b) of the FTC Act to a cross-section of nine companies in the United States in order to gain a better understanding of how their SMVSSs affect American consumers. Appendix A to this report (hereinafter “Appendix A”) is a copy of the text of the Order that the Commission issued to these nine Companies. 

This report is a culmination of that effort. Based on the information provided in response to the Commission’s Orders, publicly available materials, and the Commission’s long experience with SMVSSs, this report highlights the practices of the Companies’ SMVSSs, which include social networking, messaging, or video streaming services, or photo, video, or other content sharing applications available as mobile applications or websites. The report contains five sections relating to the following topics: (1) data practices, such as collection, use, disclosure, minimization, retention, and deletion; (2) advertising and targeted advertising; (3) the use of automated decision-making technologies; (4) practices relating to children and teens; and (5) concerns relating to competition. 

1. Summary of Key Findings 

This report makes the following general findings, although each finding may not be applicable to every one of the Companies in every instance: 

• Many Companies collected and could indefinitely retain troves of data from and about users and non-users, and they did so in ways consumers might not expect. This included information about activities both on and off of the SMVSSs, and included things such as personal information, demographic information, interests, behaviors, and activities elsewhere on the Internet. The collection included information input by users themselves, information gathered passively or inferred, and information that some Companies purchased about users from data brokers and others, including data relating to things such as household income, location, and interests. Moreover, many Companies’ data practices posed risks to users’ and non-users’ data privacy, and their data collection, minimization, and retention practices were woefully inadequate. For instance, minimization policies were often vague or undocumented, and many Companies lacked written retention or deletion policies. Some of the Companies’ SMVSSs did not delete data in response to user requests—they just de-identified it. Even those Companies that actually deleted data would only delete some data, but not all. 

• Many Companies relied on selling advertising services to other businesses based largely on using the personal information of their users. The technology powering this ecosystem took place behind the scenes and out of view to consumers, posing significant privacy risks. For instance, some Companies made available privacy-invasive tracking technologies such as pixels, which have the ability to transmit sensitive information about users’ actions to the SMVSSs that use them. Because the advertising ecosystem is complex and occurs beneath the surface, it is challenging for users to decipher how the information collected from and about them is used for ad targeting—in fact, many users may not be aware of this at all. Some Companies’ ad targeting practices based on sensitive categories also raise serious privacy concerns. 

• There was a widespread application of Algorithms, Data Analytics, or artificial intelligence (“AI”), to users’ and non-users’ personal information. These technologies powered the SMVSSs—everything from content recommendation to search, advertising, and inferring personal details about users. Users lacked any meaningful control over how personal information was used for AI-fueled systems. This was especially true for personal information that these systems infer, that was purchased from third parties, or that was derived from users’ and non-users’ activities off of the platform. This also held true for non-users who did not have an account and who may have never used the relevant service. Nor were users and non-users empowered to review the information used by these systems or their outcomes, to correct incorrect data or determinations, or to understand how decisions were made, raising the potential of further harms when systems may be unreliable or infer sensitive information about individuals. Overall, there was a lack of access, choice, control, transparency, explainability, and interpretability relating to the Companies’ use of automated systems. There also were differing, inconsistent, and inadequate approaches relating to monitoring and testing the use of automated systems. Other harms noted included Algorithms that may prioritize certain forms of harmful content, such as dangerous online challenges, and negative mental health consequences for children and teens. 

• The trend among the Companies was that they failed to adequately protect children and teens—this was especially true of teens, who are not covered by the Children’s Online Privacy Protection Rule (“COPPA Rule”). xxx Many Companies said they protected children by complying with the COPPA Rule but did not go further. Moreover, in an apparent attempt to avoid liability under the COPPA Rule, most SMVSSs asserted that there are no child users on their platforms because children cannot create accounts. Yet we know that children are using SMVSSs. The SMVSSs should not ignore this reality. When it comes to teens, SMVSSs often treat them as if they were traditional adult users. Almost all of the Companies allowed teens on their SMVSSs and placed no restrictions on their accounts, and collected personal information from teens just like they do from adults. 

The past can teach powerful lessons. By snapshotting the Companies’ practices at a recent moment in time (specifically, the Orders focused on the period of 2019–2020) and highlighting the implications and potential consequences that flowed from those practices, this report seeks to be a resource and key reference point for policymakers and the public. 

2. Summary of Competition Implications 

• Data abuses can fuel market dominance, and market dominance can, in turn, further enable data abuses and practices that harm consumers. In digital markets, acquiring and maintaining access to significant user data can be a path to achieving market dominance and building competitive moats that lock out rivals and create barriers to market entry. The competitive value of user data can incentivize firms to prioritize acquiring it, even at the expense of user privacy. Moreover, a company’s practices with respect to privacy, data collection, data use, and automated systems can comprise an important part of the quality of the company’s product offering. A lack of competition in the marketplace can mean that users lack real choice among services and must surrender to the data practices of a dominant company, and that companies do not have to compete over these dimensions—depriving consumers of additional choice and autonomy. In sum, limited competition can exacerbate the consumers harms described in this report. 

3. Summary of Staff Recommendations 

• Companies can and should do more to protect consumers’ privacy, and Congress should enact comprehensive federal privacy legislation that limits surveillance and grants consumers data rights. Baseline protections that Companies should implement include minimizing data collection to only that data which is necessary for their services and implementing concrete data retention and data deletion policies; limiting data sharing with affiliates, other company-branded entities, and third parties; and adopting clear, transparent, and consumer-friendly privacy policies. 

• Companies should implement more safeguards when it comes to advertising, especially surrounding the receipt or use of sensitive personal information. Baseline safeguards that Companies should implement include preventing the receipt, use, and onward disclosure of sensitive data that can be made available for use by advertisers for targeted ad campaigns. 

• Companies should put users in control of—and be transparent about—the data that powers automated decision-making systems, and should implement more robust safeguards that protect users. Changing this would require addressing the lack of access, choice, control, transparency, explainability, and interpretability relating to their use of automated systems; and implementing more stringent testing and monitoring standards. 

• Companies should implement policies that would ensure greater protection of children and teens. This would include, for instance, treating the COPPA Rule as representing the minimum requirements and providing additional safety measures for children as appropriate; recognizing that teen users are not adult users and, by default, afford them more protections as they continue to navigate the digital world; providing parents/legal guardians a uniform, easy, and straightforward way to access and delete their child’s personal information. 

• Firms must compete on the merits to avoid running afoul of the antitrust laws. Given the serious consumer harms risked by lackluster competition, antitrust enforcers must carefully scrutinize potential anticompetitive acquisitions and conduct and must be vigilant to anticompetitive harms that may manifest in non-price terms like diminished privacy.

The FTC notes

The SMVSSs in our study demonstrate the many ways in which consumers of all ages may interact with or create content online, communicate with other users, obtain news and information, and foster social relationships. For example:

• Amazon.com, Inc. is the parent company of the Twitch SMVSS, wherein users can watch streamers play video games in real time. In 2022, Twitch reported an average of 31 million daily visitors to its service, most of whom were between 18 and 34 years old.  

• ByteDance Ltd. is the ultimate parent company of TikTok LLC, the entity that operates the TikTok SMVSS. TikTok enables users to watch and create short-form videos.  TikTok reported having 150 million monthly active users in the United States in 2023, up from 100 million monthly active users in 2020. 

• Discord Inc. operates the Discord SMVSS that provides voice, video, and text communication capabilities to users, by means of community chat rooms known as “servers.”  In 2023, Discord reported having 150 million monthly active users, with 19 million active community chat rooms per week. 

• Meta Platforms, Inc., formerly known as Facebook, Inc., operates multiple SMVSSs. In 2023, Meta reported having 3 billion users across its services. WhatsApp Inc. is part of the Meta Platforms, Inc. corporate family. WhatsApp Inc. received a separate Order, and is therefore treated as a separate Company for purposes of this report. 

 o The Facebook SMVSS provides users with a communal space to connect to a network of other users by sharing, among other things, text posts, photos, and videos.  In March 2023, Meta reported an average of more than 2 billion daily active users and almost 3 billion monthly active users. 

o The Messenger SMVSS is a messaging application that allows users to communicate via text, audio calls, and video calls. Users of Messenger must have a Facebook account to use Messenger’s services. 

o The Messenger Kids SMVSS is a children’s messaging application that allows users to communicate via text, audio calls, and video calls.Parents of Messenger Kids users create accounts for their children through a parent’s Facebook account. 

o The Instagram SMVSS, acquired by Meta Platforms, Inc. in 2012,72 allows users to share photos and videos with their networks. News reports estimated that as of 2021 there were 1.3 billion users on Instagram. 

o The WhatsApp SMVSS, acquired in 2014 by Meta Platforms, Inc.,   is a messaging platform.WhatsApp reportedly had more than 2 billion users in 2023. 

• Reddit, Inc. operates the Reddit SMVSS, which provides communities wherein users can discuss their specific interests.News outlets reported that, as of April 2023, approximately 57 million people visit the Reddit platform every day. 

• Snap Inc. operates the Snapchat SMVSS, which it describes in part as a “visual messaging application that enhances your relationships with friends, family, and the world.”  Snapchat also includes “Stories,” which provides users the ability to “express themselves in narrative form through photos and videos, shown in chronological order, to their friends.”  Snap Inc. reported having 375 million average daily active users in Q4 2022. 

• Twitter, Inc. was a publicly traded company until October 2022, at which time it became a privately held corporation called X Corp. Since that time, X Corp. has operated X, formerly known as the Twitter SMVSS, which provides users with the ability to share short posts.  Twitter, Inc. reported having 217 million average daily users in Q4 2021. 

• YouTube, LLC is wholly owned by Google LLC, with Alphabet Inc. as the ultimate parent. Google LLC operates YouTube’s two SMVSSs. 

o The YouTube SMVSS is a video sharing product. As of February 2021, YouTube, LLC reported that “over two billion logged in users [come] to YouTube every month . . . . ” 

o The YouTube Kids SMVSS, first introduced in 2015, is a children’s video product with family-friendly videos and parental controls.88 As of February 2021, YouTube, LLC reported that YouTube Kids had more than 35 million weekly viewers.

 While the SMVSSs in this report are generally “zero price” (or have free versions available) for the end user – meaning they require no money from consumers to sign up, or to create an account, for the basic version of the product – firms monetize (or profit off of) these accounts through data and information collection.