'Eye-Tracking in Virtual Reality:A Visceral Notice Approach for Protecting Privacy' by Evan Selinger, Ely Altman and Shaun Foster in (2023) 2 Privacy Studies Journal 1-34 comments
Eye-tracking is in our future. Across many fields, it is becoming widely used. This paper analyzes eye-tracking in virtual reality and characterizes the results as a case study that illuminates novel privacy risks. Our research question is: How can we support and protect users in this environ-ment? We consider a design strategy originally proposed by Ryan Calo called “visceral notice” that provides users with an experientially resonant means of understanding privacy threats. To make our case for visceral notice, we proceed as follows. First, we provide a concise account of how eye-tracking works, emphasizing its threat to autonomy and privacy. Second, we discuss the sensitive personal information that eye-tracking reveals, complications that limit what eye-track-ing studies establish, and the comparative advantage large technology companies may have when tracking our eyes. Third, we explain why eye-tracking will likely be crucial for developing virtual reality technology. Fourth, we review Calo’s conception of visceral notice and offer suggestions for applying it to virtual reality to help users better appreciate the risks of eye-tracking. Finally, we consider seven objections to our proposals and provide counterpoints to them.
David Eggers’s fictional satire of Silicon Valley, The Every, presents an incident where a powerful technology company uses eye-tracking technology to derail a politician’s career. A “vigorous” “global debate about the ethics of eye tracking” follows. However, as Eggers writes, “anyone hoping to hold back” the technology “was proven a fool.” Once “capitalists leaped in” and “apps and related products” started “proliferating,” it was too late to prevent technology companies from engineering uncritical societal acceptance of their vision. It will be tragic if real life ends up imitating art.
Like it or not, eye-tracking is picking up steam. For example, eye-tracking could provide manufacturing companies with insights into the “cognitive state” of their employees, their situational awareness, their attention, and more, leveraging that data to identify, predict, and eliminate inefficiencies.4In light of all the workplace possibilities, eye-track-ing researchers warn that the technology “should never be used for ‘big brother’ style monitoring or for evaluative assessments of workplace satisfaction and performance.”
Workplace surveillance is just the beginning. Tobii, a prominent Swedish eye-tracking company, claims to “unlock the future” by applying eye-tracking to simulations that enhance how pilots and doctors are trained in high-intensity but low-risk environments. Eye-tracking could also gauge if drivers are attentive behind the wheel. Additionally, researchers are exploring the possibility of expanding eye-tracking to typical smartpho-nes (no VR headset required). This shift has the potential to increase eye-tracking by “orders of magnitude.”
During this eye-tracking frenzy, it is important to look to the future and proactively address one of the most concerning applications: virtual reality (henceforth, VR). If the enthusiasm surrounding the metaverse is any indication, VR will likely be one of the largest eye-tracking domains. Currently, metaverse-oriented companies are investing heavily in VR technology and services. As recently as October of 2022, Meta released their newest VR headset (which has eye-tracking capabilities), the Meta Quest Pro, for a hefty $1,499. A frequently cited estimate predicts “the collective value” of change that the metaverse will facilitate “will be in the tens of trillions of dollars.”
While it may be a decade or more before VR goes mainstream outside gaming, enterprise and health products, and specialized simulations, we ought to start thinking about pri-vacy now. Otherwise, it may be too late to enact robust safeguards to mitigate the flow of function creep and the power that special interests will exert over infrastructure and technical standards. Without anticipatory governance, too much control will be ceded to technology companies with poor privacy track records, like Meta, that are prioritizing VR development. As heavily financed first-movers, they will have a powerful influence on consumer behaviour.
We expect users will face heightened privacy risks in VR. Some problems are familiar, such as companies aggregating sensitive information into big data profiles, weaponizing predictive analytics, and deploying dark patterns. Given how interfaces are often desig-ned and the invisibility of back-end data collection and analysis, people cannot reaso-nably be expected to understand what they agree to and how vulnerable they become when consenting to typical terms of service agreements. Our paper focuses on one of the most important new threats in VR: eye-tracking. As Tom Wheeler, former chairman of the U.S. Federal Communications Commission, notes: “Meta has already patented tech-nology to build eye tracking and facial expression tracking into the optical equipment worn to access the metaverse ... [that] ... could be more revealing than hooking up to a lie detector.”
To address the privacy risks that eye-tracking poses, this paper considers the following research question. What are the merits and limitations of a design shift for conveying information in an experientially resonant manner that follows privacy scholar Ryan Calo’s “visceral notice” strategy? We see this as a modest proposal, not a silver bullet. After all, providing robust privacy protections in VR will require many governance mechanisms. Furthermore, since our design recommendations are conjectural, additional interdisciplinary testing is required to assess their efficacy.
To make our case for visceral notice, we proceed as follows. First, we provide a con-cise account of how eye-tracking works, emphasizing its threat to autonomy and pri-vacy. Second, we discuss the sensitive personal information that eye-tracking reveals, complications that limit what eye-tracking studies establish, and the comparative advan-tage large technology companies may have when tracking our eyes. Third, we explain why eye-tracking will likely be crucial for developing VR technology. Fourth, we review Calo’s conception of visceral notice and offer suggestions for applying it to VR to help users better appreciate eye-tracking risks. Finally, we consider seven objections to our proposals and provide counterpoints to them. We recognize that eye-tracking exists outside of VR, that eye-tracking data in VR can be combined with additional information, and that even without eye-tracking, companies can collect and analyze a host of sensitive information in VR, including biometric data. Indeed, researchers who constructed an escape room game in VR to experimentally determine how many privacy data attributes an attacker can obtain, claim the metaverse presents “unprecedented privacy risks.” Nevertheless, we limit the scope of our analy-sis to the privacy risks that eye-tracking poses in VR. Our justification is that developing the hardware and software needed to create maximum fidelity and engagement likely means relying heavily on eye-tracking. The confluence between technical requirements and privacy risks makes eye-tracking in VR a topic worthy of its own inquiry – especially because, as we will contend, designers can use the distinctive affordances of the medium (perhaps more effectively than in typical 2D applications) to highlight novel dangers. Relatedly, we recognize that visceral notice in the context of privacy risks is neither a comprehensive nor global solution. Nevertheless, as U.S.-based writers we live in a coun-try where the notice-and-consent regime is central to privacy regulation. Consequently, this is our starting point, and we believe that our analysis of visceral notice’s application to eye-tracking risks will have broader value.