10 July 2024

Emotion

'Physiognomic Artificial Intelligence' by Luke Stark and Jevon Hutson in (2022) 32 Fordham Intellectual Property, Media and Entertainment Law Journal 922 comments 

The reanimation of the pseudosciences of physiognomy and phrenology at scale through computer vision and machine learning is a matter of urgent concern. This Article—which contributes to critical data studies, consumer protection law, biometric privacy law, and antidiscrimination law—endeavors to conceptualize and problematize physiognomic artificial intelligence (“AI”) and offer policy recommendations for state and federal lawmakers to forestall its proliferation. 

Physiognomic AI, as this Article contends, is the practice of using computer software and related systems to infer or create hierarchies of an individual’s body composition, protected class status, perceived character, capabilities, and future social outcomes based on their physical or behavioral characteristics. Physiognomic and phrenological logics are intrinsic to the technical mechanism of computer vision applied to humans. This Article observes how computer vision is a central vector for physiognomic AI technologies and unpacks how computer vision reanimates physiognomy in conception, form, and practice and the dangers this trend presents for civil liberties. 

This Article thus argues for legislative action to forestall and roll back the proliferation of physiognomic AI. To that end, it considers a potential menu of safeguards and limitations to significantly limit the deployment of physiognomic AI systems, which hopefully can be used to strengthen local, state, and federal legislation. This Article foregrounds its policy discussion by proposing the abolition of physiognomic AI. From there, it posits regimes of U.S. consumer protection law, biometric privacy law, and civil rights law as vehicles for rejecting physiognomy’s digital renaissance in AI. Specifically, it contends that physiognomic AI should be categorically rejected as oppressive and unjust. Second, it argues that lawmakers should declare physiognomic AI unfair and deceptive per se. Third, it proposes that lawmakers should enact or expand biometric privacy laws to prohibit physiognomic AI. Fourth, it recommends that lawmakers should prohibit physiognomic AI in places of public accommodation. It also observes the paucity of procedural and managerial regimes of fairness, accountability, and transparency in ad- dressing physiognomic AI and attend to potential counterarguments in support of physiognomic AI.