30 May 2020

Fake News

'Protecting Elections from Disinformation: A Multifaceted Public-Private Approach to Social Media and Democratic Speech' by Yasmin Dawood in (2020) 16 Ohio State Technology Law Journal 1 comments
 This Article argues for a multifaceted public-private approach to the challenge of protecting the electoral process from the harms of disinformation. Such an approach employs a suite of complementary strategies — including disclosure rules, political ad registries, narrow content-based regulations against false election speech, self-regulation by online platforms, norm-based initiatives, civic education, and media literacy. It also deploys a mix of regulatory styles, namely, legal regulation (regulation imposed by the state), self-regulation (regulation by private actors), and co-regulation (regulation through cooperation between private actors and public actors). This Article has shown how the approach in Canada is multifaceted in both of these respects. In addition to incorporating a wide range of tactics by both public and private actors, the Canadian approach has adopted a mix of regulatory styles. The Article also canvasses the advantages and drawbacks of each individual tactic. 
In addition, this Article focuses on the dilemma posed by protecting the electoral process from disinformation while also protecting the freedom of speech. It argues that a multifaceted public-private approach allows for the trade-off between disinformation and free speech to be optimized. The combined and interactive effects of a multifaceted approach provide helpful protections against some of the harms of disinformation. More importantly, the adoption of these multifaceted public-private strategies signals the importance of electoral integrity to citizens thereby bolstering public trust in elections, a key ingredient of long-term democratic stability.
'Optimal Social Media Content Moderation and Platform Immunities' by Frank Fagan in (2020) European Journal of Law and Economics (forthcoming) comments
  This Article presents a model of the lawmakers' choice between implementing a new content moderation regime that provides for platform liability for user-generated content versus continuing platform immunity for the same. The model demonstrates that lawmakers prefer platform immunity, even if incivility is increasing, if the costs of implementing a platform liability regime are greater than the costs of enforcing status quo law. In addition, inasmuch as implementation of a platform liability regime is coupled with new speech restrictions that are unconstitutional or prohibitively costly, lawmakers prefer immunity, but platforms are free to set strong content moderation policies consistent with existing law. Thus, the private governance function of platforms highlighted by Balkin and others is directly related to lawmakers' ability to enact and enforce alternatives, and further, it goes beyond mere private enforcement of existing free speech restrictions. Inasmuch as lawmakers are prohibited from suppressing unwanted speech by constitutional limits as well as lawmaking and enforcement costs, they give platforms wider discretion to make private suppression decisions. The status quo governance function of platforms, therefore, includes a private lawmaking function for determining which types of speech to suppress, albeit one bounded by the state’s appetite for alternatives.