16 March 2020

Big Data, Privacy and Antitrust

'The Impulse to Condemn the Strange: Assessing Big Data in Antitrust' by Alexander Krzepicki, Joshua D. Wright and John M. Yun in (2020) 2(2) CPI Antitrust Chronicle 16 comments
 An emerging refrain in antitrust dialog is that the accumulation and use of big data is a unique and particularly troublesome entry barrier, worthy of antitrust scrutiny. Yet, it seems that both the concept of big data and entry barriers continue to be used in a highly casual and superficial manner. In this article, we argue that big data should properly be considered a two-stage process. In stage one, a firm collects the data. In stage two, a firm transforms the data into some benefit that ultimately increases profitability. We also discuss whether big data should be considered an entry barrier, which, in a broad and abstract sense, measures the relative difficulty of obtaining necessary inputs to production.
'Concealed Data Practices and Competition Law: Why Privacy Matters' (UNSW Law Research Paper No. 19-53, 2019) by Katharine Kemp comments
This paper argues that the degradation of consumer data privacy in the digital environment causes objective detriment to consumers and undermines the competitive process, and should therefore be of critical concern to competition law. Consumers are frequently unaware of the extent to which their personal data is collected, the purposes for which it is used, and the extent to which it is disclosed to others, particularly in digital markets. Researchers and regulators have observed that this is not simply a matter of consumer apathy, but that firms often understate and obscure their actual data practices, preventing consumers from making informed choices. This paper defines, and provides examples of, a set of “concealed data practices”. These concealed data practices create objective costs and detriments for consumers, making them more susceptible to criminal activity, discrimination, exclusion, manipulation and humiliation. This paper argues that these practices are not only problematic in terms of consumer protection and privacy regulation. Concealed data practices should also be of concern to competition policy due to their role in chilling competition on privacy; preserving substantial market power by means other than superior efficiency; and deepening information asymmetries and imbalances in bargaining power. The paper concludes by outlining four ways in which these factors should be taken into account by competition authorities.

Robophobia

'Robophobia' by Andrew Keane Woods comments
 Robots — machines, algorithms, artificial intelligence — play an increasingly important role in society, often supplementing or even replacing human judgment. Scholars have rightly become concerned with the fairness, accuracy, and humanity of these systems. Indeed, anxiety about machine bias is at a fever pitch. While these concerns are important, they nearly all run in one direction: we worry about robot bias against humans; we rarely worry about human bias against robots. 
This is a mistake. Not because robots deserve, in some deontological sense, to be treated fairly — although that may be true — but because human bias against non-human deciders is bad for humans. For example, it would be a mistake to reject self-driving cars merely because they cause a single fatal accident. Yet this is what we do. We tolerate enormous risk from humans, but almost none from robots. A substantial literature — almost entirely ignored by legal scholars concerned with algorithmic bias — suggests that we routinely prefer worse-performing humans over better-performing robots. We do this on our roads, in our courthouses, in our military, and in our hospitals. Our bias against robots is costly, and it will only get more so as robots become more capable. 
This paper catalogs the many different forms of anti-robot bias and suggests some reforms to curtail the harmful effects of that bias. The paper’s descriptive contribution is to develop a taxonomy of robophobia. Its normative contribution is to offer some reasons to be less biased against robots. The stakes could hardly be higher. We are entering an age when one of the most important policy questions will be how and where to deploy machine decision-makers. In doing so, we must be mindful of our own biases just as we must be aware of algorithmic biases.

Telecommunications Legislation Amendment (International Production Orders) Bill 2020

The Parliamentary Joint Committee on Intelligence and Security (PJCIS) has commenced a review into the effectiveness of the Telecommunications Legislation Amendment (International Production Orders) Bill 2020 (Cth), drafted to amend the Telecommunications (Interception and Access) Act 1979 (Cth).

The Bill seeks to
  •  provide a framework for Australian agencies to obtain independently-authorised international production orders for interception, stored communications and telecommunications data directly to designated communications providers in foreign countries with which Australia has a designated international agreement 
  • amend the regulatory framework to allow Australian communications providers to intercept and disclose electronic information in response to an incoming order or request from a foreign country with which Australia has an agreement 
  •  make amendments contingent on the commencement of the proposed Federal Circuit and Family Court of Australia Act 2020; and 
  • remove the ability for nominated Administrative Appeals Tribunal members to issue certain warrants. 
The effect is to provide for the legislative framework for Australia to give effect to future bilateral and multilateral agreements for cross-border access to electronic information and communications data, such as that being negotiated with the United States for the purposes of the US Clarifying Lawful Overseas Use of Data Act (CLOUD Act).