'The Big Data Regulator, Rebooted: Why and How the FDA Can and Should Disclose Confidential Data on Prescription Drugs and Vaccines' by Amy Kapczynski and Christopher J. Morten in (2021) 109(2) California Law Review 493 comments
Medicines and vaccines are complex products, and it is often extraordinarily difficult to know whether they help or hurt. The Food and Drug Administration (FDA) holds an enormous reservoir of data that sheds light on that precise question, yet currently releases only a trickle to researchers, doctors, and patients. Recent examples show that data secrecy can be deadly, and existing laws such as the Freedom of Information Act (FOIA) cannot solve the problem. We present here a wealth of new evidence about the urgency of the problem and argue that the FDA must “reboot” its rules to proactively disclose all safety and efficacy data for drugs and vaccines with minimal redactions, deploying data use agreements to ensure the most sensitive data is handled appropriately. In line with the literature that has been critical of simplistic calls for “transparency,” we urge a more contextual form of “data publicity.” We also show that clinical trial data publicity can be achieved without legislative reform, while respecting privacy, protecting any legitimate trade secrets, and maintaining or improving incentives to innovate. The FDA must adapt to protect and expand structural accountability and to protect the public and its trust. The model we offer here could guide similar action at other regulatory agencies as well, enabling better oversight of information-intensive industries and helping safeguard the agencies themselves.
The authors argue
Few issues are more important to the American public than the quality and safety of our medicines. About half of all Americans take one or more prescription drugs, and medicines represent a startling 2% of total U.S. gross domestic product (GDP) each year. Life as we know it relies on vaccines that prevent dangerous diseases. But there is a structural problem at the heart of our system for the development and assessment of therapeutics and vaccines: a problem of secrecy in the age of big data.
The problem of data secrecy is especially visible in the shadow of the COVID-19 pandemic. As we complete this in the summer of 2020, governments around the world are taking unprecedented measures to promote the development of a COVID-19 vaccine. Billions of dollars of public money are being invested, with dozens of potential vaccines in development. But researchers have raised an outcry, pointing out that they have no access to some of the most basic and important information about the design and outcomes of the most promising COVID-19 vaccine trials. Access to this information could enable scientists to understand key clinical trial decisions in time to influence them, to evaluate the quality of the evidence as it emerges, and to protect against mistakes and misconduct, such as changes in trial endpoints that produce spurious results. Researchers could also make novel uses of the data collected, advancing our understanding of COVID-19 at a critical time. Under pressure, several companies (as of this writing) have begun to release some such data voluntarily. This is a positive step and a proof of concept. But there are important gaps in what has been provided and no systems in place to be sure that they will be remedied, despite the extraordinary stakes.
The inability to access data related to COVID-19 vaccine development sheds light on the problems caused by systemic data secrecy in clinical trials. Therapeutics and vaccines are complex products. We cannot know whether they hurt or help without rigorous clinical trials, whose conduct and interpretation are highly complicated. Today these trials, particularly at later stages, are typically conducted by companies with strong financial interests in the outcomes. This is a key justification for our drug regulatory system: independent experts are needed to protect the public by examining and validating data about the effects of medicines. But our drug regulatory bodies are under-resourced, and recent examples show that outside expert analysis can reveal concealed risks of medicines.
The rise and fall of the painkiller rofecoxib (Vioxx) offers a stark example of the harms of data secrecy. The drug was promoted as being safer than aspirin and became a blockbuster. It earned $2 billion each year for Merck before it was abruptly removed from the market because it caused heart attacks, strokes, and heart failures. The evidence only became known to outside experts through litigation. Later independent research showed that signals of these risks were present in data held by the FDA nearly 3.5 years before the drug was withdrawn from the market. That evidence did not reach doctors or patients because the data was not made available to the scientific community. An FDA official later estimated that tens of thousands of people died as a result.
Data secrecy also causes harm by undermining our health care system. Secrecy prevents us from making the best allocation of scarce resources and obscures avenues for systematic reforms at the FDA and in the pharmaceutical industry. Data secrecy may also undermine trust. The American public, for example, expressed widespread hesitancy about any COVID-19 vaccine that was to be rushed to market before the November 2020 U.S. election. Sharing safety and efficacy data on drugs and vaccines—including COVID-19 vaccines—would help to secure public trust in the FDA review process and in the products that emerge from it and would help to protect the scientific integrity of the FDA review process from political pressure.
There is, accordingly, an emerging consensus that independent researchers need better access to clinical trial data to keep both the industry and regulators honest and accountable. Yet existing tools for an independent assessment of clinical trial data are inadequate. What remains missing is an effective legal and regulatory framework for the release of this data within the United States. For several years, working closely with medical researchers and a legal team, we have worked to maximize the potential of existing strategies for clinical trial data disclosure. This Article sets out a key lesson of that work: existing tools are inadequate for the task. If researchers are to have systematic access to the clinical trial data needed to help spot unsafe and ineffective medicines, the FDA will have to make clinical trial data available proactively.
We show that the agency can, consistent with existing law, make clinical trial data available proactively. We describe how the FDA can do so while navigating the two main challenges of data sharing: protecting the privacy of individuals who participate in trials and addressing claims that company data should remain confidential. Drawing on examples of successful data sharing in other countries and at other agencies, we also show that the process can be done effectively and manageably. Our central contribution is a wealth of new evidence about the significance of the problem and an updated argument for proactive disclosure that can be achieved without legislative reform. We reveal the flaws in arguments that comprehensive proactive disclosure is prohibited under U.S. federal statutes or, if permitted, will require expensive compensation to the industry for intellectual property violations.
This Article is centrally aimed at solving an important public health problem, but it also contributes to two broader literatures. The first is the literature on transparency and the implications of freedom of information laws. Transparency as an ideal has been rightly criticized recently as having taken on a formalistic, decontextualized quality. As an ideal, transparency does not appropriately recognize that “freedom” at times requires more than unfettered, standardless exchange and does not appreciate how freedom of information laws can be weaponized to undermine public interests. We show here that the implications of data sharing turn on and should be sensitive to a broader political-economic context. Data sharing can serve public interests because of a wider ecology that provides researchers with the necessary resources to analyze the data and includes publications and norms (of the “open science” tradition in academic medicine, for example) that help generate and validate important new insights and challenge false claims. Data itself does not produce these insights, and a context that enables trustworthy analysis is essential if data sharing is to work well.
To this end, we argue that data use agreements will be an important component of data disclosures in our “big data” age. They provide a means to navigate issues of privacy and commercial interest—issues that can otherwise shut down data sharing, rightly or wrongly—and a mechanism to develop and impose other publicly minded conditions. The role of these agreements here illustrates the importance of contract as a tool to facilitate information exchange and innovation. Decontextualized demands for “openness” have gained traction in recent decades and might suggest that in every instance we need unfettered data exchange that treats all parties equally, including companies. We argue instead that the FDA should prioritize health researchers over industry actors and that it should use data use agreements to ensure those researchers protect legitimate public interests. These contracts are possible only with proactive disclosure and are inconsistent with reactive FOIA requests.
We join other scholars in suggesting that the future of freedom of information, if it is to achieve its aims, lies in the development of robust proactive disclosure systems. In part to mark these distinctions, we call what we seek here not data transparency, but data “publicity.” The term as we use it, which draws upon early progressive traditions, marks the need for attention to context, power, and resources if data sharing is to serve the public. We also seek to contribute to the broader literature on the future of the regulatory state and the conditions of democracy broadly understood. Today, we live in an extraordinarily information-intensive age. Decades of dramatic advances in technologies for information processing have transformed the core of the modern economy and enabled the emergence of massively complex new industries and firms. This means that not only pharmaceuticals but also products like cars, insurance, airplanes, and phones are far more informationally intensive today than they were twenty years ago. Informationally intensive products and systems are complex, opaque, and dynamic. Systems that are improperly or fraudulently designed — think here about Volkswagen’s deceptive “defeat device” to evade emissions testing, or Boeing’s defective automated flight software for the 737 Max — generate serious social and individual harms. Regulators face growing challenges in this environment, and we need structures to allow the public to hold both regulators and the industry accountable. Yet the same barriers that appear in this context—issues of privacy, corporate claims to trade secrecy and confidentiality, and difficulties with reactive data release models (FOIA especially)—will reappear throughout the administrative state. Our Article thus can help inform a wide variety of regulators who face related issues, whether in the area of consumer products, environmental protection, or artificial intelligence. Data publicity will have plausible benefits elsewhere, and regulators can learn from how it can be achieved at the FDA. But they must also learn from the fertile conditions in the pharmaceutical and medical context that allow clinical trial data publicity to inform the public. It is not open data alone, but data publicity in a context where resources and expertise exist to enable intelligible uses of such data, that furthers democratic accountability.
We begin in Part I by describing the need for proactive disclosure of safety and efficacy data and why existing legal avenues, such as FOIA, fail to create adequate data publicity. In Part II, we show that, contrary to the conventional wisdom and the (usual) view of the FDA itself, federal law does not prohibit the FDA from disclosing such data, even from the moment of drug or vaccine approval. Consistent proactive disclosure, however, will require revisions to the FDA’s current regulations, corrections to its interpretations of certain statutes, and, for the most sensitive data, data use agreements. We also show that the move should not hurt and may improve innovation, nor should it require compensation under the Takings Clause. If the agency does not act, Congress can and should, as we describe in Part III.