Individual Submission Summary
Share...

Direct link:

The Promise of Standardization: Lessons from an Epistemic Crisis in Toxicology

Fri, September 6, 1:00 to 2:30pm, Sheraton New Orleans Hotel, Floor: Eight, Orpheus

Abstract

Amid a perceived increase in irreproducible research and scientific fraud, many researchers and policymakers have advocated for more rigorous standardization of protocols and reporting. Several scientists have identified biomedical standards, including “Good Laboratory Practice” (GLP) for chemical testing, as a model from which to establish “Good Institutional Practice” for research (Begley et al 2015). This paper examines the history of GLP, which was created in response to an earlier perceived crisis in reproducibility. In the 1970s, high-profile cases of laboratory misconduct and fraud prompted concerns about the reliability of data submitted to regulatory agencies and led to an epistemic crisis in toxicology. In response, between 1976 and 1983, the US FDA and EPA adopted GLP regulations to ensure the integrity of data used to evaluate the safety of chemicals. The new regulations were essentially a quality assurance scheme focused on standardization and documentation. In the 1980s, the rules were codified internationally and they are now used by regulators in 41 countries. The rules fostered trust in the reliability of data from laboratories around the world, facilitated commerce, and reduced duplicative testing. However, critics argue this came at a cost: that GLP froze “good practice” in time, making it difficult to incorporate evolving knowledge. I argue that GLP elevated reliability over validity as a criterion of high-quality research and thwarted innovative forms of toxicological knowledge—including research on endocrine-disrupting chemicals’ adverse effects—from being considered in regulatory decisions. I illustrate how standardization efforts like GLP can lead to counter-productive epistemic consequences.

Author