The Social Risks of Science
Jonathan Herington
Search for more papers by this authorScott Tanona
Search for more papers by this authorJonathan Herington
Search for more papers by this authorScott Tanona
Search for more papers by this authorAbstract
Many instances of scientific research impose risks, not just on participants and scientists but also on third parties. This class of social risks unifies a range of problems previously treated as distinct phenomena, including so-called bystander risks, biosafety concerns arising from gain-of-function research, the misuse of the results of dual-use research, and the harm caused by inductive risks. The standard approach to these problems has been to extend two familiar principles from human subjects research regulations—a favorable risk-benefit ratio and informed consent. We argue, however, that these moral principles will be difficult to satisfy in the context of widely distributed social risks about which affected parties may reasonably disagree. We propose that framing these risks as political rather than moral problems may offer another way. By borrowing lessons from political philosophy, we propose a framework that unifies our discussion of social risks and the possible solutions to them.
Notes
- 1 We claim neither that this list is exhaustive nor that it is the only, or even the best, way of enumerating risks.
- 2J. Kimmelman, “Medical Research, Risk, and Bystanders,” IRB: Ethics & Human Research 27, no. 4 (2005): 1–6; J. Kimmelman, “Missing the Forest: Further Thoughts on the Ethics of Bystander Risk in Medical Research,” Cambridge Quarterly of Healthcare Ethics 16, no. 4 (2007): 483-90; N. Eyal, “How to Keep High-Risk Studies Ethical: Classifying Candidate Solutions,” Journal of Medical Ethics 43, no. 2 (2017): 74-77.
- 3D. B. Resnik, “Ethical Issues in Field Trials of Genetically Modified Disease-Resistant Mosquitoes,” Developing World Bioethics 14, no. 1 (2014): 37–46; M. J. Selgelid, “Gain-of-Function Research: Ethical Analysis,” Science and Engineering Ethics 22, no. 4 (2016): 923-64.
- 4M. Lipsitch and A. P. Galvani, “Ethical Alternatives to Experiments with Novel Potential Pandemic Pathogens,” PLOS Medicine 11, no. 5 (2014): e1001646.
- 5S. Miller and M. J. Selgelid, Ethical and Philosophical Consideration of the Dual-Use Dilemma in the Biological Sciences (Springer Science & Business Media, 2008).
- 6H. Douglas, “Inductive Risk and Values in Science,” Philosophy of Science 67, no. 4 (2000): 559–79.
- 7Besides the inductive risk associated with the appropriate level of evidence for choosing or rejecting hypotheses, there are a variety of other epistemic risks, such as whether the choice of model or theory is appropriate for the aims of the researcher or user. See, for example, J. B. Biddle and R. Kukla, “The Geography of Epistemic Risk,” in Exploring Inductive Risk, ed. K. Elliott and T. Richards (New York: Oxford University Press, 2017), 215–35.
- 8 Depending on one's views about ethics, one might not identify these as harms from the research. For the purposes here of considering the effects of research choices on third parties, we take a broader approach and identify the potential of the lost benefit to an individual as a risk.
- 9M. J. Selgelid, “Ethics and Infectious Disease,” Bioethics 19, no. 3 (2005): 272–89.
- 10Consider, for example, the decades of research on the “stress hypothesis” regarding stomach ulcers. Arguably, this work, though broadly thought to be worthwhile at the time, diverted time and attention from the actual cause of ulcers— Helicobacter pylori. These opportunity costs are compounded by a clustering effect, whereby scientists adopt those hypotheses, theories, and models that others have noted as initially promising. See K. Zollman, “The Communication Structure of Epistemic Communities,” Philosophy of Science 74, no. 5 (2007): 574-87.
- 11Kimmelman, “ Medical Research, Risk, and Bystanders.”
- 12For instance, in arguing that gain-of-function experiments on potential pandemic pathogens ought to be curtailed, Nicholas Evans, Marc Lipsitch, and Meira Levinson appeal to the Nuremberg Code on human subjects research (N. G. Evans, M. Lipsitch, and M. Levinson “The Ethics of Biosafety Considerations in Gain-of-Function Research Resulting in the Creation of Potential Pandemic Pathogens,” Journal of Medical Ethics 41, no. 11 (2015): 901–8.
- 13E. J. Emanuel, D. Wendler, and C. Grady, “What Makes Clinical Research Ethical?,” Journal of the American Medical Association 283 (2000): 2701–11, at 2705.
- 14“ Trials of War Criminals before the Nuremberg Military Tribunals under Control Council Law No. 10,” vol. 2 (Washington, DC: U.S. Government Printing Office, 1949), 181-82. The Nuremberg Code is excerpted by the National Institutes of Health at https://history.nih.gov/display/history/Nuremburg+Code.
- 15 U.S. Department of Health and Human Services, Protection of Human Subjects, 45 C.F.R. §46.111. This explicitly notes, “The IRB should not consider possible long-range effects of applying knowledge gained in the research (e.g., the possible effects of the research on public policy) as among those research risks that fall within the purview of its responsibility.”
- 16The consideration of benefits, however, typically includes only those that are directly related to the individuals' participation qua participant. Typically, inducements and monetary compensation are not thought to be legitimate benefits to be maximized. See E. A. Largent and H. Fernandez Lynch, “Paying Research Participants: Regulatory Uncertainty, Conceptual Confusion, and a Path Forward,” Yale Journal of Health Policy, Law and Ethics 17, no. 1 (2017): 61–142.
- 17S. K. Shah et al., Ethical Considerations for Zika Virus Human Challenge Trials (Bethesda, MD: National Institute of Allergy and Infectious Diseases, 2017), https://www.niaid.nih.gov/sites/default/filesEthicsZikaHumanChallengeStudiesReport2017.pdf.
- 18 National Science Advisory Board for Biosecurity, Recommendations for the Evaluation and Oversight of Proposed Gain-of-Function Research (Bethesda, MD: National Institutes of Health, 2016), 43.
- 19 U.S. Department of Health and Human Services, Protection of Human Subjects, 45 C.F.R. §46.111.
- 20Emanuel, Wendler, and Grady, “ What Makes Clinical Research Ethical?,” 2706.
- 21S. Rayner et al., “The Oxford Principles,” Climatic Change 121, no. 3 (2013): 499–512.
- 22K. Lloyd and J. White, “Democratizing Clinical Research,” Nature 474 (2011): 277–78.
- 23V. Bush, Science, the Endless Frontier: A Report to the President by Vannevar Bush, Director of the Office of Scientific Research and Development, July 1945 (Washington, DC: U.S. Government Printing Office, 1945).
10.2307/3625196 Google Scholar
- 24E. Yudkowsky, “Cognitive Biases Potentially Affecting Judgment of Global Risks,” in Global Catastrophic Risks, ed. N. Bostrom and M. Cirkovic (New York: Oxford University Press, 2008), 91–119.
- 25 U.S. Department of Health and Human Services, Public Health Emergency, United States Government Policy for Institutional Oversight of Life Sciences Dual Use Research of Concern (Washington, DC: U.S. Government, 2014).
- 26S. K. Shah et al., “Bystander Risk, Social Value, and Ethics of Human Research,” Science 360 (2018): 158–59; N. Eyal, “How to Keep High-Risk Studies Ethical: Classifying Candidate Solutions,” Journal of Medical Ethics 43, no. 2 (2017): 74-77.
- 27C. R. Sunstein, “Costs and Benefits,” chap. 6 in Laws of Fear: Beyond the Precautionary Principle (Cambridge: Cambridge University Press, 2005); C. R. Sunstein, “The Real World of Cost-Benefit Analysis: Thirty-Six Questions (and Almost As Many Answers),” Columbia Law Review 114 (2014): 167-211.
10.1017/CBO9780511790850 Google Scholar
- 28The NSABB's own expert report on the ethical dimensions of gain-of-function experiments notes that “it is doubtful that any clear algorithmic approach to evaluating GOFR would be justifiable or should be considered realistic or desirable” (M. J. Selgelid, Gain-of-Function Research: Ethical Analysis [Clayton, Australia: Monash University Centre for Human Bioethics, 2016], 32).
- 29Large-scale analyses of the risks and benefits of different public health interventions, for instance, employ extremely complex (and much criticized) surveys to establish the precise weightings people accord to different health states. See J. A. Salomon et al., “Common Values in Assessing Health Outcomes from Disease and Injury: Disability Weights Measurement Study for the Global Burden of Disease Study 2010,” Lancet 380 (2012): 2129–43.
- 30J. Nou, “Regulating the Rulemakers: A Proposal for Deliberative Cost-Benefit Analysis,” Yale Law & Policy Review 26, no. 2 (2008): 601–44; R. B. Reich, “Public Administration and Public Deliberation: An Interpretive Essay,” Yale Law Journal 94, no. 7 (1985): 1617-41.
- 31See, for instance, the arguments discussed in Jerry L. Mashaw's Reasoned Administration and Democratic Legitimacy: How Administrative Law Supports Democratic Government (Cambridge: Cambridge University Press, 2018).
10.1017/9781108355827 Google Scholar
- 32See, for instance, the approach of Evans, Lipstitch, and Levinson, “ The Ethics of Biosafety Considerations in Gain-of-Function Research.”
- 33J. S. Mill, “Representative Democracy,” in On Liberty and Other Essays, ed. J. Gray (Oxford: Oxford University Press, 1991), 205–469; T. Christiano, The Rule of the Many: Fundamental Issues in Democratic Theory, 1st ed. (Boulder, CO: Westview Press, 1996); I. M. Young, Inclusion and Democracy (Oxford: Oxford University Press, 2000).
- 34J. Rawls, A Theory of Justice (Cambridge, MA: Belknap Press, 1999), 199.
- 35S. Jasanoff, States of Knowledge: The Co-production of Science and Social Order (Abingdon, UK: Routledge, 2004); J. Lezaun, N. Marres, and M. Tironi, “Experiments in Participation,” in The Handbook of Science and Technology Studies, 4th ed., ed. U. Felt et al. (Cambridge, MA: MIT Press, 2017); S. Shapin, “Discipline and Bounding: The History and Sociology of Science as Seen through the Externalism-Internalism Debate,” History of Science 30 (1992): 333-69; L. Winner, “Do Artifacts Have Politics?,” Daedalus 109, no. 1 (1980): 121-36.
10.4324/9780203413845 Google Scholar
- 36S. E. Cozzens and E. J. Woodhouse, “Science, Government, and the Politics of Knowledge,” in Handbook of Science and Technology Studies, ed. S. Jasanoff et al. (Thousand Oaks, CA: Sage, 1995), 533–53.
- 37S. Epstein, Impure Science: AIDS, Activism, and the Politics of Knowledge (Berkeley: University of California Press, 1996); Lezaun, Marres, and Tironi, “Experiments in Participation.”
- 38See, for example, S. G. Harding, Whose Science? Whose Knowledge? Thinking from Women's Lives (Ithaca, NY: Cornell University Press, 1991), and E. F. Keller, Reflections on Gender and Science (New Haven, CT: Yale University Press, 1985).
- 39H. E. Longino, Science as Social Knowledge: Values and Objectivity in Scientific Inquiry (Princeton, NJ: Princeton University Press, 1990).
10.1515/9780691209753 Google Scholar
- 40D. M. Wenner, “The Social Value Requirement in Research: From the Transactional to the Basic Structure Model of Stakeholder Obligations,” Hastings Center Report 48, no. 6 (2018): 25–32.
- 41A. J. London, “Justice and the Human Development Approach to International Research,” Hastings Center Report 35, no. 1 (2005): 24–37.
- 42P. Kitcher, Science in a Democratic Society (Amherst, NY: Prometheus Books, 2011); J. Dupre, “Toward a Political Philosophy of Science,” chap. 8 in The Philosophy of Philip Kitcher, ed. M. Couch and J. Pfeifer (Oxford: Oxford University Press, 2016); S. A. Schroeder, “Using Democratic Values in Science: An Objection and (Partial) Response,” Philosophy of Science 84 (2017): 1044-54.
10.1163/9789401207355_003 Google Scholar
- 43P. Dasgupta and P. A. David, “Toward a New Economics of Science,” Research Policy, special issue in honor of Nathan Rosenberg 23, no. 5 (1994): 487–521; Zollman, “The Communication Structure of Epistemic Communities”; K. Zollman, “The Credit Economy and the Economic Rationality of Science,” Journal of Philosophy 115, no. 1 (2018): 5-33.
- 44For a similar argument with respect to the social value requirement, see Wenner, “ The Social Value Requirement in Research.”
- 45J. Rawls, The Law of Peoples: With “The Idea of Public Reason Revisited” (Cambridge, MA: Harvard University Press, 1999); C. Bellolio Badiola, “Science as Public Reason: A Restatement,” Res Publica 24, no. 4 (2018): 415-32.
- 46A. J. London and J. Kimmelman, “Clinical Trial Portfolios: A Critical Oversight in Human Research Ethics, Drug Regulation, and Policy,” Hastings Center Report 49, no. 4 (2019): 31–41.
- 47M. Lipsitch, “Can Limited Scientific Value of Potential Pandemic Pathogen Experiments Justify the Risks?,” mBio 5, no. 5 (2014): e02008–14.
- 48Gryphon Scientific, Risk and Benefit Analysis of Gain of Function Research (Bethesda, MD: National Science Advisory Board for Biosecurity, 2015), http://gryphonsci.wpengine.com/wp-content/uploads/2018/12/Risk-and-Benefit-Analysis-of-Gain-of-Function-Research-Final-Report-1.pdf.
- 49 Institute of Medicine and National Research Council, Globalization, Biosecurity, and the Future of the Life Sciences (Washington, DC: National Academies Press, 2006), http://www.nap.edu/catalog/11567/globalization-biosecurity-and-the-future-of-the-life-sciences; F. Lentzos, “Countering Misuse of Life Sciences through Regulatory Multiplicity,” Science and Public Policy 35, no. 1 (2008): 55-64.
- 50M. Lipsitch and T. V. Inglesby, “Moratorium on Research Intended to Create Novel Potential Pandemic Pathogens,” mBio 5, no. 6 (2014): e02366–14.
- 51E. Wagner and J. Herington, “Agent-Based Models of Dual-Use Research Restrictions,” British Journal for the Philosophy of Science (April 20, 2019): doi:10.1093/bjps/axz017; M. J. Imperiale and A. Casadevall, “Vagueness and Costs of the Pause on Gain-of-Function (GOF) Experiments on Pathogens with Pandemic Potential, Including Influenza Virus,” mBio 5, no. 6 (2014): e02292–14.
- 52 Mashaw, Reasoned Administration and Democratic Legitimacy.
- 53 Nou, “ Regulating the Rulemakers”; Reich, “Public Administration and Public Deliberation.”
- 54S. Uniacke, “The Doctrine of Double Effect and the Ethics of Dual Use,” chap. 10 in On the Dual Uses of Science and Ethics: Principles, Practices, and Prospects, ed. B. Rappert and M. J. Selgelid (Canberra, Australia: Australian National University E Press, 2013), doi.rg/10.22459/DUSE.12.2013.10.
- 55 Indeed, not holding scientists responsible for these risks might increase the chances of such misuses if scientists then pay less attention to such risks because they are not morally responsible—the interpersonal approach might actually exacerbate the externality of the risks.
- 56See the argument in T. Douglas, “The Dual-Use Problem, Scientific Isolationism and the Division of Moral Labour,” Monash Bioethics Review 32, no. 1-2 (2014): 86–105.