Viewpoint
Mar 2023
Peer-Reviewed

Science and Ethics of “Curing” Misinformation

Isabelle Freiling, PhD, Nicole M. Krause, MA, and Dietram A. Scheufele, PhD
AMA J Ethics. 2023;25(3):E228-237. doi: 10.1001/amajethics.2023.228.

Abstract

A growing chorus of academicians, public health officials, and other science communicators have warned of what they see as an ill-informed public making poor personal or electoral decisions. Misinformation is often seen as an urgent new problem, so some members of these communities have pushed for quick but untested solutions without carefully diagnosing ethical pitfalls of rushed interventions. This article argues that attempts to “cure” public opinion that are inconsistent with best available social science evidence not only leave the scientific community vulnerable to long-term reputational damage but also raise significant ethical questions. It also suggests strategies for communicating science and health information equitably, effectively, and ethically to audiences affected by it without undermining affected audiences’ agency over what to do with it.

Current Myopia About Misinformation

“I believe that misinformation is now our leading cause of death,” US Food and Drug Administration Commissioner Robert Califf tweeted in April 2022, “and we must do something about it.”1 Diagnoses like this one are understandable. Normative democratic ideals assume that the best available scientific evidence informs both societal and individual decisions and crowds out misperceptions, disinformation, or conspiratorial thinking. Because of what it sees as a recent deviation from these norms in our current information environment, the World Health Organization has warned about an “infodemic,”2 or a deluge of information—some of it inaccurate—that makes it difficult for citizens to separate signal from noise.

The assumption that misinformation is a new problem and Califf’s diagnosis of it raise several questions. Are there, in fact, reliable bodies of evidence that demonstrate that misinformation among public audiences is (a) more widespread now than it has been in the past and (b) causally rather than correlationally linked to behavioral choices or attitudes that might be harmful to societal or individual well-being? As we have showed elsewhere, the answer to both questions, at the moment, is “no.”3 Given the problem’s uncertain diagnosis and prognosis, the answers to various questions regarding solutions quickly become complicated. What, if anything, constitutes an appropriate way to intervene on misinformation and its behavioral correlates, such as vaccine hesitancy? Does this answer change for different audiences (eg, medical professionals vs nonexperts)? When, if at all, should informational correctives or interventions targeting behavioral correlates of misinformation occur? To whom should they be delivered, in what form, and with what degree of certainty? And who should deliver them—scientists, the government, or other actors?

The complexity of the problem of misinformation and the weak evidence base for both its diagnosis and prognosis have not stopped many in the academic community from urging large-scale interventions to stop the spread and uptake of misinformation about science and health.4,5 These interventions raise a number of ethical concerns, all of which we explain in more detail below. First, some of these interventions are driven by a (sub)conscious desire among scientists to shape rather than inform public policy and to change how the public consumes public health information. This is not to say that scientists should not engage in continuous dialogue with policy makers6 about how science can inform policy options or with other “publics”7 about individual citizen choices. Some scientists’ strategy of urging policy makers or citizens to “just follow the science,” however, is not only naïve with respect to its likely success but also normatively at odds with how democratic societies make policy choices. As Stevens succinctly put it: “[T]he process of organising knowledge for policy through advisory committee is political, as well as scientific.… So when a government claims to be ‘following the science’ in response to a global pandemic, we need to treat this claim with caution.”8 We will return to this concern below.

The intention to influence policy choices or change citizens’ behavior by solely following the science, unmediated by politics—concerning in itself—is even more troubling in light of the second ethical issue: corrective interventions for misinformation can have unintended effects that undermine scientists’ credibility, raise ethical dilemmas, or create additional vulnerabilities for some populations. Finally, many of these interventions—ironically—are at odds with the best available scientific evidence about how to effectively achieve different science communication goals across diverse populations and in new media environments driven by algorithms.

Informing Policy vs Social Engineering

Scientists’ instinctual desire to intervene in what they see as a worsening misinformation problem comes with a temptation to claim authority over policy questions that do not solely have scientific answers. For example, should we mandate COVID-19 vaccines in elementary schools if those vaccines have been proven to be safe for children between 5 and 12 years of age? The second part of this question—ie, whether vaccines are “safe” according to a given technical standard—is one that science has the competence and authority to answer. However, the first part of this question—the vaccine mandate—is a question of public policy that should be determined by democratic processes involving diverse groups of stakeholders. Although science can function in policy-related deliberations as a factual evidentiary authority—for example, by providing insights as to the comparative health risks and benefits of vaccines—the weight of scientists’ authority relative to that of others in policy making processes is determined by democratic institutions.

This distinction between empirical and policy questions is captured in the term policy itself. In the polis, or the Greek ideal city-state, communities are run by citizens. Scientists and scientific institutions, as one component of our larger polis, are uniquely positioned to inform policy decisions with reliable evidence, but scientific evidence is just one of many input streams to policy choices. As such, scientific evidence competes in democratic decision-making processes with other kinds of considerations, including societal values, strategic goals, and social norms.

As we argued early in the pandemic,9 science will risk losing public support—especially among some partisan groups—if it blurs the boundaries between empirical questions that it is qualified to answer and policy questions that can only be addressed as part of broader public deliberations about facts, values, and societal priorities. Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases and chief medical advisor to the US president, demonstrates how to successfully walk this very fine line. When asked whether he would recommend a vaccination requirement for domestic travel in December of 2021, for example, he responded: “The President takes all recommendations, all discussions, and, as a group, we make a decision about what’s best to do.”10

Recognizing the need for science to be seen as a good-faith arbiter of reliable evidence to inform policy, some in the academic community have shifted their attention to maximizing public uptake of reliable information and preventing the spread of misleading claims. The COVID-19 pandemic, not surprisingly, put significant strain on this approach, given that scientists were faced with the temptation to counter misinformation that they knew to be wrong with rapidly changing scientific evidence, which often turned out to be wrong or incomplete. For example, former President Trump’s misleading claims about the benefits of hydroxychloroquine were quickly debunked by fact checkers based on an article that had been published in the Lancet, which was later retracted.3,11 The retraction led some of Trump’s defenders to label fact-checks and articles published in medical journals contradicting Trump’s claims as essentially biased political attacks on Trump.12 The damage was done, even though a recent study published in the Lancet Regional Health Americas found that hydroxychloroquine has no benefit in reducing risk of hospitalization for COVID-19.13 As this example shows, the possible benefits we reap from fact-checking scientific claims in contexts of high scientific uncertainty may simply fail to outweigh the risks and unintended consequences of undermining scientific authority.

Attempts to manage public opinion by socially engineering the ways that citizens use and interpret information are riddled with ethical pitfalls. It might be tempting to believe that, by “sticking to the facts,” scientific communicators remain neutral brokers of information whose purview encompasses interventions to combat misinformation. However, it is possible for scientific actors to use the “facts” to communicate in ethically questionable ways.14 For example, there has been increasing attention to communication approaches designed to “inoculate” members of the public against developing misperceptions about science as a sort of preventive intervention strategy.15  Psychological inoculation involves warning audiences about the existence of certain misinformation claims before they are exposed to them (eg, “over 31,000 scientists have signed a petition that there is no scientific evidence for human-caused global warming”), and then following that warning with a factual refutation (eg, “97% of climate scientists have concluded that human-caused global warming is happening”).16 Rooted in propaganda research conducted during World War II,17 inoculation research is designed to influence a priori how people will process false claims when they encounter them in order to prevent the uptake of certain beliefs (typically beliefs that are considered by scientists to be false or misleading).16 Yet attempts to limit public discourse, no matter how misguided that discourse may seem, are antithetical to the very idea of science, which, over time, builds an increasingly sound epistemological account of the world through the open contestation of competing truth claims. Attempts by scientific institutions to steer or moderate public debate through inoculation therefore risk doing irreparable reputational damage to science’s claim to be a neutral arbiter of truth.

Some might argue that scientists are uniquely positioned to steer public debate, given their reputation as neutral observers whose conclusions—in an ideal world—are only bound by facts. This reputation does not always align with scientific data, however, as scientists exhibit motivated reasoning and identity-driven conclusions similar to that of non-expert citizens. For instance, male scientists rate research as lower in quality if it challenges gender bias within the academy.18 Similarly, research conducted by the third author (D.A.S.) and colleagues shows that scientists’ policy judgments about regulation of new technologies are shaped by their personal ideology and other belief systems, even after controlling for their scientific judgments about potential risks and benefits.19

When, if at all, should informational correctives or interventions targeting behavioral correlates of misinformation occur? 

Perhaps more importantly, however, inoculation and related interventions are techniques that by design rely on audiences not consenting to the “treatment.” If scientists told audiences that they selectively exposed them to small dosages of counter-messages in news or social media in order to change the way they interpret information down the road, the corrective effects might evaporate. More perniciously, scientists would rightfully come under immediate attack from political actors for not staying in their “lane.” The undermining of scientific authority would be exacerbated by a constantly changing knowledge base that might create ethical dilemmas for scientists if—in rare cases—they were aware they might inoculate against messages that turned out not to be completely false as more science emerged, as occurred early in the COVID-19 pandemic.13 In other words, informational inoculation without consent—which would be unthinkable for vaccines or other medical treatments—is both normatively troubling and potentially disastrous as a public relations problem for science.

Our concern about social engineering approaches, such as inoculation, however, should not be interpreted as advocacy for a naïve understanding of a marketplace of ideas in which “false” claims will eventually give way to “true” claims in public discourse. They will not. Even if true claims were easily identifiable (which they often are not),3 evidence-backed claims that fail to meaningfully connect to societal values and preferences are unlikely to win hearts and minds when they compete in modern media ecosystems against well-packaged falsehoods.14,20 It therefore makes sense to use research insights from the behavioral and social sciences, for example, to frame science-related messaging in ways that will resonate with specific audiences who might otherwise be unmotivated to recognize certain issues as relevant, important, or valuable to them.21 However, rather than packaging information in ways that increase target audiences’ likelihood of considering a given argument or of seeing certain facts in new ways, inoculation essentially enables the institutions who use it to paternalistically decide that certain claims are blights on public discourse and then to stealthily diminish the power of those claims by encouraging people to discount them and thereby dramatically reduce their spread.

Similar charges of paternalism can be leveled against “nudging” initiatives. Nudging, which is sometimes euphemistically described as “enhanced active choice,”22 is a social engineering strategy that shapes information delivery and decision-making processes in ways that push people’s behavior in a desired direction. To be clear, nudging is not designed to educate, as inoculation sometimes is. Instead, nudging is intended primarily to orchestrate outcomes of interest, typically outside conscious awareness.23 An example of nudging is when transportation authorities set organ donation as the default when people obtain their driver’s license—ie, rather than asking people to opt-in to organ donation, people must instead opt out. When former President Barack Obama’s Social and Behavioral Sciences Team implemented nudging strategies—eg, by redesigning communications to encourage military service members to contribute to their retirement plans24—the policies were, unsurprisingly, criticized as affronts to individual liberty on the part of condescending government institutions. “To be clear,” noted Richard Williams in Politico, “Congress did not pass legislation authorizing such activity; this is something dreamt up by bureaucracies to force their own preferences on citizens.”25

The concerns we have expressed about nudging are not meant to detract from the importance and power of social science research. Nudges can serve as valuable tools to achieve certain goals, such as mask-wearing during a public health crisis like the COVID-19 pandemic, but the question as to whether orchestrating any given outcome qualifies as using “nudging for good” is often difficult to answer and will be dependent on cost-benefit analyses that incorporate competing value systems and priorities. Is nudging people to wear masks for the sake of public health a justifiable use of nudging, if evidence about possible detrimental effects of mask-wearing on the psychological development of young children26—or on racial profiling of Black people—remains unclear, or at least difficult to quantify?27 The answer is likely still “yes, nudging to encourage mask-wearing is important,” but, as this example suggests, we need to consider the perspectives of diverse stakeholders (including scientists in specialties other than public health) in our decision-making process, as well as in the careful design and dissemination of nudging communications.

While reasonable arguments can be made both in favor of and against policy makers or government institutions using nudging strategies, the idea of scientists themselves trying to nudge publics either in their information processing or behaviors is riddled with ethical landmines. As we discussed earlier, the reputational risks that scientists and scientific institutions face when they engage in such social engineering are serious, and they will only intensify in contexts in which the science at hand is controversial or the scientific evidence underlying the social engineering strategy is rapidly shifting. Indeed, scientists who engage in inoculation or nudging will likely be perceived by some as condescending or paternalistic, as participating in an unethical overreach of their institutional authority, or even as hypocritically undermining the open contestation of knowledge that is core to scientific philosophy and epistemology.9

Communicating Science

When Congress established the US land grant system with the passage of the Morrill Act over 160 years ago, it did so with the intent to support not only the growth of scientific knowledge but also the communication of scientific information.28 As Congressman Morrill put it: to “give intelligence to those who will esteem it.… Let us have such colleges … to announce facts and fixed laws … and broadcast that knowledge.”29 As we have argued in this essay, the communication of reliable scientific information in our current highly politicized and competitive information ecologies has many ethical pitfalls. Relying on the best available social science evidence to help guide these efforts is therefore foundational to science’s ability to fulfill what some have called its “social contract.”30 It follows that it would be unethical for scientists not to do everything they can in order to ensure that the benefits of their work reach all cross-sections of society.

This ethical mandate—and the utility of social science research in fulfilling it—is illustrated powerfully by concerns within the scientific community about a lack of trust among African American communities and other populations that historically have been at the receiving end of unethical treatment (or the lack of treatment) by parts of the medical community. Calls to rebuild trust are often well-intentioned but focus on symptoms rather than the underlying causes. For example, expectant African American mothers continue to face up to 3 times higher mortality rates than White mothers.31 A lack of trust in the medical community, in other words, might be much less a function of historical mistreatment than of current inequities in health outcomes. Any attempt to rebuild trust through outreach and communication without first addressing these kinds of inequities is disingenuous at best and unethical at worst. But we know from decades of social science research that citizens with higher income and education levels will benefit much more from health information campaigns than people with lower levels of income or education.32 These “knowledge gaps,” as sociologist Phil Tichenor and colleagues called them in the 1970s,33 will widen as more information becomes available, favoring the already information rich and leaving already vulnerable populations less (accurately) informed.

Given some communities’ lack of trust in science and the existence of knowledge gaps, new information needs to be framed in ways that align with how different publics make sense of information. Decades of research in communication science, sociology, political science, and psychology have shown that the same information is interpreted very differently by audiences when presented in ways that either resonate or do not resonate with their respective interpretive schemas and worldviews.21 When scientists communicate without providing audience-relevant context and framing that resonate with citizens’ (rather than their own) value and belief systems, their messaging is likely to favor groups who are already most interested in science and aligned with the scientific community, leaving behind groups that are often most vulnerable and underserved by paywalled science journalism in elite media outlets. Data collected during the pandemic about a lack of public buy-in for “vaccine passports” provide powerful proof of how effective alternative framings can be. While conservative audiences were concerned about the term passport, which resonated with their concerns about government overreach and federal oversight, they were much more likely to support vaccine “verification,” which frames the issue of showing vaccination cards as one of individual choice and responsibility.34

Of course, not every frame is meaningful to all publics, especially in an era when hyperpartisanship is the new normal. We all engage in motivated reasoning, especially when processing information that contradicts our values.35 Similarly, many of us navigate online environments at least partly defined by filter bubbles that echo voices and sources consistent with our prior views and preferences.36,37 We need to start taking these realities into account rather than seeing those whom we are trying to persuade as the only ones using motivated reasoning and blaming them and the filter bubbles they are in for adverse outcomes or even for exacerbating the problem. For instance, the scientific community has a poor track record of meaningfully communicating the value of science to some sectors of society, such as religious people or conservative audiences.38 And some of these wounds might be self-inflicted: we should not be surprised when science is perceived as partisan when prominent scientist communicators regularly mock Republicans and communities of faith on social media.38,39

COVID-19 has demonstrated powerfully how unprepared science was to give answers to rapidly emerging and urgent policy problems. Not only were there high-profile retractions of published research11 and preprint-based overclaims in popular media, but science during the pandemic was also conducted much faster than normal and under immense public scrutiny.9 Even with these challenges, however, science continues to be the best way that societies have for producing and curating reliable information. Scientists seeing members of the public as partners in solving large societal challenges, such as COVID-19, rather than as patients with attitudinal or behavioral pathologies that need to be fixed will be a prerequisite for scientists’ continued ability to inform the urgent policy choices that are coming our way.

References

  1. @DrCaliff_FDA. Additionally, I believe that misinformation is now our leading cause of death, and we must do something about it. April 29, 2022. Accessed April 30, 2022. https://twitter.com/drcaliff_fda/status/1520110323444985856

  2. World Health Organization. Novel coronavirus (2019-nCoV) situation report 13. World Health Organization; 2020. Accessed April 10, 2020. https://www.who.int/docs/default-source/coronaviruse/situation-reports/20200202-sitrep-13-ncov-v3.pdf

  3. Krause NM, Freiling I, Scheufele DA. The “infodemic” infodemic: toward a more nuanced understanding of truth-claims and the need for (not) combatting misinformation. Ann Am Acad Pol Soc Sci. 2022;700(1):112-123.
  4. Roozenbeek J, van der Linden S, Nygren T. Prebunking interventions based on “inoculation” theory can reduce susceptibility to misinformation across cultures. Harv Kennedy Sch Misinformation Rev. 2022;1(2):1-23.
  5. Roozenbeek J, van der Linden S, Goldberg B, Rathje S, Lewandowsky S. Psychological inoculation improves resilience against misinformation on social media. Sci Adv. 2022;8(34):eabo6254.

  6. Levine AS. Unmet desire. Issues Sci Technol. 2022;38(3):27-31.
  7. Scheufele DA, Krause NM, Freiling I, Brossard D. What we know about effective public engagement on CRISPR and beyond. Proc Natl Acad Sci U S A. 2021;118(22):e2004835117.

  8. Stevens A. Governments cannot just “follow the science” on COVID-19. Nat Hum Behav. 2020;4:560.

  9. Scheufele DA, Krause NM, Freiling I, Brossard D. How not to lose the COVID-19 communication war. Issues Sci Technol. April 17, 2020. Accessed January 19, 2022. https://issues.org/covid-19-communication-war/

  10. Dr Fauci: Omicron is not something to be taken lightly. MSNBC. December 27, 2021. Accessed April 13, 2022. https://www.msnbc.com/morning-joe/watch/dr-anthony-fauci-omicron-is-not-something-to-be-taken-lightly-129600069823

  11. Mehra MR, Ruschitzka F, Patel AN. Retraction—hydroxychloroquine or chloroquine with or without a macrolide for treatment of COVID-19: a multinational registry analysis. Lancet. 2020;395(10240):1820.

  12. Delingpole J. Lancetgate is a humiliation for Trump’s medical critics. Breitbart. June 7, 2020. Accessed August 26, 2022. https://www.breitbart.com/europe/2020/06/07/lancetgate-is-a-humiliation-for-trumps-medical-critics/

  13. Avezum Á, Oliveira GBF, Oliveira H, et al; COPE-Coalition COVID-19 Brazil V Investigators. Hydroxychloroquine versus placebo in the treatment of non-hospitalised patients with COVID-19 (COPE-Coalition V): a double-blind, multicentre, randomised, controlled trial. Lancet Reg Health Am. 2022;11:100243.

  14. Yeo SK, McKasy M. Emotion and humor as misinformation antidotes. Proc Natl Acad Sci U S A. 2021;118(15):e2002484118.

  15. Scheufele DA, Krause NM, Freiling I. Misinformed about the “infodemic”? Science’s ongoing struggle with misinformation. J Appl Res Mem Cogn. 2021;10(4):522-526.
  16. Lewandowsky S, van der Linden S. Countering misinformation and fake news through inoculation and prebunking. Eur Rev Soc Psychol. 2021;32(2):348-384.
  17. McGuire WJ. The effectiveness of supportive and refutational defenses in immunizing and restoring beliefs against persuasion. Sociometry. 1961;24(2):184-197.
  18. Handley IM, Brown ER, Moss-Racusin CA, Smith JL. Quality of evidence revealing subtle gender biases in science is in the eye of the beholder. Proc Natl Acad Sci U S A. 2015;112(43):13201-13206.
  19. Corley EA, Scheufele DA, Hu Q. Of risks and regulations: how leading US nanoscientists form policy stances about nanotechnology. J Nanoparticle Res. 2009;11(7):1573-1585.
  20. Scheufele DA, Krause NM. Science audiences, misinformation, and fake news. Proc Natl Acad Sci U S A. 2019;116(16):7662-7669.
  21. Scheufele DA. Science communication as political communication. Proc Natl Acad Sci U S A. 2014;111(suppl 4):13585-13592.
  22. Keller PA, Harlam B, Loewenstein G, Volpp KG. Enhanced active choice: a new method to motivate behavior change. J Consum Psychol. 2011;21(4):376-383.
  23. National Academy of Sciences. The Science of Science Communication II: Summary of a Colloquium. National Academies Press; 2014.

  24. Shankar M. Using behavioral science insights to make government more effective, simpler, and more people-friend. The White House blog. February 9, 2015. Accessed August 28, 2022. https://obamawhitehouse.archives.gov/blog/2015/02/09/behavioral-science-insights-make-government-more-effective-simpler-and-more-user-fri

  25. Williams R. Obama’s budding nanny state. Politico. December 8, 2013. Accessed September 6, 2022. https://www.politico.com/magazine/story/2013/12/obamas-nanny-state-100848/

  26. Spitzer M. Masked education? The benefits and burdens of wearing face masks in schools during the current corona pandemic. Trends Neurosci Educ. 2020;20:100138.

  27. Kahn KB, Money EEL. (Un)masking threat: racial minorities experience race-based social identity threat wearing face masks during COVID-19. Group Process Intergroup Relat. 2022;25(4):871-891.
  28. Scheufele DA. Thirty years of science-society interfaces: what’s next? Public Underst Sci. 2022;31(3):297-304.

  29. Morrill JS. Speech of Hon Justin S. Morrill, of Vermont, on the bill granting lands for agricultural colleges, delivered in the House of Representatives, April 20, 1858. Congressional Globe Office; 1858. Accessed October 26, 2022. https://www.loc.gov/resource/gdcmassbookdig.speechofhonjusti01morr/?st=pdf&pdfPage=1

  30. Lubchenco J. Entering the century of the environment: a new social contract for science. Science. 1998;279(5350):491-497.
  31. Working together to reduce Black maternal mortality. Centers for Disease Control and Prevention. April 6, 2022. Accessed August 28, 2022. https://www.cdc.gov/healthequity/features/maternal-mortality/index.html#:~:text=Black%20women%20are%20three%20times,structural%20racism%2C%20and%20implicit%20bias

  32. Lind F, Boomgaarden HG. What we do and don’t know: a meta-analysis of the knowledge gap hypothesis. Ann Int Commun Assoc. 2019;43(3):210-224.
  33. Tichenor PJ, Donohue GA, Olien CN. Mass media flow and differential growth in knowledge. Public Opin Q. 1970;34(2):159-170.
  34. Inskeep S. Poll finds Republicans particularly opposed to “vaccine passport” messaging. NPR. April 12, 2021. Accessed August 28, 2022. https://www.npr.org/2021/04/12/986409983/poll-finds-republicans-particularly-opposed-to-vaccine-passport-messaging

  35. Kunda Z. The case for motivated reasoning. Psychol Bull. 1990;108(3):480-498.
  36. Brossard D, Scheufele DA. The chronic growing pains of communicating science online. Science. 2022;375(6581):613-614.
  37. Sunstein CR. Republic.com 2.0. Princeton University Press; 2007.

  38. Krause NM, Scheufele DA, Freiling I, Brossard D. The trust fallacy: scientists’ search for public pathologies is unhealthy, unhelpful, and ultimately unscientific. Am Sci. 2021;109(4):226-231.
  39. Unsworth A, Voas D. The Dawkins effect? Celebrity scientists, (non) religious publics and changed attitudes to evolution. Public Underst Sci. 2021;30(4):434-454.

Citation

AMA J Ethics. 2023;25(3):E228-237.

DOI

10.1001/amajethics.2023.228.

Conflict of Interest Disclosure

The author(s) had no conflicts of interest to disclose.

The viewpoints expressed in this article are those of the author(s) and do not necessarily reflect the views and policies of the AMA.