The spread of misinformation about science, whether accidental or deliberate, is not new. Long before the advent of electronic media, false claims about science appeared in news publications. Following the 1906 Federal Food and Drugs Act, the current era of the U.S. Food and Drug Administration began in response to widespread misinformation about the efficacy and safety of drugs, food additives, and biological substances. Over the past decade, however, concerns about the spread of misinformation about science and the overall role of scientific expertise in civic dialogue have grown significantly. Such concerns are motivated by the belief that misinformation about science can lead to harmful outcomes for individuals, communities, and societies, such as ill-informed personal choices about disease treatment, higher rates of death from vaccine-preventable diseases, lack of appropriately responding to public health emergencies and natural disasters, and limitations on productive debate about addressing issues like climate change. The growing concerns about the potential harmful effects of misinformation about science have also led to a rapid increase in research across multiple disciplines to better understand and address this phenomenon.
With support from the National Science Foundation, the National Academies of Sciences, Engineering, and Medicine convened a study committee charged with bringing together multiple lines of research to develop a more comprehensive understanding of the sources, spread, and impacts of misinformation about science and effective strategies for mitigation. Specifically, the committee was tasked with characterizing the nature and scope of misinformation about science and its impacts on individuals, communities,
and society; identifying effective solutions for mitigating its spread; providing actionable guidance toward reducing associated harms; and outlining priorities for future research.
In both public discourse and in peer-reviewed research, misinformation has been used as an umbrella term to refer to various types of false, inaccurate, incorrect, and misleading information. The broad nature of the term has made it difficult to develop a coherent understanding of the nature, scope, and impacts of misinformation, and by extension, misinformation about science. To provide clarity and focus the committee’s analysis, the committee developed the following definition: misinformation about science is information that asserts or implies claims that are inconsistent with the weight of accepted scientific evidence at the time (reflecting both quality and quantity of evidence). Which claims are determined to be misinformation about science can evolve over time as new evidence accumulates and scientific knowledge regarding those claims advances. Relatedly, the committee defines disinformation about science as a sub-category of misinformation that is circulated by agents who are aware that the science information they are circulating is false (see Chapter 2).
In the course of its work, the committee identified a number of ways to advance understanding of misinformation about science and intervene, when needed, to the greatest effect. In developing recommendations, the committee prioritized actions to address misinformation about science based on relative potential for harm and also with consideration for today’s complex information ecosystem, which requires concerted, multi-level action by a diversity of actors (see Chapter 9 for a deeper discussion of the report’s recommendations).
Through its review, the committee found that misinformation about science can originate from a diversity of sources and types of media, including but not limited to corporations, governments and politicians, alternative health and science industries, entertainment media, nongovernmental organizations, science organizations and institutions, press offices and news media organizations, individual scientists, and ordinary citizens.1 Reasons and/or motivations for disseminating misinformation about science are diverse, but misinformation about science has greater potential for influence when it
___________________
1 Citations for the information presented in this summary can be found in the main text.
Importantly, systematic campaigns intended to mislead the public about science-related issues like climate change, consequences of tobacco use, and heart disease, for example, are of particular concern given the associated negative outcomes for individuals and society.
Universities, research organizations, and funders of scientific research are key sources of science information. Occasionally, misinformation about science originates from reputable science organizations, institutions, universities, and individual scientists or healthcare professionals, either as a byproduct of poor science communication, distortions of scientific data, the dissemination of research findings before they are formally vetted and substantiated, or in the worst cases, scientific fraud. Misrepresentation and misreporting of scientific studies, medical developments, and health issues by press offices, journalists, and medical professionals are also ways that misinformation about science may unintentionally arise from authoritative sources. Science and medicine are among the most trusted institutions in today’s society; therefore, it is important that the reliability of information on critical science issues from these sources is not compromised due to misinformation.
Recommendation 1: Some corporations, strategic communication companies, and non-profit organizations have at times embarked on systematic campaigns to mislead the public with negative consequences to individuals and society. Universities, researchers, and civil society organizations should work together to proactively counter such campaigns using evidence from science and science communication to mitigate their impact. For example, researchers, government, and advocacy organizations have come together to counter campaigns from the tobacco industry to reduce the public health impact of tobacco use. Similar efforts should be made for other scientific topics of public interest.
Recommendation 2: To ensure the promotion of accurate science information and reduce the spread of misinformation or misleading information from the scientific community:
Recommendation 3: Scientists and medical professionals who are active in the public arena can play a critical role in communicating accurate and reliable science and health information to the public.
Alongside the reasons and/or motivations discussed above for the dissemination of misinformation about science from various sources, the committee also identified other factors that contribute to its spread at the individual, institutional, community, and societal levels. These factors include the following:
Though inaccuracy in scientific claims is a long-standing issue, diffusion of such claims has become more visible within the contemporary information ecosystem, which operates across different technology platforms and in-person and virtual spaces that enhance the volume, production, speed, and spread of information. Science information can quickly travel through this ecosystem across different channels and media types (e.g., online platforms, electronic broadcasting media, internet websites), and in some cases, becomes divorced from the original context needed to appropriately evaluate the accuracy and reliability of the information.
Additionally, the rise of more participatory online environments (e.g., on social media platforms) has enabled greater information exchange across different social and professional networks but has also blurred the lines between reliable and unreliable science information. At times, this blurring can be exacerbated by generative artificial intelligence (AI). Such factors make it more challenging for consumers of information to navigate online environments, and specifically, to assess scientific expertise and the credibility of science information across sources.
Recommendation 4: To promote the dissemination of and broad access to evidence-based science information, funders of scientific research (e.g., federal science agencies, non-profit and philanthropic foundations) and non-partisan professional science organizations (e.g., American Association for the Advancement of Science, American Association for Cancer Research, American Psychological Association, American Society of Plant Biologists) should establish and fund an independent, non-partisan consortium that can identify and curate sources of high-quality (e.g., weight of evidence—quantity and quality) science information on topics of public interest. The consortium should also frequently review the science information from these sources for accuracy, needs, and relevance. It is particularly critical to ensure that access to such science information is openly and equitably available to all groups, especially underserved groups. Additional possible functions of the consortium could include the following:
Recommendation 5: Online platforms, including search engines and social media, are major disseminators of true and false science information. These platforms should prioritize and foreground evidence-based science information that is understandable to different audiences, working closely with non-profit, non-partisan professional science societies and organizations to identify such information.
Many adults in the United States get their science information from news media outlets, making the quality and quantity of science news production increasingly important. At the same time, decreases in funding within journalism have led to significant reductions in news coverage, especially at local levels. These cutbacks have also meant that journalists who lack specialized training in science are being assigned to cover science and health news, and insufficient expertise can make it challenging to correctly interpret scientific research and properly contextualize the findings in their reporting. Additionally, limited capacity, expertise, and resources can create science news deserts, which enable misinformation about science to spread more easily.
Journalists, editors, writers, and media organizations covering science, medical, and health issues (regardless of assigned specialty areas) serve as critical mediators between producers of scientific knowledge and consumers of science information. Local news, in particular, has broad reach and is trusted by many Americans, making it potentially valuable for mitigating misinformation about science. However, several factors may make science reporting particularly prone to the unintentional spread of misinformation about science including
Recommendation 6: To support and promote high-quality science, health, and medical journalism:
Recommendation 7: In training the next generation of professional communicators in journalism, public relations, and other media and communication industries, universities and other providers of communication training programs should design learning experiences that integrate disciplinary knowledge and practices from communication research and various sciences and support the development of competencies in scientific and data literacy and reasoning. These competencies should be reinforced through continuous learning opportunities offered by organizations that support mass communication and journalism professionals.
Characteristics of online platforms (i.e., search engines and social media) can also contribute to the spread of misinformation about science, including design and algorithmic choices that constrain the information an individual might see (e.g., those shaping individualized feeds based on prior platform activity), permissive and loosely enforced or hard-to-enforce terms of service, and limited content moderation. These conditions can also enable dedicated purveyors (both individuals and institutions) to spread misinformation about science online more easily; however, it may be difficult to convince companies to change these conditions voluntarily when doing so might conflict with other business priorities, such as maximizing number of users or attracting advertisers. Some countries have developed
regulatory approaches to content moderation online, but long-standing free speech protections, while desirable, may make it challenging to readily adopt such approaches in the United States.
Further adding to the complexity of the matter, people share misinformation about science through social media platforms both intentionally and unintentionally. In general, there is strong evidence that people prefer sharing true, rather than false, information, and share information with good intentions, such as to help or warn loved ones. However, individuals may unintentionally share misinformation about science due to confusion about the credibility of the information and inattention to accuracy, among other reasons. On the other hand, individuals and institutions may knowingly share misinformation about science in order to profit financially, to accrue social rewards (e.g., followers and likes), to gain and maintain power, to erode trust, or to disrupt existing social order and create chaos (e.g., trolling). These motivations may be especially incentivized in social media environments.
The need for high-quality science information and the potential for the spread of misinformation about science are particularly high during times of emergencies, disasters, threats, and emerging crises. Furthermore, when uncertainty and interest are both high, journalists (national and local) become critical frontline communicators of science information. Experts on emergency preparedness, disaster response, and environmental threat mitigation (e.g., government agencies and civil society organizations) could also be important sources of credible science information for the public during such times.
Recommendation 8: Government agencies at national, state, and local levels (e.g., Federal Emergency Management Agency, Centers for Disease Control and Prevention, Food and Drug Administration, state public health departments) and civil society organizations (e.g., Association of State and Territorial Health Officials or National Association of County and City Health Officials) that deliver services during times of public health emergencies, natural disasters, threats, and new crises should contribute proactively to building and maintaining preparedness capacity for communicating science information at national, state, and local levels by:
Social trust is another important factor that shapes people’s relationship to information, influencing whether they are willing to rely on a particular source of information for personal use. In recent years, concerns have been raised about declining public trust in science as a possible facilitator in the spread of misinformation about science. The committee found that trust in science has recently declined similarly to or less than trust in other civic, cultural, and governmental institutions. However, trust in science has been relatively stable over the last five decades, though levels of trust have varied significantly by partisan identity as well as among different groups depending on the science topic, the scientist(s) or science organization(s) being considered, or respective histories and experiences with science-related institutions.
Importantly, some powerful purveyors of misinformation about science have leveraged the relatively high trust in science and the authoritative “voice” of science to facilitate spread of misinformation (see Chapter 5). Examples of some of the strategies used include the following:
In light of this, the committee sees a great need for more scrutiny and accountability as well as more tools/supports for consumers of information (individual and institutions). Specifically, there is a need for continuous monitoring of the current information ecosystem concerning the production, spread, and impacts of misinformation about science. Such a process, like monitoring for signals of epidemics, could better support institutions and individuals in navigating the complexities of the current information ecosystem, including proactively managing misinformation about science.
Relatedly, regulatory structures such as legislation could also improve the current information ecosystem, but the evidence on the effectiveness of such approaches in the United States context is still emerging.
Recommendation 9: Professional scientific organizations, philanthropic organizations engaged in supporting scientific research, and media organizations should collaborate to support an independent entity or entities to track and document the origins, spread, and impact of misinformation across different platforms and communication spheres. The data produced through this effort should be made publicly available and be widely disseminated. Various entities, including public health emergency operations centers, can serve as potential models for such collaborative efforts.
Negative impacts of misinformation about science have been widely, but also unevenly, documented and evidenced across levels, with most research focused on individual-level impacts. The most well-documented impact of misinformation is that it can cause individuals to develop or hold misbeliefs, and these misbeliefs can potentially disrupt the ability of individuals to make informed decisions for themselves, their families, or their communities. Impacts beyond the individual-level have been more challenging to measure, given some societal harms are most consequential in the ways that they amass over time. Additionally, while a direct causal link between misinformation about science and detrimental behaviors and actions has not been definitively established, the current body of evidence indicates that misinformation plays a role in impacting behaviors that, in some cases, results in negative consequences for individuals, communities, and societies. Misinformation about science has great potential to disrupt individual agency and collective decision making, to exacerbate existing harms (e.g., health disparities, discrimination), to distort public opinion in ways that limit productive debate, and to diminish trust in institutions that are important to a healthy democracy.
Misinformation about science that involves specific communities and populations can also create and/or reinforce stereotypes, bias, and false narratives that can cause further harm to such groups (e.g., promulgation of racialized discourses that stoke violence). Relatedly, some populations have been specifically targeted by misinformation (e.g., African Americans, immigrants, low-income communities), and some of the most troubling cases are matters of public health concern, like vaccines and smoking. Given that health, educational, and wealth disparities across social groups already contribute to inequitable access to resources to support well-being
(including credible science information), the impacts on communities that are typically targeted by misinformation about science may be compounded.
The committee identified many moderators of the differential impacts of misinformation about science at the individual and community levels, which may inform intervention efforts. While all people have the potential to believe misinformation, individuals are more likely to engage with misinformation and ultimately believe it when it aligns with their worldview and values, originates from a source they trust, is repeated, and/or is about a topic for which they lack strong preexisting attitudes and beliefs. The committee also found that while science literacy is an important factor in how people process and interpret science information, including misinformation, the empirical evidence suggests that science literacy alone does not ensure that an individual will be less prone to believing misinformation about science.
Within the contemporary information ecosystem, people are differentially situated with respect to science information. Social factors such as race/ethnicity, culture, socioeconomic status, geography, and access to material and social resources can influence what information people are exposed to, their information-seeking and sharing behaviors, and what actions they may take in science-related contexts. For example, an individual may believe in the safety and effectiveness of vaccines and have sufficient access to accurate vaccine information, but due to logistical challenges (e.g., time offerings are inconvenient or vaccination sites are inaccessible), they may not get vaccinated. This means that the accuracy of information is only one of a constellation of factors that result in a specific behavior.
In the past few decades, many efforts within the research, practice, and policy domains have been directed toward combatting the harmful effects of misinformation about science. These efforts have generally been implemented in a topically agnostic fashion and reflect attempts to mitigate the negative impacts of misinformation by disrupting the supply, demand, distribution, and/or uptake of misinformation. So far, research does not indicate that a particular point is the best place to intervene, and many of the most effective interventions target multiple points.
Supply-based interventions aim to reduce the volume of circulating misinformation and/or shift the balance in the quality of circulating information toward high-quality science content. Examples of effective approaches of this type include foregrounding credible information online, providing funding to under-sourced newsrooms, deplatforming purveyors of misinformation, and content moderation. Demand-based interventions
are aimed at reducing the consumption of misinformation through approaches like increasing trust in sources of credible information, identifying and filling information voids, and increasing people’s ability to detect and avoid misinformation through media literacy training. Overall, this class of approaches reflects proactive ways to support individuals as they seek out information to answer pressing questions they may have. However, it is important to note that individuals and communities facing informational challenges are not inherently more susceptible to misinformation about science. Indeed, many community-based organizations, including some locally-owned businesses, non-profit organizations, and faith-based organizations, have proactively worked to adapt and provide reliable information to fill science information voids. They are also particularly well positioned to do so because of their local ties, their awareness of local needs and concerns, and the trust that residents have in them. These assets notwithstanding, the committee found that such community-based organizations are not always sufficiently resourced.
Distribution-based interventions are designed to limit the spread of misinformation and include strategies such as algorithmic changes on platforms (e.g., demoting content in algorithmic recommendations), enforced legislation and policies (e.g., mandated disclosure laws about the use of bots), and encouragement of evaluative thinking in individuals based on insights from human psychology (e.g., nudges to consider the accuracy of content before choosing to share). The latter approach has been widely adopted with demonstrated efficacy in decreasing the sharing of misinformation by individuals. Uptake-based interventions are designed to reduce the effects of misinformation about science on people’s beliefs or behaviors. Such approaches include training individuals to spot common themes, narratives, and rhetorical devices that are often associated with misinformation (prebunking2) or providing corrective information. Although the durability of these interventions remains a challenge, they are effective to specifically prevent belief in misinformation and reduce the sharing of misinformation by individuals.
Recommendation 10: To enhance the capacity of community-based organizations (CBOs) to provide high-quality, culturally relevant, accurately translated, and timely science information to the communities they serve, funders (e.g., government agencies, public and private, philanthropic foundations) should provide direct funding to CBOs:
___________________
2 The committee notes that there is some conceptual ambiguity regarding the term “prebunking.” Some scholars define prebunking as a sub-category of technique-based inoculation interventions, while others define it as the overarching category that encompasses all inoculation interventions (see Chapter 7).
Recommendation 11: Organizations at national, state, and local levels that are specifically engaged in mitigating the uptake of misinformation about science at the individual level should identify and utilize effective approach(es) that are best suited to their goals and the point of intervention (e.g., before or after exposure). For example:
Considerable progress has been made to advance understanding about the causes and consequences of misinformation about science, but there are also challenges to studying this phenomenon and mitigating its impact. While misinformation interventions have become more prevalent over time, they are largely uncoordinated across actors, sectors, disciplinary domains, and intended outcomes in ways that do not inform each other. In some
cases, these efforts may even push in different directions. Scalability and real-world efficacy have also been difficult to achieve for some interventions, and overall, comprehensive data are limited to investigate the nature of misinformation about science across various contexts and populations.
Many approaches to address misinformation about science have demonstrated efficacy in small-scale, controlled experiments, but not consistently in real-world settings or over long periods of time. Additionally, many target the individual level, despite recognition in the field that systems-level action is needed. This inadvertently places the onus of mitigating the impacts of misinformation on individuals, and also gives the perception that individual action is the most effective way to address misinformation about science. Moreover, the limited emphasis on understanding misinformation about science at higher levels and larger scales impedes progress on understanding:
Some systems-level approaches (e.g., filling information voids, building and maintaining trust in sources of credible information, governance) have been implemented by various types of organizations; however, their efficacy has not been rigorously tested. Importantly, the committee found that funding structures have played a key role in driving scholarly attention in the field, including which topics and interventions are most studied. Hence, the priorities of funding organizations may be especially important to establish a systems-level understanding of misinformation about science.
Recommendation 12: To strengthen the evidence base on the impacts of misinformation about science across levels and the suite of approaches to mitigate them (e.g., community-based, platform and platform design-based, policy, and regulatory approaches), funding agencies and funding organizations should direct more investments toward systems-level
research. Such investments would increase understanding of the ways that structural and individual factors may interact to influence the spread and impacts of misinformation about science.
Gaining comprehensive understandings of misinformation about science has also been limited by data scarcity across different populations and contexts. Notably, the impacts of misinformation about science and the effectiveness of mitigation have not been well-documented for underserved groups. There are many reasons for this, resulting in the exclusion of the experiences of such populations in many studies (e.g., surveys, clinical trials, observational studies). Data on these populations are especially important for understanding how inequalities may compound the impacts of misinformation.
Some progress has been made on understanding the nature of misinformation on select social media platforms; however, a comprehensive picture across all major platforms is lacking. In particular, the ability to detect and study misinformation about science on social media platforms is currently limited by inconsistent rules for data access, privacy concerns, and prohibitively expensive data costs. Such conditions may not only reduce the level of research being conducted on social media platforms, but also the quality, as scraping may become a common form of data collection. Greater accessibility and consistency in data from platforms may require the establishment of formal standards and policies.
Recommendation 13: To reduce current barriers to obtaining high-quality, comprehensive data about misinformation about science on social media platforms: