Review | December 1, 2023
Weathering the Storm: Climate Misinformation in the Public Sphere
Aske Berentsen

Abstract
Climate misinformation is present in contemporary society and slows down climate action. Often spread by those with an interest in maintaining the current fossil fuel-based society, misinformation typically counters factual climate science that is taken out of context to highlight information favourable to these Merchants of Doubt. Fortunately, there are ways to counter misinformation, such as inoculation and consensus communication, which provide the public with the tools needed to recognize misinformation and not be influenced by it. Successfully providing these tools via educational institutions and news outlets can eradicate misinformation. This success is vital for the health of democratic societies and the planet alike. Recommendations include holding news outlets accountable for the news they spread and including mandatory classes about misinformation in school and university curricula. Further research is needed to find other ways to mitigate climate misinformation.
Climate Misinformation in the Public Sphere
In the current digital age, information is more accessible than ever. Anyone with an internet connection can voice an opinion online which leads to ever-growing interconnectedness on a global scale. However, the internet also provides those with malicious intent with the possibility to harness the power of algorithms to spread their own versions of the truth.
This study explores the phenomenon of climate misinformation. How does it occur, in what form is it present, and—perhaps most important—how can it be countered? It is vital to answer these questions as climate misinformation slows climate action and can only be countered through communication. Gaining insight into the arguments and techniques used is valuable in understanding why climate misinformation is effective and what tools can be used to counter the phenomenon.
Definitions of Terms
Actor. A person or organization that is involved or has an interest in the topic being discussed (Berentsen, personal communication, 20 July 2023).
Agnotology. The study of how and why ignorance or misconceptions exists (Cook et al., 2014, p. 296).
Cognitive Repertoire. Recognizing misinformation by understanding the arguments used in misinformation and knowing the tools to counter this misinformation (van der Linden et al., 2017, p. 6).
Consensus Gap (Knowledge Gap). Describes the lack of knowledge about the consensus that exists in the scientific community regarding climate change (Cook, 2019, p. 281).
Echo Chamber. Occurs when social media users are mostly exposed to views they already agree with (Cook, 2019, p. 289).
Inoculation. Exposing people to small amounts of misinformation to make them “immune” to seeing misinformation in a real-life situation. This way they recognize the methods used in misinformation and are thus not affected by it. It works similarly to how a vaccine exposes the immune system to small amounts of a virus so it can build up the antibodies needed to deal with the actual exposure to a virus (Cook, 2019, p. 287).
Merchant of Doubt. An actor with a vested interest in sowing doubt (van der Linden et al., 2017, p. 1).
Overkill Backfire Effect. Occurs when the refutational text is too long and complex (Cook, 2014, p. 297).
Post-Truth Era. An era where truth is not always the main objective. The post-truth era erodes people’s trust in facts and evidence (Farrell et al., 2019, p. 192).
Refutational Texts. Texts that challenge the reader’s misconception with the purpose of promoting conceptual change. They achieve this by explicitly acknowledging misconceptions about a topic, directly refuting them, and providing an alternative scientific conception (Cook et al., 2014, p. 296).
Literature Review
The following is a review of the literature used to answer the research questions of this study. A thematic approach will be applied to discover relevant trends in the literature.

Understanding How Misinformation Works
Currently, only 12% of Americans are aware of the 90-100% consensus that exists within the scientific community regarding human-induced climate change. Polarisation between those who advocate for and those who oppose climate action has increased over time, created by decades of ideologically-driven misinformation. Misinformation—misleading statistics in particular—is effective in lowering the acknowledgement of climate change. This is exacerbated by the influences of conservative politicians and businesses interested in maintaining the status quo of climate inaction. Moreover, misinformation has the potential to cancel out the positive effects gained by accurate information (Cook, 2019, pp. 281-282).
Climate change in the United States used to be a bipartisan issue but the attitudes towards it have been influenced by conservative think tanks spreading climate change misinformation. These think tanks would use scientists sceptical of climate change to challenge climate change information, along with undermining information about the negative impacts of tobacco and acid rain. This type of misinformation used to be published in books and mostly focussed on three main themes: the uncertainty of climate change, the benefits of climate change, and the economic risks of climate mitigation policies. These campaigns have historically been funded by fossil fuel companies. What’s more, research has shown that countries with higher emissions per capita have a lower public acceptance of climate change information (Cook, 2019, pp. 282-284).
The main arguments used in climate change misinformation can be summarised in five main themes: 1) It’s not real, 2) It’s not us, 3) It’s not bad, 4) The experts are unreliable, and 5) Climate solutions won’t work. These are exact opposites of the five main arguments supporting factual information about climate change: 1) Global warming is real, 2) Human activity is the cause, 3) The impacts are bad, 4) Experts agree on these three points, and 5) Let’s hope we can avoid the worse effects of climate change (Cook, 2019, p. 284).
There are several techniques used to spread climate misinformation. As mentioned above, fake experts can be used to make a claim appear reliable. These fake experts pretend to be experts without the actual skills or knowledge of an accredited climate scientist. The most successful example in lowering acceptance originated from the Global Warming Petition Project website. Honouring its name, in 2016 the website was used to publicize a fake petition signed by “thousands of scientists” that were against climate change. As there was an overwhelming number of “scientists” who made the case against climate change, people believed it. The petition was the most shared climate change story of 2016. This illustrates the power of this type of deception. These types of campaigns are successful as they sow doubt about climate change. And when there is doubt, there is room for other voices with a different agenda to be heard. The research finds that people accept climate misinformation when it overlaps with their own beliefs The recommendation the authors make is to publish data from credible scientists to bridge the consensus gap. (Cook, 2019, p. 285).
Cook (2019) found that climate misinformation almost always contains logical fallacies. This draws wrong conclusions based on wrong information, failing to consider all evidence, and arguing with false presumptions. Common logical fallacies are red herrings (distracting arguments that draw attention away from main arguments) or false dichotomies (imposing a choice between two options even though better options are available). The research found that climate misinformation also contains fatal logical flaws. These logical flaws can happen intentionally but also unintentionally based on the motivation of wanting the misinformation to be true (Cook, 2019, p. 285).
Another technique used in climate misinformation is asking for unrealistic or unattainable proof. These impossible expectations are aimed at the scientific research method, as science can almost never be 100% certain if done correctly (Cook, 2019, pp. 285-286).
Cherry-picking, or framing, is a technique that allows for only highlighting data that creates the desired conclusion. This method uses arguments that are mostly true but only highlight the information the sender of the message wants to emphasize. One example of this is providing evidence of a few years of cooler temperatures to disprove climate change while ignoring the bigger picture (Cook, 2019, p. 286).
The last technique identified by Cook (2019) is conspiracy theories. A prominent example of a climate conspiracy theory is “climategate” in 2009, an incident when emails from scientists were stolen and published online. Out-of-context quotes were heralded as evidence that scientists were conspiring to falsify data and deceive the public. Although investigations conducted by the authorities proved this theory wrong and public interest waned quickly, climate change denialist blogs intensified their interest in climategate over time. Climate conspiracy theories can decrease public conviction about climate change, lower support for climate action, and evoke distrust in government. What makes conspiracy theories complicated to deal with is that they are “self-sealing”, meaning that an attempt to disprove the conspiracy theory with evidence will result in the theorist making the source of the evidence part of the conspiracy theory (Cook, 2019, p. 286).

Technology plays a critical role in the spread of misinformation. Social media has exacerbated the issue by creating a platform where low-quality information has an equal chance of going viral as high-quality information. Additionally, algorithms have created echo chambers in which social media users are exposed to content that aligns with their interests, creating a closed loop without perspectives other than their own getting through to them. In general, social media has made it easier to spread misinformation. Countering misinformation has proven difficult, as illustrated by Facebook’s experiments with labelling content as misinformation. This backfired as people did not believe the labelling and ended up sharing the flagged posts more (Cook, 2019, p. 289-290).
To counter misinformation, Cook (2019) suggests providing data that directly replaces the misinformation provided by deniers. Another promising counter technique is making use of the inoculation theory in which exposing people to a small amount of misinformation will make them immune to misinformation. An added benefit created by inoculation is that people are more likely to discuss the issue with their peers, which helps spread awareness. Inoculation consists of warning about the threat of misinformation and using counterarguments refuting the misinformation (Cook, 2019, p. 287). Another type of inoculation is logic-based inoculation, which is explaining the techniques that misinformation campaigns use. This way the public has a better understanding of misinformation and can safeguard against future exposure (Cook, 2019, pp. 287-288).
Cook’s (2019) study identifies five main themes and various techniques used in misinformation, creating an in-depth understanding of the topic by providing background information on the origin of misinformation. The researcher recommends that educators and communicators adopt refutational practices informed by psychologists in order to maximize results (Cook, 2019, p. 291). This can be achieved through education, public communication campaigns, and the use of social media. These methods have proven effective in neutralizing misinformation. If implemented properly, inoculation campaigns could eradicate misinformation campaigns altogether (Cook, 2019, p. 292).
Putting Effective Misinformation Counter Methods to the Test
Similar to Cook’s (2019) study, van der Linden et al. (2017) also stated that there is still a high degree of polarization despite the almost unanimous consensus within the scientific community regarding climate change. The authors attribute this to “Merchants of Doubt” (van der Linden et al., 2017, p. 1). These are actors with vested interests in thwarting climate change adaptation. The counter-climate policies from which they operate and spread have proven to be influential misinformation campaigns. These have undermined public understanding of the consensus within the scientific community on climate change and have also limited the engagement of the public on the issue (van der Linden et al., 2017, p. 1).
To discover which counter methods are effective in combating climate misinformation, van der Linden et al. (2017) designed an intervention that sought to combine climate change misinformation, consensus messaging, and inoculation. The intervention consisted of six separate groups. Group one acted as the control group and did an unrelated task. Group two was only exposed to the consensus message stating “97% of climate scientists have concluded that human-caused climate change is happening” (van der Linden et al., 2017, p. 3). Group three was exposed to the influential misinformation campaign from “The Global Warming Petition Project”—as mentioned earlier in this report—which claims “over 31.000 American scientists have signed a petition stating that there is no scientific evidence that the human release of carbon dioxide will in the foreseeable future, cause catastrophic heating of the Earth’s atmosphere” (van der Linden et al., 2017, p. 3).
An exact copy of this message was used with all identifying source material redacted to prevent interference with the study. Group four was exposed to the consensus and the misinformation sequentially. Finally, groups five and six were exposed to one inoculation condition each. In the first more general condition, participants were first shown the scientific consensus followed by a warning that “some politically motivated groups use misleading tactics to try to convince the public that there is a lot of disagreement among scientists.” (van der Linden et al., 2017, p. 3). This claim was then debunked by communicating that there is virtually no disagreement that humans are causing climate change. In the second more detailed condition, the participants were also first exposed to the consensus message before additional arguments were added that specifically debunked the “The Global Warming Petition Project” by highlighting that some of the signatories were fraudulent and only 1% of the signatories had a background in the relevant field. Before and after testing participants were asked questions related to their views on climate-related topics to measure the effect of the various countermeasures (van der Linden et al., 2017, p. 3).

As expected, the control group did not show a significant change in attitude after doing their unrelated task. The group exposed to the scientific consensus showed a large increase in the perceived scientific consensus. In contrast, the misinformation had a substantial negative effect on the attitudes toward climate change. The group exposed to the consensus and misinformation showed that the misinformation completely negated the effect of the consensus messaging. Lastly, both inoculation conditions were effective in countering the misinformation with the detailed condition proving most effective (van der Linden et al., 2017, pp. 3-4).
The study conducted by van der Linden et al. (2017) shows that public attitudes can be effectively inoculated against influential misinformation campaigns. The authors point to the importance of scientific consensus communication on human-caused climate change. They also point out that the consensus messaging was preserved in combination with the inoculation messages, proving the importance of using both methods combined (2017, p. 5). Helping the public create a “cognitive repertoire” (a toolbox on how to recognize misinformation) might prove useful in countering misinformation (2017, p. 6).
Maertens et al. (2020) also identify a high level (97%) of consensus within the scientific community that humans are the main cause of climate change. This overwhelming consensus can help bridge the current ideological divide regarding climate change (p. 1). The sowing of disbelief by the Merchants of Doubt has spread distrust among the public regarding climate change. This is made worse by misinformation spreading faster than factual information in the online sphere. Even when factual information debunks untrue information, the initial belief can still persist (Maertens et al., 2020, p. 2).
To counter this, the authors also identify inoculation as the most established theory to combat misinformation. To test this theory, Maertens et al. (2020) designed a replication study based on the study by van der Linden et al. (2017), the difference being the time between exposure to inoculation and misinformation. The intervention consisted of four separate groups: the control was not exposed to anything but did an unrelated task, the consensus group received the standard descriptive message about the scientific consensus, the false-balance group was exposed to the consensus messaging and to a misinformation message one week after exposure to the consensus messaging. Finally, in the inoculation group, an inoculation message was added immediately after the consensus message and a misinformation message was presented one week after exposure to the consensus and inoculation message. The delay in exposure allowed the researchers to test the decaying effect of consensus and inoculation messaging (Maertens et al., 2020, p. 4). Maertens et al. (2020) used the exact same messaging for the consensus, misinformation, and inoculation as van der Linden et al. (2017) (pp. 3-4).
The results of this study confirm the effectiveness of inoculation and consensus messaging in countering climate misinformation. Moreover, Maertens et al. (2020) prove that the effect lasts for at least one week, with the potential to last longer than that as this study only tested the participants one week after exposure (pp. 9-10). The authors also found that inoculation can prove useful even when the misinformation has already reached the public. They found inoculation to work therapeutically as it can “boost immune response even among the already afflicted” (Maertens et ak, 2020, p. 2).
Evidence-based Strategies to Combat Scientific Misinformation
Like the previous articles analysed, Farrell et al. (2019) agree that scientific misinformation undermines public understanding of science, erodes basic trust in evidence, and stalls evidence-based policymaking (p. 191). Additionally, the authors state that misinformation is often backed by PhDs, press briefings, and academic journals, making it appear to be credible (Farrell et al., 2019, p. 192).
However, in contrast to van der Linden et al. (2017), Farrell et al. (2019) claim that simply communicating the scientific consensus repeatedly is not enough because we have entered the “post-truth” era in which the public has become increasingly mistrustful of information. The researchers agree that an effective counter to this mistrust is the use of inoculation. For example, academics and scientists could cooperate with reporters in spreading inoculated messages, teachers could teach their students about misinformation and how to recognize it, and opinion leaders could help raise awareness about misinformation. The authors claim that inoculation will only be effective if the “patient is not already sick” (Farrell et al., 2019, p. 192). In other words, if they have not been exposed to misinformation (pp. 192-193).
Another important counter to misinformation identified by Farrell et al. (2019) is confronting the institutions and politicians that allow misinformation to spread and be used. In this way, the mechanisms that facilitate misinformation can be eliminated, which will not only help to turn the tide on the critical issue of climate change but also prevent future large-scale manipulation from taking place ( pp. 193-194).

Bringing Climate Awareness Into the Light Through Education
In an effort to identify why certain beliefs persist, specifically climate denial, Cook et al. (2014) focused their research on ignorance and applied agnothology – the study of ignorance – to climate misinformation. They found that ignorance has an impact on climate action and on science literacy in general as it hinders people’s understanding of the issue, which in turn prevents them from taking climate action. Misinformation can be so effective that it still endures even after factual information has been given. Understanding why these misbeliefs persist is critical in countering them. One explanation for this persistence is that people build a mental model based on misinformation or myth. That makes breaking down the initial belief difficult even when the receiver of information knows the information to be false. During their study on ignorance, Cook et al. (2014) found that exposure to factual information can even reinforce the myth (pp. 296-297).
The research also highlights the impact of the “overkill backfire effect”, which occurs when the refutations are too long and complex (Cook et al., 2014, p. 296). This results in ineffective counter messages where the intended receiver does not register the message properly. The solution to this is to make the counterargument simple—or at least not more complex than the myth—in order to debunk it. They suggest that making messages compelling and memorable helps to “make them stick” (Cook et al., 2014, p. 296).
Lastly, the research showed that refutational texts are effective in countering climate misinformation. According to Cook et al. (2014), “Refutational texts are texts that challenge the reader’s misconception with the purpose of promoting conceptual change. They achieve this by explicitly acknowledging misconceptions about a topic, directly refuting them, and providing an alternative scientific conception” (p. 296). The authors also recommend that refutational lessons be included in educational materials. These lessons address common misconceptions and communicate factual scientific information, thereby providing students with argumentative skills and greater awareness about the relevance between evidence and the argument. According to the authors, including the refutational texts in combination with refutational lessons in educational materials will expose students to the tools needed to combat misinformation (Cook et al., 2014, p. 304).
Discussion
The body of literature analyzed provided valuable insights into the world of climate misinformation and how to counter it. Cook (2019), van der Linden et al. (2017), and Maertens et al. (2020) agree that a near consensus exists among climate scientists that climate change is happening and that humans are the leading cause of it. However, a consensus gap exists among the American public. Only 12% of Americans are aware of the 90-100% of consensus among climate scientists (Cook, 2019, pp. 281-282). This means there is an opportunity for climate change deniers to use this gap to spread misinformation and sow doubt among the public. This doubt can lead to opposition against climate action and climate policies, which is often favourable to parties that benefit from maintaining the current fossil fuel dominated society (Cook, 2019, pp. 282-284).
The arguments used in climate misinformation are often directly opposed to factual climate information. With the use of fake scientists, logical fallacies, asking for unrealistic evidence, framing, conspiracy theories, and credible-sounding arguments, the spreaders of misinformation are successful in convincing the public that the statistics are based on false calculations or that climate change is not happening at all. Technology exacerbates the issue by providing a platform with instant reach where algorithms create echo chambers with repeating perspectives confirming the user’s beliefs (Cook, 2019, pp. 284-290).
The literature analysed provided various insights on how to combat misinformation. The most effective and proven counter method is inoculation. Similar to a vaccine, the person exposed to a small amount of misinformation can develop resistance to the actual misinformation as they now possess the tools necessary to recognize the misinformation and not be influenced by it (Cook, 2019, p. 287 & van der Linden et al., 2017, p. 6 & Maertens et al., 2020, p. 2). Further research is needed into the longevity of inoculation as Maertens et al (2020) and Farrell et al. (2019) presented conflicting results in their studies.
Another solution to mitigate misinformation that the authors suggest is consensus communication, which will show the public that there is near unanimity in the climate scientific community. This would likely lead to higher acceptance of climate action (van der Linden et al., 2017, p. 5).
The third counter method identified in this research are refutational classes and texts containing an acknowledgement of the misconception in question, its refutation, and the provision of the scientific conception (Cook et al., 2014, p. 296).
One example of governments taking action against misinformation is in the European Union, where legislation is being drafted to make tech companies such as Meta (Facebook & Instagram) and Twitter responsible for the content published on their platforms (European Commission, n.d.). This will likely have positive effects in preventing the spread of misinformation and hate speech. Although this legislation marks great progress in protecting European citizens from climate misinformation, for example, the question needs to be asked: when is it for the benefit of society and when does it become limiting of free speech? What happens when people with malicious intentions get the power to silence others for saying things that oppose their own views? Another unintended effect might be that people will leave mainstream platforms altogether and create their own social media as is the case with former American president Donald Trump. Trump launched his own social media platform called “TRUTH Social” after being banned from platforms such as Facebook and Twitter in 2021 (Whitcomb, 2021). Now Trump and his followers are isolated on their own social media platform without exposure to opposing views.
The interconnected world of today is in its relative infancy phase as the World Wide Web has only existed since 1989 (Roser, 2018). This means that society still needs to find a way to create a healthy framework of content monitoring without restricting our hard-won human rights. Empowering people with the knowledge and tools to recognize misinformation is a solution that has already proven itself and can serve as a counter to misinformation until the framework has been established and long after that.
Conclusion
This study explored the phenomenon of climate misinformation and identified the ways climate misinformation spreads. Often the dissemination of climate misinformation is done by Merchants of Doubt, actors with a vested interest in maintaining the fossil fuel dominated society. They use carefully constructed messages that appear credible, told by people who seem trustworthy.

The most effective tool to counter misinformation is inoculation in which someone is exposed to a controlled amount of misinformation and consequently becomes “immune” to the misinformation they are exposed to later. Other effective methods identified in this research are consensus communication and refutational texts in combination with misinformation counter classes. These techniques can be taught through educational institutions as part of the curriculum to provide the next generation with the awareness and tools to protect them from misinformation.
Recommendations
Countering misinformation is a complex but vital task. Further research is needed to get better insight into ways to approach further mitigation of misinformation. The recommendations below aim to provide some specific solutions to countering misinformation.
The first recommendation is to research legal ways to counter misinformation. Similar to the German government’s plans to make companies responsible for their entire supply chain, governments could hold news outlets responsible for the news they spread and any ensuing implications (Deloitte, n.d.). Enforcing such legislation would be difficult but could prove worthwhile to safeguard against misinformation for current and future generations.
Another legislative approach might be to make misinformation courses mandatory in all schools and curricula. This way each student will have the tools needed to recognize misinformation, at least in theory. This approach could directly counter misinformation as it loses its power when people know that it is misinformation. This could have a long-term effect because an increasing number of people will have the tools needed to eradicate misinformation. To reach the population not enrolled in educational institutions, scientists could cooperate with reporters to provide the public with a cognitive repertoire to mitigate misinformation via public or trusted news outlets.
Additionally, the research that could help to combat misinformation is to investigate what arguments communication platforms use to misinform and counter that with solutions to those arguments. For example, if the argument is that climate action is too expensive, the counter argument could be to illustrate the cost of not taking action. This way the motivations that give climate misinformation traction could disappear or at least dissipate.
Finally, it is evident that climate misinformation is slowing down climate action. Although misinformation is a complicated issue, it is a critical hurdle to tackle, as successfully spread misinformation can have widespread consequences for the health of our democratic society and the planet as a whole.
About the Author
Aske Berentsen believes in bringing change through communication. When he is not out in the world experiencing all flavours possible first-hand, you can find Aske learning about the world through video or text. With an insatiable desire for self-growth, he hopes to be the change he wants to see in the world and inspire others along his path to follow suit.
Earth Common Journal
An online journal dedicated to supporting and promoting student research projects on the topics of sustainability, conservation and climate adaptation