Public Debate, Scientific Skepticism, and Science Denial

Harris L. Friedman, Michael Mann, Nicholas J.L. Brown, Stephan Lewandowsky

When scientists discover a distant planet that is made of diamonds (Bailes et al. 2011), public admiration is virtually assured. When the same scientific method yields findings that impinge on corporate interests or people’s lifestyles, the public response can be anything but favorable. The controversy surrounding climate change is one example of a polarized public debate that is completely detached from the uncontested scientific fact that Earth is warming from greenhouse gas emissions (e.g., Cook et al. 2013). How can scientists navigate those contested waters, and how can the public’s legitimate demand for involvement be accommodated without compromising the integrity of science?

Denial of Science

Public debate and skepticism are essential to a functioning democracy. There is evidence that skeptics can differentiate more accurately between true and false assertions (Lewandowsky et al. 2009). However, when tobacco researchers are accused of being a “cartel” that “manufactures alleged evidence” (Abt 1983, 127), or when a U.S. senator labels climate change a “hoax” that is ostensibly perpetrated by corrupt scientists (Inhofe 2012), such assertions are more indicative of the denial of inconvenient scientific facts than expressions of skepticism (Diethelm and McKee 2009). The dividing line between denial and skepticism may not always be apparent to the public, but existing research permits its identification because denial expresses itself in similar ways regardless of which scientific fact is being targeted (Diethelm and McKee 2009). For example, denial commonly invokes notions of conspiracies (Lewandowsky et al. 2015; 2013; Mann 2012). Conspiratorial content is widespread in anti-vaccination material on the Internet (Briones et al. 2012) as well as on blogs that deny the reality of climate change (Lewandowsky et al. 2015).

A second common feature of denial, which differentiates it further from legitimate debate, involves personal and professional attacks on scientists both in public and behind the scenes. To illustrate, the first two authors (Lewandowsky and Mann) have been variously accused of “mass murder and treason” or have received email from people who wanted to see them “six feet under.” Such correspondence is not entirely random: Abusive mail tends to peak after the posting of scientists’ email addresses on websites run by political operatives.

Those public attacks are paralleled by prolific complaints to scientists’ host institutions with allegations of research misconduct. The format of such complaints ranges from brief enraged emails to the submission of detailed multipage dossiers, typically suffused with web links and richly adorned with formatting. In the tobacco arena, there is evidence that such complaints are highly organized (Landman and Glantz 2009). The triage between vexatious complaints and legitimate grievances causes considerable expenditure of public funds when university staff are tied up in phone calls, email exchanges, and responding to persistent approaches while also trying to examine the merit of complaints.

A further target for contrarian activity involves preliminary results or unpublished data. This modus operandi was also pioneered by the tobacco industry, which campaigned hard to gain unhindered access to epidemiological data (Baba et al. 2005). At first glance, it might appear paradoxical that an industry would sponsor laws ostensibly designed to ensure transparency of research. However, access to raw data is necessary for the re-“analyses” of data by entities sympathetic to corporate interests. In the case of tobacco, those analyses have repeatedly downplayed the link between smoking and lung cancer (see Proctor 2011).

A curious feature of all these lines of attack is that they tend to be accompanied by calls for “debate.” Often the same individuals who launch complaints with institutions to silence a scientist also proclaim that they want to enter into a “debate” about the science that they so strenuously oppose.

Public Skepticism and the 
Scientific Process

Given that scientific issues can have far-reaching political, technological, or environmental consequences, greater involvement of the public in policy decisions can only be welcome and may lead to better outcomes. To illustrate, the town of Pickering in Yorkshire, England, recently revised its flood management plan as a result of a year-long collaboration between the local public and scientists (Whatmore and Landström 2011). The plan that was ultimately accepted differed considerably from the initial draft produced by scientists without local public input. Notably, Pickering escaped the flooding that gripped other parts of Yorkshire during the winter of 2015–2016 (Lean 2016).

Notwithstanding the public’s entitlement to be involved, scientific debates must still be conducted according to the rules of science. Arguments must be evidence-based, and they are subject to peer review before they become provisionally accepted. Arguments or ideas that turn out to be false are eventually discarded—a process that sometimes seems to take too long but that arguably has served science and society well (Alberts et al. 2015).

Although these strictures are rigorous and may appear daunting to the layperson, they do not exclude the public from scientific debate. It is important to show that the public can participate in scientific debate, because otherwise denialist activities might acquire a sheen of legitimacy as the only avenues open to the public to question scientific findings.

Recently, two of us (Friedman and Brown) were coauthors of an article (Brown et al. 2013) that received much coverage for its criticism of a long-standing, much-cited finding in the field of positive psychology. Positive psychology studies the strengths that enable individuals to thrive and aims to aid in the achievement of a satisfactory and fulfilling life. At the time when the project that led to our article began, Brown (the first author of that paper) was essentially a stranger to academia, having only attended three weeks of a weekend master’s program in psychology at the age of fifty-one while working full time as a civil servant.

When he doubted the validity of some of positive psychology’s findings that were presented as fact in his classroom, he pursued the issue by contacting a researcher (Friedman) by email based only on the hope that Friedman might be sympathetic to his puzzlement. Once a dialog with the expert had been established—and once Brown had convinced his interlocutor of his sincerity—a fruitful scientific collaboration ensued that has thus far led to the publication of six articles. Notably, this collaboration differs from conventional student-professor interactions in that the parties initially were not known to each other and had no professional relationship prior to an unsolicited approach by email.

To be sure, the process of getting the first rebuttal article published was not easy, given the stature (e.g., more than 350 citations) of the article reporting the original, erroneous finding (Fredrickson and Losada 2005). Brown and Friedman encountered a certain amount of resistance—which would mostly qualify as bureaucratic rather than sinister, despite some apparent conflicts of interest—to the acceptance of both their initial rebuttal article (on the basis of some rather bureaucratic interpretations of customary publishing practices), and to their attempts to write a subsequent comment on the original author’s reply (on the basis that the standard sequence of replies to a target article was now finished).

Ultimately, the system worked as it should: everyone remained calm and polite, and the various publishing and appeals processes were tested and observed to work. In the end, all articles appeared in print in the same journal, the scientific record was corrected, the field of positive psychology took stock, and nobody felt the need to publish home addresses or other personal details on the Internet (a harassing process known as “doxxing” that is popular not only with political operatives who oppose climate science but also with anti-vaccination activists and others). The contrast between the approach followed by Brown and the refusal to engage in the scientific process that is characteristic of denial as we described earlier in this article is striking.

The Need for Vigorous Debate

We underscore that there is plenty of room for honest and vigorous debate in science, even among collaborators: One of us (Brown) is an enthusiastic proponent of the widespread adoption of genetically modified organisms (GMOs) as a way to alleviate global food shortages, whereas two of us (Mann and Lewadowsky), while provisionally accepting the safety of GMOs, are concerned about their indirect consequences, such as the emergence of herbicide-resistant weeds that has been associated with GMO-related overuse of herbicides (Gilbert 2013). One of us (Friedman) is concerned about both their indirect consequences and their potential safety to individuals.

Two of us (Friedman and Brown) are not convinced beyond doubt that highly complex climate models are as yet sufficiently validated to be used as the basis of major public policy decisions that might have effects for many decades; the other two authors (Lewandowsky and Mann) acknowledge the uncertainty inherent in climate projections but note that, contrary to popular intuition, any uncertainty provides even greater impetus for climate mitigation (Lewandowsky et al. 2014). Notwithstanding those disagreements, the present authors found common ground for this article.

Although we believe that scientific evidence should inform political debate, we acknowledge that it is no substitute for it. To illustrate, the scientific evidence shows that the fallout from the Fukushima nuclear accident poses no discernible risk to people in North America (e.g., Fisher et al. 2013), but that finding should only guide, and not preclude, political debate about the safety of nuclear power. Whatever the science may say about the safety of nuclear power—for example, that it causes 100 times fewer fatalities than renewable biomass (Markandya and Wilkinson 2007)—those data might be legitimately overridden by the “dread” that nuclear power evokes in people. However, even dread does not justify harassment or threats of violence against scientists who measure nuclear fallout (Hume 2015).

Enhancing the Resilience of the 
Scientific Enterprise

Opinion surveys regularly and consistently show that public trust in scientists is very high (Pew Research Center 2015). However, the position of the scientist as a neutral, disinterested proponent of “the truth” should not be taken for granted. For example, when Brown and Friedman’s first article on positive psychology (Brown et al. 2013) was published, it was cited on several forums and blogs dedicated to creationist ideas or to climate change denial. The argument typically ran thus: If psychologists can be as badly wrong as Brown et al. showed, and if psychologists are scientists, then how much confidence can we have in the pronouncements of other scientists? While such flawed logic is easily refuted in reasoned debate, it might be preferable if scientists refrained from giving provocateurs the opportunity to raise this kind of question in the first place. We suggest that the scientific community should respond to both legitimate skepticism and politically motivated denial with a three-pronged approach.

First, legitimate public concern about a lack of transparency and questionable research practices must be met by ensuring that research lives up to rigorous standards. We endorse most current efforts in this regard, and one of us (Lewandowsky) is a member of a relevant initiative involving the use of peer review to facilitate openness.

Second, we believe that daylight is the best protection against politically motivated maneuverings to undermine science. The first part of this article is one effort toward such transparency.

Finally, skeptical members of the public must be given the opportunity to engage in scientific debate. We have shown how two of the present authors—an academic and a member of the public who had been to three evening classes before his skepticism was aroused—teamed up to critique a widely cited finding and showed it to be unsupportable. None of their activities fell within the strategies and techniques of denial that we reviewed at the outset, clarifying that denial is not an “avenue of last resort” for members of the public who are desperate to contribute to science or even correct it but rather a politically motivated effort to undermine science.



An extended version of this article, which contains recommendations for the way in which scientists and members of the public might engage with each other on contested issues, can be found at Lewandowsky, S., M.E. Mann, N.J.L. Brown, et al. 2016. Science and the public: Debate, denial, and skepticism. Journal of Social and Political Psychology 4: 537–553. DOI:10.5964/jspp.v4i2.604.


  • Abt, C.C. 1983. The Anti-Smoking Industry (Philip Morris internal report). September. Available online at Accessed May 6, 2012.
  • Alberts, B., R.J. Cicerone, S.E. Fienberg, et al. 2015. Self-correction in science at work. Science 348: 1420–1422. doi: 10.1126/science.aab3847.
  • Baba, A., D.M. Cook, T.O. McGarity, et al. 2005. Legislating “sound science”: The role of the tobacco industry. American Journal of Public Health 95: S20–S27. doi: 10.2105/AJPH.2004.050963.
  • Bailes, M., S. Bates, V. Bhalerao, et al. 2011. Transformation of a star into a planet in a millisecond pulsar binary. Science 333: 1717–1720.
  • Briones, R., X. Nan, K. Madden, et al. 2012. When vaccines go viral: An analysis of HPV vaccine coverage on YouTube. Health Communication 27: 478–485. doi: 10.1080/10410236.2011.610258.
  • Brown, N.J.L., A.D. Sokal, and H.L. Friedman. 2013. The complex dynamics of wishful thinking: The critical positivity ratio. American Psychologist 68: 801–813. doi: 10.1037/a0032850.
  • Cook, J., D. Nuccitelli, S.A. Green, et al. 2013. Quantifying the consensus on anthropogenic global warming in the scientific literature. Environmental Research Letters 8: 024024. doi: 10.1088/1748-9326/8/2/024024.
  • Diethelm, P., and M. McKee. 2009. Denialism: What is it and how should scientists respond? European Journal of Public Health : 2–4. doi: 10.1093/eurpub/ckn139.
  • Fisher, N.S., K. Beaugelin-Seiller, T.G. Hinton, et al. 2013. Evaluation of radiation doses and associated risk from the Fukushima nuclear accident to marine biota and human consumers of seafood. Proceedings of the National Academy of Sciences 110: 10670–10675. doi: 10.1073/pnas.1221834110.
  • Fredrickson, B.L., and M.F. Losada. 2005. Positive affect and the complex dynamics of human flourishing. American Psychologist 60: 678–686. doi: 10.1037/0003-066X.60.7.678.
  • Gilbert, N. 2013. Case studies: A hard look at GM crops. Nature 497: 24–26. doi: 10.1038/497024a.
  • Hume, M. 2015. Canadian researcher targeted by hate campaign over Fukushima findings. The Globe and Mail. Available online at
  • Inhofe, J. 2012. The Greatest Hoax: How the Global Warming Conspiracy Threatens Your Future. Washington, DC: WND Books.
  • Landman, A., and S.A. Glantz. 2009. Tobacco industry efforts to undermine policy-relevant research. American Journal of Public Health 99: 45–58. doi: 10.2105/AJPH.2004.050963.
  • Lean, G. 2016. UK flooding: How a Yorkshire town worked with nature to stay dry. The Independent. Available online at
  • Lewandowsky, S., J. Cook, K. Oberauer, et al. 2015. Recurrent fury: Conspiratorial discourse in the blogosphere triggered by research on the role of conspiracist ideation in climate denial. Journal of Social and Political Psychology 3: 142–178. doi: 10.5964/jspp.v3i1.443.
  • Lewandowsky, S., G.E. Gignac, and K. Oberauer. 2013. The role of conspiracist ideation and worldviews in predicting rejection of science. PLoS ONE 8: e75637. doi: 10.1371/journal.pone.0075637.
  • Lewandowsky, S., J.S. Risbey, M. Smithson, et al. 2014. Scientific uncertainty and climate change: Part I. Uncertainty and unabated emissions. Climatic Change 124: 21–37. doi: 10.1007/s10584-014-1082-7.
  • Lewandowsky, S., W.G.K. Stritzke, K. Oberauer, et al. 2009. Misinformation and the war on terror: When memory turns fiction into fact. In W.G.K. Stritzke, S. Lewandowsky, D. Denemark, et al. (Eds.), Terrorism and Torture: An Interdisciplinary Perspective (pp. 179–203). Cambridge, UK: Cambridge University Press.
  • Mann, M.E. 2012. The Hockey Stick and the Climate Wars: Dispatches from the Front Lines. New York: Columbia University Press.
  • Markandya, A., and P. Wilkinson. 2007. Energy and health 2: Electricity generation and health. The Lancet 370: 979–990. doi: 10.1016/S0140-6736(07)61253-7.
  • Pew Research Center. 2015. Public Esteem for U.S. Military Highest, Scientific Achievements Second in Global Comparison. Available online at
  • Proctor, R.N. 2011. Golden Holocaust: Origins of the Cigarette Catastrophe and the Case for Abolition. Berkeley, CA: University of California Press.
  • Whatmore, S.J., and C. Landström. 2011. Flood apprentices: An exercise in making things public. Economy and Society 40: 582–610. doi: 10.1080/03085147.2011.602540.