Skepticism and the Persuasive Power of Conversion Stories

Scott O. Lilienfeld

Those of us in the skeptical community have our work cut out for us. In the process of disseminating scientific thinking, we often challenge unsubstantiated beliefs that are held with considerable conviction. Every one of us who has tried to persuade committed believers in astrology or homeopathy that they are mistaken knows just how challenging—and in some cases, how futile—this endeavor can be. We skeptics rarely win popularity contests.

So how can we effectively persuade believers in dubious claims to change their minds, or at least to give our contrary ideas a fair hearing? Traditionally, much of the science communication literature has operated implicitly from the “information deficit model.” From this perspective, which is premised on Sir Francis Bacon’s principle that knowledge is power, the primary driver of pseudoscience is inadequate scientific literacy. If we could only find a means of better educating the general public about science, this model assumes, most unsupported ideas would lose their stranglehold over the populace. Nevertheless, recent data suggest that the information deficit model, although probably containing a kernel of truth, does not tell the full story. For example, research by Dan Kahan of Yale University and his colleagues reveals that among political conservatives (but not liberals), higher levels of scientific literacy are associated with greater skepticism of global warming and its damaging impacts (Kahan et al. 2012). Although such findings are open to multiple interpretations, they raise the possibility that imparting scientific knowledge might in some cases backfire, perhaps by affording individuals who hold unwarranted beliefs the intellectual ammunition to rebut scientific arguments. If corrective science education has its limits, are there better alternatives to debunking erroneous beliefs?

A recent study conducted by Benjamin Lyons and colleagues, published in the peer-reviewed journal Public Understanding of Science (Lyons et al. 2019), offers a potential answer. The authors aimed to determine whether presenting participants with “conversion stories”—descriptions of individuals who once held an unsubstantiated belief but who later changed their minds—might be an effective, albeit underutilized, persuasion technique.

To test this hypothesis, they showed participants video clips from a 2013 conference talk by Mark Lynas, a British author and environmental activist who was initially a staunch opponent of genetically modified organisms (GMOs) but later became persuaded that GMOs were safe. Participants, who were 727 American adults recruited for an online survey, were randomly assigned to view three different clips of Lynas: (1) one in which he simply advocated for GMOs, (2) one in which he advocated for GMOs and noted that he had initially been opposed to GMOs, and (3) one in which he advocated for GMOs and explained what had prompted him to change his mind. Both conversion conditions (2 and 3) led to more positive attitudes toward GMOs than did condition 1, with no significant differences between conditions 2 and 3.

It is not known how conversion experiences work as persuasive tactics. Lyons and colleagues found that their two conversion conditions appeared to operate by boosting the strength of the speaker’s arguments rather than by enhancing his personal credibility. Other data suggest that an effective means of refuting scientific misconceptions is to supplant false beliefs with an equally or more compelling explanatory narrative (Lewandowsky et al. 2012). For example, if we want to rebut the erroneous belief that vaccines cause autism (now called autism spectrum disorder), presenting individuals with compelling data that autism risk is substantially inherited may help, as may presenting them with narratives describing how and why even intelligent people can fall prey to the erroneous belief.

The use of conversion stories for persuasive purposes is hardly new. Social psychologists who have studied persuasion have long underscored the utility of inoculating people against false claims by first presenting them with information that seemingly supports these claims and then refuting it (Pratkanis 2007). Moreover, legal scholar Cass Sunstein and his colleagues have written of the persuasive power of “surprising validators”: people we would not typically expect to support a position but who end up doing so (Glaeser and Sunstein 2014). A fervent believer in astrology who later becomes convinced that astrology is unscientific can function as a surprising validator, communicating to others her change of heart and the reasons for it.

If the results of the study by Lyons and collaborators are replicable and generalizable beyond GMOs, the skeptical community may want to consider harnessing conversion experiences as a persuasive strategy. Consider Janyce Boynton, once an ardent believer in the now convincingly debunked technique of facilitated communication (FC) for autism and other developmental disabilities. Boynton was once a trained facilitator; for readers who have seen the classic 1993 Frontline video, “Prisoners of Silence,” Boynton was the facilitator in the Betsy Wheaton case, in which a sixteen-year-old girl with autism accused her father and brother of sexual abuse through FC. Following a series of controlled tests by Dr. Howard Shane of Boston Children’s Hospital, which showed conclusively that she, not Wheaton, was authoring the messages, Boynton reluctantly decided to forgo FC and persuaded her school to stop using it. Two decades later, in a courageous 2012 article, Boynton wrote of her emotionally painful conversion experience, observing how difficult it was for her to surrender her initial beliefs in the wake of Shane’s negative controlled tests:

I felt such devastation, panic, pain, loneliness—a myriad of emotions difficult to put into words. The whole FC thing unraveled for me that day, and I did not have an explanation for any of it. Almost immediately, I started rationalizing away the truth. … I understand how difficult it may be for some facilitators to change their belief system. There is a lot at stake: people’s careers, reputations, connections with their family member or client. Nonetheless, I urge practicing facilitators to take a long, hard look at their own behavior. (Boynton 2012, 11–12)

(Psychologist Stuart Vyse devoted his March/April 2019 Skeptical Inquirer “Behavior & Belief” column to Boynton and her case [Vyse 2019].)

Despite overwhelming scientific evidence that FC does not work, the method remains alive and well, and even seems to be mounting a comeback in some quarters under the guise of “rapid prompting method” and allied techniques that are minor variants of FC (Lilienfeld et al. 2014). Concerted efforts by skeptics to stem the tide of FC’s popularity appear to have failed. It would be interesting to see if presenting committed FC believers with Boynton’s compelling narrative would shake their belief certainty. More broadly, it would be useful for the skeptical community to gather similar conversion stories and determine whether they can be harnessed in the service of beneficial attitude change.


References

  • Boynton, J. 2012. Facilitated communication—what harm it can do: Confessions of a former facilitator. Evidence-Based Communication Assessment and Intervention 6: 3–13.
  • Glaeser E., and C.R. Sunstein. 2014. Does more speech correct falsehoods? The Journal of Legal Studies 43: 65–93.
  • Kahan, D.M., E. Peters, M. Wittlin, et al. 2012. The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change 2: 732–735.
  • Lewandowsky, S., U.K. Ecker, C.M. Seifert, et al. 2012. Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest 13: 06–131.
  • Lilienfeld, S.O., J. Marshall, J.T. Todd, et al. 2014. The persistence of fad interventions in the face of negative scientific evidence: Facilitated communication for autism as a case example. Evidence-Based Communication Assessment and Intervention 8: 62–101.
  • Lyons, B.A., A. Hasell, M. Tallapragada, et al. 2019. Conversion messages and attitude change: Strong arguments, not costly signals. Public Understanding of Science 0963662518821017.
  • Pratkanis, A.R. 2007. Social influence analysis: An index of tactics. In A.R. Pratkanis (ed.), The Science of Social Influence: Advances and Future Progress. Philadelphia, PA: Psychology Press, 17–82.
  • Vyse, Stuart. 2019. An artist with a science-based mission. Skeptical Inquirer 43(2) (March/April).

Scott O. Lilienfeld

Scott O. Lilienfeld is associate professor of psychology at Emory University and Editor of the Scientific Review of Mental Health Practice