Social scientists have some intriguing explanations for why people persist in misjudgements despite strong contrary evidence. In fact, studies conducted over the past 30 years show that attempts to refute false information often backfire and lead people to hold on to their misperceptions even more strongly.
A 2015 behavioural science article examined the puzzle of why nearly one-third of U.S. parents believe that childhood vaccines cause autism, despite overwhelming medical evidence that there’s no such link. In such cases, the study noted, “arguing the facts doesn’t help — in fact, it makes the situation worse.” The reason is that people tend to accept arguments that confirm their views and discount facts that challenge what they believe.
This “confirmation bias” was outlined by a psychological study in 1979. It found that his test subjects, when asked questions about capital punishment, responded with answers shaped by their prior beliefs.
“Instead of changing their minds, most will dig in their heels and cling even more firmly to their originally held views,” (Graves, 2015)
Trying to correct misperceptions can actually reinforce them. A 2006 study documented a “backfire effect” by showing the persistence of the belief that Iraq had weapons of mass destruction in 2005 and 2006, after the United States had publicly admitted that they didn’t exist.
“The results show that direct factual contradictions can actually strengthen ideologically grounded factual belief,” (Nyhan & Reifler, 2006)
It has also been examined how attempts to debunk myths can reinforce them, simply by repeating the untruth. A 2005 consumer study on “How earnings about false claims become recommendations” concluded that it seems that people remember the assertion and forget whether it’s a lie.
“The more often older adults were told that a given claim was false, the more likely they were to accept it as true after several days have passed.” (Skurnik, Yoon, Park & Schwarz, 2005)
When critics challenge false assertions — say, Donald Trump’s claim that thousands of Muslims cheered in New Jersey when the twin towers fell on Sept. 11, 2001 — their refutations can threaten people, rather than convince them. If people feel attacked, they resist the facts all the more.
Moving forward, two interesting things can be taken from this body of research: People are more likely to accept information if it’s presented unemotionally, in graphs; and they’re even more accepting if the factual presentation is accompanied by “affirmation” that asks respondents to recall an experience that made them feel good about themselves.
– Courtesy of washingtonpost.com
 Graves, C. (2015) “Why Debunking Myths About Vaccines Hasn’t Convinced Dubious Parents” Harvard Business Review.
 Lord. C., Ross. L., Lepper. M.R. (1979) “Biased Assimilation and Attitude Polarization: The Effects of Prior Theories on Subsequently Considered Evidence” Stanford University:
People who hold strong opinions on complex social issues are likely to examine relevant empirical evidence in a biased manner. They are apt to accept “confirming” evidence at face value while subjecting “disconfirming” evidence to critical evaluation, and, as a result, draw undue support for their initial positions from mixed or random empirical findings. Thus, the result of exposing contending factions in a social dispute to an identical body of relevant empirical evidence may be not a narrowing of disagreement but rather an increase in polarization. To test these assumptions, 48 undergraduates supporting and opposing capital punishment were exposed to 2 purported studies, one seemingly confirming and one seemingly disconfirming their existing beliefs about the deterrent efficacy of the death penalty. As predicted, both proponents and opponents of capital punishment rated those results and procedures that confirmed their own beliefs to be the more convincing and probative ones, and they reported corresponding shifts in their beliefs as the various results and procedures were presented. The net effect of such evaluations and opinion shifts was the postulated increase in attitude polarization.
 Nyhan. B., Reifler. J. (2006) “The roles of information deficits and identity threat in the prevalence of misperceptions” Dartmouth College & University of Exeter:
Why do so many Americans hold misperceptions? We examine two factors that contribute to the prevalence of these beliefs. First, presenting correct information in a clear format should reduce misperceptions. However, people may instead reject accurate information because it threatens their worldview or self-concept — a mechanism that can be revealed by affirming individuals’ self-worth, which could make them more willing to acknowledge uncomfortable facts. In three experiments, we find that providing information in graphical form reduces misperceptions. We also find that self-affirmation may help diminish misperceptions when no other information is provided. These results suggest that misperceptions are caused by lack of information as well as the psychological threat of offering correct answers.
 Skurnik. Ian., Yoon. C., Park. D.C., Schwarz. N (2005) “How warnings about false claims can become recommendations” Journal of Consumer Research:
Telling people that a consumer claim is false can make them misremember it as true. In two experiments older adults were especially susceptible to this “illusion of truth” effect. Repeatedly identifying a claim as false helped older adults remember it as false in the short term, but paradoxically made them more likely to remember it as true after a three-day delay. This unintended effect of repetition comes from increased familiarity with the claim itself, but decreased recollection of the claim’s original context. Findings provide insight into susceptibility over time to memory distortions and exploitation via repetition of claims in media and advertising.
 ABC’s “This Week,” November 22, 2015 as cited by the Washington Post.