• Home
  • Exploring Why Facts Don’t Change Our Minds: A Q&A with Dr. Lee McIntyre

Exploring Why Facts Don’t Change Our Minds: A Q&A with Dr. Lee McIntyre


Lee McIntyre, Ph.D., a research fellow at the Center for Philosophy and History of Science at Boston University and an ethics lecturer at Harvard Extension School, was recently a guest on the Sunday, May 28, 2017, episode of “In Your Right Mind.” The show is a weekly talk radio broadcast with a focus on an array of behavioral health topics co-hosted by Sovereign Health CEO Tonmoy Sharma and co-host Kristina Kuestner.

In the episode, called, “Why Facts Don’t Change Our Minds,” Dr. McIntyre and Steve Sloman, Ph.D., discussed the cognitive biases and errors that tend to occur in human reasoning. These tendencies lead us to reject logical, scientific evidence and instead reach flawed conclusions. We sat down with Dr. McIntyre to learn more about different types of cognitive biases and other reasons that facts do not change our minds.

Question: Why is it that people may continue to believe certain ideas to be true even when there is scientific evidence that contradicts their reliability- why is it that facts do not change our minds?

A: The phenomenon you’re referring to is something called, “denialism,” and it has deep roots in cognitive bias. Denialism occurs when someone has an ideological belief that conflicts with the evidence of science, so they indulge in what they want to believe rather than rational thought. It’s not that denialists always ignore evidence. Some are very skilled at ferreting out any small nuggets of information that may support their views. Instead, these individuals are misusing the scientific standards of evidence. What they’re setting out to do is put their beliefs first, then find the facts to support them, which is called, “confirmation bias.” When we are studying empirical matters, this is a dangerous way to proceed. Our beliefs should follow the facts, not the other way around. Denialists are almost always proven wrong in the long run.

Q: Humans are prone to making cognitive errors in reasoning, a phenomenon you discussed in your book, “Respecting Truth: Wilful Ignorance in the Internet Age.” How can people overcome this wilful ignorance?

A: I think that wilful ignorance is a stop on the road to denialism. Wilful ignorance is when we believe something even though we suspect that there may be evidence against it, but we find it too painful to acknowledge. At this point, someone is still capable of being reached. Several psychological studies have shown that if you hit people with the facts “right between the eyes” over and over again, eventually they will change their minds; however, it is difficult because beliefs are sometimes based not just on facts, but on how it makes us feel to believe something. If believing a falsehood makes us feel better, sometimes we will choose to believe the falsehood. So the job, then, is not just to present someone with facts, but to make them see that they are being duped; that they are cooperating in their own deception. If we can get to them at that point, it is possible that they won’t become a denialist.

Q: Please explain how “wilful ignorance” differs from “simple ignorance”?

A: Simple ignorance is when we don’t know something, but we can still be taught. If I believe that the Sun goes around the Earth, that is a mistake and I can still change my mind. As long as I am open to new evidence, ignorance is not really that dangerous. Wilful ignorance, on the other hand, can be very dangerous, for if it is unchecked it can lead to denialism, which is when we stop taking in any new information and reject facts even when they are right in front of our face. If no one challenges us, simple ignorance can become wilful ignorance, which can then become denialism, and that is a very bad progression.

Q: Why is it that people tend to reject information that contradicts their beliefs and opinions, and how can this be harmful to society as a whole?

A: The roots can be found in social psychology. One of the most powerful forces in the human psyche is to defend the idea that we are smart, worthwhile, competent people. When this is threatened by outside evidence, we sometimes choose to ignore the evidence. There is a lot of psychological research which shows that we all have multiple cognitive biases that are wired into our brains, which can feel a lot like thinking. If we are motivated to believe something, it can be difficult to realize that this is the result of bias. This is especially true when those around us believe the falsehood too. Solomon Asch presented experimental evidence 60 years ago which showed that when our peer group believes something that is obviously false, about a third of us will embrace the falsehood! This is to say that belief can be tribal. And this is dangerous because “groupthink” can lead us off a cliff. What we need are more people questioning and pushing back on the prevailing wisdom. Without that, it is easy to mistake popular agreement for truth.

Q: Numerous scientific studies have found evidence for the dangerous effects of binge drinking, smoking cigarettes, drug use and other similar behaviours. Why do you think that people continue to dismiss well-documented evidence for the harms of such behaviours?

A: I think a distinction has to be made between thinking that those behaviours are dangerous versus thinking that even if they are dangerous that the effects won’t happen to us; this is another cognitive bias, where we seem to imagine that we are going to be the exception rather than the rule. I think another thing that is going on here is that some people are excited by the idea of taking risks. Unfortunately, when we are engaging in addictive behaviour – or simply behaviour so dangerous that it can kill us on the first trial – it is hard to outrun the harms that can accrue to these behaviours. Again, wishful thinking is a powerful force in human psychology. Few people these days probably think that smoking, drinking, and drugs are not dangerous … they just think it won’t affect them.

Q: At the same time, why do you think people engage in behaviours that we know are harmful for us?

A: I wish I knew the answer to this. Some of it is probably “weakness of the will,” where we know that eating fatty food, for instance, can have negative health effects, yet we still crave the taste. An interesting perspective here is that of the ancient Greeks, who believed that no one could ever do a bad thing knowing that it was bad, because it would be irrational to want to harm oneself. The ancient Greeks thought that self-destructive behaviour was always the result of being misinformed. I doubt that this is true, but it is an interesting idea, because it does at some level seem irrational to want to harm oneself.  That said, self-discipline is probably just as rare as perfect reason.

Q: It seems that people are particularly blinded when it comes to politics. Why does fact-checking not help to alleviate the ignorance of the truth?

A: I think that politics is an excellent example of tribalism. In politics, we “pick a team” and what matters most is perhaps not having beliefs that are true, but having beliefs that fit with the other members of our tribe. We see this on Facebook every day. If I am a progressive, then I believe in universal health care, gay marriage, immigrant rights, etc. However, if I’m a progressive and also support the second amendment, I will probably pay a price. I will literally “lose friends” for failing to have beliefs about gun control that conform to the beliefs of other members of my group. This might subtly nudge me toward modifying my beliefs, or it might make me keep quiet. This can occur for even the most extreme examples. Who cannot tell, by looking at the Trump inauguration photos side by side with Obama’s, that Trump had fewer people? If our “tribe” says that Trump had the largest inauguration in history, we might reject all sorts of facts and indulge in the wildest conspiracy theories about the media, just to preserve conformity with our group.

Q: How does our selective exposure to certain types of information influence what we believe to be the truth?

A: News silos are one of the most dangerous things for forming well-warranted beliefs. When we are motivated to believe that something is true, even if we don’t have any evidence for it, this is a powerful force. When we are provided with the slightest scintilla of evidence that we are right, even if it is completely fake or comes from an unreliable source, this can multiply our resolve — this is why the phenomenon of fake news was so powerful in the 2016 Presidential election. If you were already disposed to believe that Hillary Clinton was a bad person, and a piece of fake news came out suggesting that she and her husband were running a child sex ring out of a pizza restaurant in Connecticut that can be devastating. One guy actually showed up at the pizza shop in question and fired off a round from his rifle as he sought to “investigate” the allegations. This story was ludicrously false, but for those living in a news silo, they perhaps didn’t trust any other source to check its accuracy.

Q: Cognitive bias is the tendency for people to create their own “subjective social reality” from their experience of the world around them and social interactions. Why do people have cognitive bias, and how does it play a role in why facts don’t change our minds?

A: As I said, cognitive bias is wired into all of us, no matter how smart or rational we may be. The question of why these biases are there is a tricky one. In order for cognitive bias to survive the process of evolution, it must have some benefit. Yet, what could be the benefit of having any false beliefs? One idea is that in the primordial environment, we didn’t need to engage in these long chains of reasoning about things like climate change. Instead, we needed our brains to work fast and give us an answer (that might be a tiger, run!), whether those beliefs were justified by the evidence or not. So, there might have perhaps been some original purpose to them. The question now is why these biases have stuck around. Perhaps, evolution is just slow. Or, perhaps, as some scholars now argue, the point of reason was never to arrive at true beliefs in the first place, but to persuade others with our argumentative skills, even if we were not right. There are a lot of theories, but no definitive answer yet. In the meantime, we’re stuck with these paleo-brains that are called upon to reach solid analytical conclusions about space travel, thermonuclear war, and global warming.

Q: In light of all of these barriers, how can we come to find truth?

A: I am a firm believer that forming true beliefs takes practice. We have to embrace the ideas of skepticism and critical thinking. We have to care about scientific evidence, which means not just finding facts that fit our beliefs, but also those that challenge them, so we can be sure that they are well-tested.  As I argued in my book, it is important to have respect for those methods of reasoning that have customarily led to true beliefs. We shouldn’t just wait for reality to hit us between the eyes, although that does eventually work. One of the most enlightening stories I’ve read recently is that of Republicans in Coral Gables, Florida, who are far to the left of their political party on the issue of climate change. Why? Because the water is flooding their homes and businesses. It’s not so easy to toe the partisan line when the water is rising. Whether the rest of us have to wait that long to be rational is strictly up to us. Just because we have cognitive bias doesn’t mean we must be imprisoned by it. Reason matters. Truth matters. I, for one, would rather recognize this sooner, rather than later.

About Dr. McIntyre

Lee McIntyre, Ph.D., is a research fellow at the Center for Philosophy and History of Science at Boston University and an ethics lecturer at Harvard Extension School. He has previously taught philosophy at Colgate University, Boston University, Simmons College and Tufts Experimental College. Formerly Executive Director of the Institute for Quantitative Social Science at Harvard University, he has also served as policy advisor to the executive dean of the Faculty of Arts and Sciences at Harvard and as associate editor in the research department of the Federal Reserve Bank of Boston. He is the author of Respecting Truth: Willful Ignorance in the Internet Age (Routledge, 2015), Dark Ages: The Case for a Science of Human Behavior (MIT Press, 2006) and numerous other books, scholarly articles and edited volumes. His forthcoming book, Post-Truth, will be published by MIT Press as part of their Essential Knowledge series in the fall of 2017.