Why Some People Believe Fake News

Why Some People Believe Fake News

Legitimate fake news is a real threat to democracy, says Michigan State University psychology professor Zach Hambrick. And as you’ll hear him explain, there are reasons why some people believe fake news more than others. It’s an eye-opening conversation that reveals how all of us can fall victim to confirmation bias, the importance of fact-checking, and what happens to people’s beliefs when politics are removed from the equation.

Professor Hambrick’s Article in Scientific American: Cognitive Ability and Vulnerability to Fake News

Dr. Hallowell’s new book, ADHD 2.0, comes out January 12th. Pre-order Now!  Click here to pre-order your copy of ADHD 2.0.

Check out Dr. H on TikTok! @drhallowell

Thanks to our sponsor, OmegaBrite Wellness! Dr. H takes OmegaBrite supplements every day and that’s why he invited them to sponsor his podcast. SAVE 20% on your first order at OmegaBriteWellness.com with the promo code: Podcast2020.

Click HERE to learn more about our sponsor, Landmark College, in Putney, Vermont. It’s the college of choice for students who learn differently. Dr. H has an honorary degree from Landmark!

What do you think? Send an email with your thoughts to [email protected].

Distraction is created by Sounds Great Media. Our producer is Sarah Guertin and our recording engineer/editor is Scott Persson.

Check out this episode!

A transcript of this episode is below.


Dr. Ned Hallowell:
This episode is made possible by our sponsor, OmegaBrite Wellness. I’ve taken their omega-3 supplements for many years, and so as my wife, and that’s why I invited them to sponsor my podcast. I’m proud to have them. You can find all of their products online at OmegaBriteWellness.com. And bright is intentionally misspelled, B-R-I-T-E. OmegaBriteWellness.com. This episode is also sponsored by Landmark College, another institution that I have a warm personal relationship with, in Putney, Vermont. It’s the college of choice for students who learn differently. Learn more at LCDistraction.org.

Zach Hambrick:
What’s critical here is that people be critical of their own beliefs.

Dr. Ned Hallowell:
Yeah.

Zach Hambrick:
We constantly, in this information age where we’re constantly bombarded with information, some of which will be true and some of which will be false, we just have to be cognizant, A, that there’s a lot of mis- and disinformation out there and, B, that you have to take responsibility for your own beliefs and interrogate them to see whether or not they’re true.

Dr. Ned Hallowell:
Hello, and welcome to Distraction. I’m your host, Dr. Ned Hallowell. We have a fascinating topic and a fascinating guest to talk about that topic, and particularly appropriate of the era, election era we’re in right now. I guess it’s not an era, it’s a time, but in any case, a psychology professor at Michigan State University, Zach Hambrick, co-author of the article Cognitive Ability and Vulnerability to Fake News.

Dr. Ned Hallowell:
Cognitive Ability and Vulnerability to Fake News, that appeared in Scientific American in February of 2018. And the article suggests that real fake news, this is an oxymoron, but real fake news is a serious problem. Analysis by Buzzfeed revealed that during the final three months of the 2016 U.S. presidential campaign, the 20 most popular false election stories generated around 1.3 million more Facebook engagements, shares, reactions, and comments then did the 20 most popular legitimate stories. That’s just amazing.

Dr. Ned Hallowell:
And the most popular fake story was Pope Francis shocks the world, endorses Donald Trump for President. So let’s get right into it. Why do some people have a hard time rejecting misinformation?

Zach Hambrick:
Well, one reason is that misinformation is often repeated. And the more a piece of misinformation is repeated, the more likely people will come to think it’s true. We should distinguish between misinformation and disinformation. Misinformation is some false piece of information where the person disseminating the information didn’t necessarily intend to mislead. Disinformation is information that’s intended, false information that’s intended to mislead. It’s a subtle distinction, but definitely relevant.

Dr. Ned Hallowell:
So, disinformation is a lie.

Zach Hambrick:
Right.

Dr. Ned Hallowell:
And misinformation is an accidental misrepresentation.

Zach Hambrick:
Right, right, right. [crosstalk 00:04:02]

Dr. Ned Hallowell:
Okay. Okay. So I could say today is Thursday, and that would be misinformation.

Zach Hambrick:
Right.

Dr. Ned Hallowell:
But if I said my name is Bill, knowing full well that my name is Ned, that would be disinformation.

Zach Hambrick:
Right. That’s the basic idea. Yeah.

Dr. Ned Hallowell:
So, why do we believe the lie, the more it’s told?

Zach Hambrick:
Well, one reason, based on research from cognitive psychology, is that as a claim, a false claim is repeated over and over it becomes more familiar and we process it more fluently. For example, if you’re reading it, you’ll read it more quickly. And we use that fluency as a judgment for the truthfulness of something, rather than its actual truthfulness.

Zach Hambrick:
This is one of the mechanisms that seems to account for, or what called in cognitive psychology, an illusion of truth. You’re rating the truthfulness of a claim or a piece of information based on its familiarity, based on how fluently you read it or how easily you process it, rather than it on having some knowledge in your long-term memory that it’s the truth.

Dr. Ned Hallowell:
Well, how can the innocent citizen like me detect misinformation that’s information [crosstalk 00:05:49]

Zach Hambrick:
Yeah. Well, I think what this… As I write about in the Scientific American article, one of the things that you can do is begin to serve as your own fact checker. Okay? And if you hear a claim and you’re convinced that it’s true, then ask yourself why you think that’s true. Is it because you have some credible evidence that the claim is true? Or is it just because you’ve encountered it over and over? And on a related note, you should ask yourself if you know of any evidence that refutes the claim. And I think that not infrequently, if you query yourself in this way, you’ll be surprised to find that you actually do have some evidence that refutes the claim.

Dr. Ned Hallowell:
But most people… I’ll speak personally. I’m not going to take the time to do that. I’m not going to… So if I read, “Pope Francis shocks world, endorses Donald Trump for President,” I’ll say, “Wow, that’s amazing!” And I’ll just swallow it whole. So, am I not typical? Do most people naturally get skeptical?

Zach Hambrick:
Well, I think that’s another problem here, is that you come across something and you read it quickly on the subway or something. And you may not even remember where you got that piece of information.

Dr. Ned Hallowell:
Right, right.

Zach Hambrick:
And this is a problem with what we call source memory. You might forget that you saw this as a headline on the National Enquirer in checkout at the grocery store and not in the Guardian or the New York Times or The Economist. And I think that, again, what’s critical here is that people be critical of their own beliefs.

Dr. Ned Hallowell:
Yeah.

Zach Hambrick:
We constantly, in this information age where we’re constantly bombarded with information, some of which will be true in some of which will be false, we just have to be cognizant, A, that there’s a lot of mis- and disinformation out there and, B, that you have to take responsibility for your own beliefs and interrogate them to see whether or not they’re true or credible.

Dr. Ned Hallowell:
Well, I mean, how much does a, what’s the term, observant? What’s the term where you agree with what you want to…

Zach Hambrick:
Illusion of-

Dr. Ned Hallowell:
What is it called?

Zach Hambrick:
Illusion of truth?

Dr. Ned Hallowell:
Yeah. Okay. How much of that is, do we believe what we want to see and not believe what we don’t want to see?

Zach Hambrick:
Well, that’s another dimension to this. It may well be the fact that people are more likely to believe misinformation that comports with their preexisting beliefs.

Dr. Ned Hallowell:
So, is that what’s meant, is confirmation bias?

Zach Hambrick:
That would be an instance of confirmation bias, yes. That, along with seeking out information that confirms your pre-existing beliefs. If you only look for and focus on and process the information that agrees with your preexisting beliefs.

Dr. Ned Hallowell:
So, how do you take someone who doesn’t want to… Let’s take climate change as an example. I happen personally to believe that’s one of the most pressing problems that the world faces today, but there are intelligent, responsible people who think I’m completely wrong and that it’s all a hoax, that it’s just some kind of made up scare tactic that liberal politicians have invented to, I don’t know… I don’t know why they do it, but in any case, responsible people can disagree about the validity of global warming, which I personally think is the most pressing emergency that we’re facing in the world today.

Zach Hambrick:
Right.

Dr. Ned Hallowell:
So, what do I do, and what do the people who think I’m full of it do?

Zach Hambrick:
Right. Well, there’s this interesting research that seeks to use online forums where people discuss and try to come to an understanding of issues in groups. And what this has found… There’s another… I actually wrote a Scientific American article with my colleague and friend Jonathan Jennings on this. Jonathan is a director of an environmental organization called Health and Harmony.

Zach Hambrick:
And so what they did in this was quite interesting. They gave people a graph showing, I believe it was the amount of ice in the Arctic Sea. Yes, it was the amount in the Arctic Sea. And their task, working in groups, was to make a forecast for the future. Okay?

Dr. Ned Hallowell:
Mm-hmm (affirmative).

Zach Hambrick:
And the overall trend is in fact downward, indicating further loss of ice in the Arctic Ocean. And what they did was quite clever in this experiment. In one condition, they didn’t make any political orientation information salient among the group members who were chatting in this online group, and in another condition they did.

Zach Hambrick:
And basically, the finding here was that people were more accurate, the groups were more accurate, when political information was not made salient, was not made salient. And these groups included both conservatives and liberals, people who identified as such. And so the implication here is that when political information is made salient, whether you’re Republican or Democrat or a liberal conservative, then people have a hard time thinking rationally.

Zach Hambrick:
And when we set aside politics, we can actually… There’s a real value to having conversations with people with whom we disagree. In fact, we’re more likely to come to the right answer, in this case with respect to climate change. And so I think that’s one way. If there were ways in which people could harness the power of what we might call collective intelligence and work together to solve these difficult problems, while setting aside politics, then I think we would all be better off.

Dr. Ned Hallowell:
Well, why can’t computers do that for us? I mean, we can make computers talk to each other.

Zach Hambrick:
Right. Well, to some degree they can, like make forecasts. I mean, definitely computer models make forecasts that are relevant, for example, to climate change. But then in the end, people have to interpret what the computers say. And in the end, it’s humans making decisions based on value judgements that are informed by, but not completely dictated by, evidence.

Zach Hambrick:
So in the end, when we decide about what we want the world to be like, whether it’s with respect to environmental legislation or any kind of legislation for that matter, then I think that humans are ultimately making the decisions. And computers, they can inform those decisions, but there’s still a human interpreting what the computer model, for example, says.

Dr. Ned Hallowell:
Over the past few months, I’ve spoken to my friend, the founder and creator of OmegaBrite Wellness, Dr. Carol Locke, about the benefits of taking OmegaBrite’s Omega-3s, CBD, and other supplements. Here’s a clip from one of those conversations.

Dr. Ned Hallowell:
Well, I’ve certainly found them to be mood stabilizing. My mood is all over the place. I don’t know what diagnosis I have other than ADHD, but my mood is very labile, up, down, in-between, and quick to change. And I’ve found that the omega-3s really helped me with that, not only with my musculoskeletal issues but the mood issues as well. It is a wonder drug. I mean, what can I say?

Dr. Carol Locke:
Thank you. We hear that a lot from people. Particularly in the pandemic, we’re hearing from customers that they’re finding it essential with their mood. They’re also finding the OmegaBrite omega-3 essential in their relationships. Keeping their mood stable, positive, and feeling less anxiety helps them with their family relationships.

Dr. Carol Locke:
And I think anything we can do to help kids, parents, and teachers right now, because of this added stress of do they go back to the classroom, a changed classroom with partitions and masks and social distancing, or are they at home with their parents who are stressed is such a powerful situation. I think we want to help give people tools to put in their toolbox to succeed and to feel like they’re thriving and able to learn during this stress.

Dr. Ned Hallowell:
Distraction listeners, you can save 20% on your first order at OmegaBriteWellness.com by using the promo code PODCAST2020. All right. Let’s get back to today’s topic.

Dr. Ned Hallowell:
Well, as a professor of cognitive psychology, to what extent do you think emotion, confirmation bias, and lack of information contributes to our so-called opinions?

Zach Hambrick:
I think they do greatly. And I think they really do when we’re talking about highly politicized issues, whether it’s abortion, or the death penalty, or meddling in elections, and so on. I think that we know, in fact, from a lot of research on the general topic of rationality, we know that our preexisting beliefs, our politics, influence the way we think and behave. Even highly intelligent people are prone to react to irrationality. This was really the fundamental insight of a program of research by Daniel Kahneman and Amos Tversky-

Dr. Ned Hallowell:
Kahneman, yeah.

Zach Hambrick:
… beginning in the 1970s. And Daniel Kahneman won the Nobel Prize for this. And so, yes. I think it’s a big problem. And-

Dr. Ned Hallowell:
And Kahneman’s basic thesis was what, is that we are far more irrational than we’re aware of?

Zach Hambrick:
Yeah. I mean, his basic demonstration and basic argument is that people make decisions based on intuition rather than reason. And those intuitions might be right, but they quite often are wrong. And this leads people to make irrational decisions.

Dr. Ned Hallowell:
I always like to remind myself there was a time in history when the absolute, most smartest, intelligent people in the world knew quote, unquote, the world was flat.

Zach Hambrick:
Yeah. Yeah. There you go. Exactly.

Dr. Ned Hallowell:
It brings you up short. So we are, by nature, pretty easy to manipulate, if somebody knows just what buttons to push.

Zach Hambrick:
We can be. We certainly can be.

Dr. Ned Hallowell:
So what’s our best safeguard against that, Zach? What’s the-

Zach Hambrick:
Well, one is the kind of engineered environment that I was talking about before, where for example, people are interpreting evidence concerning climate change and they’re doing so in an anonymous virtual setting.

Dr. Ned Hallowell:
Right.

Zach Hambrick:
Another is basically having the, we call it metacognition. It’s thinking about your own thinking, having the metacognitive skills to know when you’re prone to errors in making judgements and decisions. For example-

Dr. Ned Hallowell:
So it’s like going shopping when you’re hungry.

Zach Hambrick:
Right. Exactly. That’s a good analogy, or knowing that there is such a thing called the confirmation bias. How do you make yourself less susceptible to confirmation bias? Well, the first way is knowing that such a thing exists.

Dr. Ned Hallowell:
Right.

Zach Hambrick:
And there’s some evidence to suggest that training about such things de-biases people, at least to some degree. I think those are examples of ways in which we might, if not making ourselves completely immune to these sorts of errors and biases and judgment decision-making, making us less susceptible.

Dr. Ned Hallowell:
I want to tell you about Landmark College in beautiful Putney, Vermont. It is the best college in the world for students who learn differently with ADHD, for other learning differences, or autism spectrum disorder. It’s fully accredited, not-for-profit, offering bachelors and associate degrees, bridge programs, online dual enrollment courses for high school students, and summer programs.

Dr. Ned Hallowell:
They use a strength-based model at Landmark which, as you know, is the model that I certainly have developed and subscribed to, to give students the skills and strategies they need to achieve their goals in life and really expand upon what they believe they’re capable of doing. It is just a wonderful, wonderful place and I can’t say enough good about it.

Dr. Ned Hallowell:
I myself have an honorary degree from Landmark College, of which I am very proud. Landmark College in Putney, Vermont is the college of choice for students who learn differently. To learn more, go to LCDistraction.org. That’s LCDistraction.org.

Dr. Ned Hallowell:
Okay. Let’s get back to today’s topic. It’s humbling, it really is, to know how, at least in my own case, how easy it is to manipulate me, whether it’s with food or money or whatever temptation I might [crosstalk 00:22:03].

Zach Hambrick:
Right. Well, that’s right. And I think that your awareness of it is absolutely critical, because now you want to understand. Okay, you recognize that, you’ve had the humbling insight, that you’re prone to making irrational decisions and that your buttons can be pushed. And so now, beginning with that insight, you seek out ways in which you can kind of protect yourself from that.

Zach Hambrick:
There’s this amazing research by a psychologist named Philip Tetlock, where they basically tried to identify people who were good at predicting world events, political events like, “Will Iran and Iraq go to war,” and stuff of this sort. This is a program of research funded by the military. And they identified people who they called superforecasters.

Zach Hambrick:
And the superforecasters were able to forecast these seemingly unpredictable events, or at least very difficult to predict events, better than anyone else. And they did so by not falling prey to biases like the confirmation bias. They were bright, but they weren’t geniuses in kind of the traditional sense of the term intelligence, but what they did know about were these kinds of biases and making judgements that we all seem to be prone to. And they were able to avoid making errors based on these biases.

Dr. Ned Hallowell:
And was there any variable that separated the superforecasters from the rest? Or is there any cognitive trait?

Zach Hambrick:
That was the one. That was the one. It wasn’t intelligence.

Dr. Ned Hallowell:
Right.

Zach Hambrick:
It was knowledge of how biases in judgment, decision-making, like the confirmation bias, the my-side bias, and so on.

Dr. Ned Hallowell:
So like in football betting, people are more likely to bet on the home team.

Zach Hambrick:
Yeah. There you go. Yeah. A perfect example, yes.

Dr. Ned Hallowell:
Yeah. So if you-

Zach Hambrick:
I guess, let me just say a little bit more. They were also people who didn’t think they had one big idea that could kind of dictate all of their predictions, and said they were information seekers who were willing to change their mind if that’s what the evidence dictated. They weren’t ideologues, maybe, to put it another way.

Dr. Ned Hallowell:
Yeah.

Zach Hambrick:
They were people who sought out evidence and revised their beliefs as that evidence dictated. They weren’t dogmatic.

Dr. Ned Hallowell:
Yeah. Well, and again, that brings me back to wondering why couldn’t a computer do it better, because there’s no emotion involved? Then the computer is only as good as the information you give it.

Zach Hambrick:
Well, yeah. That’s exactly right, how the computer’s only as good as the information that you give it. Somebody has to write the programs. And in fact, computers are better than humans at certain things, like predicting the stock market.

Dr. Ned Hallowell:
Are they really?

Zach Hambrick:
That’s not… Go ahead.

Dr. Ned Hallowell:
I said, are they really? That’s interesting.

Zach Hambrick:
Yeah. In general, there was a long series of studies beginning in probably the 1960s, 1970s showing that statistical models do better than humans in predicting certain things, the weather-

Dr. Ned Hallowell:
For all of the reasons we’ve been discussing.

Zach Hambrick:
That’s right, yeah.

Dr. Ned Hallowell:
Well, this is so spot on for living in today’s world. So what did your article conclude? I haven’t read it. I will, but-

Zach Hambrick:
The take home message of the article… This is the Scientific American article on the Illusion of Truth. The take home message is, as I said before, that we have to be our own fact checkers in this information age, in this misinformation age. And I think that it really… This type of research that I write about… Incidentally, this is not my own research. This is other cognitive psychologists’ research… is that this thread of fake news, it poses a real threat to democratic society.

Zach Hambrick:
This research really underscores this threat that fake news poses to democratic society. And of course, the aim of fake news is to make people think and behave in ways that they wouldn’t otherwise-

Dr. Ned Hallowell:
Right.

Zach Hambrick:
… including to hold views that are contradicted by scientific consensus and scientific evidence. This is very relevant during this pandemic. Recently, Trump tweeted that he was immune from COVID-19. I don’t think any doctor who knows anything about this would agree with that. It may, it may not. We don’t know this.

Zach Hambrick:
And so when this nefarious aim of fake news is achieved, we as citizens no longer have the ability to act in our self-interest. We’re misled. We’re deluded. And this, of course, isn’t just bad for an individual, it’s bad for society as a whole, as starkly illustrated by the pandemic.

Dr. Ned Hallowell:
Yeah. It just brings you up so short. I mean, everyone would love for our president to be immune. And so when they hear him say he is immune, the confirmation bias says, “Okay, great. You’re immune.”

Zach Hambrick:
Well, that’s right. So, what comes out of his mouth, some people think of that as news.

Dr. Ned Hallowell:
Yeah.

Zach Hambrick:
And in this case, that is fake news.

Dr. Ned Hallowell:
Right, right, right.

Zach Hambrick:
And so this research, again, just highlights the pernicious effects of misinformation and disinformation in a democratic society,

Dr. Ned Hallowell:
Boy, it’s really cautionary to not let yourself seduce yourself into believing what you want to believe. It’s hard not to, because it’s very tempting to just think [crosstalk 00:29:28]

Zach Hambrick:
Right. And we have to constantly ask ourselves, “Why do I believe this is true?”

Dr. Ned Hallowell:
Right. Right. And then, as you say, talk to someone who disagrees with you, wrestle with it instead of just talking to people who agree with you.

Zach Hambrick:
Sure, that’s right.

Dr. Ned Hallowell:
If you’re liberal, you should watch Fox News. And if you’re a conservative, you should watch MSNBC. You should test it out a little bit anyway.

Zach Hambrick:
Dig a little deeper than the news story.

Dr. Ned Hallowell:
Yeah. Don’t just eat what you’ve been fed. No, exactly, exactly. Then we wonder about the motive of the person who’s saying it, and what are they leaving out, and what are they lying about, frankly? Well, you’re a very smart man. And thank you so much. Zach Hambrick and his article, Cognitive Ability and Vulnerability to Fake News. He’s a psychology professor at Michigan State University, a great university. And it’s a pleasure talking to you. Thank you so much for coming on the podcast.

Zach Hambrick:
Thank you. Okay. Thanks. Goodbye.

Dr. Ned Hallowell:
Take care. Okay. Well, that’s our show for today. Please continue to reach out to us with your questions, comments, and show ideas. Email us at [email protected] That’s [email protected] Distraction is created by Sounds Great Media. Our producer is the wonderful, ebullient Sarah Guertin, and our recording engineer and editor is the equally ebullient and wonderful Scott Persson, and that’s with two S’s in Persson. I am Dr. Ned Hallowell with four L’s. Thank you so much for joining me and we’ll see you next time.

Dr. Ned Hallowell:
The episode you just heard was made possible by my good friends at OmegaBrite Wellness. I take their supplements every day and that’s why I invited them to sponsor my podcast. Shop online at OmegaBrite, and that’s B-R-I-T-E, Wellness.com.

Share:

Add Comment

Your email address will not be published. Required fields are marked *