Know Our Lazy Brains Need to Wake Up
The term “post-truth era” started showing up in 2016. It stands for “the disappearance of shared objective standards for truth.” This results in a blurring of lines among facts, alternative facts, knowledge, opinion, belief, and truth.
Unfortunately, our brains are complicit in this.
We Don’t Care If It’s True
Yes: it’s come to that.
Research by two professors at the University of California’s Rady School of Management indicates what we suspected all along. We’re attracted to things that interest us and don’t care if they’re true.
They arrived at this conclusion by studying papers published in leading psychology, economic and science journals. Experiments in a good percentage of their sample (61% in psychology, 39% in economics and 38% in nature/science) couldn’t be verified by other researchers.
However, these “non-replicable” studies were cited 153-times more often than research whose results could be repeated. And, it turned out, only 12% of the non-replicable papers ever had a correction published later. But the original research continued to be cited as though it was factual. So, over time, unreliable research has a bigger influence than truth!
When asked why, the professors believed the difference was “interest.” Journals appear to have a lower threshold for accepting interesting papers—because they believe readers find this more attractive (and they need readers).
Interesting results also grab the attention of social media outlets, which then spread the misinformation wider.
We’re Bad at Checking Sources—and Getting Worse
Today’s students have a hard time distinguishing actual from fake news online.
Stanford’s Graduate School of Education studied students in 12 states. They were asked to evaluate information in tweets, comments and articles. Researchers called the results “shocking”:
- Most middle school students can’t tell native ads from articles
- Most high school students accept photographs as presented, without verifying them
- Many high school students couldn’t distinguish a real versus fake news source on Facebook
- Most college students didn’t suspect potential bias in a tweet from an activist group
- Most Stanford students couldn’t identify the difference between a mainstream and fringe source
Why Our Brain Isn’t Helping
Psychologist and economist Daniel Kahneman, in Thinking, Fast and Slow, says our brain defaults to “fast” thinking—because it’s easier.
In 7-1/2 Lessons About the Brain, Lisa Feldman Barrett, distinguished professor of psychology at Northeastern University, offers a reason for this. She says the brain’s main job is to allocate the body’s energy to important tasks and make sure you don’t deplete your resources in doing this.
And don’t get me started on confirmation bias. Our brains look for information that supports our beliefs. When we encounter data that disagrees with our views, we actively work to disprove and discredit it.
Leader Beware
There are two major lessons for us here.
Find interesting ways to present ideas to your people. Know they’re not engaged just because something is true. Don’t think that burying them in facts and figures is convincing.
The two most compelling presentations of anything are 1) using stories and 2) showing what’s in it for them. Kahneman and Barret would tell you, “Don’t expect people’s brains to do additional work on why they should care.” That’s your job as a leader.
Check your—and their—sources. It took the British medical journal Lancet 12 years to print a retraction of an irreplicable study linking a certain vaccine to autism. And plenty of well-meaning people concerned about vaccination still cite this as true.
Know where you’re getting your information. Encourage your people to be more thorough, too. It’s called the illusory truth effect: the more often something is repeated, the more likely people will believe it’s true—even when it’s not.
Use this understanding of how everyone’s brains work to better connect with your people. And make it easier for them to connect with you and their work.