The Problem of the Lazy Brain
Do you remember The Dress? It was back in February of 2015. Tens of millions of people saw an online photo of a striped dress and had dramatically different opinions about what color it was. Some thought the dress was white and gold, while others, looking at precisely the same photo, thought it was blue and black. So what was the factual color of that dress? What was the truth?
The Dress — an example of how people can see the same raw information in such different ways — comes to mind amid discussion of our “post-factual” or “post-truth” era, in which supporters of President-elect Donald Trump seem to accept his penchant for demonstrably erroneous assertions and, it must be said, his outright lies. The public’s lack of critical thinking is also being scrutinized — and lamented — in a world of so-called “fake news,” a modern media era of unprecedented information freedom in which anyone can say anything and spread it instantly around the world, and where anyone can find and rely on only the “news” that comfortably confirms their beliefs.
These circumstances are deeply worrisome, and represent a true crisis in critical thinking. But to combat the problem, we must first understand it, and that includes recognizing that what we’re witnessing now is, in many important ways, a reflection of the inherent nature of human cognition. A host of recent research has explored the inner workings of that cognitive system, and it suggests that, at one time or another, we are all susceptible to being duped — by others and by our own minds. The science also helps to explain why the late American novelist and Nobel laureate, William Faulkner, was spot-on when he observed that “Facts and truth really don’t have much to do with each other.”
Perhaps the most important of these discoveries is that the brain is lazy. It instinctively works no harder than necessary, relying on all sorts of mental shortcuts — heuristics and biases by the academic jargon — to make sense of things without thinking too long or too hard, or even consciously thinking at all. Cognitive psychologists suggest a few reasons why relying on these tools, rather than slower more careful thinking and reasoning, may have helped us survive.
For starters, the brain is a glutton for glucose. It takes more calories to do the heavier mental lifting of paying attention and thinking things through than it takes to just jump to quick, easy conclusions. As the modern brain was evolving, our ancestors weren’t sure when or whether the next meal might come, so saving calories mattered for survival.
We also had to make sense of information quickly, in case that information indicated imminent danger. Slower, deliberative decision making was literally so dangerous that our ancestors who tried to do it were weeded out of the gene pool. Faster response was simply safer. This almost certainly contributed to what the polymath Herbert Simon labeled the “bounded rationality” of the human mind. Perfect rationality requires more time, information, education and experience than most of us will ever have. So we developed cognitive tools that help us deal with life’s immediate demands and make judgments and decisions on the fly before we have all the evidence or all the smarts necessary to fully understand it.
The lazy brain helps explain why fake news works — and it does: A recent poll conducted for BuzzFeed News “found that 75 percent of American adults who were familiar with a fake news headline viewed the story as accurate.” It works because, speaking generally, it’s easier to jump to conclusions than to question, doubt, dig deeper, and think more analytically. That’s not to say we aren’t capable of such careful conscious reflection, but the brain prefers to avoid the heavier mental lifting that comes with critical thinking.
Of course, the lazy brain doesn’t explain everything, particularly in a socially complex culture like ours. We are, after all, susceptible to fake news, distortions of the truth, and even outright lies, for other reasons. Psychological research, for example, has found that most of what we call “reasoning” – the cognitive process of mulling things over to come up with our perceptions and views — is highly motivated. We reason about things for specific reasons, and objectively analyzing the facts is rarely the main reason. Rather, we tend to consider the facts in ways that serve basic animal needs, particularly the need for survival, which for social animals like us depends to a great degree on being part of a community.
This explains what Dan Kahan, a professor of law and psychology at Yale University, has found in his Cultural Cognition research. We tend to shape our beliefs and perceptions so that they match those of the group — or groups — with which we most closely identify. Demonstrating loyalty to our group earns us the support and protection of everyone in the group. And if everybody in our group shares the same views, that consensus and cohesion gives our group more power in the competition with other groups over how society should operate. Helping the group do better, then, helps each member individually.
In practice, this means that when news and information appears consistent with our group’s general attitudes, we tend to believe it more readily. Which helps explain how people in groups that dislike Hillary Clinton might believe that the former Secretary of State and U.S. Senator ran a child sex ring from a Washington, D.C. pizza restaurant — a fiction spread on the internet by, among others, Michael G. Flynn, the son of Donald Trump’s pick for national security adviser.
Conversely, when information conflicts with our group’s view, we often find ways to reject it. Witness, for example, how conservatives reject scientific evidence that the climate is warming and that humans are causing it, or how liberals reject decades of research that has found no human health risk from genetically modified foods.
The idea of “Cultural Cognition” helps to explain why we seek information from sources that frame the facts in ways that align with our group’s views and values. Not only does that spare our lazy brains the mental effort required to keep an open mind and continually think things through, but confirmation of our beliefs literally feels good. We have emotionally positive reactions to information that confirms our group’s view, and negative emotional reactions to information that challenges our group’s view.
Try watching Fox News or reading Breitbart if you’re a liberal, or watching MSNBC or reading Daily Kos if you are a conservative, and measure your pulse. Odds are the stress of exposing yourself to information that conflicts with your group identity will raise your heart rate.
It’s worth noting, too, that Cultural Cognition is not just true for people with less education or intelligence. Kahan and his colleagues have found that people who rank higher in those attributes are actually better at perceiving the facts the way they want to. A more capable brain is just better at doing what it has evolved to do: see things in ways that enhance survival.
That brings us to Scottie Nell Hughes, a conservative pundit who served as a surrogate for Trump during the campaign. Hughes was roundly mocked — mostly by people in groups opposed either to Donald Trump (Democrats, liberals) or to fake news (journalists, academics) — when she declared on The Diane Rehm Show show late last month that “There’s no such thing, unfortunately, anymore, as facts.”
Those mocking her might do well to remember that the 19th-century German philosopher and cultural critic Friedrich Nietzsche once observed precisely the same thing: “There are no facts, only interpretations,” he said.
Consider, too, the words of Pope Francis just last week: “Using striking terminology, Francis said journalists and the media must avoid falling into ‘coprophilia’ – an abnormal interest in excrement,” The Guardian newspaper reported. “Those reading or watching such stories risked behaving like coprophagics, people who eat feces, he added.”
In short, the pontiff said, apologizing for his bluntness: We tend too readily to eat shit — and in a nutshell, he’s right. We have a natural tendency to readily swallow bullshit. We always have and always will. It’s an inherent facet of human cognition.
So what can we do to avoid the dangers caused by our inherent gullibility and motivated reasoning? We can, and should, fight back by demanding better self-policing by the Googles and Twitters and Facebooks of the world to control the spread of misinformation in the first place. We should also encourage robust fact-checking in both traditional and social media, and call-out liars who make a living, or promote their ideologies, by feeding our coprophagia.
But this can take us only so far, and objective reason alone will not solve the problem. Throwing evidence at people that challenges the beliefs they hold as part of their group affiliation threatens their tribal identity, which threatens their sense of safety. That triggers the motivated reasoning that leads some people to stubbornly refuse to accept what the evidence clearly says.
Rather than struggle against the inherently subjective nature of how we perceive things — from the color of a dress to whether we think a candidate lied — we may be able to encourage people to think about things a bit more objectively by taking advantage of another of our basic cognitive reflexes: the instinct to survive.
Risk perception, after all, has a powerful influence on how we see the world. We are exquisitely sensitive to signs of danger. And a post-factual world is a dangerous place. No matter what individual groups we’re in, we all face profound threats from a world where our lazy brains and motivated thinking runs amok, whether it’s from armed vigilantes “investigating” sex rings in pizza shops, trumped-up nationalism raising the likelihood of war, or climate denial imperiling billions of people.
So along with fact-checking, calling out the lies and the liars, and putting reasonable controls on the tools being used to take advantage of our inherent gullibility, perhaps an even more effective way to fight back against a post-factual world is to make it clear that living in such a world is itself a cause for concern. Living in a more evidence-based world isn’t mostly about being smart, or right, after all. It’s mostly about staying safe.
David Ropeik is an author and risk-perception consultant assisting businesses, governments, non-profits, and other organizations in their understanding of human perceptions of risk, and in navigating the challenges of effective risk communication. A full list of his current and former clients is available here.
Comments are automatically closed one year after article publication. Archived comments are below.
This essay is a bit simplistic on a few things. One is that Fox News and MSNBC would both raise my pulse and for the same reason, they both resort to exaggerated, dramatic news stories that often leave out significant pieces of information. At best a stress response would only partially be due to the views opposing my own. A second issue is that, unless you’re a scientist talking to another scientist, facts are a matter of interpretation and culture. A wavelength of 650nm is a wavelength of 650nm no mater what you call it but depending on the person you might get various names for red or even a color other than red. A final fact is that science is never “done”. Everything is really a theory. This doesn’t mean everything is relative, Some theories have so little or no evidence against them that we accept them as fact, things like gravity. Other things, for example in the medical field, are often accepted as fact and yet are highly speculative since the issues are complex, out knowledge is incomplete and humans are all different. I would enjoy an essay by the author that explores the complex issue of issues that are complex. Things like global warming have a large weight of evidence but also suffer from gaps in knowledge and imperfect predictions so there is room for people to try to deny it altogether. This in part stems from people wanting to reinforce their beliefs but also partly from poor science education and critical thinking skills in general.
I tend to agree with James that this article is a nice easy read for a lazy brain but lacks nuance. I am not a conservative but i find all sorts of problems with the “science” of global warming. I am generally centrist in my views because I was trained as an intelligence analyst so look for biases and inconsistencies when presented with information. I also look for motivation. Here in Australia our news media has become ideologically driven to the point that every news story begins with an ideological proposition followed by a set of cherry picked statements that support that proposition. It is very difficult therefore to derive any real pleasure from simple reporting as I am always doubting the motivation of the reporter.
The fight in Syria for example is one of the most complex conflicts in recent memory but it is still treated as a baddies versus goodies cowboy and western.
I am unable to form a morally sound view on this conflict let alone derive a militarily sound solution.