risk

Know This First: Risk Perception Is Always Irrational.

IN 2011, the city leaders of Calgary, Alberta, bowed to public pressure and ended fluoridation of the local drinking water, despite clear evidence that the benefits of fluoridation vastly outweigh its risks. A recent study found that second graders in Calgary now have 3.8 more cavities, on average, than a similar group did back in 2004-05, when the water was still being treated.

viewpoints

In West Virginia, legislators in favor of shrinking government recently passed a law allowing sale of unpasteurized milk, despite convincing evidence that raw milk is a vector for pathogens like Salmonella, E. coli, and Listeria. To celebrate, the bill’s sponsor shared some raw milk with his colleagues, several of whom got sick. The legislator says it was just coincidence.

Since the 2011 nuclear accident in Fukushima, Japan, fear of radiation has prompted thyroid cancer screening for all children in the prefecture. The levels of radiation to which kids had been exposed were too low to pose significant danger, and the sensitive ultrasound screening technique is well known to find abnormal cells in most people’s thyroids, though in nearly all cases those cells will never cause cancer. As a result of this unprecedented scrutiny for an infinitesimal risk, hundreds of kids have had their thyroids removed unnecessarily, with far-reaching health implications for the rest of their lives.

A Canadian couple is mourning the death of their 19-month-old son from meningitis. They hadn’t vaccinated him, and treated him with natural remedies like horseradish root and olive leaf extract, refusing medical attention until the boy was unconscious and near death. They are facing criminal charges.

For anyone outside the emotions that produced these choices, it’s hard not to feel frustration at hearing about them. It’s hard not to call them ignorant, selfish, and irrational, or to label such behavior, as some do — often with more than a hint of derision — “science denialism.” It’s hard, but it’s necessary, because treating such decision-making as merely flawed thinking that can be rectified with cold hard reason flies in the face of compelling evidence to the contrary.

Discuss

In fact, the evidence is clear that we sometimes can’t help making such mistakes. Our perceptions, of risk or anything else, are products of cognitive processes that operate outside our conscious control — running facts through the filters of our feelings and producing subjective judgments that disregard the evidence. The behavioral scientists Melissa Finucane and Paul Slovic call this the Affect Heuristic; it gives rise to what I call the risk perception gap, the dangers produced when we worry more than the evidence says we need to, or less than the evidence says we should. This is literally built in to the wiring and chemistry of the brain. Our apparent irrationality is as innate as the functioning of our DNA or our cells.

And some of the specific emotional factors that cause us to worry too much, or too little, seem to be built in too. Psychological research has identified a number of emotional “fear factors” that make some potential threats feel scarier than others, no matter what the evidence might say:

• Human-made risks scare us more than natural ones, so we are more likely to fear vaccines, radiation from nuclear plants, and industrial chemicals like fluoride than we are to fear “natural” hazards like unpasteurized milk, natural medicines, or carcinogenic radiation from the sun.

• Imposed risks scare us more than those we take voluntarily. Fluoridated drinking water, childhood vaccination, and pasteurization are all mandated by government. And radiation from nuclear accidents is imposed on us, compared with radiation from the sun, to which we expose ourselves voluntarily.

• Risks to kids scare us more, fueling fear of fluoride and childhood vaccines.

• We subconsciously weigh possible harms against potential benefits. When diseases like measles disappear, the benefits of the vaccines that vanquished them are no longer obvious. Fear of fluoride is greater now than it was when most Americans lost many teeth to decay.

Research has found that these emotional guideposts for quickly assessing danger are shared by people around the world. The evidence suggests they are somehow a built-in part of being human.

Making matters worse for fans of cold, hard, objective rationality, neuroscience has found that we apply these instinctive emotional filters quickly, before the cortical areas that do our thinking have a chance to chime in. And even after reason finally has a voice, the brain gives more weight to our feelings than to facts alone as it produces our judgments and behaviors.

We feel first and think second, and most of the time we feel more and think less. It’s just how the brain works.

The evidence from decades of research in a range of fields is convincing. If the definition of “rational” is “thinking based on facts or reason and not on emotions or feelings” (Merriam-Webster), then humans are decidedly not rational. The truth is closer to what Charles Darwin observed as he stood at a puff adder exhibit in the London Zoo, nose pressed against the glass, knowing he was safe but unable to keep from flinching whenever the snake struck:  “My will and reason were powerless against the imagination of a danger which had never been experienced.”

Or as Ambrose Bierce wrote in the Devil’s Dictionary, “Brain, n. An apparatus with which we think we think.”

Somehow, though, we continue to deny the evidence and cling to our anthropocentric faith in human intellectual power and the myth of our ability to use dispassionate, objective analysis to know “the truth.”  In that belief, we sniff with superiority at those whose perceptions of risk don’t match the facts, as though we’re smarter because we can see what those science-denying dummies can’t.

This arrogance is particularly common in the scientific, engineering, and academic communities — cultures that rest on the bedrock Enlightenment belief in reason and logic. This is curious — and an interesting example of science denialism in itself — because the scientific evidence, supported by countless examples from the real world, makes it inescapably clear that we are not the perfectly rational creatures we like to think we are. All perception is subjective. All reasoning is “motivated.” The risk perception gap may produce its fair share of Darwin Award winners, but it’s an intrinsic part of human behavior.

 

Doesn’t reason itself argue for a more humble attitude? Doesn’t the evidence about the inherently affective nature of human cognition suggest that we accept that we sometimes make mistakes, and use that knowledge to learn how to avoid making them again and again?  We can apply such insights to how we communicate about risk, going beyond the “deficit model” belief that people make these mistakes because they don’t have all the facts, and that they’ll get things right after communication fills in what they don’t know.

Research finds that risk communication is more effective when it also demonstrates an understanding of, and respect for,  how people feel. That research suggests avoiding messages like “You shouldn’t worry about vaccines,” (or fluoridation or nuclear power), which imply that the communicator is right and the information recipient’s feelings are wrong. That’s a surefire way to trigger hostility toward the communicator and resistance to the message. Science and health journalism, which does a reasonable job of alerting the public to dangerous instances of the risk perception gap, should also explain the underlying psychology that produces each gap. It’s a fascinating and central part of these stories.

But communication can never reshape everyone’s deep passions about what feels safe. Risk management policies must account for the mistakes we sometimes make, and here we can take a lesson from the way governments around the world are using behavioral economics to nudge people toward choices that maximize public and environmental health.

For example, some states have improved childhood vaccination rates just by requiring much more effort from parents than a simple signature on a one-page opt-out form. The government of Finland encouraged several communities to consider hosting a high-level nuclear waste repository by offering economic incentives — rebalancing the equation of risks and benefits — and by giving the communities the choice to say yes or no. Two communities actually competed to host the repository, which is nearing completion in the city of Eurajoki. A similar consent-based process is being designed by the Department of Energy for nuclear waste disposal in the United States.

We can’t make risk perception perfectly rational. We can’t intellectually overpower the emotions that our subconscious cognitive system relies on. Human-made risks will always worry us more than natural ones. Risks that are imposed on us will always concern us more than risks we choose to take. Risks to kids will always be scarier.

We can, however, start to close the risk perception gap and minimize its dangers by applying what we’ve learned about why it occurs. But that can happen only if we achieve a more modern, post-Enlightenment definition of “rationality,” and accept that intelligence and reason can take us only so far.

Sophocles’ statement “Reason is God’s crowning gift to man” may appeal to our egos. But it would be more realistic — and safer — to adopt the sense of rationality captured by the Italian philosopher Nicola Abbagnano: “Reason itself is fallible, and this fallibility must find a place in our logic.”

 

David Ropeik is an author and risk-perception consultant assisting businesses, governments, non-profits, and other organizations in their understanding of human perceptions of risk, and in navigating the challenges of effective risk communication. A full list of his current and former clients is available here. The views expressed in this op-ed are his own.