In the debate over nuclear power, experts accused the public of being “irrational,” “mistaken,” and “insane.”

Opinion: To Sway Science Skeptics, First Listen to Them

Scientists need to acknowledge that emotions and values play a legitimate role in shaping public opinions about risk.

13 Comments
In the debate over nuclear power, experts accused the public of being “irrational,” “mistaken,” and “insane.” Visual: David Falconer / U.S. National Archives

Just over 30 years ago, psychology professor Paul Slovic delivered an uncomfortable message to technology experts in the U.S. At the time, then-unfamiliar technologies like nuclear power and pesticides were just becoming a part of everyday life. People were nervous about them, and the experts sought to assuage their fears.

VIEWPOINTS
Partner content, op-eds, and Undark editorials.

But it wasn’t working. Large segments of the public voiced opposition. Left frustrated, policymakers and technical experts felt they needed a better way to convey what their statistical models told them: that the risks associated with the technologies were low and acceptable. They wanted to persuade the masses that they would be safe.

Implicit in the experts’ stance was the belief that, when it came to judging risk, they knew better than the “laypeople.” But Slovic’s seminal review made the case that people had a rich conceptualization of risk. The public’s decision-making factored in social concerns, attitudes, and emotions — not just statistics. Their judgments, he wrote, carried a complexity that wasn’t being captured by the experts’ verdicts. 

Not only did the experts fail to acknowledge that complexity, they reacted harshly to the public’s concerns, Slovic wrote. In the debate over nuclear power, a prominent issue of disagreement at the time, experts used words like “irrational,” “mistaken,” and “insane” to describe the public’s stance.

Fast-forward to today. Does any of this sound familiar?

Although today’s scientists rarely use such strong wording, they still operate largely on the conviction that people will get on board with scientific expertise if only the evidence is communicated better — an idea known as the deficit model.

But there are strong signs that, on contentious matters at least, the deficit model isn’t working. A recent Pew Research Center study found a wide gulf between the opinions of scientists and the American public on issues like GMOs and climate change. Scientific expertise has been very publicly challenged by politicians on both sides of the Atlantic in recent years. The Trump administration’s rollback of climate change policy and environmental safeguards is a sign of a public that — through its vote — has voiced a degree of doubt, if not mistrust, of expertise.

As I’ve previously written, I believe we are facing this impasse because science is becoming increasingly entangled with the messy reality of social problems. That was especially evident this summer, when researchers reported that troll social media accounts traced to Russia had been spreading misinformation about vaccines. Such misinformation campaigns have helped feed the long-term trend of skepticism that’s led to a drop in vaccination rates and a measles comeback in the U.S. and Europe.

In other words, there are forces at play beyond science. If we are to have constructive public conversations about the science and technology that affects people’s lives, we must acknowledge that fact, look beyond the politics, and figure out a way to work with what might lie beneath the surface of public skepticism.

Slovic’s work is a good place to start. Three decades ago, the psychologist reviewed studies that assessed perceptions of risk, in one case comparing experts and members of the general public by using their risk ratings to rank the perceived dangers of 30 activities and technologies, including nuclear power, surgery, pesticides, and swimming. The work revealed that public perceptions were shaped in part by psychological and cognitive factors. The risk posed by nuclear power, for instance, evoked feelings of dread and was perceived to be unknown, uncontrollable, inequitable, potentially catastrophic, and likely to affect future generations. No amount of education would easily change anyone’s mind about it.

More recent work backs up Slovic’s findings: Studies have shown that attitudes on vaccination are linked to psychological traits and tend to align with political ideology. The same is true of climate change.

Slovic’s approach to probing how people think and respond to risk focused on the mental strategies, or heuristics, people use to make sense of uncertainty. Sometimes these strategies are valid; sometimes they lead to error. But heuristics aren’t deployed exclusively by the masses, Slovic wrote. Experts use them too, and are prone to similar errors. “Perhaps the most important message from this research is that there is wisdom as well as error in public attitudes and perceptions,” he concluded.

And this is a crucial point: To respect the public’s understanding of a technological risk is not to say that the public is right, or more correct than the experts. Rather, it is to acknowledge that the public has legitimate concerns that expert assessments might miss.

Slovic’s ideas have proliferated in social science circles and influenced the communication of risk over the years. As of today, his review has been cited nearly 9,000 times. The notion of a broader conception of risk has evolved into a concept known as risk as feelings. His and similar work by people like Baruch Fischhoff and Sarah Lichtenstein informed the 1996 report “Understanding Risk,” by the National Research Council. It was later expanded into a conceptual framework known as “the social amplification of risk.” Regulators eventually came to accept that the new paradigm should be factored into risk-management decisions.

But Slovic’s underlying message seems daring, even today. Are we ready to accept that attitudes, values, and emotions like fear and desire have a legitimate role in establishing “truth”? How can we negotiate compatible roles for scientific and non-scientific considerations in decision-making?

These are tricky propositions. Acknowledging non-scientific aspects of decision-making can appear to give legitimacy to the hidden agendas of interest groups that use uncertainty to sow doubt. Although Slovic’s work doesn’t imply that the public knows better than the experts, it could be spun to that effect. A complex science narrative is often no match for a simple, emotional message that seeks to undermine it. This is a serious pitfall.

Another pitfall is that, in the absence of the authority conferred by the established process of scientific consensus, social decision-making might devolve into protracted discussions that lead nowhere. Five years ago, Mike Hulme, then a professor of climate change at the University of East Anglia, suggested that we may not actually need consensus — that we might be better off acknowledging minority views in order to make disagreements explicit, a kind of “decision by dissensus.” It’s an intriguing idea, but may be ahead of its time.

Any moves to re-examine how experts communicate with the public about socially relevant science need to be carefully considered. But we can, at least, start a conversation. Just in the past year, I’ve already seen many commentaries and discussions take place under the rubric of “post-truth.” This is encouraging.

Again, Slovic’s work can provide useful guidance for these conversations. Back in 1987, he advocated a two-way communication process between experts and the public, saying that each side has valid insights and intelligence to contribute. That view of science communication is now familiar, often taking the form of light-handed efforts to “engage” the public. But it can, and should, go beyond that. Mike Stephenson, head of science and technology at the British Geological Survey, has written about the organization’s efforts to break down barriers between government scientists and local residents at sites where decarbonization technology is being tested. Its scientists turned up in church halls and community centers, ready to listen. They abandoned familiar academic-style lecterns, projectors, and rows of chairs in favor of drop-in sessions, dialogue, and props like pictures and maps.

Such efforts are demanding. They bring scientists and the public together in an uncomfortable space, where they are forced to try to understand each other. But if we are to find socially and scientifically acceptable solutions to contentious problems, perhaps that’s what it will take.


Anita Makri is a freelance writer, editor and producer based in London. Her work has been published in the Guardian, Nature, The Lancet, and SciDev.Net, among others.