Podcast: Risky Science and Public Consent

Welcome to Entanglements. In this episode, hosts Brooke Borel and Anna Rothschild explore what might seem a simple question: Should citizens decide on risky science? It’s an idea that animated a special series of stories that Undark is publishing over the next six weeks, called Unleashed, and this podcast marks the series’ debut.

To explore this question of public consent and risky science, our hosts invited two experts with differing opinions to share their points of view, in an effort to find some common ground. The point isn’t to both-sides an issue or to try to force agreement. Instead, the show aims to explore the nuance and subtleties that are often overlooked in heated online forums or in debate-style media. 

Their guests this week are Zeynep Pamuk, an associate professor in the Department of Politics & International Relations at the University of Oxford, and Alta Charo, an emeritus professor of law and bioethics from the University of Wisconsin-Madison who is now a private consultant.

Below is the full transcript of the podcast, lightly edited for clarity. New episodes drop every Monday through the end of the year. You can also subscribe to Entanglements at Apple Podcasts and Spotify.


Brooke Borel: Welcome to Entanglements, the show where we talk through some of the thorniest debates in science. I’m Brooke Borel.

Anna Rothschild: And I’m Anna Rothschild.

Brooke Borel: On Entanglements, we bring together two experts who have different takes on a contested area of science or research and we try to see if there’s any common ground between them. Today we are asking: Should we the people get to decide whether risky science takes place?

Anna Rothschild: Yeah, why don’t you explain what kind of risky science we’re talking about.

Brooke Borel: Yeah, so gain-of-function research where scientists are trying to understand deadly pathogens by actually altering a highly deadly or transmissible pathogen in a lab.

Anna Rothschild: Right, so the benefit is stop the next pandemic. The risk is start the next pandemic.

Brooke Borel: Exactly. Geoengineering experiments where the ultimate goal is to change the Earth’s atmosphere so less sun hits the Earth.

Anna Rothschild: Right. So, benefit: Slow climate change. That sounds good. Risk: Disrupt ecosystems in dangerous ways.

Brooke Borel: Yeah, no more crops! Oh no. Ok, another example is work towards artificial general intelligence, like we talked about in our first episode.

Anna Rothschild: Right. So maybe we create AI that helps us do all sorts of things and we all get to live carefree lives not doing any of the grunt work that drives us all crazy. Or the robots take over and kill us all.

Brooke Borel: Yeah, extreme examples, but yeah. So yes, these are giant topics and the future of many of them is actually uncertain as we head into the next Trump administration.

Anna Rothschild: Right, a lot of decisions about things like biosafety actually happen in Washington. So who’s in power has major consequences.

Brooke Borel: Also, for transparency, I recorded these interviews before the election took place.

Anna Rothschild: Good to know.

Brooke Borel: Yeah, and today’s episode is actually part of a year’s long investigation we’ve been working on at Undark, the rest of which we’ll be publishing through the end of the year. It’ll be a mix of reported stories and essays and more. They’re great. I hope you check it out. And it’s going to give you a snapshot of how these decisions are happening now. Is there a democratic process at all? And as we are asking today in this episode, should there be? Anna, ready to hear what our guests had to say?

Anna Rothschild: Of course I am. I can’t wait.

[Music]

Brooke Borel: The question of the day is, should we, the people, be able to decide whether risky science should take place?

Zeynep Pamuk: Yes absolutely.

Brooke Borel: This is Zeynep Pamuk. She’s an associate professor in the Department of Politics & International Relations at the University of Oxford and a professorial fellow at Nuffield College. And she studies democratic theory and the role of expertise in politics.

Zeynep Pamuk: The decision about whether this kind of research should go forward should be made in a much more inclusive, open participatory way. At the moment, this is decided mostly by scientists themselves. It shouldn’t be just left to the responsibility, the good sense, the moral feeling of scientists, but it should include everyone who is affected. So that’s, that’s you and me and everyone else.

Brooke Borel: But, how does Zeynep want that to work? Should we use direct democracy where individuals vote on scientific issues? Maybe we use ballot measures, for example? Or indirect democracy, where we elect officials to make decisions for us? Or something else entirely?

Zeynep Pamuk: Any of those three could work. I don’t particularly like ballot measures. People are given various options when they arrive at the ballot box, and they often do not really know what the issue is.

So I’m not a big fan of those. In terms of indirect democracy or direct democracy, I think both are fine. Both should go forward. I’m a bit partial to more direct methods, so participatory mechanisms where citizens get together, they listen to the experts, and then they deliberate, and then they vote.

Anna Rothschild: Brooke, you know, I like this idea, but I just don’t understand how you do that for the giant issues that affect everyone that we’re talking about here?

Brooke Borel: Right, I could intuitively see how this might work at a local level, but not at a broad scale. And I asked her about that.

Zeynep Pamuk: You’re right to say that direct democracy may work better at the local level. But even for experiments that will have planetary consequences, ultimately there is a line of research going forward that’s being either funded or at least permitted at some level. And you could hold this kind of direct, participatory mechanism at the national level, inviting citizens randomly selected from different places. And it could be a much bigger thing. It could be like a citizens’ convention.

It could have, you know, like, I don’t know, hundreds of citizens and then you could open up the issue for debate and then people would make their cases and then a decision would be made.

So I think it would make the organization more difficult, more challenging, but I don’t think it’s infeasible.

Brooke Borel: So even in that case, it sounds like it wouldn’t be every single citizen. You said a random selection. It wouldn’t necessarily be everyone comes out. And how does that part work? I’m thinking about in the U.S., you can’t even get people to all vote in a presidential election.

You know, you can’t get them to vote for a lot of things. How do you get them to get engaged and vote on something like this?

Zeynep Pamuk: It would work like jury selection, basically. It’s a system that is maybe in decline, maybe not working so great anymore, but it’s still, it exists, and it works. And we have the mechanism for selecting people. You can pay people for their day off work. You can make it mandatory.

Many of these experiments have been successful. There are examples from all around the world where people like to feel included, especially if it’s an area where they feel that they’re affected.

Brooke Borel: Zeynep says this method is mostly used in smaller impact situations, like deciding at what age people should have cancer screenings, or whether we should re-introduce wolves into a national park. But she argues it could also be used for riskier science.

Zeynep Pamuk: I’ve made the argument that we should use democratic mechanisms to stop risky science, to ban it sometimes if necessary. And in other parts of my work, I’ve proposed an organization called the science court to evaluate scientific policies, especially where there’s disagreement among the experts. I actually haven’t put the two together, but I really don’t see why not. So I would certainly easily put together the two ideas.

Brooke Borel: Tell me more about a science court and how that would work in practice. Would that be one piece of direct democracy in a way, or is this, do you see that as tangential, but related to this idea?

Zeynep Pamuk: It’s one proposal for an organization in that landscape. The most distinct aspect of a science court, as I propose it, is that it’s not a deliberative institution. It’s not meant to be citizens or scientists deliberating together, but it’s adversarial.

So it’s like a court in that sense. The experts would debate one another. So we want experts with different views, you know, someone who thinks this is really dangerous and someone who, a scientist, and someone who thinks, “Oh, it actually is so beneficial for humanity or the risks are exaggerated.” And let’s hear them argue it out in front of a citizen jury. And then people can decide who they’re convinced by or how they would weigh the risks and the benefits given their own values and commitments.

Brooke Borel: And you mentioned that in part of your work, you’ve argued in some cases to outright ban certain risky science. Talk more about that.

Zeynep Pamuk: My point is that banning should be on the table, that there is no unlimited right to freedom of inquiry just because science on the whole has given us huge benefits and improvements in, in our lives.

[Music]

Brooke Borel: Ok Anna. How are you feeling now?

Anna Rothschild: I mean, everything Zeynep says makes a lot of sense to me. I’m a fan of democracy.

Brooke Borel: Same, same.

Anna Rothschild: I think people should be able to participate in decisions that will have a big impact in their lives. I mean, call me crazy. So I get the case that she’s making.

Brooke Borel: Okay. Great. Let’s see what you think after you hear a different point of view.

Anna Rothschild: Yeah, I’m really interested to see if this will change my mind.

[Music]

Brooke Borel: Should we the people be able to decide whether risky science should take place?

Alta Charo: Well, being a former law professor, I resist the binary nature of the question, as you might imagine.

Brooke Borel: I know. I know. But if you had to.

Alta Charo: If I had to, I would say no. I don’t believe that we the people, as a group, at every possible educational level and every possible level of familiarity with science, are in a position to actually look at a particular area of science or a particular experiment and judge it on its merits, with any kind of accurate appreciation of its precise risks, benefits, possibilities, et cetera.

Brooke Borel: This is Alta Charo. She’s an emeritus professor of law and bioethics from the University of Wisconsin-Madison. These days, she works as a consultant in biotech, regulatory policy, and ethics for private companies and the U.S. government.

And to be clear, she doesn’t think that the rest of us shouldn’t get any say in whether risky science takes place. We just shouldn’t be the main deciders.

Brooke Borel: How do you think it should happen?

Alta Charo: I think it’s a mix. I do think that the general public has a role, it’s how much it values a particular benefit. I think there’s a role in people expressing their views about the value that a benefit offers, because that’s crucial in understanding whether the risks should be tolerated. The second is, I think there’s a role in kind of having a sense of absolute risk.

That’s a public question and it’ll be different for different people and we need to average their tolerance levels.

Anna Rothschild: How does or should the public express their views on these things?

Brooke Borel: So, for some technologies that already exist in the world, she said people express their opinions of the tech’s value by actually using it. They are opting in. And in other cases, maybe for risky science, she mentioned conducting surveys perhaps, to get a sense of people’s opinions on risks and benefits.

Alta Charo: Once those big picture questions have been answered, though, I think that then I think you need to move to people who are more technically sophisticated, whether it’s scientists working within regulatory agencies in the government, or it’s advisory bodies made up of technical experts who can then try to feed information about the technology itself into this equation.

Brooke Borel: So if we want people who have more expertise to kind of lead the way, I’m curious what that looks like.

Alta Charo: I mean, I think primarily we’re talking about regulators at various governmental levels who have technical training and you may need people with multiple disciplines of technical training.

It’s not just the biologists or the chemists or the physicists, because you often need to have people who understand human factor as well. People who understand how humans interact with things, whether it’s a machine or how humans will react if, if let’s say, let’s say something biological did get released, right?

You need human factor experts who understand how likely is it that human beings will follow the advice that’s given to protect themselves. If they’re not likely to do that, then that release is even more dangerous than it might have been. But you need to know that, right? Because in the end, what you’re worried about is the actual number of people who might be injured.

Brooke Borel: Alta went on to include all sorts of other groups. It’s psychologists. It’s science communicators. It’s professional societies. And then she said something I’ve never even considered.

Alta Charo: It’s the insurance companies, which play a hugely important role in determining whether or not the risks are tolerable because they’re the ones who are looking at the actual number of affected people or affected areas of the environment and what kind of costs that might entail and who would be responsible for the costs. And they then decide whether or not to insure against it. And if they won’t insure against it, it isn’t going to happen.

Anna Rothschild: Wait, what? So, like, a university has to make sure they’re covered in case they do experiments that cause some sort of prolonged nuclear winter or something?

Brooke Borel: I know. What a world we live in today, right? Can you imagine, I don’t know, Robert Oppenheimer on the phone with his insurance rep trying to figure out if his umbrella policy covers nuclear fallout?

Anna Rothschild: Of course not. That just sounds so silly. Though, I guess it would be pretty interesting if it actually happened. How does this even work?

Brooke Borel: Here’s what Alta said.

Alta Charo: It turns out that is a fascinating question and one that doesn’t have an easy answer. I have a feeling that it hasn’t been sorted out yet and that there are people that are scrambling to do it now.

Brooke Borel: And maybe once they try and figure it out, maybe that will have some interesting effects on what research is getting greenlit.

Alta Charo: It absolutely will have an effect on how the research is done as much as whether it’s done.

[Music]

Brooke Borel: Okay. Anna, how are you feeling now?

Anna Rothschild: I mean, I totally see where Alta’s coming from. I certainly do want to be able to trust regulators and people much smarter than me to make decisions about things in my life that I am just not an expert in.

Brooke Borel: Yeah, I know right?

Anna Rothschild: Right, it totally makes sense. I guess today, I just feel like I have almost no say at all. I mean, what are the mechanisms for public participation in these issues today?

Brooke Borel: Yeah, so the main ones that came up in my conversation with Alta and Zeynep were reaching out to NGOs or lobbying groups and indirect democracy, where we elect officials. And then those officials — sometimes because of public pressure — they hold hearings that might include scientists. We’ve seen that in the past few years, for example, with hearings on pandemic prevention.

Anna Rothschild: Yeah. I don’t know that I would necessarily call those great successes policy-wise, exactly. Maybe more of a show of partisan politics, but that’s debatable. But regardless though, both of the methods you just discussed put the onus on the public to reach out, either to an NGO or to their representatives. But as we’ve talked about, some of these issues are happening without the public being aware at all.

So maybe there’s some middle ground to find between Zeynep’s public approach and Alta’s sort of technocratic approach.

Brooke Borel: Yeah, absolutely. And we talked about that in our conversation when I brought these two together.

[Music]

Brooke Borel: Early in our chat, Alta expressed some skepticism about what we’re doing here.

Alta Charo: Yeah. I, I think you’re going to find, unfortunately, that there’s a lot of agreement between the two people speaking here, even if you thought you were going to have a debate.

Brooke Borel: Well, we’re not, we’re not trying to have a debate. We’re not trying to have a debate.

Alta Charo: Okay.

Brooke Borel: We’re having an interesting conversation.

Brooke Borel: And it’s true, they did agree on a lot, like the definition of risky science and that there needs to be a big benefit for the risk to be worth it. They also agreed that the public should have at least some role in the process.

But then Alta jumped into a very specific example: Gene drives and mosquitos. Just a quick definition for you: These mosquitoes are genetically altered to spread a specific trait rapidly through a wild population. A trait that, for example, might make them unable to carry malaria or that would kill the entire population of mosquitoes. So big benefit, no malaria. But some scientists warn it could have catastrophic and unintended ecological effects.

Alta Charo: There’s so much fundamental misunderstanding of what it means to genetically engineer the mosquito and what the significance of that is for the environment or for humans, that I fear that direct democracy and voting on such a thing is going to be laden with misunderstanding. I mean, I don’t know about you, Zeynep, but I’ve given talks where people have come up afterwards and expressed their concern that if a genetically engineered mosquito were to bite them, that it would genetically engineer them and make them into a mosquito.

I mean, the level of fundamental misunderstanding is profound and widespread, and I don’t think it’s wrong to say that after having expressed value judgments about the big picture issues, and value judgments about where to direct our resources, like what areas to fund or not fund with public monies, that there is an appropriate moment for a handoff. For a delegated kind of democracy in which you delegate to technical experts who have been appointed through democratic processes like elections and statutory measures.

Zeynep Pamuk: Aha, we have a disagreement.

I agree with Alta that public misunderstanding or misinformation or inadequate perception of risk is a significant problem. And if that is driving decisions, then we would have reason to be concerned. However, it’s not a reason to take away the decision from these people, because ultimately if you impose on a public who doesn’t understand that a mosquito whose genes have been edited, is now going to be released into the wild, then people will both be fearful for their lives, and they will distrust science and scientists even more.

So the solution to that is to have better understanding of science. So I think going ahead with these decisions over a public that’s not understanding it and fearful of it is a recipe for disaster. And we have seen this kind of fear increase public mistrust of science over decades.

Alta Charo: I have to agree that there is a large middle ground of value and technical mix. And I could not disagree with that at all. I think part of the challenge here in this conversation is understanding what you mean by public involvement. Because the idea of direct democracy and referenda on technical questions, it sounds like we might agree that there are situations where that just doesn’t make sense, just as much as we don’t want to have direct democracy on a variety of topics. I don’t want direct democracy setting the next, I don’t know, the next cost of living index rate that I trust experts to figure out for me.

Brooke Borel: So, how do we make sure … For the general public to understand these technologies, how do we actually do that to make sure that they’re informed enough to be able to participate in a way that makes sense?

Zeynep Pamuk: I think there’s a lot to be said about more face-to-face interaction between experts, scientists, and ordinary people.

Brooke Borel: Face-to-face interaction isn’t always possible, so Zeynep also mentioned using the internet to connect with people. And she also talked about leaning on research on messaging: That is, what’s the best way to frame risk and express uncertainty? But Alta pushed back.

Alta Charo: You know, I think that of course it would be ideal if we had better communication and I think you’ve listed, Zeynep, quite a number of good techniques to try to improve that. But there is a point at which I think it’s not about educating people to the point where they can evaluate these questions themselves.

Anna Rothschild: Can I jump in here?

Brooke Borel: Yeah, of course.

Anna Rothschild: So what about the fact that understanding science doesn’t even guarantee that you’ll make a rational decision? You know, very few of our decisions are completely rational. We’re shaped by all sorts of things, like our society and culture or our political beliefs and families and all sorts of stuff.

Brooke Borel: Yeah, really good point.

Anna Rothschild: Thank you, Brooke, I appreciate that.

Brooke Borel: You’re welcome. And Alta even mentioned another thing. She says there are also big zeitgeisty trends that shape our collective fears.

Alta Charo: To be very simplistic, I think if you look in the U.S. in the 19th century, I think we had an era of celebrating of the power of science in the world of engineering. It was big bridges and roads and railroads and electricity, and it was the inventive power of science. And there was tremendous enthusiasm for it that overwhelmed the concerns that we should have been having to a greater degree about the number of injuries that were being caused and the number of workers that were being harmed.

I think in the 20th century, we saw the destructive power of science begin to take hold on public’s imagination, whether it was the chemical warfare and gas warfare of World War I, or it was the atomic bomb in World War II. But suddenly, by the middle of the century, people were worried about the destructive power of science.

As we move into the end of the 20th century, I think that the fears continued, but they began to shift a little bit, especially in the biology area, because the fear was not so much the destructive power of science, but the creative power of science.

I don’t have any idea about how one manipulates that public kind of meta impression of what’s going on. But I do think you have to take that into account when asking what is the appropriate role of the public in determining at what point certain science simply shouldn’t be permitted, because I think that decision is definitely being colored by these more instinctual attitudes that people bring to the whole question.

Zeynep Pamuk: I think you described it beautifully. I mean, I couldn’t agree more, and I really enjoyed your description of the different stages and how people’s attitudes are colored by these, that there are their values and emotions involved. But I don’t see this as a problem. I mean, of course people’s judgment will be colored by these. And these fundamental values, I think these matter. These should be very much involved.

Now, we’re not talking about whether this particular drug should go on the market. I mean, at that level, we have certain overarching approaches that are agreed on, and it would be both, technically unsound and, you know, probably a waste of time to put that to a vote or whatever, the participatory experiment. But at the level of, you know, do we want heritable gene editing, or do we want geoengineering? I think these values are very much a part of, should be a part of conversation. And if they color people’s understanding, I don’t see that to be a problem.

I don’t see this as a kind of distortion or something to be manipulated or not. I think this is very much how we appreciate our world, make big decisions about the kind of society we want to live in, and the kind of future we don’t want.

Anna Rothschild: Ok, we’ve talked a lot about what should happen. But how should it happen? Like what about Zeynep’s science court idea?

Brooke Borel: Alta was a little skeptical. Remember she is a former law professor.

Anna Rothschild: Yeah, that tracks.

Alta Charo: I honestly think that this is the kind of thing where the devil is absolutely in the details, and probably that’s an entire show specifically on …

Brooke Borel: Sure.

Alta Charo: … how you would set something like this up. I just get very nervous about adding another kind of, this is another kind of governmental mechanism. Because you were creating a new governmental structure out of public participation, without more detail, right, in order to really see how it compares to the existing structures we have now.

Brooke Borel: So, another thing that came up in Alta’s conversation that I found fascinating that I’d like for you to talk about a bit more and get Zeynep’s reaction to is the role of insurance companies in all of this.

Alta Charo: They are in the business of assessing risk. They are in the business of trying to calculate what is the, what are the odds that a particular harm might occur? What is the financial significance of that harm? What’s the likelihood that that harm would be compensable in court and how likely is it that our insured entity would lose in court and therefore how much exposure do we have?

Zeynep Pamuk: It seems something that is a really important part of the puzzle. I mean, I would look more into it. So are we talking about the insurers of universities? So who’s liable for what? Let’s say a university undertakes gain-of-function research. Is the university liable for a global pandemic? I mean, I, how, how does this work?

Alta Charo: So these are, these are some of the interesting questions that occur because first of all, when you talk about universities, I’m going to separate out the public universities from the private because public universities often have special immunities because of their public status.

I’m going to talk about a private university that permits some kind of research to go on on campus. Let’s assume that it does not take all the steps it’s supposed to take in order to monitor the research on campus and to ensure that the research goes through the appropriate regulatory review steps, because surprisingly it can be very difficult on campus sometimes to know what everybody is actually doing.

If there is some kind of environmental release that causes a problem that then spreads, it is possible that there can be some amount of financial responsibility for these extended consequences of a single act that is deemed to have been negligent, particularly if you fail to abide by the particular safety measures that have already been required of you, whether it’s the review processes or the specific mitigation and lab safety measures you’re supposed to put into place before you do the work.

Zeynep Pamuk: Would insurance companies have a role in saying, “Oh, this should not be permitted?” Is that part of the equation?

Alta Charo: It is, but in an interesting way, not directly. So insurance companies can say, I will not insure that because it just seems too dangerous.

Zeynep Pamuk: That’s super interesting.

Brooke Borel: I’m curious, have you, have the two of you agreed or disagreed more than you were expecting to in this conversation? What do you think?

Zeynep Pamuk: I imagined it being more adversarial, but I think we agree on a lot of things. But, you know, I think there was also a healthy ground of disagreement to make it interesting.

Alta Charo: We might disagree on the details of what a citizen science interaction might best be designed to be whether it’s the science court or some other form, but I don’t think you’re going to see a lot of disagreement over the fact that there’s going to be a spectrum that represents the ideal way to handle these things over time and across technologies.

Brooke Borel: And what will each of you leave this conversation thinking the most about?

Zeynep Pamuk: Insurance.

Brooke Borel: Me too, actually.

Anna Rothschild: Okay, Brooke, you’ve asked me a few times now what I think of this conversation. But what are your thoughts, now that you’ve heard both sides?

Brooke Borel: Yeah, I mean, as a member of the public, I would like to have some say whether this research is taking place, especially if it has a potential for big risk for me or for people I care about or for people in general, even if that risk is small, right? To quote Alta in all of this, she said “the devil is in the details,” and I think that that’s exactly right.

I think that the actual sort of process in getting people involved though, and convincing them that they should be involved and they should participate, and then having these structures in place to actually make that happen in a way that makes sense on a large scale, that’s really hard to do. It’s very messy. And it’s unclear to me how exactly that’s all working out now and how it could work out and what we can do better there.

Anna Rothschild: Yeah I agree, and I think that it’s really vital to figure out. You know, trust in science is really low. And I think it will only get lower if we don’t let people have some sort of say in the decision-making process. Or at least be informed that these experiments are going forward. Because…

Brooke Borel: Yeah.

Anna Rothschild: … that will only make it decline more.

Brooke Borel: Absolutely.

Anna Rothschild: So I think that’s a great challenge, but it’s a really worthy one for maybe minds greater than mine to figure out.

And oh no! See what I just did there!

Brooke Borel: You have a great mind, Anna! Hey, hey, hey, you have a great mind.

Anna Rothschild: Yes, but look what I’m doing! I’m just like, taking a technocratic approach to figuring this out and just being like “eh, someone else will figure it out! And I trust them to do it.” This is exactly what we’re talking about.

Brooke Borel: I know, it’s so true. So in theory, in an ideal world, I do want to know about things that might harm me and I want to have a say in them. But also at the end of the day, I’m tired. I have two kids. I have a job. I’m busy. And there are people who study these things who know a lot more about them than I do. So I don’t know, like yes, I want, in an ideal world, I want to be fully informed and make decisions, or help make decisions about these really big, important things. But I don’t know how realistic it is.

Anna Rothschild: Yeah, I mean, I totally agree. I will say, I would be very curious to hear what our listeners think of this. If you guys have some thoughts about who should decide risky science, please drop us an email. You can reach us at entanglements @undark.org.

Brooke Borel: I think that’s a good place to leave it for this week.

Anna Rothschild: Yes, me too.

Brooke Borel: That’s it for this episode of Entanglements brought to you by Undark Magazine, which is published by the Knight Science Journalism Program at MIT. The show is fact checked by Undark deputy editor Jane Reza. Our production editor is Amanda Grennell, and Adriana Lacy is our audience engagement editor. Special thanks to our editor in chief, Tom Zeller Jr. I’m Brooke Borel.

Anna Rothschild: And I’m Anna Rothschild. Thanks for listening. We’ll see you soon.

Republish