World map showing connections, illustration.

On the Spread of Behavior: Five Questions for Damon Centola

Suppose you have an idea that you hope will influence a large group of people. It might be something to improve their health or encourage them to vote for a political party. How do you spread that idea widely and quickly?

“Weak ties are good for spreading information,” the author says, “but for a complex behavioral contagion to spread, we need more than awareness.”

You might think social media would be far more effective than word of mouth. But it’s generally not that straightforward, according to Damon Centola, the author of “How Behavior Spreads: The Science of Complex Contagions.” He argues that so-called strong ties, which tend to be with family members, close friends, and coworkers are generally more powerful in changing behavior than weak ties — long-distance connections formed on social media or with acquaintances or friends of friends, though weak ties work well enough for spreading simple ideas and gossip.

Centola, who is an associate professor in the Annenberg School for Communication and the School of Engineering and Applied Sciences at the University of Pennsylvania, also distinguishes between the simple contagion of ideas (one person influencing another or a small group, and expanding out from there, or “going viral”) and complex contagions, which involve multiple sources of exposure and account for the diffusion of most widespread and influential ideas.

Centola explores both types of contagion and concludes that, while there is no single strategy for convincing someone to stick to a diet or choose a candidate, most successful idea transmissions come from someone we perceive to be similar to us, a phenomenon sociologists call homophily.

For this installment of the Undark Five, I asked Centola about the power of human connections for the propagation of new ideas, and how, in the wrong hands, the power to influence can lead to unpleasant consequences — unintended and otherwise. Our email exchanges have been edited and condensed for space and clarity.


Undark: A lot of your research concerns public health. Where do strong ties or weak ties prove more effective for improving personal and civic behavior?

Damon Centola: There are lots of simple contagions that spread through weak ties. However, as soon as the behavior is a bit risky, or costly, or requires some social validation, then it becomes more complex, and requires strong ties.

Let’s think about every interesting behavior that social scientists spend their time trying to explain — for instance, joining a revolution, standing up for human rights, changing your profile picture on Facebook, closing your Facebook account to join a new social media platform, or deciding to vaccinate (or not vaccinate) your children. The big difference between these kinds of behaviors — which are “complex” contagions — and a virus — which is a “simple” contagion — is that for these decisions, people look around to see what other people are doing. Contact with a single adopter is usually not enough to convince us of any of these behaviors.

Drilling down on that, a “weak” tie contact whom we bump into in the café will probably not get us to change our mind about vaccinating our child, or closing down our Facebook account. We need to believe that the decision is legitimate and safe.

UD: What are some of the main costs and benefits of using online communities to spread ideas?

DC: I would say that there’s an ethical drive at the outset that motivates the construction of these kinds of online communities. The upshot is that it offers a way to provide people with new sources of social capital to improve their lives.

Particular populations, such as young minority women, are often poorly connected to institutional sources of advice and counseling (for both financial planning and for health care). For this target population, this kind of social capital — the kind offered online — is often far more effective for triggering positive behavior changes than are mainstream institutions or medical counselors.

Once we appreciate the impact that these kinds of technologies can offer, there are two ethical concerns that quickly arise. First, ensuring that people are not exploited by malicious actors; and, second, making sure that our well-intentioned designs do not backfire. The science of networks helps to detect how fraudulent or malicious uses of social communities can be detected and prohibited. At this very moment, both Facebook and Twitter are giving serious attention to this exact issue.

UD: Hannah Arendt’s study of totalitarian regimes noted how they tightly controlled the flow of information by encouraging weak ties among their citizens and discouraging — or prohibiting — strong ties. What are your thoughts on the way current global governments manipulate the structure of society?

DC: Totalitarian regimes use weak network ties to ensure that propaganda is widely distributed. In other words, weak ties are a key principle of surveillance through informational access.

Activism of every kind, from the Freedom Summer of 1964, to the Berlin Wall protests of 1989, to the Arab Spring of 2011, has depended upon reinforcement from [strong ties]. I’ve been impressed by how consistently people manage to find ways of re-forging strong interlinking network ties, with wide bridges between groups or chapters. This has been witnessed most recently with Black Lives Matter.

UD: Can your research on simple and complex contagions provide any insight into how political messages can be manipulated?

DC: Yes, this is a really important issue, and will continue to be for the next several years. In the 2016 Presidential election, 139 million ballots were cast, but Donald Trump’s margin of victory was only 78,000 votes. Any time the margin is this narrow, it means that strategically targeted messages can indeed be used to influence undecided voters.

Three factors from my work come to the foreground here: reinforcement, coordination, and anonymity. Anonymous actors (often engineered programs or “bots”) coordinated their efforts to provide reinforcing negative ads attacking the Democratic candidate. The repetition and multiplicity of these signals generated genuine confusion about which accusations were real and which were not.

Once the seeds of doubt were successfully sowed, a complex-contagion effect was able to unfold: every targeted person who discussed their uncertainty about a candidate with other susceptible people unintentionally reinforced the negative signals — inadvertently supporting the malicious actors’ (or trolls’) messaging strategy.

UD: You write that if our social media trends continue, people “may no longer come in contact with cooperative practices, shared cultural norms, or complex ideas that require social reinforcement in order to stay in circulation.’’ Could you elaborate?

DC: In my research on social diffusion, I noticed that people often think about how to make their idea spread faster, regardless of whether it has any long-term benefit for the people who adopt it. In other words, the metric of success is the speed of diffusion, not the ultimate impact of the diffusion process on people’s lives. Simple contagions, like viral media, often spread quickly because they are easy and familiar; by contrast, complex contagions, such as cooperation and sustainable practices, require a little more social reinforcement to get us to adopt them because they are challenging — even though they are ultimately more rewarding.

If the social channels that spread memes, ideas, contagions, and influences make it much easier to spread simple bits of information, then the [sheer amount] of activity over these channels will make them increasingly suitable for spreading simple contagions, and less suitable for spreading complex contagions. Because complex contagions require a little more attention and reinforcement in order to spread, cooperative practices that are common now may be easily outcompeted by the increasing pace and volume of media content in the next generation. The upshot is that in a world where political, commercial, and even scientific information is struggling to outcompete its rivals, increasing connectedness may inevitably lead to the natural selection of messages that are simpler and more digestible but less nourishing.


Louise Fabiani has written about science and culture previously for Undark, as well as for Pacific Standard, Science, New Scientist, Earth Island Journal, The American Scholar, the TLS, The Rumpus, and many other periodicals. She lives in Montreal.