My daughter lay prostrate and feverish on the floor of the hut, deep in Iban territory in Borneo and far from any modern medical facilities. Our interpreter informed us that one of the guides was a shaman. If we were agreeable, he would be honored to treat Leah. But first we should give him money. It was not payment for services — it was to make the medicine more powerful.
We might smile at such a primitive notion, but faith in the power of money pervades our modern medical system. All good capitalists believe devoutly in an Invisible Hand which wisely guides our choices in drug development. Nowhere is this faith more misplaced than for the development of medicines that treat and prevent infectious diseases.
Even today, as infectious diseases kill more than 9 million people worldwide each year, antibiotic development is moribund, abandoned by Big Pharma as a money-losing enterprise. New antivirals are being developed, but with ungodly price tags. Vaccine development has been revived, but only through an unstable combination of nonprofit efforts, government incentives and unprecedented prices. Pharmaceutical companies aren’t evil (usually). They just choose to make the most profitable drugs, not the drugs of greatest value to society. That’s why it’s time to begin socializing drug development — and we only need to look back at the trajectory of disease discovery and medicine development to see why.
Once the Malthusian Trap was sprung and the Industrial Revolution launched, control of contagion was required to sustain a culture of growth. The critical mass of modern cities was first purchased with the lives of dispossessed peasants: deaths from urban infectious diseases outpaced births, and cities’ populations were sustained only by continual immigration. Fortunately, we discovered germs and learned to tame them before we ran out of peasants. Clean water, vaccines and, eventually, antibiotics cut deaths from infection by a factor of 20 and enabled the demographic transition.
Free markets and profits were directly responsible for exactly none of these world-changing advances.
Academics discovered the anti-sera that cut deaths from diphtheria and tetanus in half. Produced by urban health departments which maintained stables of retired police horses for this purpose, they were distributed to citizens without regard for ability to pay. Likewise, the city fathers of Frankfurt funded development of the first antibiotic, the anti-syphilitic Salvarsan in 1909. The rich of that era were no nobler than of ours, but knew from personal experience that infectious diseases were no respecters of class
Vaccines, too, have a history of successful socialized development. In 1813, the U.S. Congress decided that all citizens were entitled to protection from smallpox, and could have the vaccine mailed to them free of charge. Nearly all the vaccines that make cities safe and livable — cholera, typhoid, pertussis, diphtheria, polio — were developed by nonprofits and were commonly made available to all. Indeed, Jonas Salk scoffed at the notion that his polio vaccine should even be patented: [It belongs to] “the people … ” he once said. “Could you patent the sun?”
Today, pharmaceutical companies have largely abandoned new antibiotic development on the eminently sensible principle that they are money-losers. Promising narrow-spectrum antibiotics — agents that precisely target pathogens and spare “good” bacteria — languish in development limbo because there is no hope that they might turn a profit. Old but effective antibiotics oscillate through wild price swings as manufacturers drop out and create monopoly pricing opportunities. Antibiotic development suffers from too much market freedom, not from too little.
Antibiotic and vaccine discovery and development are mature sciences; progress is more a matter of hard work than of breakthroughs — with few incentives. Innovation is still important, but not as important as the alignment of development efforts with societal needs. This means creating and providing vaccines to all, so that epidemics do not fulminate among the disadvantaged. It means creating narrow-spectrum antibiotics with small markets but large societal benefits. It means restricting the indiscriminate use of antibiotics so that they retain their potency. Markets fail to address all these needs.
In the U.S., the NIH already sets the agenda of new drug innovation by funding basic research — the phase that is highest-risk and most likely to fail. The subsequent phases of development — preclinical pharmacology and toxicology; chemistry, manufacturing, and control; and clinical testing — are well-defined disciplines. Drug companies perform no magic here but routinely contract out these functions to independent organizations. The NIH is as capable of writing these checks as any for-profit entity. But it, unlike pharma, has a mandate to write checks that improve public health, not private wealth. This approach won’t preclude for-profit drug development by private entities; it will just minimize the market-created gaps in our defenses against contagion.
The coming disruption in pharma will force drug makers to focus even more on short-term profits and their own survival. We cannot sacrifice our freedom from plague to the gods of the market. We need a reliable supply of the medicines that make modern society possible.
And my daughter? She was cured by neither the shaman’s ancient smoke nor by our modern ciprofloxacin. A trip down the river to a government clinic for a dose of milk of magnesia — a nearly 200-year-old concoction never patented outside the U.K. by its inventor — restored her health.
Drew Smith is a molecular biologist and long-distance hiker who has held positions at several biotech and medical technology startups, most recently MicroPhage, Inc., which declared bankruptcy in 2012. He now writes about science and hiking at drewsmithblog.com.