Reefer Madness 2.0: What Marijuana Science Says, and Doesn’t Say

Scientific research on the effects of marijuana is rife with holes, thanks in large part to it still being categorized at the federal level alongside drugs such as heroin and LSD. Unfortunately, when research is scarce, it becomes easier to mislead people through cherry-picked data, sneaky word choice, and misinterpreted conclusions.

The Science Press Under the Microscope.

Which brings us to Alex Berenson and Malcolm Gladwell, and what happens when tidy narratives outrun the science.

Two weeks ago, Berenson, a former New York Times reporter and subsequent spy novelist, published a book with the ominous title “Tell Your Children: The Truth about Marijuana, Mental Illness, and Violence.” Gladwell, meanwhile, published a feature in the New Yorker, where he is a staff writer, drawing largely on Berenson’s book and questioning the supposed consensus that weed is among the safest drugs. Combined, these two works offer a master class in statistical malfeasance and a smorgasbord of logical fallacies and data-free fear-mongering that serve only to muddle an issue that, as experts point out, needs far more good-faith research.

Berenson’s main argument is relatively simple. In his book, he claims, essentially, that the existing evidence really does contain solid answers, painting a truly alarming picture about marijuana: That it can and does cause psychosis and schizophrenia. He then makes the leap that since psychosis and schizophrenia can lead to violence, marijuana itself is causing violence to increase in the United States and elsewhere.

Arriving at that conclusion requires a host of scientific and statistical sins. But first, it’s worth stating that there really is some research linking cannabis and schizophrenia. A 2017 National Academies of Sciences, Engineering, and Medicine (NASEM) report, which synthesized all the available research on marijuana, concluded in part: “There is substantial evidence of a statistical association between cannabis use and the development of schizophrenia or other psychoses, with the highest risk among the most frequent users.”

Berenson takes this conclusion and runs with it, far beyond what the researchers seem willing to say. In a 15-year study in Sweden, for example, experts call cannabis “an additional clue” into how schizophrenia forms, and point out genetics and other factors that confuse the issue. And last May, Aaron Carroll, a pediatrician who frequently writes about health care policy and research, noted in The New York Times that the “arrow of causality” between marijuana and psychosis is difficult to pin down.

One of the authors of the NASEM report, Ziva Cooper of the Cannabis Research Initiative at the University of California, Los Angeles, recently told Rolling Stone: “To say that we concluded cannabis causes schizophrenia, it’s just wrong, and it’s meant to precipitate fear.” To Carroll, again in The New York Times, Cooper added there isn’t yet evidence that clarifies the direction of the association. Put simply, existing data can’t answer fundamental questions like the following: Does smoking marijuana spark the onset of schizophrenia, or are people on the cusp of developing the disease simply more likely to smoke marijuana, either to self-medicate or for other reasons?

And that’s the topic that Berenson probably got least wrong. On virtually every issue in his 272-page book, he commits one of the most common logical errors: He mixes up correlation and causation. The classic example: Crime tends to spike in the summer; so does ice cream consumption. Did all that ice cream cause the crime? Of course not.

An important piece of Berenson’s argument is that rates of marijuana use have risen at around the same time as an increase in diagnoses of schizophrenia and other forms of psychosis. For example, separate studies from Finland and Denmark show an increase in such diagnoses in recent years. The authors of both studies wrote that the increases could be explained by changes to diagnostic criteria, as well as improved access to early interventions. In both cases, the authors do not rule out an actual change in incidence. But Berenson makes that possibility seem a firm reality, and that the rise in marijuana use is responsible. There is no real evidence that he’s right.

Berenson’s connection of marijuana to violence seems even more tenuous. He writes that violence has increased dramatically in four states that legalized marijuana in recent years: Alaska, Colorado, Oregon, and Washington. He notes that the number of violent crimes in those states has increased faster than the rest of the country between the years 2013 and 2017. On its face, he’s not wrong, but this is a great example of the liberties one can take with numbers.

For starters, Berenson bases his calculations on the total number of these crimes in his chosen states, rather than the rates per 100,000 people, which would account for population differences. According to the FBI, Colorado’s murder rate actually increased more slowly than the national rate from 2013 to 2017.

Even in states where Berenson’s assertions hold true, his numbers don’t provide the whole story. For example, Washington state had a murder rate of 2.3 per 100,000 people in 2013, and 3.1 in 2017 — an increase of 35 percent, compared to a national increase of around 18 percent. But this isn’t some clear, straight-line trend: The murder rate in Washington state actually fell by around 7 percent from 2015 to 2016. How does that fit in to this narrative? You can goose the numbers by changing your starting point as well: The increase in the murder rate from a particularly violent 2012 through 2017 was only around 3 percent, much less than the national increase for that period. Possession of up to one ounce of weed for personal use became legal in Washington in late 2012, and the first recreational use stores opened in 2014. Which murder rate do you use?

And all that ignores the basic fact that even if those states are seeing increases in violence that outpace some others, that says nothing about why. Could it be partially due to the legalization of marijuana? Sure. Could it be completely random statistical noise? Also sure.

There are dozens more examples, often involving an oversimplification or outright misrepresentation of a study’s conclusions. Berenson suggests marijuana raises the risk of heart attacks (experts: evidence is insufficient), and that legalization spawns more fatal car crashes (experts: some studies have shown no difference from states without legalization, and testing positive doesn’t necessarily mean a driver was impaired at the time in any case). There are also questionable claims about rates of cannabis use in Colorado, the possible existence of fatal marijuana overdoses, and maybe most strikingly about potential connections — positive or negative — to the U.S. opioid crisis.

Some of these might be honest errors, but twisting the science so badly on so many studies suggests Berenson was trying to fit evidence to a predetermined thesis, rather than letting the data guide him. The errors are so numerous that this book, to be quite honest, deserves to be ignored. Unfortunately, after coverage in places like the New Yorker, it’s too late for that.

While “Tell Your Children” has problems, Malcolm Gladwell’s feature compounds them. It is little more than a glorified book review, relying heavily on a single source and failing to vet it in any meaningful fashion. Gladwell repeats shoddy statistics, jumps from one straw man argument to the next, and generally fails at the basic functions of a journalist. He does all this in the guise of the “just asking questions” truth-teller.

Gladwell wants you to think that he is just pointing out what’s missing in marijuana research, writing, for example: “Low-frequency risks also take longer and are far harder to quantify, and the lesson of ‘Tell Your Children’ and the National Academy report is that we aren’t yet in a position to do so.” But Berenson is doing the opposite: He is claiming — explicitly throughout the book — that the answers we’re looking for are here, and they say the opposite of what they actually say.

Nonetheless, Gladwell offers up Berenson’s same misleading statistics on violence without interrogating their accuracy, mentions two studies suggesting a gateway effect while ignoring plenty of contradictory research, and highlights some data generated by a New York University professor and friend of Berenson’s, which, while they may be valid, have not been peer reviewed or otherwise assessed by experts. Gladwell even misrepresents the professor, calling him a statistician, when his expertise is actually in politics and law.

In some spots, Gladwell is just regurgitating what he read with no context or input from other sources or experts. On the increased potency of modern-day weed compared to the 1960s and 1970s, here’s Berenson: “Imagine drinking martinis instead of near-beer to get a sense of the difference in power.” Now here’s Gladwell, describing that same increase: “… from a swig of near-beer to a tequila shot.” He changed the liquor, you see, so it’s fine.

Both writers make it seem as though they’re shining a light on a dark, unexplored corner of science and public policy. But they even take liberties with the claim that no one is paying attention. “Almost no one noticed the National Academy Report,” Berenson writes. Gladwell repeats the claim, saying it “went largely unnoticed.” An extremely quick Google news search reveals pieces on the report from Business Insider, Vox, NPR, Forbes, Quartz, and The Washington Post, just to name a few that came out at the time of its release. The New York Times, Berenson’s former employer, printed an editorial about the report, calling for rescheduling the drug in order to make research easier. What would people “noticing” actually look like?

There is one point Gladwell gets right. In his piece, he writes that Berenson has “a novelist’s imagination.” Berenson repeatedly mentions how his book will likely spawn references to the infamous anti-weed propaganda film “Reefer Madness,” as if preemptively pointing out what other people will say about you somehow proves them wrong. But the book does come across as theater, from the stark title and the scary, smoke-tinged, red-on-black cover, to the author’s reliance on scary anecdotes.

The book opens with a horrific story of an octuple murder committed in 2014 in Cairns, Australia. Berenson spends entire chapters on gruesome tales of murder and violent psychosis, of course all linked back to smoking pot. At one point, he even acknowledges the scientific maxim that “the plural of anecdote is not data,” but then proceeds with an argument of majestically twisted logic. If the data really do exist, he says, then the anecdotes are sure to follow. How the anecdotes he supplies somehow prove the existence of data is unclear.

The debacle provides a cautionary tale for other science writers: Avoid these sins of omission, misrepresentation, and cherry-picked data. Do the work to find many credible sources for each story, don’t rely on a single flawed book. Scientists, unlike many in the general public, are perfectly comfortable with uncertainty, and with addressing gaps in knowledge. Part of the challenge for any good journalist is to reflect these uncertainties and nuances clearly and honestly — not to squeeze a compelling, utterly false narrative from uncooperative data.

Clearly, neither Berenson nor Gladwell were up to the task.

Dave Levitan is a freelance journalist based in Philadelphia who writes about energy, the environment, and health. He is the author of “Not A Scientist: How Politicians Mistake, Misrepresent, and Utterly Mangle Science.”