Book Review: Unlocking the Mysteries of the Human Brain

We know little about the life of Alcmaeon of Croton: We can be pretty sure he hailed from the coastal city of Croton (present day Crotone), in the far south of Italy, and had a father named Peirithous; he may have been a student of Pythagoras. We don’t know when he was born or when he died; only that he was active in the 5th century B.C. But we know he was fascinated by the human body, and was willing to challenge some of the established dogmas of his day. In particular, he may have been the first to assert that the brain — not the heart — is the seat of the mind. At the very least, he seems to have recognized the importance of this astonishing organ.

BOOK REVIEW “The Idea of the Brain: The Past and Future of Neuroscience,” by Matthew Cobb (Basic Books, 496 pages).

In “The Idea of the Brain: The Past and Future of Neuroscience,” Matthew Cobb, a biologist at the University of Manchester, shows just how long and arduous the road to understanding the brain has been — and, as he makes clear, the journey has only just begun. The brain, we are often told, is the most complex entity in the known universe. And though we surmise that the brain is what allows us to think and feel and perceive and know, it is far from clear how it accomplishes this.

A good metaphor might help, if only we could latch onto the right one: Is the brain like a complicated electrical circuit, or a telephone switchboard, or a digital computer, or a neural network? Or none of the above? As Cobb lays out the history, from the ancient Greeks to the present day, he offers a candid analysis of where these analogies have been helpful, and where they’ve fallen short.

In his book “Elements of Eletro-Biology,” published in 1849, an English polymath and inventor named Alfred Smee argued that the brain was made up of hundreds of thousands of tiny batteries, each of them somehow connected to a different part of the body. Emotions such as desire, and even consciousness itself, were supposedly the result of these little batteries working in various combinations. He even drew up plans for what he believed would be a thinking machine, composed of metal plates and hinges and other paraphernalia.

As Cobb points out, as fuzzy as some of Smee’s ideas were, he was also occasionally prescient, as with his proposal for sending what we would now call video signals over long distances. There is no reason, Smee wrote, “why a view of St. Paul’s in London should not be carried to Edinburgh through tubes like the nerves which carry the impression to the brain.”

Is the mind merely the result of marvelously complex machinery? A famous criticism of the brain-as-machine metaphor comes from the German mathematician and philosopher Gottfried Leibniz. Suppose, Leibniz argued in 1714, you had a thinking machine that duplicated the function of the human brain, and scaled it up “so that one could enter it as one does a mill.” You get to then walk about the interior of this device, observing “parts which push upon each other” and yet one would find nothing “which would explain a perception.”

Echoes of Leibniz’s argument can still be heard today, especially by those who find it hard to imagine that the mind has a material explanation. (I was a little disappointed that Cobb doesn’t explore the numerous counterarguments that have been leveled against Leibniz over the years. The philosopher Daniel Dennett, for example, asks: “Might it be that somehow the organization of all the parts which work one upon another yields consciousness as an emergent product? And if so, why couldn’t we hope to understand it, once we had developed the right concepts?”)

Though we surmise that the brain is what allows us to think and feel and perceive and know, it is far from clear how it accomplishes this.

It wasn’t until the middle of the 19th century that scientists came to understand that different brain regions accomplished different tasks. Often this knowledge came through observing victims of brain damage. A famous example involved an American railway worker named Phineas Gage. In 1848, Gage suffered a traumatic head injury when an explosion sent a 1.1-meter-long iron rod through his skull. Incredibly, Gage survived, living another 20 years — but his personality was radically changed. A model employee before the accident, Gage became “fitful, irreverent, indulging at times in the grossest profanity,” according to a doctor who knew him. It was the front of Gage’s brain that received the most damage; as Cobb notes, the case suggested “that various aspects of mental life linked to attention and behavior were somehow localized in the frontal parts of the brain.”

Enter Darwin, whose theory of natural selection forced us to think of the brain as the product of countless eons of evolution. If humans and apes share a common ancestor, then our brains, and perhaps our minds, couldn’t be all that different. Inspired by Darwin, Thomas Henry Huxley — “Darwin’s bulldog” — compared human and ape brains, concluding that there is “no dispute as to the resemblance in fundamental characters, between the ape’s brain and man’s,” noting the “wonderfully close similarity” between our brains and those of chimpanzees and orangutans.

Neurons — the brain’s cells — were discovered in the 1830s, though the method by which they send signals to one another was only discerned in the 20th century, as the science behind synaptic transmission took shape, and the role of neurotransmitters became clearer. As telephones and telegraphs flourished in the 1800s, so did analogies between the brain and a telephone network, but, writes Cobb, discoveries in biology soon showed how limited this analogy was. Even if we talk of “switches” in the brain, these switches “did not work in the same way as those in a piece of electrical equipment. Biological discovery was outstripping the dominant technological metaphor and revealing” that “the brain is not a telephone exchange.”

The 20th century brought the digital computer, and the brain-as-computer metaphor persists to this day. Of the many fascinating historical details in the book, one of the most remarkable is the mid-20th-century collaboration between neurologist Warren McCullogh and a self-taught teenage prodigy named Walter Pitts. Working together, they developed the first “computational” theory of the mind: a fascinating mix of mathematical logic, computer science, and biology. Yet as promising as their work was, the sheer complexity of the human brain threatened to derail the whole effort. An early critique came from computer science pioneer John von Neumann. As Cobb puts it, von Neumann realized “that real nervous systems were far more complicated than those described by McCullogh and Pitts” and that neurons — unfortunately — did not always function in the all-or-nothing fashion of the electronic switches in a digital computer.

By the final decades of the 20th century, the discoveries were coming fast and furious. Functional magnetic resonance imaging (fMRI) offered tantalizing hints of what areas of the brain are most active, when we do or think about certain things. And the new science of neural networks, which is continuing to revolutionize computer science as we speak, offered new metaphors for the brain’s function.

And yet, the end goal — a true understanding of the brain’s connections, and how they produce their mental counterparts — still seems like a distant mirage. While acknowledging “astonishing progress,” Cobb writes that, in his view, “it will probably take 50 years before we understand the maggot brain.” Later, after a discussion of mental illness and the search for effective treatments, Cobb writes that “unravelling the genetic architecture of the human brain and how it interacts with the environment will be the work of centuries.”

And there remains what Cobb refers to as “this question of questions” — the puzzle of consciousness.

“The Idea of the Brain” offers a compelling account of the twists and turns that our quest to understand this remarkable organ has taken over the centuries. How wonderful it would be to know how the story ends, but that will have to wait. The brain is so complex that it has not yet come to know itself.

Dan Falk (@danfalk) is a science journalist based in Toronto and a senior contributor to Undark. His books include “The Science of Shakespeare” and “In Search of Time.”