Picture a sunlit Grecian sea or the deep hues of Santorini’s rooftops. They’re both called “blue” in English. But to Greek speakers, the lighter hue is “ghalazio” and the darker color “ble.” When researchers showed native speakers of both languages squares of light and dark blue, they found the Greek speakers viewed the two colors as more unlike each other. Did a different language influence how people viewed the two colors?
The idea that language can shape perception and thought — a hypothesis formally known as “linguistic relativity” — harkens back to the 1930s. This hypothesis asserts that language doesn’t just express ideas, it actively shapes them, determining how we understand the world around us. Initially met with great interest, this idea fell out of favor by the 1960s due to a lack of scientific evidence.
Still, the notion refuses to die. Many researchers now believe that language does play a role in some aspects of cognitive activity, but the nature and extent of this role remains frustratingly murky. “We’d like to be able to say: look we tested it, and here’s the answer,” says Terry Regier, a linguistics researcher at the University of California Berkeley. But evidence to support some version of linguistic relativity “doesn’t replicate reliably.”
Now, cognitive scientists are applying new technologies to resolve the issue. Their methods can tap directly into the brain to track blood flow and electrical activity in response to sensory input, allowing researchers to investigate the neural mechanisms by which language works to influence cognitive function. Additionally, research groups have begun to study young infants. These pudgy-cheeked humans allow us to understand the brain’s capacity to process sensory information before words are learned. Taken together, these experiments point in a surprising direction: Language does, indeed, influence our ability to perceive the world around us.
In 2016, researchers in Japan investigated whether babies can categorize colors, by using a method that measured the babies’ neural responses to various colors and shades. A color switch between blue and green triggered an increase in blood flow to certain parts of the brain, indicating that the brain perceived and processed the two colors differently — well before the babies learned the words “green” and “blue.” Yet when babies were shown 2 shades of green, there was no corresponding change in blood flow.
This caused Alice Skelton, a Ph.D. student in psychologist Anna Franklin’s lab at the University of Sussex, to wonder how the infant brain draws the line between different colors. She and her colleagues strapped babies into car seats and showed them a sequence of several differently-colored squares. A baby was presented one color and the researchers watched how long it took for its eyes to drift away in boredom. When that happened, they placed another colored square adjacent to the first. Babies prefer looking at novel things. So if they perceived the second hue as new or different, it grabbed their attention — and held their gaze. But if the infants saw the new color as basically similar to the previous one, they remained uninterested.
The team found that babies appear to lump colors together into five groups: red, yellow, green, blue, and purple. Those categories appear to be governed by the biology of color vision itself. Other studies from Franklin’s lab suggest that toddlers who are learning language still see colors much like babies do — based on their biology rather than lexicon. But several studies have found that language does appear to influence adults’ perceptions of color. As we grow and learn new color words, Skelton explains, languages teach us to distinguish more tints.
“We’re not saying all color categories are just a result of biology,” says Skelton. “It’s more that biology provides the fault lines that culture then builds upon.”
Figuring out how culture — and more specifically, language — influences adult perception has historically been difficult. Consider the study of how Greek vs. English speakers perceive color. One might argue that the results merely reflect linguistic differences: Because the Greek speakers have two labels, they can slap “ghalazio” and “ble” on the two hues. While English speakers call both of those hues “blue,” this doesn’t mean they don’t see the differences — they just don’t call them two different names.
“It’s very difficult to negate that argument because it actually makes good sense,” says Guillaume Thierry, a cognitive neuroscientist at Bangor University. Still, he wanted to understand what was at work in the brains of those Greek and English speakers as they looked at different shades of blue. Thierry returned to his study. This time, instead of asking participants how they’d name the colors, his team placed electrodes on the participants’ scalps to track tiny changes in electrical signals in the brain’s visual system. Now, participants no longer needed to name the colors. Researchers could directly observe participants’ neural activity.
Native speakers of Greek and English were shown a series of circles and squares, and instructed to press a button when they saw a square instead of a circle. The shapes randomly switched colors between a light and dark shade of either blue or green, but participants weren’t instructed to focus on the colors. During the test, electrical activity revealed that in Greek speakers, the visual system responded differently to “ghalazio” and “ble” shapes. The change wasn’t seen in English speakers. Perhaps most surprisingly, the fluctuation in electrical activity occurred within 200 milliseconds of seeing the color — faster than the average person’s brain can retrieve a word.
When the researchers asked participants what they’d seen, most didn’t recall even seeing differently-colored circles. There were squares that changed color, they said, and some circles. Because participants had been asked to consciously focus on squares, they all recalled both shades of blue on the squares. But only the Greek speakers unconsciously registered the difference in blue colored circles. “That was a turning point for me,” Thierry recalls. “It showed that the visual system functions differently in speakers of one language than another, and that people can’t account for [that difference] in other ways.”
The team repeated the experiment with shapes. The Spanish word “taza” encompasses both cups and mugs, whereas English distinguishes between the two. When Spanish and English speakers were presented with pictures of a cup, mug, or bowl, the difference between the cup and mug elicited greater electrical activity in the brains of English speakers than in Spanish speakers.
These speedy spikes in brain activity occur even when people are unaware of them. But does the altered perception have an effect on subsequent actions? That’s hard to say, according to Thierry. “Our results are very much about unconscious processing by the human brain,” he says. “The very nature of this kind of research entails that any links to overt behavior and attitude can only be tentative.”
But few researchers today seek such links, or support the extreme notion that users of one language think entirely differently than those who speak another tongue. Linguistic relativity can take many forms — some seemingly more mundane than others. Perhaps knowing an extra word for blue simply influences what we see on an Aegean holiday.
And yet, surely what we see, smell, and otherwise sense fuels at least some of our thinking. That’s why researchers continue to probe the interplay between language and cognitive activity. Understanding the effects of color categories in different languages is only a first step. “In the bigger picture, it’s about principles that are broadly generalizable,” Regier says. “It’s really about the effect of a communicative system on thought.”
Jyoti Madhusoodanan is a science journalist based in the San Francisco Bay Area. She covers life sciences, health, and STEM careers for Nature, Science, Scientific American, Discover, Chemical & Engineering News, and other outlets.