Science —

Lost languages leave traces on the brain

Babies' brains adjust to listening to a language, even if they never learn it.

Lost languages leave traces on the brain

Our brains start soaking in details from the languages around us from the moment we can hear them. One of the first things infants learn of their native languages is the system of consonants and vowels, as well as other speech sound characteristics, like pitch. In the first year of life, a baby’s ear tunes in to the particular set of sounds being spoken in its environment, and the brain starts developing the ability to tell subtle differences among them—a foundation that will make a difference in meaning down the line, allowing the child to learn words and grammar.

But what happens if that child gets shifted into a different culture after laying the foundations of its first native language? Does it forget everything about that first language, or are there some remnants that remain buried in the brain?

According to a recent PNAS paper, the effects of very early language learning are permanently etched into the brain, even if input from that language stops and it’s replaced by another language. To identify this lasting influence, the researchers used functional magnetic resonance imaging (fMRI) scans on children who had been adopted to see what neural patterns could be identified years after adoption.

Because not all linguistic features have easily identifiable effects on the brain, the researchers decided to focus on lexical tone. This is a feature found in some languages that allows a single arrangement of consonants and vowels to have different meanings that are distinguished by a change in pitch. For example, in Mandarin Chinese, the word “ma” with a rising tone means “hemp”—the same syllable with a falling tone means “scold.”

People who speak tone languages have differences in brain activity in a certain region of the brain’s left hemisphere. This region activates in response to pitch differences that are used to convey a difference in linguistic meaning; non-linguistic pitch is processed in the right hemisphere. Tone information is learned very early in life: infants learning Chinese languages (including Mandarin and Cantonese) show signs of recognizing tonal contrasts as early as four months.

The researchers focused on 21 Chinese children who had been adopted early in life. The average age of the children at adoption was 12.8 months, which meant that they were likely to have learned to recognize tone before being adopted. Since adoption, the children had been exposed exclusively to French, had grown up as French monolingual speakers, and had no remaining conscious knowledge of Chinese.

As controls, the researchers used 11 children who spoke only French, as well as a third group of 12 children who spoke both Chinese and French. The children, all between 9 and 17 years old, completed a task involving tone discrimination while in the fMRI scanner. They heard pairs of phrases made up of nonsense words using Chinese speech sounds (like “brillig” or “strint” in English), or hummed phrases with nothing but tone information. Each pair of terms was either identical or had a difference in tone on the last syllable. The children were asked to press a button to show whether the final syllable was different or the same.

All of the children were able to answer with very high accuracy, and there were no differences between the groups on either accuracy or reaction times. However, their fMRI scans showed a difference in how they processed the information.

Chinese-French bilingual children used the specialized left-hemisphere brain region found in speakers of tone languages, while French monolingual speakers used only their right hemispheres, as they would for processing any complex sound. Adopted ex-Chinese speaking children showed the same pattern as the Chinese-speaking bilinguals—their brains showed activation in the specialized tone region in the left hemisphere.

There was also a stronger activation among children who had been older when they were adopted. The researchers suggest that this indicates that the representation of lexical tone in the brain gets strengthened with more exposure to it. However, the length of time since the children had been adopted made no difference to the amount of activation in the brain, possibly indicating that, once the representation of tone in the brain has been established, time doesn’t weaken or erase it.

What makes this study particularly useful, says Dr Cristina Dye, a researcher who studies childhood language acquisition, is that lexical tone is very well suited to probing this question. Previous studies tackling the same question used tasks that required more complex linguistic knowledge, which children are less likely to have learned at a very young age. Lexical tone also has the benefit of being very difficult for adults to learn, meaning that traces of it are most likely from early childhood.

As with many fMRI studies, the sample sizes are small. This is due to the expense of the technology, as well as the stringent criteria for participants. Nevertheless, the results corroborate behavioral studies that have shown similar traces of lost languages, says Dye.

The next thing to determine, write the researchers, is whether the neural traces of the first forgotten language can affect how subsequent languages are learned or processed by the brain. There may also be implications for learning the lost languages: people with forgotten exposure to languages may be able to learn that language faster, or more completely, than people with no exposure at all.

PNAS, 2014. DOI: 10.1073/pnas.1409411111  (About DOIs).

Channel Ars Technica