How does the brain encode speech?

How does the brain encode speech?

Ventral sensory-motor cortex as the key center of speech

During the conversation, we will use about a hundred muscles. Continuously moving our lips, jaw, tongue and larynx, we transform our breath into fluent sequences of sounds that form words and sentences.

Scientists from San Francisco believe that the speech centers of the brain depend on the physical needs of the vocal tract. Linguists divide speech into phonemes – the shortest sound units that can distinguish the sound of words, which in some way is a “deception of hearing.” For example, in English, the sound [K] in the words “keep” and “coop” will be the same. But in fact, the mouth forms sounds differently depending on the vowels after the letters K and C, respectively.

The ventral sensorimotor cortex is the key center of speech. In their study, Chartier J and Anumanchipalli GK asked five volunteers to read aloud a collection of 460 sentences, built in concise form. Before these people, electrocorticographic electrodes were installed directly on the cerebral cortex. The sentences covered the full range of co-articulation and contained different phonemes, that is, they were as close as possible to natural speech.

It should be noted that the level of complexity of the experiment was much higher than in previous studies with pronouncing words by syllables. The new approach allowed scientists to identify individual groups of neurons responsible for specific models of the movement of the vocal tract, necessary for creating smooth speech sounds.

Experiments have shown that a significant variety of movements is encoded by the neurons surrounding the individual electrodes. The researchers concluded that there are four groups of neurons that are responsible for coordinating the movements of the muscles of the lips, tongue and larynx, which constitute the four main configurations of the vocal tract.

It was also noted that the speech centers of our brain coordinate various patterns of muscle movement, based on the context of what is said and in which order sounds are pronounced. For example, the jaw opens more with the pronunciation of the word “tap” than with the word “has”. Despite the same vowel sound ([ae]), the word “has” closes the mouth faster to make a sound [z].

Now we know that the sensorimotor cortex encodes the movements of the vocal tract. Based on this, we can decipher the signals from the cerebral cortex and translate them through a speech prosthesis. This will give a voice to people who cannot speak but have intact neural connections.

– Chartier J

To get closer to solving the coding of words in the head, scientists from Northwestern University (USA) conducted a series of studies and found that the brain controls speech production in the same way as the movement of hands. The researchers registered signals from two different areas of the brain and found that the brain has two tasks: what to say (sounds like “p” and “b”) and how to do it (movements of the lips, the sky, the tongue and the larynx).

What is the process of transforming thought into speech? As mentioned earlier, speech is filled with certain sounds, called phonemes, which are formed with the help of coordinated movements of the lips, tongue, sky and larynx.

For the study, a group of patients operated on for the removal of a brain tumor was selected. The experiment was as follows: patients were asked to read the words from the screen during the operation. With the help of special electrodes installed on the surface of the cerebral cortex, the researchers received signals that determined the brain regions responsible for the pronunciation of certain sounds.

We hypothesized that the brain region responsible for the motor activity of speech may have a similar organization with areas responsible for the movement of the hands. We studied two parts of the brain that help shape speech. Precentral gyrus largely responsible for the movement of the lips, tongue, palate and larynx. The lower frontal gyrus, which is a region of higher-level speech, was responsible for phonemes and articulation gestures.

– Dr. Marc Slutzky, lead author of the study

The results of the research are primarily important for the paralyzed, people who have the ability to think, but are not able to say their thoughts out loud. For communication, they use special devices that react to the movement of their eyes or cheeks and form words one letter at a time.

However, this process is rather slow and unnatural and does not allow for a normal conversation. In view of the new discoveries, scientists want to help completely paralyzed people to communicate intuitively thanks to a neurocomputer interface that helps decipher commands sent by the brain to the tongue, sky, lips and larynx. In the near future, people will want to say the words – the car immediately transforms them into speech.