First-of-its-kind AI system restores bilingual speech for paralyzed man

LONDON, UNITED KINGDOM — For the first time, a brain implant has enabled a bilingual person who cannot articulate words to communicate in both Spanish and English.
This groundbreaking artificial intelligence (AI) system, coupled with a brain implant, decodes in real time what the individual is trying to say in either language.
The findings offer new insights into how our brains process language and could pave the way for long-lasting devices capable of restoring multilingual speech to those who cannot communicate verbally.
“This new study is an important contribution to the emerging field of speech-restoration neuroprostheses,” says Sergey Stavisky, a neuroscientist at the University of California, Davis, who was not involved in the study.
Although the study included only one participant, “there’s every reason to think that this strategy will work with higher accuracy in the future when combined with other recent advances,” Stavisky adds.
Brain implant decodes Spanish and English words in real-time
The study’s participant, known as Pancho, suffered a stroke at age 20 that left much of his body paralyzed. As a result, he can only moan and grunt but cannot speak clearly.
In his thirties, Pancho partnered with Edward Chang, a neurosurgeon at the University of California, San Francisco, to explore the stroke’s lasting effects on his brain.
In a 2021 study, Chang’s team surgically implanted electrodes on Pancho’s cortex to record neural activity, which was then translated into words on a screen.
Pancho’s first sentence, “My family is outside,” was interpreted in English. However, as a native Spanish speaker, Pancho feels more connected to Spanish.
“What languages someone speaks are actually very linked to their identity,” Chang says. “Our long-term goal has never been just about replacing words, but about restoring connection for people.”
Training the AI on neural patterns for each language
To achieve this, the team developed an AI system to decipher Pancho’s bilingual speech. Led by Chang’s PhD student Alexander Silva, the system was trained as Pancho attempted to say nearly 200 words, creating distinct neural patterns recorded by the electrodes.
The AI system, which includes both Spanish and English modules, decodes phrases as Pancho tries to say them aloud. Each module selects the most likely word based on neural patterns and builds a phrase, displaying the version with the highest probability score on Pancho’s screen.
The modules distinguished between English and Spanish with 88% accuracy and decoded sentences with 75% accuracy. Pancho eventually engaged in candid, unscripted conversations with the research team.
“After the first time we did one of these sentences, there were a few minutes where we were just smiling,” Silva recalls.
Bilingual language processing in the brain
The study revealed unexpected aspects of language processing in the brain. Contrary to previous experiments suggesting different languages activate distinct brain areas, the signals recorded directly in the cortex showed that “a lot of the activity for both Spanish and English was actually from the same area,” Silva says.
Pancho’s neurological responses were similar to those of children who grew up bilingual, despite learning English in his thirties. These findings suggest that different languages share some neurological features, potentially generalizable to others.
Kenji Kansaku, a neurophysiologist at Dokkyo Medical University in Japan, who was not involved in the study, notes that future research should include languages with different articulatory properties, such as Mandarin or Japanese.
Silva is already exploring this, along with ‘code switching’—shifting from one language to another in a single sentence. “Ideally, we’d like to give people the ability to communicate as naturally as possible,” Silva says.