Pancho, as his friends call him, at just 20 years old he experienced what it means when a stroke causes paralysis of a large part of the body. At 30, he met a neurosurgeon from the University of California at San Francisco, Edward Chang, who investigated the lasting effects of the stroke on his brain. And in 2021 he becomes the protagonist of an innovative study: Chang’s team surgically implants electrodes on the cerebral cortex to record neural activity, which is translated into words on a screen. A special story that ends up in the spotlight again, because for the first time a brain implant helped a bilingual person unable to articulate words to communicate in both languages, thanks to an artificial intelligence system (coupled with the brain implant) which decodes in real time what he is trying to say in Spanish or English. A step forward also for the knowledge of the human mind.
The results – published these days in ‘Nature Biomedical Engineering’ – provide information on how our brain processes language and could one day lead to long-lasting devices that can restore multilingual speech for people who cannot communicate verbally. “This new study represents an important contribution to the emerging field of neuroprosthetics for speech restoration,” says Sergey Stavisky, a neuroscientist at the University of California, Davis, who was not involved in the study. Although the research included only one participant and much work remains to be done, “there is every reason to think that this strategy will work more accurately in the future when combined with other recent advances,” Stavisky believes.
Pancho is a native Spanish speaker And he learned English only after the stroke. Spanish still evokes in him feelings of familiarity and belonging. Chang’s team developed an AI system to decipher his bilingual speech. This effort was led by Alexander Silva and involved training the system while Pancho attempted to say nearly 200 words. His efforts to form each word created a distinct neural pattern that was recorded by the electrodes. The authors then applied their AI system, which has a Spanish and an English module, to the sentences. The modules were able to distinguish between English and Spanish based on the first word with 88% accuracy and decoded the correct sentence with 75% accuracy.
The results of the research team’s work also revealed unexpected aspects of language processing in the brain. Some previous experiments using non-invasive tools had suggested that different languages activated distinct parts of the brain. But the authors’ examination of signals recorded directly in Pancho’s cortex allowed them to observe that “much of the activity, for both Spanish and English, was actually coming from the same area,” Silva says. Furthermore, Pancho’s neurological responses did not appear to differ much from those of children raised bilingually, even though he was in his 30s when he learned English, contrary to the findings of previous studies. Together, these data suggest to experts that different languages share at least some neurological characteristics and that they may be generalizable to other people.
#returns #speaking #stroke #discovers #bilingual