"Paralysis had robbed the two women of their ability to speak. For one, the
cause was amyotrophic lateral sclerosis, or ALS, a disease that affects the
motor neurons. The other had suffered a stroke in her brain stem. Though they
can’t enunciate clearly, they remember how to formulate words.
Now, after volunteering to receive brain implants, both are able to communicate
through a computer at a speed approaching the tempo of normal conversation. By
parsing the neural activity associated with the facial movements involved in
talking, the devices decode their intended speech at a rate of 62 and 78 words
per minute, respectively—several times faster than the previous record. Their
cases are detailed in two papers published Wednesday by separate teams in the
“It is now possible to imagine a future where we can restore fluid conversation
to someone with paralysis, enabling them to freely say whatever they want to
say with an accuracy high enough to be understood reliably,” said Frank
Willett, a research scientist at Stanford University’s Neural Prosthetics
Translational Laboratory, during a media briefing on Tuesday. Willett is an
author on a paper produced by Stanford researchers; the other was published by
a team at UC San Francisco.
While slower than the roughly 160-word-per-minute rate of natural conversation
among English speakers, scientists say it’s an exciting step toward restoring
real-time speech using a brain-computer interface, or BCI. “It is getting close
to being used in everyday life,” says Marc Slutzky, a neurologist at
Northwestern University who wasn’t involved in the new studies.
A BCI collects and analyzes brain signals, then translates them into commands
to be carried out by an external device. Such systems have allowed paralyzed
people to control robotic arms, play video games, and send emails with their
minds. Previous research by the two groups showed it was possible to translate
a paralyzed person’s intended speech into text on a screen, but with limited
speed, accuracy, and vocabulary.
In the Stanford study, researchers developed a BCI that uses the Utah array, a
tiny square sensor that looks like a hairbrush with 64 needle-like bristles.
Each is tipped with an electrode, and together they collect the activity of
individual neurons. Researchers then trained an artificial neural network to
decode brain activity and translate it into words displayed on a screen."
*** Xanni ***
Chief Scientist, Xanadu
Partner, Glass Wings
Manager, Serious Cybernetics