pull down to refresh

I didn't even think about that!
Words you'll never speak still cause activity in the brain's speech centers.
Most experimental brain-computer interfaces (BCIs) that have been used for synthesizing human speech have been implanted in the areas of the brain that translate the intention to speak into the muscle actions that produce it. A patient has to physically attempt to speak to make these implants work, which is tiresome for severely paralyzed people.
To go around it, researchers at the Stanford University built a BCI that could decode inner speech—the kind we engage in silent reading and use for all our internal monologues. The problem is that those inner monologues often involve stuff we don’t want others to hear. To keep their BCI from spilling the patients’ most private thoughts, the researchers designed a first-of-its-kind “mental privacy” safeguard.
...
One solution for attempted speech BCIs worked automatically and relied on catching subtle differences between the brain signals for attempted and inner speech. “If you included inner speech signals and labeled them as silent, you could train AI decoder neural networks to ignore them—and they were pretty good at that,” Krasa says.
Their alternate safeguard was a bit less seamless. Krasa’s team simply trained their decoder to recognize a password patients had to imagine speaking in their heads to activate the prosthesis. The password? “Chitty chitty bang bang,” which worked like the mental equivalent of saying “Hey Siri.” The prosthesis recognized this password with 98 percent accuracy.
They go on to say it's still very much work in progress as it doesn't work in many cases. Pretty cool nonetheless and not something i had ever thought about...
reply