pull down to refresh

One solution for attempted speech BCIs worked automatically and relied on catching subtle differences between the brain signals for attempted and inner speech. “If you included inner speech signals and labeled them as silent, you could train AI decoder neural networks to ignore them—and they were pretty good at that,” Krasa says.
Their alternate safeguard was a bit less seamless. Krasa’s team simply trained their decoder to recognize a password patients had to imagine speaking in their heads to activate the prosthesis. The password? “Chitty chitty bang bang,” which worked like the mental equivalent of saying “Hey Siri.” The prosthesis recognized this password with 98 percent accuracy.
They go on to say it's still very much work in progress as it doesn't work in many cases. Pretty cool nonetheless and not something i had ever thought about...