A computer program which can read silently spoken words by analysing nerve signals in our mouths and throats, has been developed by NASA.
Preliminary results show that using button-sized sensors, which attach under the chin and on the side of the Adam's apple, it is possible to pick up and recognise nerve signals and patterns from the tongue and vocal cords that correspond to specific words.
"Biological signals arise when reading or speaking to oneself with or without actual lip or facial movement," says Chuck Jorgensen, a neuroengineer at NASA's Ames Research Center in Moffett Field, California, in charge of the research. Just the slightest movements in the voice box and tongue is all it needs to work, he says. [continued]