A cursor on a computer screen can be controlled using thoughts about a range of vowel sounds, research has found.
Brain signals have been translated into motion or even pictures before, but the current research showcases a nascent technique called electrocorticography.
The approach uses sensors placed directly on the surface of the brain.
The authors of the Journal of Neuroengineering paper say the technique will lead to better “brain-computer interfaces” for the disabled.
A great many studies and demonstrations have in recent years made use of the electroencephalograph, or EEG, typically worn as a “cap” studded with electrodes that pick up the electric fields produced by firing neurons.
The technique has been shown to guide electric wheelchairs or even toys, based only on the wearer’s intention.
However, EEGs lose a great deal of the precious information that is available closer to the brain itself, what lead author of the study Eric Leuthardt, of Washington University in St Louis, in the US, calls the “gold standard” brain signal.
“You cannot get the spatial or the signal resolution,” he told BBC News.
“One of the key features in signal resolution is seeing the higher frequencies of brain activity – those higher frequencies have a substantial capability of giving us better insights into cognitive intentions, and part of the reason EEG suffers for this is it acts as a filter of all of these high frequency signals.”
That is, the EEG picks up signals outside the skull, which acts to absorb and muddle the signals.
Electrocorticography, by contrast, is so named because it taps directly into the brain’s cortex – the outermost layer of the brain.
In a surgical procedure, a plastic pad containing a number of electrodes is implanted under the skull.
Its power has already been shown off in allowing video game play by thought alone – but in the new study, the researchers have tapped into the speech network of the brain.