New technology for reading brainwaves while monitoring eye movement could lead to improved systems for stopping drivers falling asleep at the wheel.
Researchers from Leicester University have developed signal processing algorithms to compensate for the electrical activity created by the eye muscles as they move, which can interfere with the brain signals captured through electroencephalograph (EEG) systems.
This means the subject doesn’t need to keep their eyes still for the EEG to work and the brain signals and eye movement can be tracked at the same time, opening up the possibility of adding an extra element to eye-tracking systems to make them more reliable – or improving future brain-computer interfaces.
‘Eye monitoring on its own doesn’t work for everyone,’ project leader Dr Matias Ison told The Engineer. ‘ But most importantly if you’re driving along a long, boring road you can be starting at one point but your mind can be wandering and about to fall asleep. This could be picked up by an EEG system and not an eye tracker.’
Current EEG systems that require electrodes to be placed on the scalp would be impractical for use while driving. But Ison said the new system would enable the next stage of research into brain activity that occurs when the eyes are moving.
In particular, researchers will need to conduct experiments to collect data from the brain about what happens a subject makes different saccades (fast eye movements), in order to improve the system further.
‘The issue is that we need to have a certain number of saccades of different types and sizes so we can teach the algorithm what particular types of [movement] look like,’ said Ison.
‘We move our eyes about three to four times a second and in such a small period of time we are acquiring information and deciding where to move our eyes next. Given the very limited speed at which neurons can transmit information this is a very challenging task.’
If the technology were to improve enough, it could be applied not only to driver monitoring systems but also computer interfaces, allowing games players to move characters around much more naturally or enabling completely paralysed people to control a wheelchair just by moving their eyes.
A much earlier application could be in diagnosing dyslexia and other reading disorders by monitoring subjects in a more realistic way than the current method of checking their brain activity as they read a rapid succession of single words.
The project was carried out in collaboration with the University of Buenos Aires in Argentina and funded by the EPSRC. The researchers are currently awaiting publication of their work in a peer-reviewed journal.
No comments:
Post a Comment