Imagine if playing music was as simple at looking at your laptop screen. Now it is thanks to Kenneth Camilleri and his team of researchers from the Department of Systems and Control Engineering and the Centre for Biomedical Cybernetics at the University of Malta, who have developed a music player that can be controlled by the human brain.

Camilleri and his team have been studying brain responses for ten years. Now they have found one that is optimal for controlling a music player using eye movements. The system was originally developed to improve the quality of life of individuals with severely impaired motor abilities such as those with motor neuron disease or cerebral palsy.

The technology works by reading two key features of the user’s nervous system: the nerves that trigger muscular movement in the eyes and the way that the brain processes vision. The user can control the music player simply by looking at a series of flickering boxes on a computer screen. Each of the lights flicker at a certain frequency and as the user looks at them their brain synchronizes at the same rate. This brain pattern reading system, developed by Rosanne Zerafa relies on a system involving Steady State Visually Evoked potentials (SSVEPs).

Electrical signals sent by the brain are then picked up by a series of electrodes placed at specific locations on the user’s scalp. This process, known as electroencephalography (EEG), records the brain responses and converts the brain activity into a series of computer commands.

As the user looks at the boxes on the screen, the computer program is able to figure out the commands, allowing the music player to be controlled without the need of any physical movement. In order to adjust the volume, or change the song, the user just has to look at the corresponding box. The command takes effect in just seconds.

For people who have become paralyzed due to a spinal injury, the normal flow of brain signals through the spine and to muscles is disrupted. However, the cranial nerves are separate and link directly from the brain to certain areas of the body, bypassing the spine altogether. This particular brain-computer interface exploits one of these; the occulomotor nerve, which is responsible for the eye’s movements. This means that even an individual with complete body paralysis can still move their eyes over images on a screen.

This cutting age brain-computer interface system could lead the way for the development of similar user interfaces for tablets and smart phones. The concept could also be designed to operate other applications such as assistive living and climate control.

The BCI system was presented at the 6th International IEEE/EMBS Neural Engineering Conference in San Diego, California by team member Dr. Owen Falzon.

Commercial BCI Headsets

A number of companies have designed BCI systems for the everyday computer user. One such company is InteraXon, the creators of the Muse headband. Muse makes use of EEG sensors to monitor the use of four different bands of brainwaves as the user focuses on different objects or activities. The readings are then sent to a tablet, laptop or smart phone. You can use Muse to monitor your brain activity while you are studying or playing and use the information to help improve your memory and reduce your stress. Future applications include controlling music and playing games.

Emotiv is another personal interface for human-computer interaction. This system is also based on EEG technology. It has 14 sensors and high-performance WiFi so that you have a wide range of movement. As well as neuro-biofeedback applications, Emotiv can enhance your gaming interaction by measuring player facial interaction and emotional states in real time. Other software applications allow users to manipulate virtual objects using only their mind, create music and art and for people with disabilities, Emotiv can help with hands-free gaming, wheelchair control and mind-keyboard manipulation.

Watch this space, because BCI systems are rapidly becoming more and more sophisticated. They have a broad spectrum of potential in the medical field where they may be used to improve quality of life and allow communication for locked-in patients, restore mobility for victims of trauma and stroke. Everyday uses may soon be extended to complete mind manipulation of technology from smartphones, tablets, climate control, TV and even your car.

MARKET RESEARCH x INDUSTRY TRENDS

TechEmergence conducts direct interviews and consensus analysis with leading experts in machine learning and artificial intelligence. Stay ahead with of the industry with charts, figures, and insights from our unparalleled network, including executives from Facebook, Google, Baidu, Yahoo!, MIT, Stanford and beyond: