According to a report from Cullen College of Engineering, Jose L. Contreras-Vidal, professor of electrical and computer engineering at the college, is pioneering a unique study on the brain’s reaction to art, which he hopes could help streamline brain-machine interface systems.
For the latest study, he has teamed up with conceptual artist Dario Robleto, who has an exhibition at The Menil Gallery in Texas next month. The exhibition, titled “The Boundary of Life Is Quietly Crossed,” allows visitors the option of wearing a non-invasive EEG headset while they are looking at the art collection. As they leave, the headsets will be given to researchers who will examine the data recorded from their brain signals. The study is designed to help researchers understand what happens in the brain when we look at art.
Once data has been collected from several hundred visitors, Contreras-Vidal will look for common patterns of brain activity. He and his team are hoping that it will help them increase their understanding of the function of thoughts and mapping human neural networks. This could help them develop a brain-machine interface (BMI) so streamlined that users will be able to move robotic prosthesis without having to consciously think about it.
Contreras-Vidal is widely known throughout the medical community for his work on BMI, specifically thought controlled exoskeletons. His research focuses on four main areas: reverse engineering the brain, developing and enhancing rehabilitation robotics, using neural interfaces as tools to reverse-translate brain plasticity, and the development of wearable exoskeletons to help people who have severe motor disabilities.
Over the past few years, BMI research has led to a number of advances in assistive medical technology. As well as mind controlled prosthetic limbs and exoskeletons, we have also seen systems which allow the user to control assistive robots, communicate through the Internet, manipulate wheelchairs and their external environment, such as light and temperature control.
Image credit: Neurogadget