One of the amazing things about brain-computer interfaces (BCIs) is that they can pick up minute electrical signals from the brain and translate them into movement of a prosthetic limb, a wheelchair or even speech. Researchers at BrainGate, an alliance of scientists, physicians, mathematicians and engineers, are studying the brain and developing cutting-edge neurotechnologies to aid people with neurologic diseases, trauma or limb loss.

Beata Jarosiewicz is an assistant professor of research in the Department of Neuroscience at Brown University. She joined BrainGate in January 2010.  Jarosiewicz is currently investigating the possibilities of manipulating neural plasticity to improve the control of prosthetic devices. She knows more than most just how intricate the brain’s processes are and how it is possible to exploit them in order to create BCIs.

Like each single cell in our bodies, neurons have a certain potential across their membrane. Unlike other cells in our bodies, neurons can rapidly switch that potential over the scale of a millisecond or so and that spike of energy – known as action potential – can be recorded by placing a wire near the neuron. But how can this action potential be translated into digital information that can promote movement, say of a prosthetic limb?

“We have an array of electrodes placed in the motor cortex, which is the part of your brain that controls voluntary movement, and we find that neurons there will increase or decrease their firing rate with respect to certain plans or intended movements,” says Jarosiewicz.

What this means is that if the person just imagines opening or closing their hand, there will be a percentage of neurons that will increase their firing rate for these motions. Because scientists can monitor this type of brain activity, they can figure out which motions cause which subsection of neurons to fire. This can be calibrated by asking an individual to think about making certain movements and recording which neurons increase or decrease their activity. From this information, researchers can create a map or algorithm that will later be used to decode that information in real time.

Given the complexity of the brain, the neuronal activity created by such thoughts varies from individual and can even vary from day to day in the same individual, due to the brain’s natural movement. So the recalibration has to be repeated at least daily.

“It’s really not that big of a deal. We just have the person imagine they’re making movements for a couple of minutes and this is sufficient to get a good initial estimate of that mapping from neural patterns. Then we can keep refining that as they actually control their computer cursor or robotic arm,” explains Jarosiewicz.

While it’s relatively easy to see how imagining moving a body part can create a pattern of neuronal energy that can be translated into movement of a prosthetic arm in real time, when it comes to interacting with a computer cursor, the correlation does not seem quite so clear. So where does that data come from?

“When an able bodied person is controlling a cursor on a computer screen, externally they are doing that by moving a mouse around and tapping their finger on the clicker. So we can ask a person to imagine doing that or we can have them imagining they’re moving a finger on a track pad, or anything that feels intuitive to them,” says Jarosiewicz.

This form of BCI technology has far-reaching potential. As research continues to grow and neuronal brain maps become more detailed, it could become possible to produce much more complex movements with prosthetics or even other devices using thought alone. Imagine being immersed in a virtual reality game where you are able to make your avatar jump, run, shoot and point by merely thinking, or to be able to drive a car or operate a machine simply by imagining you are doing so. According to Beata, studies have shown that once a person develops a strong relationship with their prosthetic device or the virtual object they are controlling, they no longer have to imagine any body movements but are able to control it directly. This relationship, known as “embodiment,” could, in the future, lead to humans having several more arms that we could control just by thinking about them.

Image credit:


TechEmergence conducts direct interviews and consensus analysis with leading experts in machine learning and artificial intelligence. Stay ahead with of the industry with charts, figures, and insights from our unparalleled network, including executives from Facebook, Google, Baidu, Yahoo!, MIT, Stanford and beyond: