Looking Beyond The Horizon

Innovative Technologies & Services

Matching brain signals with prosthetics – algorithmically

Posted by evolvingwheel on October 4, 2007

I do remember the robot from Will Smith’s I Robot (2004), where the android moves its arm with unimaginable degrees of freedom. The artificial machine is capable of translating its wishes by communicating effectively with its mechanical appendages. Brain (central intelligence system) signals are decoded and converted into mechanical actions. One of the researchers from MIT has embarked on one such project of creating these movements in artificial prosthetics by decoding neural commands from the brain.

Laxminarayan Srinivasan has developed an algorithm that will enable a prosthetic device to move according to neural signals [read article here]. People who often loose their arms or limbs from accidents or paralysis are still able to think and manifest their intentions from the brain. The challenge is to interpret their intentions that originate as neural signals and match them with the mode of action sought. Then make the prosthetic device operate accordingly. The researcher and his team have developed an algorithm that matches such recorded signals with different archived mechanical actions and then instruct the machine to behave. Presumably, a lot of work needs to be done in understanding the nature of the neural transmission associated with the movements of our arms and limbs. The algorithm processes the signal modalities and all its subtle variations in stimulation and then appropriately connects the command with the action code. A highly robust library of actions and a very sensitive and critical recorder of signals. blog_irobot.jpg

With a very difficult task in hand, Srinivasan aspires to build a unifying model of decoding in the coming years.

These kind of activities will one day lead to artificial movements very close to the natural ones. The difference between science fiction and reality is TIME. As we develop smart interfaces and recording devices for neural signals, and are able to interpret their messages, the closer we will get in understanding the motor behaviors related to such signals. My forecast is over the next decade industry will focus on developing such interfaces and creating small prosthetics that use AI to learn and develop actions from the recorded signals. A very burgeoning area of bio-engineering.

Advertisements

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

 
%d bloggers like this: