by Caroline Morley
Jianhui manipulates objects with his hands and gets a drink as a reward. Unknown to him, not far away a robot hand mirrors his fingers’ moves as it receives instructions from the chips implanted in his brain.
Zheng Xiaoxiang of the Brain-Computer Interface Research Team at Zhejiang University in Zijingang, China, and colleagues announced earlier this week that they had succeeded in capturing and deciphering the signals from the monkey’s brain and interpreting them into the real-time robotic finger movements.
The two sensors implanted in Jianhui’s brain monitor just 200 neurons in his motor cortex, Zheng says. However, this was enough to accurately interpret the monkey’s movements and control the robotic hand.
Humans have used electrodes to control prosthetic arms, but Zheng claims this research looks at the finer movements of the fingers.
“Hand moves are associated with at least several hundreds of thousands of neurons,” she said. “We now decipher the moves based on the signals of about 200 neurons. Of course, the orders we produced are still distant from the truly flexible finger moves in complexity and fineness.”