Paralysed man feeds himself for first time in 30 years with robot arms plugged into brain

Man unable to use his fingers for three decades eats dessert in 90 seconds

Anthony Cuthbertson
Thursday 30 June 2022 09:21 BST
Comments
Researchers at Johns Hopkins connected a robotic arm to a paralysed person’s brain, allowing the patient to feed themselves by thinking of commands
Researchers at Johns Hopkins connected a robotic arm to a paralysed person’s brain, allowing the patient to feed themselves by thinking of commands (Johns Hopkins Applied Physics Laboratory)

Your support helps us to tell the story

This election is still a dead heat, according to most polls. In a fight with such wafer-thin margins, we need reporters on the ground talking to the people Trump and Harris are courting. Your support allows us to keep sending journalists to the story.

The Independent is trusted by 27 million Americans from across the entire political spectrum every month. Unlike many other quality news outlets, we choose not to lock you out of our reporting and analysis with paywalls. But quality journalism must still be paid for.

Help us keep bring these critical stories to light. Your support makes all the difference.

A partially paralysed man has been able to feed himself through a brain-computer interface connected to a robotic arm.

Researchers at Johns Hopkins Applied Physics Laboratory in the US built a two arm system that allowed the man to manipulate a knife and fork in order to cut food and bring it to his mouth.

The man, who has not been able to use his fingers in about 30 years, was able to eat dessert using his mind in less than 90 seconds.

“Although our results are preliminary, we are excited about giving users with limited capability a true sense of control over increasingly intelligent assistive machines,” said Dr Francesco Tenore, a senior project manager in APL’s Research and Exploratory Development department.

Advances in brain-computer interfaces, also referred to as brain-machine interfaces, have occurred rapidly in recent years. The technology holds near-term promise for transforming the lives of paralysed people, as well as those impacted by neurological disorders.

They come in a variety of forms – from brain implants to external ensors – but essentially work by decoding neural signals and translating them into external functions, such as moving the cursor of a computer mouse, to controlling a robot.

The research team at Johns Hopkins used two arrays of 96 channels and two arrays of 32 channels to facilitate the control of the robotic arm, which is a relatively small amount compared to brain-computer interfaces being developed elsewhere.

Devices built by Elon Musk’s Neuralink startup utilise thousands of channels, with the tech billionaire hoping to one day allow humans to compete with advanced forms of artificial intelligence (AI).

The team at Johns Hopkins are already working on the next iteration of the system, which could allow amputees to transform feelings of a phantom limb into real-world movements of a robotic prosthetic.

“This research is a great example of this philosophy where we knew we had all the tools to demonstrate this complex bimanual activity of daily living that non-disabled people take for granted,” Dr Tenore said.

“Many challenges still lie ahead, including improved task execution, in terms of both accuracy and timing, and closed-loop control without the contstant need for visual feedback.”

A paper detailing the research at Johns Hopkins was published in the journal Frontiers in Neurorobotics.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in