Scientists create robotic arm that can be moved using your imagination



Researchers at the University of California, San Francisco, have enabled a paralysed man to regularly control a robotic arm using signals from his brain, transmitted via a computer.

He was able to grasp, move, and release objects simply by imagining himself performing the actions. The device, known as a brain-computer interface (BCI), functioned successfully for a record seven months without requiring any adjustments.

Until now, such devices had only worked for a day or two.

Researchers at the University of California, San Francisco, have enabled a paralysed man to regularly control a robotic arm using signals from his brain, transmitted via a computer. UCSF

This BCI relies on an artificial intelligence (AI) model that adapts to small changes in brain activity as a person repeatedly imagines a movement, gradually improving its accuracy.

“This blending of learning between humans and AI is the next phase for these brain-computer interfaces,” said Professor Karunesh Ganguly, a neurologist at UCSF Weill Institute for Neurosciences. “It’s what we need to achieve sophisticated, lifelike function.”

The study, funded by the US National Institutes of Health, was published on 6 March in the journal Cell.

One of the study participants, who lost the ability to move and speak following a stroke years ago, can now control the robotic arm by imagining specific movements.

The key breakthrough involved understanding how brain activity shifts from day to day when the participant repeatedly imagines making these movements.

Once the AI system was trained to account for these changes, it maintained performance for months at a time.

Dr. Nikhilesh Natraj, a neurology researcher, alongside Professor Karunesh Ganguly, worked with a participant who had been paralysed by a stroke and could neither move nor speak. UCSF

Professor Ganguly previously studied brain activity patterns in animals and observed that these patterns evolved as the animals learned new movements.

He suspected the same process was occurring in humans, which explained why earlier BCIs quickly lost their ability to interpret brain signals.

Ganguly and Dr. Nikhilesh Natraj, a neurology researcher, worked with a participant who had been paralysed by a stroke and could neither move nor speak.

The participant had tiny sensors implanted on the surface of his brain to detect neural activity when he imagined moving. UCSF

The participant had tiny sensors implanted on the surface of his brain to detect neural activity when he imagined moving.

To investigate whether these brain patterns changed over time, the participant was asked to imagine moving different body parts, such as his hands, feet, and head.

While he could not physically move, his brain continued to generate signals corresponding to these imagined movements.

The BCI recorded these signals and found that while the general patterns remained the same, their precise locations in the brain shifted slightly each day.

The researchers then asked the participant to imagine simple finger, hand, and thumb movements over two weeks while the AI system learned to interpret his brain activity. Initially, the robotic arm’s movements were imprecise.

To improve accuracy, the participant practiced using a virtual robotic arm that provided feedback on how closely his imagined movements matched the intended actions.

To improve accuracy, the participant practised using a virtual robotic arm that provided feedback on how closely his imagined movements matched the intended actions. UCSF

Eventually, he was able to get the virtual arm to perform the desired tasks. Once the participant began practising with the real robotic arm, it only took a few practice sessions for him to transfer his skills to the real world. He was able to use the robotic arm to pick up blocks, turn them, and move them to new locations.

He was even able to open a cabinet, retrieve a cup, and hold it under a water dispenser. Months later, he could still control the robotic arm after a brief 15-minute “tune-up” to adjust for changes in his brain activity over time.

Ganguly and his team are now working to refine the AI model to make the robotic arm move faster and more smoothly. They also plan to test the system in a home environment. For people with paralysis, the ability to perform simple tasks like feeding themselves or getting a drink of water could be life-changing.

“I am very confident that we have learned how to build the system now, and that we can make this work,” Ganguly said.



Source link

Related Posts