Cornell‘s Maryam Shanechi, Harvard‘s Ziv Williams and colleagues developed a cortical-spinal prosthesis that directs “targeted movement” in paralyzed limbs. They tested a prosthesis that connects two subjects by enabling one subject to send its recorded neural activity to control limb movements in a different subject that is temporarily sedated.
The BMI is based on a set of real-time decoding algorithms that process neural signals by predicting their targeted movements. In the experiment, one animal acted as the controller of the movement or the “master,” then “decided” which target location to move to, and generated the neural activity that was decoded into this intended movement. The decoded movement was used to directly control the limb of the other animal by electrically stimulating its spinal cord.
The researchers focused on decoding the target endpoint of the movement as opposed to its detailed kinematics. This allowed them to match the decoded target with a set of spinal stimulation parameters that generated limb movement toward that target. They demonstrated that the alert animal could produce two-dimensional movement in the sedated animal’s limb .
The experiment focused on two different animals, rather than just one with a temporarily paralyzed limb. The scientists contend that this provided a true model of paralysis, since the master animal’s brain and the sedated animal’s limb had no physiological connection, as is the case for a paralyzed patient.
In the ongoing effort to improve the dexterity of prosthetics, a recent trial showed an improved sense of touch and control over a prosthetic hand. EPFL professor Silvestro Micera and colleagues surgically attached electrodes from a robotic hand to a volunteer’s median and ulnar nerves. Those nerves carry sensations that correspond with the volunteer’s index finger and thumb, and with his pinky finger and the edge of his hand, respectively. The volunteer controlled the prosthetic with small muscle movements detected by sEMG, a non-invasive method that measures electrical signals through the skin.
Over seven days, the volunteer was asked to grasp something with a light grip, a medium grip, and a hard grip, and to evaluate the shape and stiffness of three kinds of objects. During 710 tests, he wore a blindfold and earphones so that he could not use his vision or sound to guide the prosthetic. The researchers also sometimes turned off the sensory feedback to test whether he was using time to modulate his grip.
The subject was able to complete the requested tasks with his prosthetic thumb and index finger 67 percent of the time the first day and 93 percent of the time by the seventh day of the experiment. His pinky finger was harder to control: he was only able to accomplish the requested grip 83 percent of the time. In both the grip strength tests and in detecting the stiffness of objects, the volunteer made mistakes with the medium setting and object, but he never confused the softest and hardest objects. The ability to modulate his grip strength is this study’s main progress over previous work by the same group.