Category Archives: Prosthetics

Deep learning driven prosthetic hand + camera recognize, automate required grips

FacebooktwitterlinkedinFacebooktwitterlinkedin

Ghazal Ghazai and Newcastle University colleagues have developed a deep learning driven prosthetic hand + camera system that allow wearers to reach for objects automatically.  Current prosthetic hands are controlled  via a user’s myoelectric signals, requiring learning, practice, concentration and time.

A convolutional neural network was trained it with images of  500 graspable objects, and taught  to recognize the grip needed for each. Objects were grouped by size, shape, and orientation, and the hand was programmed to perform four different grasps to accommodate them: palm wrist neutral (to pick up a cup); palm wrist pronated (to pick up the TV remote); tripod (thumb and two fingers), and pinch (thumb and first finger).

The hand’s camera takes a picture of the object in front of it, assesses its shape and size, picks the most appropriate grasp, and triggers a series of hand movements, within milliseconds.

In a small study of the technology, subjects successfully picked up and moved objects with an 88 per cent success rate.

The work is part of an effort to develop a bionic hand that senses pressure and temperature, and transmits the information to the brain.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda

Solar powered, highly sensitive, graphene “skin” for robots, prosthetics

FacebooktwitterlinkedinFacebooktwitterlinkedin

Professor Ravinder Dahiya, at the University of Glasgow, has created a robotic hand with solar-powered graphene “skin” that he claims is more sensitive than a human hand.  The flexible, tactile, energy autonomous “skin” could be used in health monitoring wearables and in prosthetics, reducing the need for external chargers. (Dahiya is now developing a low-cost 3-D printed prosthetic hand incorporating the skin.)

Click to view University of Glasgow video


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston – Featuring Roz Picard, Tom Insel, John Rogers, Jamshid Ghajar and  Nathan Intrator – September 19, 2017 at the MIT Media Lab

 

Thought controlled prosthetic arm has human-like movement, strength

FacebooktwitterlinkedinFacebooktwitterlinkedin

This week at the Pentagon, Johnny Matheny unveiled his DARPA developed prosthetic arm.  The mind-controlled prosthesis has the same size, weight, shape and grip strength of a human arm, and, according to Matheny, can do anything one can do.

It is, by all accounts, the most advanced prosthetic limb created to date.

The 100 sensor arm was developed as part of the “Revolutionizing Prosthetics Program” of the Biological Technologies office, led by Dr. Justin Sanchez.

An implanted neural interface allows the wearer to control the arm with his thoughts. Sensors are also implanted in the fingertips, sending signals back to the brain, allowing users to feel sensations.

Click to view Johns Hopkins video.

Dr. Sanchez will be a keynote speaker at ApplySci’s Wearable Tech + Digital Health + NeuroTech NYC conference on June  7-8, 2016 at the New York Academy of Sciences.

Bionic finger, implanted electrodes, enable amputee to “feel” texture

FacebooktwitterlinkedinFacebooktwitterlinkedin

Swiss Federal Institute of Technology and Scuola Superiore Sant’Anna researchers have developed  a bionic fingertip that allows amputees to feel textures and  differentiate between rough and smooth surfaces.

Electrodes were surgically implanted into the upper arm of a man whose arm had been amputated below the elbow.  A machine moved an artificial finger, wired with electrodes,  across smooth and rough lines on a plastic strip.  The fingertip movement generated an electrical signal, which translated into a series of electrical spikes sent to the brain. The spikes mimicked the language of the nervous system and create the sensation of feeling.

The subject, Aabo Sørensen said:  “When the scientists stimulate my nerves I could feel the vibration and sense of touch in my phantom index finger,” he says. “[It] is quite close to when you feel it with your normal finger you can feel the coarseness of the plates and different gaps and ribs.”

Click to view EPFL video.


 

Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center
NeuroTech San Francisco  – April 6, 2016 @ the Mission Bay Conference Center
Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences
NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

Mind controlled prosthetic fingers

FacebooktwitterlinkedinFacebooktwitterlinkedin

Johns Hopkins researchers have developed a proof-of-concept for a prosthetic arm with fingers that, for the first time, can be controlled with a wearer’s thoughts.

The technology was tested on an epileptic patient who was not missing any limbs.  The researchers used brain mapping technology to bypass control of his arms and hands.  (The patient was already scheduled for a brain mapping procedure.) Brain electrical activity was measured for each finger.

This was an invasive procedure, which required implanting an array of 128 electrode sensors, on sheet of film, in the part of the brain that  controls hand and arm movement. Each sensor measured a circle of brain tissue 1 millimeter in diameter.

After compiling the motor and sensory data, the arm was programmed to allow the patient to move individual fingers based on which part of his brain was active.

The team said said that the prosthetic was initially 76 percent accurate, and when they combined the signals for the ring and pinkie fingers, accuracy increased to 88 percent.

Click to view Johns Hopkins video.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

Self-healing sensor improves “electronic skin”

FacebooktwitterlinkedinFacebooktwitterlinkedin

Hossam Haick and Technion colleagues are developing materials to be integrated into flexible electronics that mimic the healing properties of human skin.  The goal is to quickly repair incidental scratches or damaging cuts that might compromise device functionality. The synthetic polymer can “heal” electronic skin in one day, which can improve the materials used to achieve  a sense of touch in prosthetics.

The new sensor is comprised of a self-healing substrate, high conductivity electrodes, and molecularly modified gold nanoparticles.  The researchers noted that “the healing efficiency of this chemiresistor is so high that the sensor survived several cuttings at random positions.”

WEARABLE TECH + DIGITAL HEALTH SAN FRANCISCO – APRIL 5, 2016 @ THE MISSION BAY CONFERENCE CENTER

NEUROTECH SAN FRANCISCO – APRIL 6, 2016 @ THE MISSION BAY CONFERENCE CENTER

Sensors allow more natural sense of touch in prosthetics

FacebooktwitterlinkedinFacebooktwitterlinkedin

Stanford’s Zhenan Bao is developing technology that could restore a more natural sense of touch in prosthetics.  Her flexible, thin plastic sensors send signals to the brain that more closely resemble nerve messages of human skin touch sensors.

The disruptive technology has not yet been tested on humans, and researchers still need to find a safe way to pass electrical signals from prostheses to the brain for long periods.

Many teams are working toward this (see ApplySci coverage from 2013-2015).   Previous tactile sensors have however been analogue devices, where more pressure produces a stronger electrical signal, rather than a more frequent stream of pulses. The electrical signals must then be sent to another processing chip that converts the strength of the signals to a digital stream of pulses that is only then sent on to peripheral nerves or brain tissue.  Bao’s sensors send digital signals directly.

Click to view Stanford University video.

WEARABLE TECH + DIGITAL HEALTH SAN FRANCISCO – APRIL 5, 2016 @ THE MISSION BAY CONFERENCE CENTER

NEUROTECH SAN FRANCISCO – APRIL 6, 2016 @ THE MISSION BAY CONFERENCE CENTER

Biocompatible neural prosthetics

FacebooktwitterlinkedinFacebooktwitterlinkedin

Spinal injury patients, and those with lost limbs, sometimes have neural prosthetic devices implanted in an attempt to regain independence.  They are used for deep brain stimulation and brain controlled external prosthetics.  However, neural prosthetics are often rejected by the immune system, and can  fail because of a mismatch between soft brain tissue and rigid devices.

University of Pennsylvania‘s Mark Allen and colleagues have created an implantable neural prosthetic device that is biocompatible and replaces silicon and noble metal. The goal is to avoid immune-system rejection, failures due to tissue strain, neurodegeneration, and decreased fidelity of recorded neural signals.

“Lifelike” bionic hand for women and teenagers

FacebooktwitterlinkedinFacebooktwitterlinkedin

bebionic by steeper is a small,  “lifelike” bionic hand created for women and teenagers.  It is designed around an accurate skeletal structure with 337 mechanical parts.  Its 14  grip patterns and hand positions mimic real hand functions.

Its first user, Nicky Ashwell, was born with out a right hand.  After being fitted with the prosthetic, she was able to ride a bicycle and lift weights for the first time.

The prosthetic’s sensors are triggered by user muscle movements that connect to microprocessors and  motors in each finger. It weighs  1 pound and is 6.4 inches long.

WEARABLE TECH + DIGITAL HEALTH NYC 2015 – JUNE 30 @ NEW YORK ACADEMY OF SCIENCES.  REGISTER HERE.

Intent controlled robotic arm with neuroprosthetic implant

FacebooktwitterlinkedinFacebooktwitterlinkedin

Caltech and Keck researchers implanted neuroprosthetics in a part of the brain that controls the intent to move, with the goal of producing more natural and fluid motions.   The study, published in Science, was led by Richard Andersen.  A quadriplegic implanted with the device was able to perform a fluid handshaking gesture and  play “rock, paper, scissors” using a separate robotic arm.

Andersen  and colleagues improved the versatility of movement that a neuroprosthetic can offer by recording signals from  the PPC brain region.  He said: “The PPC is earlier in the pathway (than the motor-cortex, a target of earlier neuroprosthetics,) so signals there are more related to movement planning—what you actually intend to do—rather than the details of the movement execution.  We hoped that the signals from the PPC would be easier for the patients to use, ultimately making the movement process more intuitive. Our future studies will investigate ways to combine the detailed motor cortex signals with more cognitive PPC signals to take advantage of each area’s specializations.”

WEARABLE TECH + DIGITAL HEALTH NYC 2015 – JUNE 30 @ NEW YORK ACADEMY OF SCIENCES.  REGISTER HERE.