Category Archives: Prosthetics

Prosthetic “skin” senses force, vibration

FacebooktwitterlinkedinFacebooktwitterlinkedin

Jonathan Posner, with University of Washington and UCLA colleagues, has developed a flexible sensor “skin” that can be stretched over prostheses to determine force and vibration.

The skin mimics the way a human finger responds to tension and compression, as it slides along a surface or distinguishes among different textures. This could allow users to sense when something is slipping out of their grasp.

Tiny electrically conductive liquid metal channels are placed on both sides of of a prosthetic finger. As it is slid across a surface, the channels on one side compress while those on the other side stretch, similar to a natural limb.  As the channel geometry changes, so does the amount of electricity. Differences in electrical resistance correlate with force and vibrations.

The researchers believe that the sensor skin will enable users to better be able to open a door, use a phone, shake hands, or lift packages.


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University, featuring:  Vinod KhoslaJustin SanchezBrian OtisBryan JohnsonZhenan BaoNathan IntratorCarla PughJamshid Ghajar – Mark Kendall – Robert Greenberg Darin Okuda Jason Heikenfeld

Sensor-embedded prosthetic monitors gait, detects infection

FacebooktwitterlinkedinFacebooktwitterlinkedin
The Monitoring OsseoIntegrated Prostheses uses a limb which includes a titanium fixture surgically implanted into the femur. Bone grows at the implant’s connection point, leaving  a small metallic connector protruding from the remaining leg.  An accompanying artificial limb then can be attached or detached. The same procedure can be performed for upper limbs.

Advantages include less pain, a fluid walking motion, and a more stable, better-fitting limb. However, infection risk is increased due to the metal profusion. This is meant to be addressed by electrochemical and skin sensors, including a  bio-compatible array embedded within the residual limb. The array tracks changes in body temperature and pH balance, which indicate infection. It also monitors the fit of the bone and prosthetic limb, and the healing process, which could help doctors to speed recuperation.


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University, featuring:  Vinod KhoslaJustin SanchezBrian OtisBryan JohnsonZhenan BaoNathan IntratorCarla PughJamshid Ghajar – Mark Kendall

Deep learning driven prosthetic hand + camera recognize, automate required grips

FacebooktwitterlinkedinFacebooktwitterlinkedin

Ghazal Ghazai and Newcastle University colleagues have developed a deep learning driven prosthetic hand + camera system that allow wearers to reach for objects automatically.  Current prosthetic hands are controlled  via a user’s myoelectric signals, requiring learning, practice, concentration and time.

A convolutional neural network was trained it with images of  500 graspable objects, and taught  to recognize the grip needed for each. Objects were grouped by size, shape, and orientation, and the hand was programmed to perform four different grasps to accommodate them: palm wrist neutral (to pick up a cup); palm wrist pronated (to pick up the TV remote); tripod (thumb and two fingers), and pinch (thumb and first finger).

The hand’s camera takes a picture of the object in front of it, assesses its shape and size, picks the most appropriate grasp, and triggers a series of hand movements, within milliseconds.

In a small study of the technology, subjects successfully picked up and moved objects with an 88 per cent success rate.

The work is part of an effort to develop a bionic hand that senses pressure and temperature, and transmits the information to the brain.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda

Solar powered, highly sensitive, graphene “skin” for robots, prosthetics

FacebooktwitterlinkedinFacebooktwitterlinkedin

Professor Ravinder Dahiya, at the University of Glasgow, has created a robotic hand with solar-powered graphene “skin” that he claims is more sensitive than a human hand.  The flexible, tactile, energy autonomous “skin” could be used in health monitoring wearables and in prosthetics, reducing the need for external chargers. (Dahiya is now developing a low-cost 3-D printed prosthetic hand incorporating the skin.)

Click to view University of Glasgow video


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston – Featuring Roz Picard, Tom Insel, John Rogers, Jamshid Ghajar and  Nathan Intrator – September 19, 2017 at the MIT Media Lab

 

Thought controlled prosthetic arm has human-like movement, strength

FacebooktwitterlinkedinFacebooktwitterlinkedin

This week at the Pentagon, Johnny Matheny unveiled his DARPA developed prosthetic arm.  The mind-controlled prosthesis has the same size, weight, shape and grip strength of a human arm, and, according to Matheny, can do anything one can do.

It is, by all accounts, the most advanced prosthetic limb created to date.

The 100 sensor arm was developed as part of the “Revolutionizing Prosthetics Program” of the Biological Technologies office, led by Dr. Justin Sanchez.

An implanted neural interface allows the wearer to control the arm with his thoughts. Sensors are also implanted in the fingertips, sending signals back to the brain, allowing users to feel sensations.

Click to view Johns Hopkins video.

Dr. Sanchez will be a keynote speaker at ApplySci’s Wearable Tech + Digital Health + NeuroTech NYC conference on June  7-8, 2016 at the New York Academy of Sciences.

Bionic finger, implanted electrodes, enable amputee to “feel” texture

FacebooktwitterlinkedinFacebooktwitterlinkedin

Swiss Federal Institute of Technology and Scuola Superiore Sant’Anna researchers have developed  a bionic fingertip that allows amputees to feel textures and  differentiate between rough and smooth surfaces.

Electrodes were surgically implanted into the upper arm of a man whose arm had been amputated below the elbow.  A machine moved an artificial finger, wired with electrodes,  across smooth and rough lines on a plastic strip.  The fingertip movement generated an electrical signal, which translated into a series of electrical spikes sent to the brain. The spikes mimicked the language of the nervous system and create the sensation of feeling.

The subject, Aabo Sørensen said:  “When the scientists stimulate my nerves I could feel the vibration and sense of touch in my phantom index finger,” he says. “[It] is quite close to when you feel it with your normal finger you can feel the coarseness of the plates and different gaps and ribs.”

Click to view EPFL video.


 

Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center
NeuroTech San Francisco  – April 6, 2016 @ the Mission Bay Conference Center
Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences
NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

Mind controlled prosthetic fingers

FacebooktwitterlinkedinFacebooktwitterlinkedin

Johns Hopkins researchers have developed a proof-of-concept for a prosthetic arm with fingers that, for the first time, can be controlled with a wearer’s thoughts.

The technology was tested on an epileptic patient who was not missing any limbs.  The researchers used brain mapping technology to bypass control of his arms and hands.  (The patient was already scheduled for a brain mapping procedure.) Brain electrical activity was measured for each finger.

This was an invasive procedure, which required implanting an array of 128 electrode sensors, on sheet of film, in the part of the brain that  controls hand and arm movement. Each sensor measured a circle of brain tissue 1 millimeter in diameter.

After compiling the motor and sensory data, the arm was programmed to allow the patient to move individual fingers based on which part of his brain was active.

The team said said that the prosthetic was initially 76 percent accurate, and when they combined the signals for the ring and pinkie fingers, accuracy increased to 88 percent.

Click to view Johns Hopkins video.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

Self-healing sensor improves “electronic skin”

FacebooktwitterlinkedinFacebooktwitterlinkedin

Hossam Haick and Technion colleagues are developing materials to be integrated into flexible electronics that mimic the healing properties of human skin.  The goal is to quickly repair incidental scratches or damaging cuts that might compromise device functionality. The synthetic polymer can “heal” electronic skin in one day, which can improve the materials used to achieve  a sense of touch in prosthetics.

The new sensor is comprised of a self-healing substrate, high conductivity electrodes, and molecularly modified gold nanoparticles.  The researchers noted that “the healing efficiency of this chemiresistor is so high that the sensor survived several cuttings at random positions.”

WEARABLE TECH + DIGITAL HEALTH SAN FRANCISCO – APRIL 5, 2016 @ THE MISSION BAY CONFERENCE CENTER

NEUROTECH SAN FRANCISCO – APRIL 6, 2016 @ THE MISSION BAY CONFERENCE CENTER

Sensors allow more natural sense of touch in prosthetics

FacebooktwitterlinkedinFacebooktwitterlinkedin

Stanford’s Zhenan Bao is developing technology that could restore a more natural sense of touch in prosthetics.  Her flexible, thin plastic sensors send signals to the brain that more closely resemble nerve messages of human skin touch sensors.

The disruptive technology has not yet been tested on humans, and researchers still need to find a safe way to pass electrical signals from prostheses to the brain for long periods.

Many teams are working toward this (see ApplySci coverage from 2013-2015).   Previous tactile sensors have however been analogue devices, where more pressure produces a stronger electrical signal, rather than a more frequent stream of pulses. The electrical signals must then be sent to another processing chip that converts the strength of the signals to a digital stream of pulses that is only then sent on to peripheral nerves or brain tissue.  Bao’s sensors send digital signals directly.

Click to view Stanford University video.

WEARABLE TECH + DIGITAL HEALTH SAN FRANCISCO – APRIL 5, 2016 @ THE MISSION BAY CONFERENCE CENTER

NEUROTECH SAN FRANCISCO – APRIL 6, 2016 @ THE MISSION BAY CONFERENCE CENTER

Biocompatible neural prosthetics

FacebooktwitterlinkedinFacebooktwitterlinkedin

Spinal injury patients, and those with lost limbs, sometimes have neural prosthetic devices implanted in an attempt to regain independence.  They are used for deep brain stimulation and brain controlled external prosthetics.  However, neural prosthetics are often rejected by the immune system, and can  fail because of a mismatch between soft brain tissue and rigid devices.

University of Pennsylvania‘s Mark Allen and colleagues have created an implantable neural prosthetic device that is biocompatible and replaces silicon and noble metal. The goal is to avoid immune-system rejection, failures due to tissue strain, neurodegeneration, and decreased fidelity of recorded neural signals.