Washington University‘s Daniel Moran has received a DARPA grant to test a device that would stimulate nerves in the upper arm and forearm of prosthetic users. The goal is for the wearer to be able to feel hot, cold, and a sense of touch. In a related development last year, MC10‘s Roozbeh Ghaffari developed artificial skin for prosthetics that mimics the sensitivity of real skin. Its silicon and gold sensors detect pressure, moisture, heat and cold (see ApplySci, 12/30/14).
Moran’s electrode is designed to stimulate sensory nerve cells in the ulnar and median nerves in the arms. The ulnar nerve is the largest in the body unprotected by muscle or bone and is connected to the ring finger and pinkie finger on the hand. The median nerve in the upper arm and shoulder is connected to the other fingers on the hand. Together, the two nerves control movement and sensations including touch, pressure, vibration, heat, cold and pain in all of the fingers.
This novel macro-sieve peripheral nerve interface is designed to stimulate regeneration of the ulnar and median nerves to transmit information back into the central nervous system.
The device is in an early stage, and will only be implanted in non-human primates at this time.
WEARABLE TECH + DIGITAL HEALTH NYC 2015 – JUNE 30 @ NEW YORK ACADEMY OF SCIENCES.
EARLY REGISTRATION RATE ENDS TODAY, 5/15/15.
Tel Aviv based Voyant Health‘s TraumaCad Mobile app helps orthopedic surgeons plan operations and create result simulations. The system offers modules for hip, knee, deformity, pediatric, upper limb, spine, foot and ankle, and trauma surgery. The iPad app mobile version of this decade old system was recently approved by the FDA.
Surgeons can securely import medical images from the cloud or hospital imaging systems to perform measurements, fix prostheses, simulate osteotomies, and visualize fracture reductions. The app overlays prosthesis templates on radiological images and includes tools for performing measurements on the image and positioning the template. In total hip replacement surgery, it automatically aligns implants and assembles components to calculate leg length discrepancy and offset.
Wearable Tech + Digital Health NYC 2015 – June 30, 2015 @ New York Academy of Sciences. Register before April 24 and save $300.
Jennifer Collinger and University of Pittsburgh colleagues have enabled a prosthetic arm wearer to reach, grasp, and place a variety of objects with 10-D control for the first time.
The trial participant had electrode grids with 96 contact points surgically implanted in her brain in 2012. This allowed 3-D control of her arm. Each electrode point picked up signals from an individual neuron, which were relayed to a computer to identify the firing patterns associated with observed or imagined movements, such as raising or lowering the arm, or turning the wrist. This was used to direct the movements of a prosthetic arm developed by Johns Hopkins Applied Physics Laboratory. Three months later, she also could flex the wrist back and forth, move it from side to side and rotate it clockwise and counter-clockwise, as well as grip objects, adding up to 7-D control.
The new study, published yesterday, allowed the participant 10-D control — the ability to move the robot hand into different positions while also controlling the arm and wrist.
To bring the total of arm and hand movements to 10, the pincer grip was replaced by four hand shapes: finger abduction, in which the fingers are spread out; scoop, in which the last fingers curl in; thumb opposition, in which the thumb moves outward from the palm; and a pinch of the thumb, index and middle fingers. As before, the participant watched animations and imagined the movements while the team recorded her brain signals. They used this to read her thoughts so that she could move the hand into various positions.
Ifor Samuel and Ashu Bansal at the University of St. Andrews have developed a wearable optical sensor that can be used to control the movement of artificial limbs.
Plastic semiconductor based sensors detect muscle contraction. Light is shined into fibrous muscle, and the scattering of the light is observed. When muscle is contracted, the light scatters less, because muscle fibers are further apart. Sensors detect the changed scattering signals, and relay the information, as photocurrents, to a prosthetic limb, triggering movement. A robotic arm was controlled using this method in a recent study.
Using disposable wearable optical sensors could eliminate patient risks associated with electrical based sensors, including electromagnetic interference, pain caused by sensing needles, and immune responses.
Prosthetics are lighter and more flexible than in the past, but fail to mimic human muscle power. Powered prosthesis motors generate force, but cannot respond with stability to disturbances or changing terrain.
Robert Gregg and colleagues at the University of Texas have applied robot control theory to allow powered prosthetics to dynamically respond to a wearer’s environment. This has enabled wearers of a robotic leg to walk on a moving treadmill at similar speeds to able-bodied people.
Gregg said that “the gait cycle is a complicated phenomenon with lots of joints and muscles working together. We used advanced mathematical theorems to simplify the entire gait cycle down to one variable. If you measure that variable, you know exactly where you are in the gait cycle and exactly what you should be doing.”
In a recent study, algorithms with sensors measured the center of pressure on a powered prosthesis. Inputted with a user’s height, weight and dimension of the residual thigh, the prosthesis was configured for each subject in 15 minutes. Subjects walked on the ground, and on a treadmill, at increasing speeds. Their walking speeds were greater than 1 meter per second throughout the study. Typical able-bodied walking speed is 1.3 meters per second. Participants also reported less energy exertion than with traditional prostheses.
The National Institute of Biomedical Imaging and Bioengineering recently launched the “NIBIB Bionic Man,” an interactive Web tool detailing 14 sensor based technologies they are supporting. They include:
1. A robotic leg prosthesis that senses a person’s next move and provides powered assistance to achieve a more natural gate.
2. A light sensitive biogel and biological adhesive to help new cartilage grow and become functional.
3. A blood clot emulator used to optimize ventricular assist devices to reduce the risk of blood clots.
4. An artificial kidney that could be used in place of kidney dialysis for treatment of end-stage kidney disease.
5. A micro needle patch that delivers vaccines painlessly and doesn’t require refrigeration.
6. An interstitial pressure sensor to help doctors determine optimal times for delivering chemotherapy/radiation to cancer patients.
7. Glucose-sensing contact lenses to provide a non-invasive solution for continuous blood sugar monitoring.
8. A tongue drive system to help individuals with severe paralysis navigate their environment using only tongue movements.
9. A wireless brain-computer interface that records and transmits brain activity wirelessly and could allow people with paralysis to use their thoughts to control robotic arms or other devices.
10. Implantable myoelectric sensors to detect nerve signals above a missing limb and can use these signals to move a prosthesis in a more natural way.
11. A synthetic glue modeled after an adhesive found in nature that could be used to repair tissues in the body.
12. Focused ultrasound used to temporarily open the blood brain barrier to let gene therapy treatments reach the brain.
13. Flexible electrode arrays that record brain activity from the surface of the brain and could be used to control robotic arms or provide real-time information about brain states.
14. Electrical stimulation of the spinal cord used in individuals with paralysis to help restore voluntary movement and other functions.
CAREN, developed at the University of South Florida, helps those with limb loss and prosthetics improve basic function, symmetry and walking efficiency. It is also a tool for researchers to study ways to improve mobility and balance.
Wearing a safety harness and walking on a treadmill in the room-sized system, participants of a recent study engaged in audio-visual balance games, explored virtual environments, and used an avatar to simulate activities fro on a surround screen.
CAREN’s interactive games allow for physical rehabilitation, combined with cognitive tasks, such as requiring someone to dig for objects in a virtual world while still walking on a treadmill. Distraction gait training could help balance, mobility and coordination in PTSD, traumatic brain injury or stroke patients.
Boat driving, walking in a combat environment or mountain hiking can be simulated. Visual tracking technology allows researchers to evaluate a patient’s gait or performance in real time, and immediately adjust the system to customize the rehab/training process.
DEKA is a robotic, prosthetic arm that will allow amputees to perform complex movements and tasks. It has just received FDA approval.
Electrodes attached to the arm detect muscle contractions close to the prosthesis, and a computer translates them into movement. Six “grip patterns” allow wearers to drink a cup of water, hold a cordless drill or pick up a credit card or a grape, among other functions.
DARPA‘s Justin Sanchez believes that DEKA “provides almost natural control of upper extremities for people who have required amputations.” He claims that “this arm system has the same size, weight, shape and grip strength as an adult’s arm would be able to produce.”
Neurogrid is a “human brain based” microchip that is 9,000 times faster than and requires 1/40,000 the power of a typical pc. It is being developed by Professor Kwabena Boahen at Stanford University.
The circuit board consists of 16 custom-designed “Neurocore” chips which can simulate 1 million neurons and billions of synaptic connections. Certain synapses were enabled to share hardware circuits, saving power.
Its speed and low power character could impact the development of prosthetic limbs that are controlled by a similar chip and not tethered to a power source. Such a limb could have “the speed and complexity of our own actions” according to Professor Boahen.