“Artificial nerve” system for sensory prosthetics, robots

Stanford’s  Zhenan Bao has developed an artificial sensory nerve system that can activate the twitch reflex in a cockroach and identify letters in the Braille alphabet. Bao describes it as “a step toward making skin-like sensory neural networks for all sorts of applications”  which would include artificial skin that creates a sense of touch in prosthetics.

The artificial nerve circuit integrates three components:

  • A touch sensor that can detect minuscule forces.
  • A flexible electronic neuron which receives signals from the touch sensor.
  • An artificial synaptic transistor modeled after human synapses which is stimulated by theses sensory signals.

The system was successfully tested to generate both reflexes and a sense touch. The team  also hopes to create low-power, artificial sensor nets to cover robots, to provide feedback that makes them more agile.

Click to view Science video


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda

Join Apply Sci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22, 2019 at Stanford University.  Zhenan Bao will be the keynote speaker.

DARPA’s Justin Sanchez on driving and reshaping biotechnology | ApplySci @ Stanford

DARPA Biological Technologies Office Director Dr. Justin Sanchez on driving and reshaping biotechnology.  Recorded at ApplySci’s Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 26-27, 2018 at Stanford University.


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 25, 2018 at the MIT Media Lab

Closed loop EEG/BCI/VR/physical therapy system to control gait, prosthetics

Earlier this year, University of Houston’s Jose Luis Contreras-Vidal developed a closed-loop BCI/EEG/VR/physical therapy system to control gait as part of a stroke/spinal cord injury rehab program.  The goal was to promote and enhance cortical involvement during walking.

In a study, 8 subjects walked on a treadmill while watching an avatar and wearing a 64 channel EEG headset and motion sensors at the hip, knee and ankle.

The avatar was first activated by the motion sensors, allowing its movement to precisely mimic that of the test subject. It was  then controlled by the brain-computer interface, although this was less precise than the movement with the motion sensors. Contreras-Vidal believes that as subjects learn how to use the interface, the result will be closer to that of the sensors. The researchers reported increased activity in the posterior parietal cortex and the inferior parietal lobe, along with increased involvement of the anterior cingulate cortex.

The team built on this reasearch  to demonstrate how brain activity is used to identify different terrains to develop prosthetics that automatically adjust to changing ground conditions in real time. 

Click to view University of Houston video


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University. Speakers include:  Vinod Khosla – Justin Sanchez – Brian Otis – Bryan Johnson – Zhenan Bao – Nathan Intrator – Carla Pugh – Jamshid Ghajar – Mark Kendall – Robert Greenberg – Darin Okuda – Jason Heikenfeld – Bob Knight – Phillip Alvelda – Paul Nuyujukian –  Peter Fischer – Tony Chahine – Shahin Farshchi – Ambar Bhattacharyya – Adam D’Augelli – Juan-Pablo Mas – Michael Eggleston – Walter Greenleaf – Jacobo Penide – David Sarno – Peter Fischer

Registration rates increase on January 26th

Prosthetic “skin” senses force, vibration

Jonathan Posner, with University of Washington and UCLA colleagues, has developed a flexible sensor “skin” that can be stretched over prostheses to determine force and vibration.

The skin mimics the way a human finger responds to tension and compression, as it slides along a surface or distinguishes among different textures. This could allow users to sense when something is slipping out of their grasp.

Tiny electrically conductive liquid metal channels are placed on both sides of of a prosthetic finger. As it is slid across a surface, the channels on one side compress while those on the other side stretch, similar to a natural limb.  As the channel geometry changes, so does the amount of electricity. Differences in electrical resistance correlate with force and vibrations.

The researchers believe that the sensor skin will enable users to better be able to open a door, use a phone, shake hands, or lift packages.


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University, featuring:  Vinod KhoslaJustin SanchezBrian OtisBryan JohnsonZhenan BaoNathan IntratorCarla PughJamshid Ghajar – Mark Kendall – Robert Greenberg Darin Okuda Jason Heikenfeld

Sensor-embedded prosthetic monitors gait, detects infection

The Monitoring OsseoIntegrated Prostheses uses a limb which includes a titanium fixture surgically implanted into the femur. Bone grows at the implant’s connection point, leaving  a small metallic connector protruding from the remaining leg.  An accompanying artificial limb then can be attached or detached. The same procedure can be performed for upper limbs.

Advantages include less pain, a fluid walking motion, and a more stable, better-fitting limb. However, infection risk is increased due to the metal profusion. This is meant to be addressed by electrochemical and skin sensors, including a  bio-compatible array embedded within the residual limb. The array tracks changes in body temperature and pH balance, which indicate infection. It also monitors the fit of the bone and prosthetic limb, and the healing process, which could help doctors to speed recuperation.


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University, featuring:  Vinod KhoslaJustin SanchezBrian OtisBryan JohnsonZhenan BaoNathan IntratorCarla PughJamshid Ghajar – Mark Kendall

Deep learning driven prosthetic hand + camera recognize, automate required grips

Ghazal Ghazai and Newcastle University colleagues have developed a deep learning driven prosthetic hand + camera system that allow wearers to reach for objects automatically.  Current prosthetic hands are controlled  via a user’s myoelectric signals, requiring learning, practice, concentration and time.

A convolutional neural network was trained it with images of  500 graspable objects, and taught  to recognize the grip needed for each. Objects were grouped by size, shape, and orientation, and the hand was programmed to perform four different grasps to accommodate them: palm wrist neutral (to pick up a cup); palm wrist pronated (to pick up the TV remote); tripod (thumb and two fingers), and pinch (thumb and first finger).

The hand’s camera takes a picture of the object in front of it, assesses its shape and size, picks the most appropriate grasp, and triggers a series of hand movements, within milliseconds.

In a small study of the technology, subjects successfully picked up and moved objects with an 88 per cent success rate.

The work is part of an effort to develop a bionic hand that senses pressure and temperature, and transmits the information to the brain.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda

Solar powered, highly sensitive, graphene “skin” for robots, prosthetics

Professor Ravinder Dahiya, at the University of Glasgow, has created a robotic hand with solar-powered graphene “skin” that he claims is more sensitive than a human hand.  The flexible, tactile, energy autonomous “skin” could be used in health monitoring wearables and in prosthetics, reducing the need for external chargers. (Dahiya is now developing a low-cost 3-D printed prosthetic hand incorporating the skin.)

Click to view University of Glasgow video


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston – Featuring Roz Picard, Tom Insel, John Rogers, Jamshid Ghajar and  Nathan Intrator – September 19, 2017 at the MIT Media Lab

 

Thought controlled prosthetic arm has human-like movement, strength

This week at the Pentagon, Johnny Matheny unveiled his DARPA developed prosthetic arm.  The mind-controlled prosthesis has the same size, weight, shape and grip strength of a human arm, and, according to Matheny, can do anything one can do.

It is, by all accounts, the most advanced prosthetic limb created to date.

The 100 sensor arm was developed as part of the “Revolutionizing Prosthetics Program” of the Biological Technologies office, led by Dr. Justin Sanchez.

An implanted neural interface allows the wearer to control the arm with his thoughts. Sensors are also implanted in the fingertips, sending signals back to the brain, allowing users to feel sensations.

Click to view Johns Hopkins video.

Dr. Sanchez will be a keynote speaker at ApplySci’s Wearable Tech + Digital Health + NeuroTech NYC conference on June  7-8, 2016 at the New York Academy of Sciences.

Bionic finger, implanted electrodes, enable amputee to “feel” texture

Swiss Federal Institute of Technology and Scuola Superiore Sant’Anna researchers have developed  a bionic fingertip that allows amputees to feel textures and  differentiate between rough and smooth surfaces.

Electrodes were surgically implanted into the upper arm of a man whose arm had been amputated below the elbow.  A machine moved an artificial finger, wired with electrodes,  across smooth and rough lines on a plastic strip.  The fingertip movement generated an electrical signal, which translated into a series of electrical spikes sent to the brain. The spikes mimicked the language of the nervous system and create the sensation of feeling.

The subject, Aabo Sørensen said:  “When the scientists stimulate my nerves I could feel the vibration and sense of touch in my phantom index finger,” he says. “[It] is quite close to when you feel it with your normal finger you can feel the coarseness of the plates and different gaps and ribs.”

Click to view EPFL video.


 

Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center
NeuroTech San Francisco  – April 6, 2016 @ the Mission Bay Conference Center
Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences
NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences