Category Archives: Robotics

Solar powered, highly sensitive, graphene “skin” for robots, prosthetics

FacebooktwitterlinkedinFacebooktwitterlinkedin

Professor Ravinder Dahiya, at the University of Glasgow, has created a robotic hand with solar-powered graphene “skin” that he claims is more sensitive than a human hand.  The flexible, tactile, energy autonomous “skin” could be used in health monitoring wearables and in prosthetics, reducing the need for external chargers. (Dahiya is now developing a low-cost 3-D printed prosthetic hand incorporating the skin.)

Click to view University of Glasgow video


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston – Featuring Roz Picard, Tom Insel, John Rogers, Jamshid Ghajar and  Nathan Intrator – September 19, 2017 at the MIT Media Lab

 

Robots support neural and physical rehab in stroke, cerebral palsy

FacebooktwitterlinkedinFacebooktwitterlinkedin

Georgia Tech’s  Ayanna Howard has developed Darwin, a socially interactive robot that encourages children to play an active role in physical therapy.

Specific targeting children with cerebral palsy (who are involved in current studies),  autism, or tbi, the robot is designed to function in the home, to supplement services provided by  clinicians.  It engages users as their human therapist would — monitoring performance, and providing motivation and feedback.In a recent experiment, motion trackers monitored user movements as Darwin offered encouragement, and demonstrated movements when they were not performed correctly.  Researchers claimed that wth the exception of one case, the robot’s impact was “significantly positive.

Darwin is still evolving (pun intended) and has not yet been commercialized.

At MIT,  Newman Lab researcher Hermano Igo Krebs has been using robots for gait and balance neurorehabilitation in stroke and cerebral palsy patients since 1989.  Krebs’s technology continues to be incorporated into  Burke Rehabilitation hospital treatment plans.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston – Featuring Roz Picard, Tom Insel, John Rogers and Nathan Intrator – September 19, 2017 at the MIT Media Lab

Sensors + robotics + AI for safer aging in place

FacebooktwitterlinkedinFacebooktwitterlinkedin

IBM and rice University are developing MERA — a Waston enabled robot meant to help seniors age in place.

The system comprises a Pepper robot  interface, Watson, and Rice’s CameraVitals project, which calculates vital signs by recording video of a person’s face.  Vitals are measured multiple times each day. Caregivers are informed if the the camera and/or accelerometer detect a fall.

Speech to Text, Text to Speech and Natural Language Classifier APIs are being studied to enable answers to health related questions, such as “What are the symptoms of anxiety?” or “What is my heart rate?”

The company  believes that sensors plus cognitive computing can give clinicians and caregivers insights to enable better care decisions. They will soon test the technology in partnership with Sole Cooperativa, to monitor the daily activities of seniors in Italy.

Click to view IBM video


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Krishna Shenoy – Karl Deisseroth – Shahin Farshchi – Casper de Clercq – Mary Lou Jepsen – Vivek Wadhwa – Dirk Schapeler – Miguel Nicolelis

Robotic hand exoskeleton for stroke patients

FacebooktwitterlinkedinFacebooktwitterlinkedin

ETH professor Roger Gassert has developed a robotic exoskeleton that allows stroke patients  to perform daily activities by supporting motor and somatosensory functions.

His vision is that “instead of performing exercises in an abstract situation at the clinic, patients will be able to integrate them into their daily life at home, supported by a robot.” He observes that existing exoskeletons are heavy, rendering patients unable to lift their hands. They also have difficulty feeling objects and exerting the right amount of force. To address this, the palm of the hand is left free in the new device.

Gassert’s Kyushu University colleague Jumpei Arata developed a mechanism for the finger featuring three overlapping leaf springs. A motor moves the middle spring, which transmits the force to the different segments of the finger through the other two springs. The fingers thus automatically adapt to the shape of the object the patient wants to grasp.

To reduce the weight of the exoskeleton, motors are placed on the patient’s back and  force is transmitted using a bicycle brake cable. ApplySci hopes that the size and weight of the motor can be reduced, allowing it to be integrated into the exoskeleton in its next phase.

Gassert wants to make the exoskeleton thought controlled, and is using MRI and EEG to detect, in the brain,  a patient’s intention to move his or her hand, and communicating this to the device.


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis – Roozbeh Ghaffari – Unity Stoakes – Mounir Zok – Krishna Shenoy

AI robot learns ward procedures, advises nurses

FacebooktwitterlinkedinFacebooktwitterlinkedin

Julie Shah and MIT CSAIL  colleagues have developed a robot to assist labor nurses.  The AI driven assistant learns how the unit works from people, and is then able to make care recommendations, including scheduling and patient movement.

Labor nurses attempt to predict the arrival and length of labor, and which patients will require a C-section.  A smart robot, who has observed thousands of these decisions, could be a useful resource.  Whether a machine can replace human judgment in stressful situations must still be proven.

Click to view CSAIL video

Tiny, ingestible robot can deliver medicine, patch wounds, remove objects

FacebooktwitterlinkedinFacebooktwitterlinkedin

Daniela Rus and MIT, University of  Sheffield, and Tokyo Institute of Technology colleagues have developed an ingestible origami robot designed to patch wounds, deliver medicine or remove foreign objects from a person’s stomach.

The tiny robot, made of pig intestines, can unfold itself from a swallowed capsule. Steered by a doctor using external magnetic fields, the “microsurgeon” crawls across the stomach wall, and propels itself using a “stick-slip” motion.

Click to view MIT video


Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

Machine learning model enables robotic hand to learn autonomously

FacebooktwitterlinkedinFacebooktwitterlinkedin

Vikash Kumar and University of Washington colleagues have developed a simulation model that allows robotic hands to learn from their own experiences, while performing dexterous manipulation.  Human direction is not required.

A recent study incorporated the model while a robotic hand attempted several tasks, including  rotating an elongated object. With each try, the hand became better able to spin the tube. Machine learning algorithms helped it model the basic physics involved, and plan the actions it should take to best complete the task.

Click to view University of Washington video


Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

Voice, image,language identifying robot responds to human dialogue

FacebooktwitterlinkedinFacebooktwitterlinkedin

Hitachi’s EMIEW3 robot, designed to provide customer service in commercial environments, could be an ideal companion for the elderly or disabled. Its “remote brain” allows it to identify voices, images and language in its surroundings (which it can process with background street noise).  AI enables it to  respond to human dialogue and avoid collisions.  It is light enough to lift,can move at 6km per hour,  and stand on its own if knocked over. A cosmetic LED-light “beating heart” makes the robot seem more human.


Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

“Socially assistive” robot helps children learn

FacebooktwitterlinkedinFacebooktwitterlinkedin

Tega is a “socially assistive” robot  that senses the emotional state of a learner, and based on those cues, creates a personalized motivational strategy.  It was developed by Cynthia Breazeal at MIT to enable long-term educational interactions with children. It uses an AFFDEX Android device with emotion/facial expression recognition software by Rosalind Picard‘s Affectiva, to process movement, perception and thinking, and can respond to individual children’s behaviors.

In a learning trial, the system mirrored the emotional response of students ­­— getting excited when they were excited, and distracted when they lost focus — and tracked the impact of each of these cues on the student. Over time, it learned how the cues influenced a student’s engagement, happiness, and learning successes. As the sessions continued, it personalized its responses to optimize each student’s experience.

Click to view MIT video.

Rosalind Picard will be a keynote speaker at NeuroTech NYC on June 8th.


Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

 

Stretchable robot “skin” can display health data

FacebooktwitterlinkedinFacebooktwitterlinkedin

Robert Shepherd and Cornell colleagues have developed an electroluminescent “skin” that stretches to more than six times its original size while emitting light.  This could be used for soft robots that move more naturally, and dynamically display information, include health data.  The Cornell press release invites us to imagine “a  health care robot that displays a patient’s temperature and pulse, and even reacts to a patient’s mood.”

The material uses a “hyper-elastic light-emitting capacitor” made of layers of transparent hydrogel electrodes surrounding an insulating elastomer  sheet. The elastomer changes luminance and capacitance when stretched or rolled. The  skin allows soft robots to sense their actuated state and environment and communicate optically.  Small robots can crawl.

Click to view Cornell Universty video.


 

Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences