Category Archives: Assistive Technologies

Algorithm-adjusted exoskeleton enables movement optimization, personalization

FacebooktwitterlinkedinFacebooktwitterlinkedin

Juanjuan Zhang, Steven Collins and CMU colleagues have developed an algorithm that enables ankle exoskeletons to adapt to the wearer’s walk called “Human in the Loop Optimization.”

In a recent study, by using indirect calorimetry to measure metabolic rates, torque was adjusted while users were walking, running, and carrying a load.

At this stage, a treadmill and extensive monitoring equipment are required.  The goal is for users to be fitted in a clinic, and walk away with a programmed profile.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab – featuring  Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Michael Weintraub – Nancy Brown – Steve Kraus – Bill Geary – Mary Lou Jepsen – Daniela Rus

Registration rates increase Friday, July 7th

 

“Cybathlon” features robotic exoskeleton, BCI competitions

FacebooktwitterlinkedinFacebooktwitterlinkedin

ETH professor Robert Riener‘s first Cybathlon will feature competitions using robotic prostheses and brain computer interfaces.  Disabled participants will compete in brain controlled races and exoskeleton controlled tasks.  Many  will include common obstacles, like doors, ramps, and stairs, as the goal is to develop technology to increase independence and make the activities of daily living less difficult.

Click to view the Cybathlon trailer


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Krishna Shenoy – Karl Deisseroth

 

Wheelchair allows quadriplegics to stand, navigate various surfaces

FacebooktwitterlinkedinFacebooktwitterlinkedin

UPnRIDE is a wheelchair that brings quadriplegics from a sitting to an upright position — allowing them to stand and be mobile. A user’s center of gravity stays in the middle of the auto-balancing device, allowing it to remain stable once one is standing. The multi-function wheelchair can negotiate various types of  surfaces, sidewalks and slopes.

It was developed by Amit Goffer, creator of the ReWalk robotic exoskeleton.  ReWalk, which allows users to walk and climb stairs, requires the use of the upper body, to hold crutches.  UPnRIDE can be used by those paralyzed from the neck down.


Wearable Tech + Digital Health + NeuroTech Silicon Valley – February 7-8, 2017 @ Stanford University

BCI controlled wheelchair

FacebooktwitterlinkedinFacebooktwitterlinkedin

Miguel Nicolelis has developed a brain computer interface that allows monkeys to steer a robotic wheelchair with their thoughts.  The study is meant to demonstrate the potential of humans to do the same.

Signals from hundreds of neurons simultaneously recorded in two brain regions were translated into the real-time operation of a wheelchair.

Nicolelis said: “In some severely disabled people, even blinking is not possible. For them, using a wheelchair or device controlled by noninvasive measures like an EEG may not be sufficient. We show clearly that if you have intracranial implants, you get better control of a wheelchair than with noninvasive devices.”

ApplySci looks forward to the day when non-invasive methods will allow similar brain-driven functioning for the disabled.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

Modular exoskeleton adjusts to user’s height, ability

FacebooktwitterlinkedinFacebooktwitterlinkedin

SuitX is a modular, adjustable exoskeleton that adapts to a user’s height or disability.  For example, it can assist only one leg if the other does not require help.  It was developed by Berkeley professor Homayoon Kazerooni. As with other exoskeletons, it allow users to move their hips and knees with small motors attached to orthotics.   One can walk at up to 1.1 miles per hour by controlling buttons incorporated into crutches.

The suit is 27 pounds and costs $40,000.  A backpack battery provides power for 8 hours.  Walking data can be monitored.

Kazerooni  said that his goal is to build a version for children, as those with neurological disorders can require intensive walking training to remain mobile.   He is currently researching the suit’s benefits for adult stroke patients.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

“Augmented attention” wearable assists the visually impaired

FacebooktwitterlinkedinFacebooktwitterlinkedin

OrCam is a disruptive artificial vision company that creates assistive devices for the visually impaired.  It is led by Hebrew University professor Amnon Shashua.

MyMe, its latest product, uses artificial intelligence to respond to audio and visual information in real-time.  A clip on camera and Bluetooth earpiece create what the company calls an “augmented attention” experience, meant to enrich interactions.

The device is aware of all daily actions — including people we meet, conversation topics,  visual surroundings, food we eat, and activities we participate in. Visual and audio processing functions serve as an extension to a wearers’ awareness.  A built in fitness tracker will also be included.

More details will be available after MyMe is unveiled at CES next week.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

2nd Annual Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

AR + Kinect games assist the hearing, visually impaired

FacebooktwitterlinkedinFacebooktwitterlinkedin

Reflex Arc‘s  augmented reality games  work with  Microsoft Kinect to help children learn sign language and assist the visually impaired with exercise.   Boris gestures sign language, and  The Nepalese Necklace helps those with no limited sight  with mobility training.

The games encourage exercise and  are designed to help blind children learn about  spatial awareness, balance, coordination, and orientation.

WEARABLE TECH + DIGITAL HEALTH SAN FRANCISCO – APRIL 5, 2016 @ THE MISSION BAY CONFERENCE CENTER

NEUROTECH SAN FRANCISCO – APRIL 6, 2016 @ THE MISSION BAY CONFERENCE CENTER

PREFERRED REGISTRATION RATES AVAILABLE THROUGH 11/30/15.

Drone guides visually impaired runners

FacebooktwitterlinkedinFacebooktwitterlinkedin

University of Nevada‘s Eelke Folmer has built a  drone system that guides visually impaired runners around a track, allowing them to run without a sighted guide. A downward-facing camera follows the lines on the track. A second camera focuses on a marker on the runner’s shirt.  The quadcopter flies at eye level, 10 feet ahead of a runner, and guides with sound.  The drone adjusts its speed to accommodate the runner’s pace.

Professor Folmer’s lab is developing multiple projects aimed at improving the lives of those with limited sight.

WEARABLE TECH + DIGITAL HEALTH SAN FRANCISCO – APRIL 5, 2016 @ THE MISSION BAY CONFERENCE CENTER

NEUROTECH SAN FRANCISCO – APRIL 6, 2016 @ THE MISSION BY CONFERENCE CENTER

Virtual coaching for TBI patients

FacebooktwitterlinkedinFacebooktwitterlinkedin

The Office of Naval Research is developing MOVER (Mobile, Virtual Enhancements for Rehabilitation) to help TBI patients maintain therapy regimens.  Confusion, forgetfulness or depression can prevent injured veterans from completing necessary exercises for rehabilitation. Featured movements include including lunges, knee raises and squats, which are standard for TBI therapy.

When a user turns on a computer and camera, he/she stands still, while MOVER maps a virtual “skeleton” of brightly colored lines and shapes.  Movements are mirrored through each exercise. To increase visibility, users can connect MOVER to a television using Microsoft Kinect.

The system coaches by displaying pop-up text boxes or color shading in areas of the virtual skeleton, highlighting where and how to correct one’s form.

A six-month pilot study of the software, with 40 TBI patients and therapists at Spaulding Rehabilitation Hospital, will soon begin.

Connected wheelchair improves safety, comfort

FacebooktwitterlinkedinFacebooktwitterlinkedin

AT&T and Permobil have developed a connected wheelchair concept meant to increase user comfort and independence.

The system monitors seating position, cushion pressure (to prevent pressure ulcers), whether a chair has turned on its side, battery level, and location, and it also predicts the need for maintenance.

The data, including location, can be remotely accessed from the cloud and shared with caregivers.