Category Archives: Virtual Reality

VR training to reduce falls in Parkinson’s, dementia

FacebooktwitterlinkedinFacebooktwitterlinkedin

Tel Aviv University’s Jeff Hausdorff has created a virtual reality treadmill system in an attempt to prevent falls in Parkinson’s  and  dementia patients.

Current interventions focus on improving muscle strength, balance and gait.  By integrating motor planning, attention, executive control and judgement training, using VR, therapies can also address the cognitive issues associated with falls.

In a recent study of 282 participants,  146 did treadmill + VR training, and 136 did treadmill training alone. VR patient foot movements were filmed and shown on a screen, in order for them to “see” their feet walking  in real-time. The game-like simulation included avoiding and stepping over puddles or hurdles, and navigating pathways. It also provided motivational feedback.

Fall rates were similar in both groups before the training. Six months after, those who participated in the VR intervention fell 50% less. Those who did not train with VR had consistent fall rates. The biggest improvement was seen in Parkinson’s patients.

Patients can receive the combined therapy at the Hausdorff-led Center for the Study of Movement Cognition and Mobility at Tel Aviv’s Ichilov Hospital.

Click to view the Tel Aviv Sourasky Medical Center video.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston – Featuring Roz Picard, Tom Insel, John Rogers and Nathan Intrator – September 19, 2017 at the MIT Media Lab

Eye tracking + VR to improve brain injury diagnosis, track recovery

FacebooktwitterlinkedinFacebooktwitterlinkedin

Eye tracking technology, combined with VR, is proliferating, with myriad medical, gaming, and education applications.

SyncThink uses eye tracking, built into an Oculus Rift,  to detect if a person has the ability to keep

the eyes synced with moving objects, to determine brain injury and track recovery.

The company has been granted 10 patents, for  eye-tracking hardware, and analytical techniques for stimulating, measuring, and training brain attention networks. It has been used to detect concussions on the field and evaluate soldier readiness and brain impairment after injury. The company describes additional applications including characterizing and monitoring fatigue, performance, and developmental or neurodegenerative conditions.

Eyefluence, which was today acquired by Google, creates head-mounted display AR, VR, and mixed reality interfaces. According to the company,  its AR application allows critical care professionals to access patient data with their eyes while their hands treat the injured.  VR integrations humanize experiences, reduce nausea, optimize image resolution, and increase speed.

ApplySci believes that the next step in AR/VR enhancement is integrating mobile EEG into headsets, combining eye tracking, GSR, and  brainwave data into various applications.


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Krishna Shenoy – Karl Deisseroth

VR for early neurodegenerative disease detection, personalized rehabilitation

FacebooktwitterlinkedinFacebooktwitterlinkedin

Tomsk Polytechnic and Siberian State University scientists David Khachaturyan  and  Ivan Tolmachov have developed a VR based neurodegenerative disorder diagnosis system.  The goal is the early detection and tretment of diseases, including MS and Parkinson’s.  The next step is the use of VR systems, like Glass and Kinect, for personalized rehabilitation.

50 subjects, both healthy and already diagnosed, used VR headsets,  a non-contact sensor controller and a mobile platform during a variety of activities.  Changes in posture and balance were detected, and compared to a human  skeleton model of 20 points on the body.  Deviations from the model indicated disease.  Differences in reactions of those with difference diseases was also noted —  Parkinson’s patients experienced hand tremors, and others experienced compromised coordination.

A clinical trial  will be completed in 2017.


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis – Roozbeh Ghaffari – Unity Stoakes

Anxiety reducing VR game

FacebooktwitterlinkedinFacebooktwitterlinkedin

Deep VR teaches breathing techniques meant to reduce the anxiety of users during a game. Its developers believe that the skills learned can also help manage stress during daily life.

It is the basis of a Radboud University study, in the lab of Isabela Granic, that aims to alleviate anxiety in children.  100 children have already been studied, the findings of which will guide the game’s future design and lead to the development of its sensor.

Exposure therapy will soon be added, to shift the experience from sedative to mildly frightening, in an attempt to systematically desensitize those with anxiety.

Click to view Deep VR video


Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

“Mixed Reality” headset could support surgery, rehab, learning

FacebooktwitterlinkedinFacebooktwitterlinkedin

Magic Leap has unveiled its “mixed reality” headset, where  virtual objects are integrated into the real world.  In addition to obvious gaming and entertainment applications, the system could be used in healthcare (including in surgery, surgery preparation, and orthopedic rehabilitation) and education.

The company remains vague in its description of its technology, but head and hand tracking functionality appear to have been added.   According to founder Rony Abramovitz, “Magic Leap doesn’t trick the brain. Rather it shoots photons into the eye that stimulate the cones and rods as if the hologram were real, or neurologically true.”

Click to view Magic Leap video.


Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

VR + sound to control pain

FacebooktwitterlinkedinFacebooktwitterlinkedin

In a recent study, York St. John University researchers have demonstrated the use of virtual reality headsets to control pain.  Discomfort was further reduced when sound was incorporated into the process.

In the experiment, a small group of adults submerged one hand in ice water while playing an Oculus VR  based game, with and with out sound.  While playing and hearing accompanying sounds, subjects could tolerate the discomfort for 79 seconds. With out sound, it was reduced to 56 seconds.  With out any VR support, they could tolerate the cold water for 30 seconds (on average).

If verified with a much larger group of subjects and a broader spectrum of pain/discomfort tested, this discovery could potentially bring a non-drug pain reducing method to individuals at home.


 

Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

 

 

VR + sensors improve accuracy, speed of PTSD diagnosis

FacebooktwitterlinkedinFacebooktwitterlinkedin

PTSD is often misdiagnosed. Symptoms can be confused with those of depression.  Many clinicians lack the expertise needed to distinguish the condition, and therefore might not provide appropriate treatment.

To address this widespread dilemma, Draper has developed a diagnostic system that combines virtual reality data with psychophysiological sensors. The sensors monitor heart rate, sweat, and pupil diameter, while subjects experience different types of audio and visual stimuli.

Stimuli customized to a patient’s personal traumatic experience can generate robust psychophysiological responses. However,  the time needed to tailor stimuli  is often not available in a point-of-care setting.  Draper’s solution uses generalized stimuli that results in quicker, more accurate assessments.

Additional research will address larger samples over a wide geographic area, as well as patients suffering from multiple mental health issue and  chronic diseases.

According to Dr. Philip Parks, who oversees Draper’s neurotechnology portfolio: “Once diagnosed with a particular disorder, such as depression, most mental health patients get relatively the same treatment even though their symptoms and response to treatment choices may be quite different. We hope that one day these technologies will help clinicians ensure that patients get the best possible medication and other treatments at the right time.”

Dr. Parks will be a featured speaker at NeuroTech NYC on June 8th at the New York Academy of Sciences.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

Virtual clinic uses apps, VR, data, wearables in remote care

FacebooktwitterlinkedinFacebooktwitterlinkedin

USC’s Center for Body Computing, led by Professor Leslie Saxon, has created the Virtual Care Clinic, featuring vetted, best of class partners providing integrated remote healthcare solutions.  The eight initial partners are Doctor Evidence, IMS Health, Karten Design, Medable, Planet Grande, Proteus Digital Health and VSP Global.

Mobile apps, virtual doctors, data collection and analysis systems,  and diagnostic and monitoring wearables  will provide on-demand access to care.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

AR + Kinect games assist the hearing, visually impaired

FacebooktwitterlinkedinFacebooktwitterlinkedin

Reflex Arc‘s  augmented reality games  work with  Microsoft Kinect to help children learn sign language and assist the visually impaired with exercise.   Boris gestures sign language, and  The Nepalese Necklace helps those with no limited sight  with mobility training.

The games encourage exercise and  are designed to help blind children learn about  spatial awareness, balance, coordination, and orientation.

WEARABLE TECH + DIGITAL HEALTH SAN FRANCISCO – APRIL 5, 2016 @ THE MISSION BAY CONFERENCE CENTER

NEUROTECH SAN FRANCISCO – APRIL 6, 2016 @ THE MISSION BAY CONFERENCE CENTER

PREFERRED REGISTRATION RATES AVAILABLE THROUGH 11/30/15.

Personalized medicine via “medical avatars”

FacebooktwitterlinkedinFacebooktwitterlinkedin

The European Commission’s DISCIPULUS project, led by UCL researcher Vanessa Diaz,  aims to build a roadmap towards the “digital patient”.  The  dynamic, virtual version of an individual, which Diaz describes as a “medical avatar” could run simulations of treatments to find the best course of action.

If a symptomatic patient arrives at hospital, a  virtual “twin” is created, based on scans. Multiple testing is done on the avatar,  and new scans are continuously uploaded, to determine the outcomes of various treatments.

The best and most tailored plan is then carried out. The patient is discharged with wearables, or other sensor based devices, to monitor key metrics at home, and continue to update the virtual twin during recovery.