Category Archives: Apps

Phone camera + machine learning detect concussion

FacebooktwitterlinkedinFacebooktwitterlinkedin

Shwetak Patel and UW colleagues have developed PupilScreen, an app that uses a phone’s camera to detect concussion from the pupil.

The phone’s video camera and flash check the eye for its pupillary light reflex, measures size changes associated with concussion.  Machine learning algorithms confirm the diagnosis.

Hospitals typically use a pen light to check for concussions, which is much less accurate than a pupillometer.

PupilScreen was tested on  48 healthy and tbi patients. The team reported that it “diagnosed brain injuries with almost perfect accuracy using the app’s output alone.”

Click to view University of Washington video


Stanford professor Jamshid Ghajar will discuss the rapid concussion detection and treatment platform SyncThink at ApplySci’s Wearable Tech + Digital Health + Neurotech Boston conference, on September 19th at the MIT Media Lab.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab – featuring  Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Riccardo Sabatini – Phillip Alvelda – Michael Weintraub – Nancy Brown – Steve Kraus – Bill Geary – Mary Lou Jepsen


ANNOUNCING WEARABLE TECH + DIGITAL HEALTH + NEUROTECH SILICON VALLEY – FEBRUARY 26 -27, 2018 @ STANFORD UNIVERSITY –  FEATURING:  ZHENAN BAO – JUSTIN SANCHEZ – BRYAN JOHNSON – NATHAN INTRATOR – VINOD KHOSLA

Voice analysis as a diagnostic tool

FacebooktwitterlinkedinFacebooktwitterlinkedin

Beyond Verbal recently used its emotion-detecting voice analysis app in an attempt to predict coronary artery disease in 150 study participants, 120 of whom had presented for angiography.  The company claims to have identified 13 voice features  associated with CAD – and one associated with a 19-fold increase in its likelihood.

The researchers said that the voice indicator was strongest when a subject described a negative experience.  As this suggests an emotional connection to heart disease, could a neurotech wearable, that provides brain activity analysis, validate the process?  The company is already studying voice changes in Autism and Parkinson’s disease, but this was the first time they used their voice diagnostic tool in a non-brain disease study.


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Krishna Shenoy – Karl Deisseroth

Registration rates increase Friday, November 18th

Curated, enterprise-wide, health app prescription system

FacebooktwitterlinkedinFacebooktwitterlinkedin

Mount Sinai Health System (with 7,100 physicians) has launched RxUniverse, an enterprise wide, curated app prescription system. Included apps have have been evaluated based on published evidence, to help physicians utilize digital health solutions, with an increased level of safety.

A pilot platform was launched throughout five clinical areas at Mount Sinai earlier this year.  Physicians have since prescribed 2,000 apps.  Their startup company, Responsive Health,  will license RxUniverse to other health systems.

RxUniverse was developed in the Sinai App Lab, under the direction of Ashish Atreja.  Integrating the latest consumer medical and lifestyle apps throughout such an enormous system is no easy task — and ApplySci applauds Mt Sinai for its vision of delivering healthcare in an integrated, efficient, vetted, and patient-friendly manner.


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Krishna Shenoy – Karl Deisseroth

Mental illness symptoms self reported, empowering patients and alerting caregivers

FacebooktwitterlinkedinFacebooktwitterlinkedin

Monsenso is an app created to help those who suffer from mental illness gauge their own symptoms.   ApplySci applauds this and other attempts to empower the patient (as long as privacy is protected), which in itself could produce positive outcomes.  Data is continuously sent to clinicians, and emergency interventions are facilitated.

Monsenso users complete daily self-assessments on their phones, describing medication taken, stress, and sleep quality.  Physical activity, social interaction and mobility data is also phone-gathered.

The app identifies triggers and early warning signs, and notifies a doctor and/or emergency contact.  Instant messages requesting emergency consultations can be sent to users — which could speed appointments, and also let patients know that they are not alone.

The use of apps in mental health is growing rapidly, and is studied by Harvard’s John Torous, who frequently writes about research, ethical, and patient perspectives of digital psychiatry.


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Krishna Shenoy – Karl Deisseroth

Phone sensors detect anemia, irregular breathing, jaundice

FacebooktwitterlinkedinFacebooktwitterlinkedin

University of Washington’s Shwetak Patel and his UbiComp Lab colleagues develop non-invasive, smartphone based tests, meant to bring diagnostics to the masses.

HemaApp,  a smartphone/light  source detection  method for anemia, could be especially  useful in  areas  lacking  access to care.   Anemia is extremely common in poor  countries.

In a recent study, a phone camera was  used measure hemoglobin by capturing light passing through one’s finger. Video was  recorded as light from the flash, plus additional LED lights and an incandescent bulb shined through a fingertip. Changes in color indicated a red  blood cell deficiency.

31 patients were tested, using  this method, at Seattle Children’s Hospital.  The system worked as well  as  a commercial LED anemia  test, but not as well as a standard blood test.

Patel is also  investigating  the use of smartphones to make healthcare  more  accessible in other  ways,  including an app that uses a  microphone to track breathing while sleeping, and a camera  that uses color coded references to detect jaundice in  babies.


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis

“Data, not drugs” for elite sport performance

FacebooktwitterlinkedinFacebooktwitterlinkedin

With equal parts modesty, enthusiasm, and wearable tech expertise, Olympic cyclist Sky Christopherson came to ApplySci’s recent Wearable Tech + Digital Health + NeuroTech NYC conference to “thank this community for helping the US Olympic team before the last Olympics with a lot of the same technology to help athletes prepare, using data and not drugs.”

Sky launched OAthlete at the 2012 London Olympics, to develop technology that will allow elite athletes — and the rest of us — to “fine tune” training to “understand how each athlete is responding as a unique individual.”

His mission is to put forth the use of data — not drugs — to enhance performance — healthfully, and sustainably.  Click to view his interview with StartUp Health’s Unity Stoakes at the conference.


ApplySci’s 6th Wearable Tech + Digital Health + NeuroTech Silicon Valley – February 7-8 2017 @ Stanford  | Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis 

App + wearable to quantify mindfulness

FacebooktwitterlinkedinFacebooktwitterlinkedin

Apple’s Breathe meditation app follows the recent trend of using wearables to quantify mindfulness and improve mental health.

Every four hours (while wearing the Apple Watch) Breathe reminds one to inhale and exhale for one to five minutes. Concentric circles can be watched as they shift on the screen, or a wearer can respond to haptic touches. Heart rate changes are shown and tracked.

ApplySci’s recent NeuroTech San Francisco conference ended with  a session called “Transformative Tech – Mindfulness, stress, anxiety.”  With more accurate mobile EEG technology, these tools will soon be able to allow users to more reliably quantify their stress levels.

Sleep app uses wearable sensors, cloud analytics

FacebooktwitterlinkedinFacebooktwitterlinkedin

The American Sleep Apnea Association,  Apple and IBM have begun a study about the impact of sleep quality on daily activity level, alertness, productivity,  health and medical conditions. iPhone and Apple Watch sensors and the ResearchKit framework collect data from healthy and unhealthy sleepers, which is sent to the Watson Health Cloud.

The SleepHealth app uses the watch’s  heart rate monitor to detect sleep, and gathers movement data with its accelerometer and gyroscope. The app includes a  “personal sleep concierge” and nap tracker, meant to help users develop better sleeping habits.

Data is stored and analyzed on the Watson Health Cloud, allowing researchers to see common patterns .  The long term goal is to develop  effective interventions.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8,2016 @ the New York Academy of Sciences

 

DeepMind Health identifies complication risks

FacebooktwitterlinkedinFacebooktwitterlinkedin

Google has announced DeepMind Health, which creates non-AI based apps to identify patientscomplication risk.  It is expected for AI to be integrated in the future. Acute kidney injury is the group’s initial focus, being tested by the UK National Health Service and the Royal Free Hospital London.

The initial app, Streams, quickly alerts hospital staff of critical patient information.  One of Streams’ designers, Chris Laing, said that  “using Streams meant I was able to review blood tests for patients at risk of AKI within seconds of them becoming available. I intervened earlier and was able to improve the care of over half the patients Streams identified in our pilot studies.”

The company plans to integrate patient treatment prioritization features based on the Hark clinical management system.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

Concussion app gauges recovery

FacebooktwitterlinkedinFacebooktwitterlinkedin

As the incidence of and focus on concussion grows, the neurotech community continues to develop diagnostic tools and treatment protocols.

One such tool is the NYU developed Concussion Tracker app,  and corresponding study, designed to track self-reported physical and cognitive function after concussion.  The app does not, however, diagnose concussion.

The goal is to monitor recovery with greater detail and regularity,  to better evaluate treatment and identify persistent issues.   The project is led by NYU’s Laura BalcerDennis Cardone, and Paul Testa.

Testa said  the collected data will be used to  develop a new recovery evaluation app that in the future could shape recovery protocols.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences