Thin, flexible, adhesive, continuous, cuffless blood pressure sensor

FacebooktwitterlinkedinFacebooktwitterlinkedin

Zhao Ni and Yuan-ting Zhang of the Chinese University of Hong Kong have developed an ultra-thin, waterproof, cuffless blood pressure  sensor that can be worn on the wrist, woven into clothes or bed sheets, or integrated into an earpiece. The monitor detects blood flow and monitors  health data through color reflected by skin and image depth. It provides continuous, wireless monitoring and abnormality alerts.

Professor Zhao believes that in the future, the sensor could use AI to improve itself.  She intends to  broaden its applications to include monitoring breathing rate and blood oxygen level, to also replace a finger -worn pulse oximeter. 


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab – featuring  Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Michael Weintraub – Nancy Brown – Steve Kraus – Bill Geary – Mary Lou Jepsen – Daniela Rus

Preferred registration rates available through Friday, June 23rd

 

Sensor-embedded respiration, feeding tubes in the ICU

FacebooktwitterlinkedinFacebooktwitterlinkedin

Art Medical‘s sensor-embedded feeding and respiratory  tubes provide continuous measurement of gastric reflux and, saliva. These secretions, currently measured manually in the ICU, can cause aspiration pneumonia and ventilator associated pneumonia, which can prolong hospital stays, or cause death.  Urine output is also measured, in an attempt to prevent kidney failure.

The company’s digestive, respiratory, and urinary sensors gather data in real time, which is processed in the cloud, and sent to the medical team automatically for immediate treatment.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab – featuring  Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Michael Weintraub – Nancy Brown – Steve Kraus – Bill Geary – Mary Lou Jepsen

Preferred registration rates available through Friday, June 23rd.

 

 

Adhesive patch + nose wearable detect sleep apnea

FacebooktwitterlinkedinFacebooktwitterlinkedin

Somnarus has developed a disposable, adhesive patch that detects obstructive sleep apnea at home.

The SomnaPatch is worn on the forehead, wth an addtional piece on the nose. It records nasal pressure, blood oxygen saturation, pulse rate, respiratory effort, body position and how long a patient is asleep.

An 174-patient study showed that results from the SomnaPatch matched standard in-lab polysomnography 87% of the time.

If the device is proven effective in larger studies, it could be a cheaper, more comfortable alternative to lab-based sleep studies.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab – featuring  Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Michael Weintraub – Nancy Brown – Steve Kraus – Bill Geary – Mary Lou Jepsen

Preferred registration rates available through Friday, June 9th.

BCI-controlled exoskeleton helps motor recovery in stroke

FacebooktwitterlinkedinFacebooktwitterlinkedin

Ipsihand, developed by Eric Leuthardt and Washington University colleagues, is a brain controlled glove that helps reroute hand control to an undamaged part of the brain.  The system uses a glove or brace on the hand, an EEG cap, and an amplifier.

One’s hands are controlled by the opposite side of the brain. If one hemisphere is damaged, it is difficult to control the other hand.

According to Leuthard, the idea of Ipsihand is that if one can “couple those motor signals that are associated with moving the same-sided limb with the actual movements of the hand, new connections will be made in your brain that allow the uninjured areas of your brain to take over control of the paralyzed hand.”

Ipsihand’s cap detects intention signals to open or close the hand, then the computer amplifies them. The brace then opens or closes in a pincer-like grip with the hand inside, bending the fingers and thumb to meet.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab – featuring  Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Michael Weintraub – Nancy Brown – Steve Kraus – Bill Geary

EEG detects infant pain

FacebooktwitterlinkedinFacebooktwitterlinkedin

Caroline Hartley and Oxford colleagues studied 72 infants during painful medical procedures.  Using EEG, they found a signature change in brain activity about a half-second after a painful stimulus. They seek to understand its use in monitoring and managing infant pain, as well as  the use of EEG in adult pain treatment.

EEG is more precise than current heart rate, oxygen saturation level, and facial expression pain assessment, which are affected by other stressful, non-painful events.

In one experiment, 11 out of 12 infants had a decreased pain-related EEG signal after doctors applied a topical anesthetic to their feet.  A new study uses EEG to test the efficacy of morphine in infants, whose skin and intestines absorb drugs differently than adults.

EEG is being miniaturized by companies such as Neurosteer, making it an increasingly viable option for continuous pain, attention, and consciousness monitoring and treatment optimization.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab – featuring  Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Michael Weintraub – Nancy Brown – Steve Kraus – Bill Geary

 

Graphene sensor detects asthma attacks early

FacebooktwitterlinkedinFacebooktwitterlinkedin

Azam Gholizadeh, Clifford Weisel, and Rutgers colleagues have created a graphene sensor for early molecular diagnosis of asthma attacks.  The goal is the development of wearables that will alert users to take medicine, as well as determine appropriate dosages.

Current non-invasive detection methods, such as spirometry, are limited in characterizing the nature and degree of airway inflammation, and require expensive, bulky equipment.

The miniaturized electrochemical sensor measures nitrite in exhaled breath condensate using reduced graphene oxide. Its rapid measurements can help asthma sufferers  determine if air pollutants are affecting them, to  better manage their medication and physical activity, and, hopefully, prevent complications, hospitalizations, and even deaths.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab – featuring  Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Michael Weintraub – Nancy Brown – Steve Kraus – Bill Geary

REGISTER BY JUNE 9 AND SAVE $400

 

Earbud sensor reportedly measures blood pressure, dehydration

FacebooktwitterlinkedinFacebooktwitterlinkedin
As health sensors become more discreet, and fused with commonly worn devices, Kyocera has integrated a tiny, optical sensor into its earbud.  The hybrid music/phone/health use wearable measures blood flow in hypodermal tissues using Laser Doppler velocimetry. It can monitor nerve and blood pressure, levels of dehydration, and possible signs of heat stroke.  Sleep monitoring can be done more accurately than with current devices, and the effect of music on brain states can also be studied.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda

REGISTER BY MAY 19TH AND SAVE $500

VR studied for PTSD, phobia treatment

FacebooktwitterlinkedinFacebooktwitterlinkedin

Emory’s Jessica Maples-Keller has published a study demonstrating the effectiveness of VR in treating PTSD, phobias, and other mental illnesses.  She describes the treatment as allowing “providers to create computer-generated environments in a controlled setting, which can be used to create a sense of presence and immersion in the feared environment for individuals suffering from anxiety disorders.”

Small studies on the use of VR in  panic disorder, schizophrenia, acute and chronic pain, addiction, and eating disorders have been done, but with limited numbers and a lack of comparison groups. Keller noted that extensive training is needed before integrating VR approaches into clinical practice.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda

REGISTER BY MAY 19TH AND SAVE $500

Deep learning driven prosthetic hand + camera recognize, automate required grips

FacebooktwitterlinkedinFacebooktwitterlinkedin

Ghazal Ghazai and Newcastle University colleagues have developed a deep learning driven prosthetic hand + camera system that allow wearers to reach for objects automatically.  Current prosthetic hands are controlled  via a user’s myoelectric signals, requiring learning, practice, concentration and time.

A convolutional neural network was trained it with images of  500 graspable objects, and taught  to recognize the grip needed for each. Objects were grouped by size, shape, and orientation, and the hand was programmed to perform four different grasps to accommodate them: palm wrist neutral (to pick up a cup); palm wrist pronated (to pick up the TV remote); tripod (thumb and two fingers), and pinch (thumb and first finger).

The hand’s camera takes a picture of the object in front of it, assesses its shape and size, picks the most appropriate grasp, and triggers a series of hand movements, within milliseconds.

In a small study of the technology, subjects successfully picked up and moved objects with an 88 per cent success rate.

The work is part of an effort to develop a bionic hand that senses pressure and temperature, and transmits the information to the brain.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda

Less invasive, soft tissue artificial retina

FacebooktwitterlinkedinFacebooktwitterlinkedin

Oxford University student Vanessa Restrepo-Schild is developing a synthetic, soft tissue retina that more closely resembles human tissue than current artificial retina technology.

Unlike existing implants, the cell-cultures are created from natural, biodegradable materials and do not contain foreign bodies or living entities. This makes the implant less invasive than a mechanical device, and less likely to have an adverse reaction on the body.

The technology has only been tested in a lab, therefore its usefulness with living tissues is unknown.  If successful, this could be a breakthrough for the visually impaired.

The retina replica consists of soft water droplets  and biological cell membrane proteins. Designed like a camera, the cells act as pixels, detecting and reacting to light to create a grey scale image. The synthetic material can generate electrical signals, which stimulate the neurons at the back of our eye, similar to an original retina.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Nathan Intrator