Earbud sensor reportedly measures blood pressure, dehydration

FacebooktwitterlinkedinFacebooktwitterlinkedin
As health sensors become more discreet, and fused with commonly worn devices, Kyocera has integrated a tiny, optical sensor into its earbud.  The hybrid music/phone/health use wearable measures blood flow in hypodermal tissues using Laser Doppler velocimetry. It can monitor nerve and blood pressure, levels of dehydration, and possible signs of heat stroke.  Sleep monitoring can be done more accurately than with current devices, and the effect of music on brain states can also be studied.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda

REGISTER BY MAY 19TH AND SAVE $500

VR studied for PTSD, phobia treatment

FacebooktwitterlinkedinFacebooktwitterlinkedin

Emory’s Jessica Maples-Keller has published a study demonstrating the effectiveness of VR in treating PTSD, phobias, and other mental illnesses.  She describes the treatment as allowing “providers to create computer-generated environments in a controlled setting, which can be used to create a sense of presence and immersion in the feared environment for individuals suffering from anxiety disorders.”

Small studies on the use of VR in  panic disorder, schizophrenia, acute and chronic pain, addiction, and eating disorders have been done, but with limited numbers and a lack of comparison groups. Keller noted that extensive training is needed before integrating VR approaches into clinical practice.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda

REGISTER BY MAY 19TH AND SAVE $500

Deep learning driven prosthetic hand + camera recognize, automate required grips

FacebooktwitterlinkedinFacebooktwitterlinkedin

Ghazal Ghazai and Newcastle University colleagues have developed a deep learning driven prosthetic hand + camera system that allow wearers to reach for objects automatically.  Current prosthetic hands are controlled  via a user’s myoelectric signals, requiring learning, practice, concentration and time.

A convolutional neural network was trained it with images of  500 graspable objects, and taught  to recognize the grip needed for each. Objects were grouped by size, shape, and orientation, and the hand was programmed to perform four different grasps to accommodate them: palm wrist neutral (to pick up a cup); palm wrist pronated (to pick up the TV remote); tripod (thumb and two fingers), and pinch (thumb and first finger).

The hand’s camera takes a picture of the object in front of it, assesses its shape and size, picks the most appropriate grasp, and triggers a series of hand movements, within milliseconds.

In a small study of the technology, subjects successfully picked up and moved objects with an 88 per cent success rate.

The work is part of an effort to develop a bionic hand that senses pressure and temperature, and transmits the information to the brain.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda

Less invasive, soft tissue artificial retina

FacebooktwitterlinkedinFacebooktwitterlinkedin

Oxford University student Vanessa Restrepo-Schild is developing a synthetic, soft tissue retina that more closely resembles human tissue than current artificial retina technology.

Unlike existing implants, the cell-cultures are created from natural, biodegradable materials and do not contain foreign bodies or living entities. This makes the implant less invasive than a mechanical device, and less likely to have an adverse reaction on the body.

The technology has only been tested in a lab, therefore its usefulness with living tissues is unknown.  If successful, this could be a breakthrough for the visually impaired.

The retina replica consists of soft water droplets  and biological cell membrane proteins. Designed like a camera, the cells act as pixels, detecting and reacting to light to create a grey scale image. The synthetic material can generate electrical signals, which stimulate the neurons at the back of our eye, similar to an original retina.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Nathan Intrator

 

Transparent, stretchable lens sensor for diabetes, glaucoma detection

FacebooktwitterlinkedinFacebooktwitterlinkedin
UNIST professors Jang-Ung Park, Chang Young Lee and Franklin Bien, and KNU professors Hong Kyun Kim and Kwi-Hyun Bae, have developed a contact lens sensor to monitor biomarkers for intraocular pressure, diabetes mellitus, and other health conditions. Several attempts have been  made to monitor diabetes via glucose in tears.  The challenge has been poor wearability, as the electrodes used in existing smart contact lenses are opaque, obscuring  one’s view.  Many wearers also complained of significant discomfort from the lens-shaped firm plastic material. The research team addressed this by developing a sensor based on transparent, stretchable, flexible materials  graphene sheets and metal nanowires. This allowed the creation of lenses comfortable and accurate enough for eventual self-monitoring of glucose levels and eye pressure. Patients can transmit their health information through an embedded wireless antenna in the leans, allowing real-time monitoring  The system uses  the wireless antenna to read sensor information, eliminating the need for a separate power source.

Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Nathan Intrator

Wall sensor monitors walking speed, stride to track health

FacebooktwitterlinkedinFacebooktwitterlinkedin

MIT’s Dina KatabiChen-Yu Hsu, and colleagues have developed WiGait, a wall sensor that detects walking speed and stride to monitor health. This builds on previous MIT research which showed that radio signals could track breathing and heart rate, without wearables.

The  system works by transmitting low-power radio signals and analyzing how they reflect off  bodies within a radius of 9 to 12 meters. Machine learning algorithms separated walking periods from other activities and found the stable phase within each walking period.  The sensor, when combined with wearable devices, could also track Parkinson’s and MS symptoms, and help predict health events related to  heart failure,  lung disease, kidney failure, and stroke, as well as the risk of falls. Caregivers could also be notified in emergencies.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Nathan Intrator

Tetraplegic patient moves arm with thoughts via BCI/FES system

FacebooktwitterlinkedinFacebooktwitterlinkedin

Bolu Ajiboye and Case Western colleagues used an implanted  BrainGate2 brain-computer interface to allow a tetraplegia patient to control arm movements via an implanted FES muscle stimulation system.  A robotic arm, which was needed in previous BrainGate experiments,  was no longer required.

Neural activity was recorded from two 96-channel microelectrode arrays implanted in the motor cortex. The implanted brain-computer interface translated recorded brain signals into specific command signals that determine the amount of stimulation to be applied to each functional electrical stimulation electrode in the hand, wrist, arm, elbow and shoulder, and to a mobile arm support.

The researchers first exercised patient’s arm and hand with cyclical electrical stimulation patterns. Over 45 weeks, his strength, range of motion. and endurance improved. He then learned how to use his own brain signals to move a virtual-reality arm on a computer screen. After the 36 electrode FES system was implanted, he was able to make each joint in his right arm move individually. Or,  by thinking about a task such as feeding himself or getting a drink, the muscles are activated in a coordinated fashion.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Nathan Intrator

Sweat sensor for cystic fibrosis detection, drug optimization, glucose monitoring

FacebooktwitterlinkedinFacebooktwitterlinkedin

Carlos Milla, Ronald Davis, and Stanford and Berkeley colleagues have developed a wearable sweat sensor for detecting cystic fibrosis, diabetes and other diseases.  It can also aid drug development and personalization, and continuously monitor patients.

The flexible sensor/microprocessor system adheres to the skin, stimulates sweat glands, and detects the presence of molecules and ions based on electrical signals.  Increased chloride generates increased electrical voltage at the sensor’s surface.  High levels of chloride ions indicate cystic fibrosis.

Conventional methods for diagnosing cystic fibrosis require a visit to a specialized center, where a patient does not move for 30 minutes, while electrodes stimulate their sweat glands. A lab then measures chloride ions in the sweat to diagnose the disease. This method hasn’t changed in 70 years.

The wearable sweat sensor stimulates skin to produce minute amounts of sweat, quickly evaluates the contents, and sends the data via phone, to the cloud, for immediate analysis.   The system is portable and self-contained, and ideal for use in children, and in underserved communities.

As CF is caused by hundreds of different mutations in the CF gene,  it’s possible to use the sensor to determine which drugs work best for which mutations.

The device was also used to compare levels of glucose in sweat to that in blood.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Nathan Intrator

 

Robotic leg brace helps stroke patients walk

FacebooktwitterlinkedinFacebooktwitterlinkedin

Toyota’s Welwalk WW-1000 exoskeleton is designed to help those with paralysis on one side of their body walk again. The frame is worn on the affected leg, with a  motor at the knee joint that provides calibrated assistance based on a user’s ability.  Wearers are trained to recover their walking ability over time.

The robotic device is paired with a treadmill and harness that is controlled by medical staff. The  system will be rented to hospitals in Japan for $9000, plus $3200 per month.

The hope is that it will dramatically speed recovery time for stroke patients. The  brace integrates sensors that determine exactly how much support to provide  at any given point, ensuring that patients aren’t over-reliant on support, or rushed before they’re ready to progress.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Nathan Intrator

Verily’s health sensing research watch

FacebooktwitterlinkedinFacebooktwitterlinkedin

The  Verily Study Watch passively captures health data for continuous care platforms and clinical research. Key features described by the company include:

  • Multiple physiological and environmental sensors are designed to measure relevant signals for studies spanning cardiovascular, movement disorders, and other areas. Examples include electrocardiogram (ECG), heart rate, electrodermal activity, and inertial movements.
  • A long battery life of up to one week in order to drive better user compliance during longitudinal studies.
  • Large internal storage and data compression allow the device to store weeks’ worth of raw data, thus relaxing the need to frequently sync the device.
  • A powerful processor supports real time algorithms on the device.
  • The firmware is designed to be robust for future extensions, such as over-the-air updates, new algorithms, and user interface upgrades.
  • The display is always on so that time is always shown. The display is low power and high resolution for an appealing look and a robust user interface. Note: currently, only time and certain instructions are displayed. No other information is provided back to the user.

The watch will be used in Verily’s current and forthcoming studies, such as the  Personalized Parkinson’s Project, meant to track progression, and the Baseline study, meant to understand transitions between health and disease.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston – Featuring: Joi Ito, Ed Boyden, Roz Picard, George Church, Tom Insel, John Rogers, Jamshid Ghajar, Phillip Alvelda and Nathan Intrator – September 19, 2017 at the MIT Media Lab