Category Archives: Eyes

Diabetic retinopathy-detecting algorithm for remote diagnosis

FacebooktwitterlinkedinFacebooktwitterlinkedin

Google has developed an algorithm which it claims is capable of detecting diabetic retinopathy in photographs.  The goal is to improve the quality and availability of screening for, and early detection of,  the common and debilitating condition.

Typically, highly trained specialists are required to examine photos, to detect the lesions that indicate bleeding and fluid leakage in the eye. This obviously makes screening difficult in poor and remote locations.

Google developed a dataset of 128,000 images, each evaluated by 3-7 specially-trained doctors, which trained  a neural network to detect referable diabetic retinopathy.  Performance was tested on two clinical validation sets of 12,000 images. The majority decision of a panel 7 or 8 ophthalmologists served as the reference standard. The results showed that the accuracy of the  Google  algorithm was equal to that of the physicians.


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Krishna Shenoy – Karl Deisseroth – Shahin Farshchi – Casper de Clercq – Mary Lou Jepsen – Vivek Wadhwa – Dirk Schapeler – Miguel Nicolelis

Fully transparent, glucose monitoring contact lens

FacebooktwitterlinkedinFacebooktwitterlinkedin

Oregon State’s Greg Herman has developed a transparent sensor to monitor glucose (via tears) in a contact lens.  The device could also be used to control insulin infusions, by transmitting real-time data to a pump.

Similar technology has been developed by Google, although their lens is not (currently) fully transparent, and Noviosense, which requires a user to insert a device in the lower lid.

Herman believes that the lens sensor could also be used to monitor stress hormones, uric acid, and  ocular pressure in glaucoma.


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Krishna Shenoy – Karl Deisseroth

Eye tracking + VR to improve brain injury diagnosis, track recovery

FacebooktwitterlinkedinFacebooktwitterlinkedin

Eye tracking technology, combined with VR, is proliferating, with myriad medical, gaming, and education applications.

SyncThink uses eye tracking, built into an Oculus Rift,  to detect if a person has the ability to keep

the eyes synced with moving objects, to determine brain injury and track recovery.

The company has been granted 10 patents, for  eye-tracking hardware, and analytical techniques for stimulating, measuring, and training brain attention networks. It has been used to detect concussions on the field and evaluate soldier readiness and brain impairment after injury. The company describes additional applications including characterizing and monitoring fatigue, performance, and developmental or neurodegenerative conditions.

Eyefluence, which was today acquired by Google, creates head-mounted display AR, VR, and mixed reality interfaces. According to the company,  its AR application allows critical care professionals to access patient data with their eyes while their hands treat the injured.  VR integrations humanize experiences, reduce nausea, optimize image resolution, and increase speed.

ApplySci believes that the next step in AR/VR enhancement is integrating mobile EEG into headsets, combining eye tracking, GSR, and  brainwave data into various applications.


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Krishna Shenoy – Karl Deisseroth

Contact lens/eyeglass system monitors blood sugar, dispenses drugs

FacebooktwitterlinkedinFacebooktwitterlinkedin

Pohang University’s Sae Kwang Han and Do Hee Keum have developed a contact lens/ eyeglass combination  to monitor diabetes and dispense drugs as needed.  The glasses wirelessly power and communicate with the drug-releasing lens, that monitors glucose concentration in tears.  An LED alarm lights up when sugar levels are very high. The lens can be worn for one month.

A user can  tell the eyeglasses to send a drug-releasing signal to the chip with voice commands. A control circuit is being created to automate the process,  deciding independently when medicine is needed. To release drugs, the chip draws on one of ten drug reservoirs chambers that are carved into the hydrogel, and covered with a thin gold electrode membrane. The voltage dissolves the membrane and releases the drug.


Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

Vision enhancing, injectable smart lens

FacebooktwitterlinkedinFacebooktwitterlinkedin

Google has filed a patent for a vision enhancing, injectable smart lens. The “intraocular device” is meant to replace one’s natural lens, and focus light onto the eye’s retina. It is injected into the eye in a solution that congeals and attaches to the lens capsule.   It will contain its own storage, and tiny radio components that will communicate with an external processor. An energy harvesting antenna will power the device wirelessly.


Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

Voice controlled wearable supports sight-impaired mobility

FacebooktwitterlinkedinFacebooktwitterlinkedin

Toyota’s Project BLAID is a  camera based assistive device concept, meant to help the visually impaired identify bathrooms, escalators, stairs, elevators, doors, signs, and logos.

The wearable, which is in an early stage of development, is worn on the shoulders, wrapped around the neck, like an electronic scarf.  It will be controlled by voice commands, and relay information via audio and haptic cues.  It can also be paired with a phone via Bluetooth. The company plans to include mapping, object identification, and facial recognition technologies at a later stage.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

Self-adjusting lenses adapt to user needs

FacebooktwitterlinkedinFacebooktwitterlinkedin

DeepOptics is developing is vision-enhancing wearable lenses, with sensors that gauge viewing distance, and precisely adjust the lenses to bring an object into focus.

Electronic volts are sent into three layered liquid crystal lenses, changing the refractive index to provide the specific optical compensation needed to correct vision in every situation.

The company also believes that its technology can offer VR/AR devices the ability to deliver better experiences.

Click to view the DeepOptics video:


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

First human optogenetics vision trial

FacebooktwitterlinkedinFacebooktwitterlinkedin

Retina Foundation of the Southwest scientists, in a study sponsored by Retrosense Therapeutics, will for the first time use optogenetics — a combination of gene therapy and light to  control nerve cells – in an attempt to restore human sight.  Previously, optogenetic therapies were only tested on mice and monkeys.

Viruses with DNA from light-sensitive algae will be injected into the eye’s ganglion cells, which transmit signals from the retina to the brain, in an attempt to make them directly responsive to light.  15 legally blind patients will participate in the study, which was first reported by the MIT Technology Review.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

Smart contact lens could detect Glaucoma progression

FacebooktwitterlinkedinFacebooktwitterlinkedin

Columbia University’s C. Gustavo De Moraes has developed a contact lens sensor that can detect Glaucoma progression by constantly monitoring intraocular pressure.  Doctors check eye pressure, but the measurement is not continuous, and not performed at night, when eye pressure typically rises.

As eye pressure fluctuates, lens curvature changes.  The sensor sends a signal to a wireless device that records it, and shows pressure changes over time.

In a recent study, 40  open-angle glaucoma patients wore the smart  lens for 24 hours, while awake and asleep. Patients with steeper overnight spikes and a greater number of peaks in their signal profile usually had faster glaucoma progression.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the New York Academy of Sciences

NeuroTech San Francisco – April 6, 2016 @ the New York Academy of Sciences

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences

 

Coating enhances smart contact lens capabilities

FacebooktwitterlinkedinFacebooktwitterlinkedin

Google and others are developing smart contact lenses meant to be the next wave of wearables.  To broaden and enhance their capabilities, Drew Evans of the  University of South Australia has created  a biocompatible, conducting, nanoscale polymer lens coating.   Potential applications include  visual assistance through electronic displays and noninvasive glucose  measurement through sensors.


Wearable Tech + Digital Health San Francisco – April 5, 2016 @ the Mission Bay Conference Center

NeuroTech San Francisco – April 6, 2016 @ the Mission Bay Conference Center

Wearable Tech + Digital Health NYC – June 7, 2016 @ the New York Academy of Sciences

NeuroTech NYC – June 8, 2016 @ the New York Academy of Sciences