DARPA-developed closed loop therapies for neuropsychiatric illness

Led by Justin Sanchez, DARPA’s SUBNETS program develops responsive, adaptable, personalized closed-loop therapies for neuropsychiatric illness that incorporate recording and analysis of brain activity with near-real-time neural stimulation to correct or mitigate brain dysfunction.

The technology detects ongoing dynamic changes in brain activity associated with fluctuations in mood, and uses the data to deliver precisely timed therapeutic stimulation.

The premise is that brain function and dysfunction — rather than being relegated to distinct anatomical regions of the brain — play out across distributed neural systems. By understanding what healthy brain activity looks like across these sub-networks, compared to unhealthy brain activity, and identifying predictive biomarkers that indicate changing state, DARPA is developing interventions that maintain a healthy brain state within a normal range of emotions. 

Three recent papers show that decoding technology can predict changes in mood from recorded neural signals; a brain sub-network appears to contribute to depression, especially in those with anxiety; and moderate to severe depression symptoms can be alleviated using open-loop neural stimulation delivered to the orbitofrontal cortex to modulate a sub-network that contributes to depression. 

This work is inspired by Sanchez’s commitment to finding better treatments for the millions of veterans who suffer from neuropsychiatric illness, which have been limited by a lack of a mechanistic understanding of how these illnesses manifest in the brain.

These findings encompass key discoveries and technologies to enable the SUBNETS goal of a closed-loop system that can detect ongoing dynamic changes in brain activity associated with fluctuations in mood, and that can use this information to deliver precisely timed therapeutic stimulation to improve brain function in individuals living with neuropsychiatric illnesses.


Join ApplySci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22 at Stanford University — Featuring:  Zhenan BaoChristof KochVinod KhoslaWalter Greenleaf – Nathan IntratorJohn MattisonDavid EaglemanUnity Stoakes Shahin Farshchi Emmanuel Mignot Michael Snyder Joe Wang – Josh Duyan – Aviad Hai Anne Andrews Tan Le – Anima Anandkumar – Hugo Mercier

EEG identifies cognitive motor dissociation

Nicholas Schiff and Weill Cornell colleagues have developed an EEG-based method for measuring the delay in brain processing of continuous natural speech in patients with severe brain injury. Study results correlated with fMRI obtained evidence, commonly used to identify the capacity to perform cognitively demanding tasks. EEG can be used for long periods, and is cheaper and more accessible than fMRI.

This type of monitoring can identify patients with severe brain injury who have preserved high-level cognition despite showing limited or no consciousness.

According to Schiff: “This approach may be a more effective and efficient method for initially identifying patients with severe brain injuries who are very aware but are otherwise unable to respond, a condition called cognitive motor dissociation.”


Join ApplySci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22 at Stanford University — Featuring:  Zhenan BaoChristof KochVinod KhoslaWalter Greenleaf – Nathan IntratorJohn MattisonDavid EaglemanUnity Stoakes Shahin Farshchi Emmanuel Mignot Michael Snyder Joe Wang – Josh Duyan – Aviad Hai Anne Andrews Tan Le – Anima Anandkumar – Hugo Mercier

Thought controlled tablets

The BrainGate/Brown/Stanford/MGH/VA consortium has published a study describing three teraplegic patients who were able to control an off the shelf tablet with their thoughts. They surfed the web, checked the weather and shopped online. A musician played part of Beethoven’s “Ode to Joy” on a digital piano interface.

The BrainGate BCI included a small implant that detected and recorded signals associated with intended movements produced in the motor cortex. Neural signals were routed to a Bluetooth interface that worked like a wireless mouse, which was paired to an unmodified tablet.

Participants made up to  22 point-and-click selections per minute while using several apps. They typed up to 30 characters per minute with standard email and text interfaces.

The researchers believe that the technology can open new lines of communication between brain disorder patients and their caregivers.

Click to view BrainGate video


Join ApplySci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22 at Stanford University — Featuring:  Zhenan BaoChristof KochVinod KhoslaWalter Greenleaf – Nathan IntratorJohn MattisonDavid EaglemanUnity Stoakes Shahin Farshchi Emmanuel Mignot Michael Snyder Joe Wang – Josh Duyan – Aviad Hai Anne Andrews Tan Le – Anima Anandkumar – Hugo Mercier

Thought controlled television

Samsung and EPFL researchers, including Ricardo Chavarriaga, are developing Project Pontis, a BCI system meant to allow the disabled to control a TV with their thoughts.

The prototype uses a 64 sensor headset plus eye tracking to determine when a user has selected a particular movie. Machine learning is used to build a profile of videos one is interested in, allowing future content suggestions.  The user ultimately makes a selection using eye tracking.  The team is now working on a system that relies on brain signals alone for users who aren’t able to control their eyes or other muscles reliably,

Click to view Samsung video


Join ApplySci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22 at Stanford University — Featuring:  Zhenan BaoChristof KochVinod KhoslaWalter Greenleaf – Nathan IntratorJohn MattisonDavid EaglemanUnity Stoakes Shahin Farshchi Emmanuel Mignot Michael Snyder Joe Wang – Josh Duyan – Aviad Hai Anne Andrews Tan Le – Anima Anandkumar

Sensor patch monitors blood oxygen levels anywhere in the body

Ana Claudia Arias and Berkeley colleagues have developed a flexible, adhesive sensor that maps blood-oxygen levels over large areas of skin, tissue and organs, making it possible to monitor wound healing in real time, or oxygen levels in transplanted organs. It can also be used to continuously monitor blood oxygen levels in diabetes, respiration diseases and  sleep apnea.

The device is made of an array of alternating red and near-infrared organic LEDs and organic photodiodes, printed on bendable plastic that molds to the the body. Unlike fingertip oximeters, which measure oxygen levels at a single point, it can detect blood-oxygen levels at nine points in a grid and can be placed anywhere on the skin.


Join ApplySci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22 at Stanford University — Featuring:  Zhenan BaoChristof KochVinod KhoslaWalter Greenleaf – Nathan IntratorJohn MattisonDavid EaglemanUnity Stoakes Shahin Farshchi Emmanuel Mignot Michael Snyder Joe Wang – Josh Duyan – Aviad Hai Anne Andrews Tan Le

Focused ultrasound thalamotomy in Parkinson’s Disease

UVA’s Scott Sperling and Jeff Elias, who already used focused ultrasound to treat essential tremor, have just published the results of  a small study showing the efficacy of the technology in Parkinson’s Disease.

The sound waves were shown to interrupt brain circuits responsible for the uncontrollable shaking associated with the disease. The researchers claim that their study also offers “comprehensive evidence of safety” in its effect on mood, behavior and cognitive ability, which has not previously been studied.

According to Sperling, “In this study, we extended these initial results and showed that focused ultrasound thalamotomy is not only safe from a cognitive and mood perspective, but that patients who underwent surgery realized significant and sustained benefits in terms of functional disability and overall quality of life.”

27 adults with severe Parkinson’s tremor that had not responded to previous treatment were divided  into two groups. Twenty received the procedure, and a control group of seven (who were later offered the procedure) did not. Participants reported improved quality of life, including their ability to perform simple daily tasks, emotional wellbeing, and a lessened sense of stigma due to their tremor, at both three and twelve months.

The team found that mood and cognition, and the ability to go about daily life, ultimately had more effect on participants’ assessment of their overall quality of life than did remor severity or the amount of tremor improvement.

Cognitive decline was seen in some participants after the study, in that they were less able to name colors and think of and speak words. The cause of this was unclear, and must be investigated. The researchers suggested this could be a result of the natural progression of Parkinson’s.


Join ApplySci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22 at Stanford University — Featuring:  Zhenan BaoChristof KochVinod KhoslaWalter Greenleaf – Nathan IntratorJohn MattisonDavid EaglemanUnity Stoakes Shahin Farshchi Emmanuel Mignot Michael Snyder Joe Wang – Josh Duyan – Aviad Hai Anne Andrews Tan Le

Wearable sensor monitors shunt function in hydrocephalus

Northwestern’s John Rogers has created another minimal, flexible, wireless, adhesive wearable — this time to help hydrocephalus patients manage their condition.

The band-aid like sensor determines whether a shunt is working properly.

Shunts often fail.  When this happens, a patient can experience headaches, nausea and low energy, and must go to a hospital immediately.  However, a patient can have similar symptoms with a properly working shunt. The wearable determines, in five minutes, if the shunt is functioning, and if it is, a patient could avoid a hospital visit, CT, MRI, and potential surgery to determine the shunt’s functionality.

Click to view Northwestern University video


Join ApplySci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22 at Stanford University — Featuring:  Zhenan BaoChristof KochVinod KhoslaWalter Greenleaf – Nathan IntratorJohn MattisonDavid EaglemanUnity Stoakes Shahin Farshchi Emmanuel Mignot Michael Snyder Joe Wang – Josh Duyan – Aviad Hai Anne Andrews Tan Le

Minimally invasive sensor detects electrical activity, optical signals in brain for MRI

MIT’s Aviad Hai has developed a minimally invasive sensor to detect electrical activity or optical signals in the brain for MRI. No power source is needed, as radio signals that an external MRI scanner emits power the sensor.  It is implanted but does not require a wired connection to the brain. The researchers believe that it could also be adapted to measure glucose or other chemicals.

The team previously developed MRI sensors to detect calcium, serotonin and dopamine. The new sensor is meant to replace current electrical activity monitoring, which is extremely invasive, and can cause tissue damage.

Hai and colleagues shrank a radio antenna down to a few millimeters, so that it could be implanted directly into the brain to receive radio waves generated by water in the tissue.

The sensor is first tuned to the same frequency as the radio waves emitted by the hydrogen atoms. When an electromagnetic signal is detected, its tuning changes and  it no longer matches the hydrogen atom frequency. A weaker image then arises when the sensor is scanned by an external MRI machine.

In a study, the sensors were able to pick up electrical signals similar to those produced by action potentials or local field potentials.

Hai plans to further miniaturize the sensor, to enable multiple injections, to image light or electrical fields over a larger brain area.

Dr. Hai will discuss this work at ApplySci’s Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22 at Stanford University


Join ApplySci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22 at Stanford University — Featuring:  Zhenan BaoChristof KochVinod Khosla – Nathan IntratorJohn MattisonDavid EaglemanUnity Stoakes Shahin Farshchi – Emmanuel Mignot – Michael Snyder – Joe Wang – Josh Duyan – Aviad Hai

Brain-to-brain communication interface

Rajesh Rao and University of Washington colleagues have developed BrainNet, a non-invasive direct brain-to-brain interface for multiple people.  The goal is a social network of human brains for problem solving. The interface combines EEG to record brain signals and TMS to deliver information to the brain, enabling 3 people to collaborate via direct brain-to-brain communication.

In a recent study, two of the three subjects were “Senders.” Their brain signals were decoded with real-time EEG analysis to extract decisions about whether to rotate a block in a Tetris-like game before it is dropped to fill a line. The Senders’ decisions were sent via the Internet to the brain of a third subject, the “Receiver.”  Decisions were delivered to the Receiver’s brain via magnetic stimulation of the occipital cortex. The Receiver integrated the information and made a decision, using an EEG interface, to either turn a block or keep it in the same position.  A second round of the game gave Senders another chance to validate and provide feedback to the Receiver’s action.


Join ApplySci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22 at Stanford University — Featuring:  Zhenan BaoChristof KochVinod Khosla – Nathan IntratorJohn MattisonDavid EaglemanUnity Stoakes Shahin Farshchi

Wearable system detects postpartum depression via baby/mother interaction

Texas professor Kaya de Barbaro is creating a mother-child wearable system to detect and attempt to prevent postpartum depression. Mother stress levels are measured via heart rythm, and encouraging messages are sent.  Mom wears the sensor on her wrist, and baby wears it on her/his ankle. The child’s sensor collects heart rate and movement data, which is correlated with the mother’s reaction.  Audio is recorded to track crying. Mothers receive messages, including “great job” and “take a breather” when stress is sensed via a faster heart beat, in an attempt to limit feelings of isolation.


Join ApplySci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22 at Stanford University — Featuring:  Zhenan BaoChristof KochVinod Khosla – Nathan IntratorJohn MattisonDavid EaglemanUnity Stoakes Shahin Farshchi