Portable, on-demand biopharmaceutical manufacturing


Tim Lu and MIT colleagues are developing a microfluidic/programmable yeast system to produce small batches of pharmaceuticals on demand.  The project, which could deliver biologic medications, including vaccines, to remote locations, for both military and civilian use,  was funded by DARPA.

The team  cultivated  Pichia pastoris, a programmable yeast that grows densely and is able to express large amounts of protein. It was genetically modified to allow for the production of multiple medications.

The hand held bioreactor contains a microfluidic chip. Liquid laced with chemical triggers is fed into the device, and mixed with the yeast. Temperature, oxygen, and pH levels are monitored to ensure an ideal environment for protein production.

In one day,  a near single dose of biopharmaceuticals was produced. The process could then start over, with a different liquid trigger, using the same yeast, once the previous fluid has been flushed out.

The team believes that this work could one day  lead to a “small-scale, portable, and fully integrated personal biomanufacturing platform that could advance disease treatment at point-of-care.”

An earlier portable pharmaceutical system was introduced by Allan Myerson last year.

Click to view Nature paper.

Digital Health + NeuroTech Silicon Valley – February 7-8, 2017 @ Stanford University

NFC powered ultra-thin health monitoring patch


Powering wearables for efficient, long term, continuous, use remains a challenge.

Illinois professor John Rogers has, again, disrupted himself.   His new, stretchable, ultra-thin, health monitoring patches are wirelessly powered via smartphone near field communication.  This enables the devices to be 5-10 times thinner than before —  increasing comfort and therefore the willingness of people to wear them.

LEDs illuminate the skin. Some light is absorbed, and reflected light is picked up by light sensors. Data is transmitted to an external device. Ultraviolet ray exposure can be measured (see the MC10/L’Oreal UV monitor) as well as skin temperature.

In a recent study, the NFC powered patch was used to monitor heart rate, blood oxygen level, skin temperature, ultraviolet radiation exposure, and changes in skin color.

Click to view ApplySci’s coverage of John Rogers’ work, which can also be seen in the presentations of many of our conference speakers.

Digital Health + NeuroTech Silicon Valley – February 7-8, 2017@ Stanford University

GSK/Verily “biolectronic medicine” partnership for disease management


Galvani Biolectronics is a Verily/GSK company, created to accelerate the research, development and commercialization of bioelectronic medicines. The goal is to find solutions to manage chronic diseases, such as arthritis, diabetes, and asthma, using  miniaturized electronics.  Implanted devices would  modify electrical signals that pass along nerves, including irregular impulses that occur in illness.

Initial work will focus on developing precision devices for inflammatory, metabolic and endocrine disorders, including type 2 diabetes, where substantial evidence already exists in animal models.

Every major pharmaceutical company (globally) attended  ApplySci’s recent Wearable Tech + Digital Health + NeuroTech conferences in San Francisco and New York.  We believe that partnerships similar to the Verily/GSK venture will proliferate — and that they will improve the lives of those with chronic diseases.

Digital Health + NeuroTech Silicon Valley – February 7-8, 2017 @ Stanford University

“Placenta on a chip” to study pre-term birth


Penn researchers have developed a “placenta-on-a-chip” to model the transport of nutrients across the placental barrier.  It will be used in studies to identify causes of  and prevention methods for dangerous preterm birth. (Lungs, intestines, and eyes “on chips” are similarly being used for research.)

The underlying mechanisms of pre-term birth studies currently rely on experimenting with intact, living human placentae.  This method is limited by complexity, the scarcity of samples, and the limited time the tissue is viable.

The device contains two layers of human cells that model the interface between mother and fetus. Microfluidic channels on both sides allow the study of how molecules are transported through, or are blocked by, that interface.

Wearable Tech + Digital Health + NeuroTech Silicon Valley – February 7-8, 2017 @ Stanford University


Robot assesses, assists dementia patients


Ludwig is a University of Toronto – built robot meant to assist seniors with cognitive issues.

“He” stands in front of a person, displays a picture on a screen, and asks the viewer to describe what he or she sees. Ludwig then interprets a user’s condition, including engagement, happiness or anxiety, and behavior changes over time.

With in-ear microphones, in-eye cameras, and feet embedded sensors, he tracks one’s stare, body movement, intonation and choice of words.  Based on these factors, speech recognition technology provides an analysis of cognitive health.

The second generation robot was originally built to guide seniors around their own homes, but was not interactive.

The new system will be piloted in a senior living facility in Canada next month.  Ludwig will be placed  in a common room so that residents can approach him casually.  If successful, a robot like this could help seniors age in place, or assist those with brain injuries and diseases in managing the activities of daily living.

Join ApplySci at Wearable Tech + Digital Health + NeuroTech Silicon Valley – February 7-8, 2017 @Stanford University

Keith Black on tumor treatment innovation, early Alzheimer’s detection, predictive medicine


Keith Black, MD, Chairman and Professor, Department of Neurosurgery at Cedars-Sinai, was a keynote speaker at ApplySci’s recent NeuroTech San Francisco conference.

Click to view his interview with StartUp Health’s Unity Stoakes at the event, where he discussed brain tumor treatment innovation, early Alzheimer’s diagnosis, wearables, and predictive medicine.  Dr. Black’s brilliance is equaled only by his compassion for patients — and ApplySci was truly honored to include him in the conference.

Join ApplySci at Wearable Tech + Digital Health + NeuroTech Silicon Valley – February 7 – 8, 2017 @ Stanford University

MRI, fMRI, task-based MRI, diffusion imaging combined for highly precise brain map


A brain map that includes more than doubled the number of distinct areas known in the human cortex, from 83 to 180, has been published. It combines data from four imaging technologies to  bring high-definition to brain scanning. Washington University’s Matthew Glasser and David Van Essen led the global research team.

1200 young adult brains were scanned using MRI, which reveals the structure of the brain; fMRI, which registers brain activity while resting; task-based fMRI, which registers activity while engaged in mental exercises; and diffusion imaging, which reveals the paths of neurons.

By aligning the brain areas using the combined scanning protocol, an extraordinary degree of precision was achieved.

Potential applications include identifying biological markers for  neurological diseases and mental illnesses, and the ability for neurosurgeons to better define tissue.

Click to view Nature video

Join ApplySci on February 7-8 at Stanford University for Digital Health + NeuroTech Silicon Valley 2017

Implanted thread provides real-time diagnostic data


Tufts University researchers have created  a thread-based diagnostic platform to provide real-time health data for implanted devices and wearables.

Thread-integrated nano-scale sensors, electronics and microfluidics can be sutured through multiple layers of tissue . Measures of tissue health (pressure, stress, strain and temperature), pH and glucose levels are collected.  Results are transmitted wirelessly.

The system can be used to determine how a wound is healing, whether infection is emerging, or whether the body’s chemistry is out of balance.

The three-dimensional platform can conform to  organs, wounds or orthopedic implants. Previously, substrate structures for implantable devices have  been two-dimensional, limiting their use to flat tissue, such as skin.

Join ApplySci at Digital Health + NeuroTech Silicon Valley 2017 – February 7-8 @ Stanford University

Algorithm detects depression in speech


USC researchers are using machine learning to diagnose depression, based on speech patterns. During interviews, SimSensei detected reductions in vowel expression that might be missed by human interviewers.

The depression-associated speech variations have been documented in past studies.  Depressed patient speech can be flat, with reduced variability, and monotonicity. Reduced speech, reduced articulation rate, increased pause duration, and varied switching pause duration were also observed.

253 adults used the method to be  tested for “vowel space,” which was significantly reduced in subjects that reported symptoms of depression and PTSD.  The researchers believe that vowel space reduction could also be linked to schizophrenia and Parkinson’s disease, which they are investigating.

Join us at ApplySci’s 6th WEARABLE TECH + DIGITAL HEALTH + NEUROTECH conference – February 7-8, 2017 @ Stanford University

Andreas Weigend on Data for the People


Of all the data we create and share, perhaps none is more important — or more sensitive —  than data about our health.  The wearable tech revolution has given us, as patients and individuals, control – but we also must think about what we, and others, do with the data that we collect.

Andreas Weigend, author of Data for the People, Professor at Stanford and Berkeley, Director of the Social Data Lab, and former Chief Scientist at Amazon,  keynoted the recent NeuroTech San Francisco conference.   Following is a link to his interview with Unity Stoakes of StartUp Health, where he discusses the reality of social data, including his 5 rules:

  • The right to access (one’s data)
  • The right to amend
  • The right to blur
  • The right to play
  • The right to port

Click to view Dr. Andreas Weigend’s interview on StartUp Health NOW, recorded on April 6th, 2016 at Wearable Tech + Digital Health + NeuroTech San Francisco

Join ApplySci at the 6th Wearable Tech + Digital Tech + NeuroTech Silicon Valley conference – February 7-8, 2017 @ Stanford University