AI detects bowel cancer in less than 1 second in small study

FacebooktwitterlinkedinFacebooktwitterlinkedin

Yuichi Mori and Showa University colleagues haved used AI to identify bowel cancer by analyzing colonoscopy derived polyps in less than a second.

The  system compares a magnified view of a colorectal polyp with 30,000 endocytoscopic images. The researchers claimed  86% accuracy, based on a study of 300 polyps.

While further testing the technology, Mori said that the team will focus on creating a system that can automatically detect polyps.

Click to view Endoscopy Thieme video


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University. Speakers include:  Vinod Khosla – Justin Sanchez – Brian Otis – Bryan Johnson – Zhenan Bao – Nathan Intrator – Carla Pugh – Jamshid Ghajar – Mark Kendall – Robert Greenberg – Darin Okuda – Jason Heikenfeld – Bob Knight – Phillip Alvelda

3D neuron reconstruction reveals electrical behavior

FacebooktwitterlinkedinFacebooktwitterlinkedin

Christof Koch and Allen Institute colleagues  have created 3D computer reconstructions of living human brain cells using discarded surgical tissue.  As the tissue is still alive when it reaches the lab, the virtual cells are able to capture electrical signals, in addition to cell shape and anatomy.

This is the first time that scientists have been able to study the electrical behavior of living brain cells in humans.

Koch believes that this will enhance our understanding of how brain diseases, including Alzheimer’s and schizophrenia, impact the behavior of brain cells.

The institute has captured electrical data from 300 living neurons, taken from 36 patient brains.  100 cells have been reconstructed in 3D.  Genetic information about some of the cells will eventually be added to the database.

Click to view the  Allen Institute video.


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University. Speakers include:  Vinod Khosla – Justin Sanchez – Brian Otis – Bryan Johnson – Zhenan Bao – Nathan Intrator – Carla Pugh – Jamshid Ghajar – Mark Kendall – Robert Greenberg – Darin Okuda – Jason Heikenfeld – Bob Knight

Machine learning improves breast cancer detection

FacebooktwitterlinkedinFacebooktwitterlinkedin

MIT’s Regina Barzilay has used AI to improve breast cancer detection and diagnosis. Machine learning tools predict if a high-risk lesion identified on needle biopsy after a mammogram will upgrade to cancer at surgery, potentially eliminating unnecessary procedures.

In current practice, when a mammogram detects a suspicious lesion, a needle biopsy is performed to determine if it is cancer. Approximately 70 percent of the lesions are benign, 20 percent are malignant, and 10 percent are high-risk.

Using a method known as a “random-forest classifier,” the AI model resulted in 30 per cent fewer  surgeries, compared to the strategy of always doing surgery, while diagnosing more cancerous lesions (97 per cent vs 79 per cent) than the strategy of only doing surgery on traditional “high-risk lesions.”

Trained on information about 600 high-risk lesions, the technology looks for data patterns that include demographics, family history, past biopsies, and pathology reports.

MGH radiologists will begin incorporating the method into their clinical practice over the next year.


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University, featuring:  Vinod KhoslaJustin SanchezBrian OtisBryan JohnsonZhenan BaoNathan IntratorCarla PughJamshid Ghajar – Mark Kendall – Robert Greenberg Darin Okuda Jason Heikenfeld

Teleneurology for remote, underserved populations

FacebooktwitterlinkedinFacebooktwitterlinkedin

Neurodegenerative disease cases have, unfortunately, far outpaced the number of neurologists able to diagnose and treat patients, particularly in rural areas. A recent study highlighted 20 states that were or would become “dementia neurology deserts,”

Remote tele-neurology is being introduced to fill the gap.

The American Academy of Neurology has announced a new curriculum to train students and providers to use video conferencing, sensors, and text and image communication tools to connect  with patients. Five training areas, developed by the University of Missouri, are the focus: technology; legal and ethical issues; “webside” manners; privacy; and, of course, neurology expertise

The University of Texas, Vanderbilt University, and Tufts Medical Center are already using teleneurology..


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University, featuring:  Vinod KhoslaJustin SanchezBrian OtisBryan JohnsonZhenan BaoNathan IntratorCarla PughJamshid Ghajar – Mark Kendall – Robert Greenberg Darin Okuda Jason Heikenfeld

Robot “patients” for medical research, training

FacebooktwitterlinkedinFacebooktwitterlinkedin

In addition to robots with increasingly human-like faces, being used as companions, “patient” robots are being developed to test medical equipment and procedures on babies and adults.

Yoshio Matsumoto  and AIST colleagues created a robotic skeletal structure of the lower half of the body, with 22 moveable joints.  Its skeleton is made of metal, and its skin, fat and muscles of silicone. Embedded sensors measure pressure on various parts of the lower body. It is being used to develop hospital beds with a reduced risk of pressure wounds.

Waseda University’s Hiroyuki Ishii has developed a robot baby for practicing  breathing protocols after birth. If non-breathing babies do not respond to sensory stimulation, a tube is inserted into the trachea.  This happens in  1-2% of newborns, and poses obvious risks. The robot is designed to train medical staff to properly insert the tube into the baby’s windpipe.

The use  of robots as personal assistants, and to test procedures, will increase rapidly as advanced sensors are built into the devices.


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University, featuring:  Vinod KhoslaJustin SanchezBrian OtisBryan JohnsonZhenan BaoNathan IntratorCarla PughJamshid Ghajar – Mark Kendall – Robert Greenberg Darin Okuda Jason Heikenfeld

Prosthetic “skin” senses force, vibration

FacebooktwitterlinkedinFacebooktwitterlinkedin

Jonathan Posner, with University of Washington and UCLA colleagues, has developed a flexible sensor “skin” that can be stretched over prostheses to determine force and vibration.

The skin mimics the way a human finger responds to tension and compression, as it slides along a surface or distinguishes among different textures. This could allow users to sense when something is slipping out of their grasp.

Tiny electrically conductive liquid metal channels are placed on both sides of of a prosthetic finger. As it is slid across a surface, the channels on one side compress while those on the other side stretch, similar to a natural limb.  As the channel geometry changes, so does the amount of electricity. Differences in electrical resistance correlate with force and vibrations.

The researchers believe that the sensor skin will enable users to better be able to open a door, use a phone, shake hands, or lift packages.


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University, featuring:  Vinod KhoslaJustin SanchezBrian OtisBryan JohnsonZhenan BaoNathan IntratorCarla PughJamshid Ghajar – Mark Kendall – Robert Greenberg Darin Okuda Jason Heikenfeld

VR + neurofeedback for movement training after stroke

FacebooktwitterlinkedinFacebooktwitterlinkedin

Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University, featuring:  Vinod KhoslaJustin SanchezBrian OtisBryan JohnsonZhenan BaoNathan IntratorCarla PughJamshid Ghajar – Mark Kendall – Robert Greenberg

Pressure sensors generated from pattern-forming bacteria

FacebooktwitterlinkedinFacebooktwitterlinkedin

Paul Ruffin Scarborough, Stefan Zauscher, and Duke colleagues have programmed bacteria with a synthetic gene circuit to turn them into working devices.

As a bacterial colony grows into the shape of a hemisphere, the gene circuit triggers the production of a protein, to distribute within the colony, that recruits inorganic materials. Gold nanoparticles enable  the system to form a shell around the bacterial colony, resulting in  a pressure sensor.

This is the first time that a composite structure was produced by programming the cells themselves, and controlling their access to nutrients, but still leaving the bacteria free to grow in three dimensions.

Click to view Duke University video


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University, featuring:  Vinod KhoslaJustin SanchezBrian OtisBryan JohnsonZhenan BaoNathan IntratorCarla PughJamshid Ghajar – Mark Kendall – Robert Greenberg

Video: Boston VC’s on funding digital health innovation

FacebooktwitterlinkedinFacebooktwitterlinkedin

Video:  Flare Capital’s Bill Geary, Bessemer’s Steve Kraus, Oak HC/FT’s Nancy Brown, and Optum Ventures’ Michael Weintraub on funding and commercializing innovation.

Recorded at ApplySci’s Digital Health + Neurotech conference at the MIT Media Lab, September 19, 2017


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University, featuring:  Vinod KhoslaJustin SanchezBrian OtisBryan JohnsonZhenan BaoNathan IntratorCarla PughJamshid Ghajar – Mark Kendall

Ingestible, flexible sensor to diagnose gastrointestinal disorders

FacebooktwitterlinkedinFacebooktwitterlinkedin

Canan Dagdeviren, Giovanni Traverso, Bob Langer, and MIT and Brigham and Women’s colleagues have built a swallowable, flexible sensor that adheres to the stomach wall or intestinal lining to measure digestive track contractions.  It could be used to help diagnose gastrointestinal disorders or to monitor food intake.

The piezoelectric device generates a current and voltage when mechanically deformed. Elastic polymers allow it to conform to and stretch with skin.

The sensor has only been tested on pigs.  It was able to remain active for 2 days.  If found safe to be used in humans, its flexibility could help avoid the side effects associated with current, rigid ingestible devices. Future versions will include the harvesting of some of the piezoelectric generated energy to power additional sensors and wireless transmitters.  The elimination of a battery would further improve safety.


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University, featuring:  Vinod KhoslaJustin SanchezBrian OtisBryan JohnsonZhenan BaoNathan IntratorCarla PughJamshid Ghajar – Mark Kendall