Sound waves detect disease related changes in red blood cell shape


Ryerson University investigators used photoacoustics to create detailed images to detect changing shapes of red blood cells associated with diseases including maria, sickle cell anemia and certain types of cancer.

A drop of blood is placed under a microscope that picks up sounds produced by the cells. Researchers then focus a laser beam on the samples. As the blood cells absorb energy from the laser pulse, they release some of it in the form of sound waves, enabling scientists to understand details about the shape of the cell.

Algorithm determines embryo quality in IVF procedures


Researchers from Tel Aviv’s Afeka College of Engineering and Sourasky Medical Center have developed software to find the best embryos for IVF procedures.  Their algorithm uses Matlab to process microscopic pictures to choose the highest quality fertilized egg.

Today embryologists use a microscope to find the best embryos, which five days later are inserted into the uterus. “The examination done now depends on the embryologist and isn’t precise. The new test increases the potential for choosing as healthy an embryo as possible,” said lead researcher Danai Menuhin.

LCD display, embedded in contact lenses — Google Glass functionality with out headgear?


Professor Jelle De Smet of Ghent University has developed a spherical, curved LCD display which can be embedded in contact lenses and handle projected images using wireless technology.  This is the first step towards “fully pixelated contact lens displays” with the same detail as a television screen.  The technology could lead to a superimposed image projected onto the user’s normal view, similar to Google Glass but without the headgear.  The lenses could also be used for medical purposes, including controlling light transmission toward the eye’s retina when an iris is damaged, or to display directions or texts from a smartphone.

High resolution mapping uncovers brain circuit architecture


Salk and Gladstone Institute scientists have found a way to untangle neural networks by enhancing a brain mapping technique that they first developed in 2007.

“These initial results should be treated as a resource not only for decoding how this network guides the vast array of very distinct brain functions, but also how dysfunctions in different parts of this network can lead to different neurological conditions,” said Salk professor Edward Callaway.

The researchers combined mouse models with a tracing technique known as the monosynaptic rabies virus system to assemble maps of neurons that connect with the basal ganglia, a region of the brain that is involved in movement and decision making.

“The monosynaptic rabies virus approach is ingenious in the exquisite precision that it offers compared with previous methods, which were messier with a much lower resolution,” explained Gladstone investigator Professor Anatol Kreitzer.  “In this paper, we took the approach one step further by activating the tracer genetically, which ensures that it is only turned on in specific neurons in the basal ganglia. This is a huge leap forward technologically, as we can be sure that we’re following only the networks that connect to particular kinds of cells in the basal ganglia from other parts of the brain.”

Smartphone diagnostic and cloud platform make eye care accessible


Vinod Khosla and others have invested in MIT Media Lab’s EyeNetra, a smartphone attachment that claims to diagnose nearsightedness, farsightedness and astigmatism.  The device is positioned as a less bulky alternative to the Shack-Hartmann Wavefront sensor.  A $2 eyepiece is clipped onto a phone.  The user then clicks to align the displayed patterns.  The number of clicks required to bring the patterns into alignment indicates the refractive error.  Patients connect to corrective lens providers through a cloud based system.  The technology provides access to eye exams in the developing world.

Eye tracking data helps diagnose autism, ADHD, Parkinson’s


USC’s Laurent Itti and researchers from Queen’s University in Ontario have created a data heavy, low cost method of identifying brain disorders through eye tracking.  Subjects watch a video for 15 minutes while their eye movements are recorded. An enormous amount of data is generated as the average person makes three to five saccadic eye movements per second.  Itti’s team uses advanced machine learning algorithms to enable a computer to recognize patterns without explicit human instruction.

The proof of concept study found that the algorithm could classify mental disorders through eye movement patterns.  Parkinson’s patients were identified with nearly 90 percent accuracy.  Children with ADHD or fetal alcohol spectrum disorder were identified with 77 percent accuracy.  “This is very different from what people have done before. We’re trying to have completely automated interpretation of the eye movement data,” said Itti.

Non-invasive nanotube device detects disease with one drop of blood


Professors Reginald Farrow and Alokik Kanwal of the New Jersey Institute of Technology have created a carbon nanotube-based device to non-invasively and quickly detect mobile single cells with the potential to maintain a high degree of spatial resolution.  They are now overseeing the manufacture of a prototype lab-on-a-chip that would enable a physician to detect disease or virus from just one drop of liquid, including blood.

The military was the initial funder of the research, in an attempt to identify biological warfare agents.  Farrow believes that it can be applied much more widely to detect viruses, bacteria, or cancer.  The next phase of research will include a study of the device’s ability to assess the health of brain neurons.

Algorithm analyzes head movements to measure heart rate


MIT researchers have developed an algorithm that gauges heart rate by measuring tiny head movements in video data.  A subject’s heart rate was consistently measured within a few beats per minute when compared to results from electrocardiograms. The algorithm was also able to provide estimates of time intervals between beats, which can be used to identify patients who are at risk for cardiac events.

The algorithm uses face recognition to differentiate between the person’s head and the rest of the image.  It then randomly picks 500 to 1,000 exact points, clustered around the person’s mouth and nose.  Movements are followed from frame to frame, and filtered when the temporal frequency falls below the range of a regular heartbeat – about 0.5 to 5 hertz, or 30 to 300 cycles per minute. This eliminates movements that continue at a lower frequency, such as those caused by regular breathing and slow alterations in posture.  Principal component analysis is used to break down the resulting signal into many constituent signals, which stand as part of the uncorrelated leftover movements. Of those signals, it chooses one that appears to be the most regular and that drops within the typical frequency band of the human pulse.

fMRI and machine learning identify emotions


Carnegie Mellon researchers have developed “a new method with the potential to identify emotions without relying on people’s ability to self-report” using a combination of fMRI and machine learning.

They recruited 10 actors from the university’s drama school to act out emotions including anger, happiness, pride and shame, while inside an fMRI scanner,  multiple times in random order.  To ensure that researchers were able to measure actual emotions and not just the acting out of emotions, participants viewed emotion-eliciting images while undergoing fMRI scans.

The findings illustrate how the brain categorizes feelings, giving researchers the first reliable process to analyze emotions.  Emotion research has been difficult because of a lack of reliable evaluation methods, caused by a subject’s reluctance to honestly report feelings and the potential of emotional responses not being consciously experienced. Researchers plan to apply this new identification method to a number of problems, including identifying emotions that individuals are attempting to suppress and multiple emotions experienced simultaneously.

Insulin pump detects overnight hypoglycemia, shuts off automatically


Medtronic has designed an insulin pump that temporarily shuts off when blood sugar levels fall too low—a key advance in the effort to fully automate the delivery of insulin in diabetes patients.  Current technology allows people who use insulin pumps to wear a sensor that measures the amount of blood sugar in the body, which helps them program pumps to deliver the appropriate amount of insulin. Researchers have been trying to link the two technologies in order to automate insulin delivery, to create what has been referred to as an artificial pancreas.