Combined BCI + FES system could improve stroke recovery

Jose Millan and EPFL colleagues have combined a brain computer interface with functional electrical stimulation in a system that, in a study, showed the ability to enhance the restoration of limb use after a stroke.

According to Millan: “The key is to stimulate the nerves of the paralyzed arm precisely when the stroke-affected part of the brain activates to move the limb, even if the patient can’t actually carry out the movement. That helps re-establish the link between the two nerve pathways where the signal comes in and goes out.”

27 patients with a similar lesion that resulted in moderate to severe arm paralysis following a stroke participated in the trial. Half were treated with the dual-therapy approach, and reported clinically significant improvements.  A BCI system  enabled the researchers to pinpoint where the electrical activity occurred in the brain when they tried to extend their hands. Each time the electrical activity was identified, the system stimulated the muscle controlling the corresponding wrist and finger movements.

The control group received FES only, and had their arm muscles stimulated randomly. This allowed the scientists to understand how much additional motor function improvement could be attributed to the BCI system.


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda Marom Bikson

REGISTRATION RATES INCREASE JUNE 29TH

Bob Knight on decoding language from direct brain recordings | ApplySci @ Stanford


Berkeley’s Bob Knight discussed (and demonstrated) decoding language from direct brain recordings at ApplySci’s recent Wearable Tech + Digital Health + Neurotech Silicon Valley at Stanford:


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Rudy Tanzi – Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum – Phillip Alvelda

Join Apply Sci at the 10th Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 21-22, 2019 at Stanford University

Nathan Intrator on epilepsy, AI, and digital signal processing | ApplySci @ Stanford

Nathan Intrator discussed epilepsy, AI and digital signal processing at ApplySci’s Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 26-27, 2018 at Stanford University:


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 24, 2018 at the MIT Media Lab.  Speakers include:  Mary Lou Jepsen – George ChurchRoz PicardNathan IntratorKeith JohnsonJuan EnriquezJohn MattisonRoozbeh GhaffariPoppy Crum

DARPA’s Justin Sanchez on driving and reshaping biotechnology | ApplySci @ Stanford

DARPA Biological Technologies Office Director Dr. Justin Sanchez on driving and reshaping biotechnology.  Recorded at ApplySci’s Wearable Tech + Digital Health + Neurotech Silicon Valley conference on February 26-27, 2018 at Stanford University.


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 25, 2018 at the MIT Media Lab

TMS + VR for sensory, motor skill recovery after stroke

EPFL’s Michela Bassolino has used transcranial magnetic stimulation to create hand sensations when combined with VR.

By stimulating the motor cortex,  subjects’ hand muscles  were activated, and involuntary short movements were induced.

In a recent study, when subjects observed a virtual hand moving at the same time and in a similar manner to their own during TMS, they felt that a virtual hand was a controllable body part.

25 of 32 participants experienced the effect within two minutes of stimulation. Bassolino believes that the effect may also be achieved through less immersive video.

The technology could  help patients recover sensory and motor skills after a stroke — and also be used as a gaming enhancement.


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 25, 2018 at the MIT Media Lab

Bone-conduction headset for voice-free communication

MIT’s Arnav Kapur has created a device that senses and interprets neuromuscular signals created when we subvocalize. AlterEgo rests on the ear and extends across the jaw.  A pad sticks beneath the lower lip, and another below the chin. It senses jaw and facial tissue bone-conduction, undetectable by humans.

 Two bone-conduction headphones pick up inner ear vibrations, and four electrodes detect neuromuscular signals. Algorithms determine what a wearer is subvocalizing, and can report silently back. This enables communication with out speaking.

In studies,  researchers interacted with a computer to solve problems; a participant asked a computer the time and got an accurate response; and  another played a game of chess with a colleague.

Click to view MIT Media Lab video


Join ApplySci at the 9th Wearable Tech + Digital Health + Neurotech Boston conference on September 25, 2018 at the MIT Media Lab

Software records, organizes, analyzes 1 million neurons in real-time

Martin Garwicz and Lund University colleagues have developed a novel method for recording, organizing, and analyzing enormous amounts of neurohysiological data  from  implanted brain computer interfaces.

The technology simultaneously acquires data from 1 million neurons in real time. It converts spike data and sends it for processing and storage on conventional systems. Subject feedback is provided in  25 milliseconds — stimulating up to 100,000 neurons.

This has implications for  basic research, clinical diagnosis, and brain disease treatment, and is built for implantable, bidirectional brain computer interfaces, used to communicate complex data between neurons and computers. This includes monitoring the brain of paralyzed patients, early detection of epileptic seizures, and real-time feedback to control to robotic prostheses.


Announcing ApplySci’s 9th Wearable Tech + Digital Health + Neurotech conference — September 25, 2018 at the MIT Media Lab