Category Archives: BCI

Lightweight, highly portable, brain-controlled exoskeleton

FacebooktwitterlinkedinFacebooktwitterlinkedin

EPFL’s José Millán has developed a brain-controlled, highly portable exoskeleton, that can be quickly  secured around joints with velcro. Metal cables act as soft tendons on the back of each finger, with the palm free to feel hand sensations.  Motors that push and pull the cables are worn on the chest. Fingers are flexed when the cables are pushed and extended when they are pulled.

The control interface can be eye-movement monitoring, phone-based voice controls, residual muscular activity, or EEG-driven brainwave analysis. Hand motions induced by the device elicited brain patterns typical of healthy hand motions.  Exoskeleton-induced hand motions combined with the brain interface lead to unusual neural patterns that could facilitate control of the device. Contralateral brain activity was observed in people who passively received hand motion by the exoskeleton. When subjects were asked to control the exoskeleton with their thoughts, same-side patterns were consistent.

Click to view EPFL video


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University. Speakers include:  Vinod Khosla – Justin Sanchez – Brian Otis – Bryan Johnson – Zhenan Bao – Nathan Intrator – Carla Pugh – Jamshid Ghajar – Mark Kendall – Robert Greenberg – Darin Okuda – Jason Heikenfeld – Bob Knight – Phillip Alvelda – Paul Nuyujukian –  Peter Fischer – Tony Chahine – Shahin Farshchi – Ambar Bhattacharyya – Adam D’Augelli – Juan-Pablo Mas – Michael Eggleston – Walter Greenleaf – Jacobo Penide – David Sarno – Peter Fischer

Registration rates increase on January 26th

 

Closed loop EEG/BCI/VR/physical therapy system to control gait, prosthetics

FacebooktwitterlinkedinFacebooktwitterlinkedin

Earlier this year, University of Houston’s Jose Luis Contreras-Vidal developed a closed-loop BCI/EEG/VR/physical therapy system to control gait as part of a stroke/spinal cord injury rehab program.  The goal was to promote and enhance cortical involvement during walking.

In a study, 8 subjects walked on a treadmill while watching an avatar and wearing a 64 channel EEG headset and motion sensors at the hip, knee and ankle.

The avatar was first activated by the motion sensors, allowing its movement to precisely mimic that of the test subject. It was  then controlled by the brain-computer interface, although this was less precise than the movement with the motion sensors. Contreras-Vidal believes that as subjects learn how to use the interface, the result will be closer to that of the sensors. The researchers reported increased activity in the posterior parietal cortex and the inferior parietal lobe, along with increased involvement of the anterior cingulate cortex.

The team built on this reasearch  to demonstrate how brain activity is used to identify different terrains to develop prosthetics that automatically adjust to changing ground conditions in real time. 

Click to view University of Houston video


Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University. Speakers include:  Vinod Khosla – Justin Sanchez – Brian Otis – Bryan Johnson – Zhenan Bao – Nathan Intrator – Carla Pugh – Jamshid Ghajar – Mark Kendall – Robert Greenberg – Darin Okuda – Jason Heikenfeld – Bob Knight – Phillip Alvelda – Paul Nuyujukian –  Peter Fischer – Tony Chahine – Shahin Farshchi – Ambar Bhattacharyya – Adam D’Augelli – Juan-Pablo Mas – Michael Eggleston – Walter Greenleaf – Jacobo Penide – David Sarno – Peter Fischer

Registration rates increase on January 26th

VR + neurofeedback for movement training after stroke

FacebooktwitterlinkedinFacebooktwitterlinkedin

Join ApplySci at Wearable Tech + Digital Health + Neurotech Silicon Valley on February 26-27, 2018 at Stanford University, featuring:  Vinod KhoslaJustin SanchezBrian OtisBryan JohnsonZhenan BaoNathan IntratorCarla PughJamshid Ghajar – Mark Kendall – Robert Greenberg

Direct brain path for sight, sound via implanted microscope

FacebooktwitterlinkedinFacebooktwitterlinkedin

Rice University’s Jacob Robinson, with Yale and Columbia colleagues, are developing FlatScope — a flat, brain implanted microscope that monitors and triggers neurons which are modified to be fluorescent when active.

While capturing greater detail than current brain probes, the microscope also goes through deep levels that illustrate  sensory input processing — which they hope to be able to control.

Aiming to produce a super high-resolution neural interface, FlatScope is a part of  DARPA’s NESD program, founded by Phillip Alvelda, and now led by Brad Ringeisen.


Phillip Alvelda will be a featured speaker at ApplySci’s Wearable Tech + Digital Health + NeuroTech Boston conference on September 19, 2017 at the MIT Media Lab.  Other speakers include:  Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar  – Michael Weintraub – Nancy Brown – Steve Kraus – Bill Geary – Mary Lou Jepsen – Daniela Rus

Registration rates increase Friday, July 21st

BCI-controlled exoskeleton helps motor recovery in stroke

FacebooktwitterlinkedinFacebooktwitterlinkedin

Ipsihand, developed by Eric Leuthardt and Washington University colleagues, is a brain controlled glove that helps reroute hand control to an undamaged part of the brain.  The system uses a glove or brace on the hand, an EEG cap, and an amplifier.

One’s hands are controlled by the opposite side of the brain. If one hemisphere is damaged, it is difficult to control the other hand.

According to Leuthard, the idea of Ipsihand is that if one can “couple those motor signals that are associated with moving the same-sided limb with the actual movements of the hand, new connections will be made in your brain that allow the uninjured areas of your brain to take over control of the paralyzed hand.”

Ipsihand’s cap detects intention signals to open or close the hand, then the computer amplifies them. The brace then opens or closes in a pincer-like grip with the hand inside, bending the fingers and thumb to meet.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab – featuring  Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Michael Weintraub – Nancy Brown – Steve Kraus – Bill Geary

Tetraplegic patient moves arm with thoughts via BCI/FES system

FacebooktwitterlinkedinFacebooktwitterlinkedin

Bolu Ajiboye and Case Western colleagues used an implanted  BrainGate2 brain-computer interface to allow a tetraplegia patient to control arm movements via an implanted FES muscle stimulation system.  A robotic arm, which was needed in previous BrainGate experiments,  was no longer required.

Neural activity was recorded from two 96-channel microelectrode arrays implanted in the motor cortex. The implanted brain-computer interface translated recorded brain signals into specific command signals that determine the amount of stimulation to be applied to each functional electrical stimulation electrode in the hand, wrist, arm, elbow and shoulder, and to a mobile arm support.

The researchers first exercised patient’s arm and hand with cyclical electrical stimulation patterns. Over 45 weeks, his strength, range of motion. and endurance improved. He then learned how to use his own brain signals to move a virtual-reality arm on a computer screen. After the 36 electrode FES system was implanted, he was able to make each joint in his right arm move individually. Or,  by thinking about a task such as feeding himself or getting a drink, the muscles are activated in a coordinated fashion.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Nathan Intrator

Magnetic coils might improve neural prostheses

FacebooktwitterlinkedinFacebooktwitterlinkedin

Neocortex stimulation is used to treat neurological disorders, including Parkinson’s disease and paralysis. Current electrode-based implants have limited efficacy. It is difficult to create precise patterns of neural activity, or to achieve consistent responses over time.  This can be addressed by magnetic stimulation, but until now, coils small enough to be implanted into the cortex were not thought strong enough to activate neurons. Shelley Fried at Harvard has created a microcoil design that  has been effective for activating cortical neurons and driving behavioral responses.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston – Featuring Roz Picard, Tom Insel, John Rogers and Nathan Intrator – September 19, 2017 at the MIT Media Lab

Toward a speech-driven auditory Brain Computer Interface

FacebooktwitterlinkedinFacebooktwitterlinkedin

University of Oldenburg student Carlos Filipe da Silva Souto is in the early stages of developing a brain computer interface that can advise a user who he/she is listening to in a noisy room.   Wearers could focus on specific conversations, and tune out background noise.

Most BCI studies have focused on visual stimuli, which typically outperforms auditory stimuli systems, possibly because of the larger cortical surface of the visual system.  As researchers further optimize classification methods for auditory systems, performance will be improved.

The goal is for visually impaired or paralyzed people to be able to use natural speech features to control hearing devices.

ApplySci’s 6th  Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Sky Christopherson – Marcus Weldon – Krishna Shenoy – Karl Deisseroth – Shahin Farshchi – Casper de Clercq – Mary Lou Jepsen – Vivek Wadhwa – Dirk Schapeler – Miguel Nicolelis

 

 

 

 

BCI controlled robotic arm; no implant required

FacebooktwitterlinkedinFacebooktwitterlinkedin

University of Minnesota professor Bin He has created a brain computer interface to control a robotic arm with out an implant.

In a recent study, EEG alone was used to allow 8 people to move objects in a complex 3D environment.  Combined with advanced signal processing and machine learning, thoughts were turned into actions.

64 electrodes were required, making a clinical setting necessary.  However, if this system can be adapted to use the smaller, mobile EEG bands now being developed, a mobile or home based noninvasive interface might be possible in the future.

ApplySci’s 6th  Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Sky Christopherson – Marcus Weldon – Krishna Shenoy – Karl Deisseroth – Shahin Farshchi – Casper de Clercq – Mary Lou Jepsen – Vivek Wadhwa – Dirk Schapeler – Miguel Nicolelis

 

Implanted device enables slow, but more reliable, brain-driven typing

FacebooktwitterlinkedinFacebooktwitterlinkedin

UMC Utrecht professor Nick Ramsey has developed an implantable brain computer interface that allowed a locked-in ALS patient to (very slowly) translate her thoughts into text.

Until recently, the patient used eye tracking to spell words on a computer screen, as many ALS patients do.  However, as 1/3 of patients lose the ability to control their eyes, scientists have been looking for thought controlled alternatives, both invasive, implant-driven, and non invasive, EEG-driven.

The success of this implant, while it only allows the user to “type” 2 letters per minute, is that it does not require the constant calibration of earlier devices, which required them to be used in the lab, or were unreliable.

Ramsey simplified the process by focusing on detecting only brain signals that fire when the brain counts backwards and commands the body to click a mouse.

Click to view UMC Utrecht video


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Krishna Shenoy – Karl Deisseroth