Category Archives: BCI

Direct brain path for sight, sound via implanted microscope

FacebooktwitterlinkedinFacebooktwitterlinkedin

Rice University’s Jacob Robinson, with Yale and Columbia colleagues, are developing FlatScope — a flat, brain implanted microscope that monitors and triggers neurons which are modified to be fluorescent when active.

While capturing greater detail than current brain probes, the microscope also goes through deep levels that illustrate  sensory input processing — which they hope to be able to control.

Aiming to produce a super high-resolution neural interface, FlatScope is a part of  DARPA’s NESD program, founded by Phillip Alvelda, and now led by Brad Ringeisen.


Phillip Alvelda will be a featured speaker at ApplySci’s Wearable Tech + Digital Health + NeuroTech Boston conference on September 19, 2017 at the MIT Media Lab.  Other speakers include:  Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar  – Michael Weintraub – Nancy Brown – Steve Kraus – Bill Geary – Mary Lou Jepsen – Daniela Rus

Registration rates increase Friday, July 21st

BCI-controlled exoskeleton helps motor recovery in stroke

FacebooktwitterlinkedinFacebooktwitterlinkedin

Ipsihand, developed by Eric Leuthardt and Washington University colleagues, is a brain controlled glove that helps reroute hand control to an undamaged part of the brain.  The system uses a glove or brace on the hand, an EEG cap, and an amplifier.

One’s hands are controlled by the opposite side of the brain. If one hemisphere is damaged, it is difficult to control the other hand.

According to Leuthard, the idea of Ipsihand is that if one can “couple those motor signals that are associated with moving the same-sided limb with the actual movements of the hand, new connections will be made in your brain that allow the uninjured areas of your brain to take over control of the paralyzed hand.”

Ipsihand’s cap detects intention signals to open or close the hand, then the computer amplifies them. The brace then opens or closes in a pincer-like grip with the hand inside, bending the fingers and thumb to meet.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab – featuring  Joi Ito – Ed Boyden – Roz Picard – George Church – Nathan Intrator –  Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Michael Weintraub – Nancy Brown – Steve Kraus – Bill Geary

Tetraplegic patient moves arm with thoughts via BCI/FES system

FacebooktwitterlinkedinFacebooktwitterlinkedin

Bolu Ajiboye and Case Western colleagues used an implanted  BrainGate2 brain-computer interface to allow a tetraplegia patient to control arm movements via an implanted FES muscle stimulation system.  A robotic arm, which was needed in previous BrainGate experiments,  was no longer required.

Neural activity was recorded from two 96-channel microelectrode arrays implanted in the motor cortex. The implanted brain-computer interface translated recorded brain signals into specific command signals that determine the amount of stimulation to be applied to each functional electrical stimulation electrode in the hand, wrist, arm, elbow and shoulder, and to a mobile arm support.

The researchers first exercised patient’s arm and hand with cyclical electrical stimulation patterns. Over 45 weeks, his strength, range of motion. and endurance improved. He then learned how to use his own brain signals to move a virtual-reality arm on a computer screen. After the 36 electrode FES system was implanted, he was able to make each joint in his right arm move individually. Or,  by thinking about a task such as feeding himself or getting a drink, the muscles are activated in a coordinated fashion.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston on September 19, 2017 at the MIT Media Lab. Featuring Joi Ito – Ed Boyden – Roz Picard – George Church – Tom Insel – John Rogers – Jamshid Ghajar – Phillip Alvelda – Nathan Intrator

Magnetic coils might improve neural prostheses

FacebooktwitterlinkedinFacebooktwitterlinkedin

Neocortex stimulation is used to treat neurological disorders, including Parkinson’s disease and paralysis. Current electrode-based implants have limited efficacy. It is difficult to create precise patterns of neural activity, or to achieve consistent responses over time.  This can be addressed by magnetic stimulation, but until now, coils small enough to be implanted into the cortex were not thought strong enough to activate neurons. Shelley Fried at Harvard has created a microcoil design that  has been effective for activating cortical neurons and driving behavioral responses.


Join ApplySci at Wearable Tech + Digital Health + NeuroTech Boston – Featuring Roz Picard, Tom Insel, John Rogers and Nathan Intrator – September 19, 2017 at the MIT Media Lab

Toward a speech-driven auditory Brain Computer Interface

FacebooktwitterlinkedinFacebooktwitterlinkedin

University of Oldenburg student Carlos Filipe da Silva Souto is in the early stages of developing a brain computer interface that can advise a user who he/she is listening to in a noisy room.   Wearers could focus on specific conversations, and tune out background noise.

Most BCI studies have focused on visual stimuli, which typically outperforms auditory stimuli systems, possibly because of the larger cortical surface of the visual system.  As researchers further optimize classification methods for auditory systems, performance will be improved.

The goal is for visually impaired or paralyzed people to be able to use natural speech features to control hearing devices.

ApplySci’s 6th  Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Sky Christopherson – Marcus Weldon – Krishna Shenoy – Karl Deisseroth – Shahin Farshchi – Casper de Clercq – Mary Lou Jepsen – Vivek Wadhwa – Dirk Schapeler – Miguel Nicolelis

 

 

 

 

BCI controlled robotic arm; no implant required

FacebooktwitterlinkedinFacebooktwitterlinkedin

University of Minnesota professor Bin He has created a brain computer interface to control a robotic arm with out an implant.

In a recent study, EEG alone was used to allow 8 people to move objects in a complex 3D environment.  Combined with advanced signal processing and machine learning, thoughts were turned into actions.

64 electrodes were required, making a clinical setting necessary.  However, if this system can be adapted to use the smaller, mobile EEG bands now being developed, a mobile or home based noninvasive interface might be possible in the future.

ApplySci’s 6th  Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Sky Christopherson – Marcus Weldon – Krishna Shenoy – Karl Deisseroth – Shahin Farshchi – Casper de Clercq – Mary Lou Jepsen – Vivek Wadhwa – Dirk Schapeler – Miguel Nicolelis

 

Implanted device enables slow, but more reliable, brain-driven typing

FacebooktwitterlinkedinFacebooktwitterlinkedin

UMC Utrecht professor Nick Ramsey has developed an implantable brain computer interface that allowed a locked-in ALS patient to (very slowly) translate her thoughts into text.

Until recently, the patient used eye tracking to spell words on a computer screen, as many ALS patients do.  However, as 1/3 of patients lose the ability to control their eyes, scientists have been looking for thought controlled alternatives, both invasive, implant-driven, and non invasive, EEG-driven.

The success of this implant, while it only allows the user to “type” 2 letters per minute, is that it does not require the constant calibration of earlier devices, which required them to be used in the lab, or were unreliable.

Ramsey simplified the process by focusing on detecting only brain signals that fire when the brain counts backwards and commands the body to click a mouse.

Click to view UMC Utrecht video


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Krishna Shenoy – Karl Deisseroth

BCI for astronaut equipment control, ground communication

FacebooktwitterlinkedinFacebooktwitterlinkedin

The China Astronaut Research and Training Center and Tianjin University are developing a BCI system to allow astronauts to control spacecraft equipment with their thoughts. Brain impulses will be translated into words, to operate instruments and  communicate with ground control.

The system will be tested by astronauts in space, and information is currently displayed at the International Simulation Expo in Beijing.


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Krishna Shenoy – Karl Deisseroth

“Cybathlon” features robotic exoskeleton, BCI competitions

FacebooktwitterlinkedinFacebooktwitterlinkedin

ETH professor Robert Riener‘s first Cybathlon will feature competitions using robotic prostheses and brain computer interfaces.  Disabled participants will compete in brain controlled races and exoskeleton controlled tasks.  Many  will include common obstacles, like doors, ramps, and stairs, as the goal is to develop technology to increase independence and make the activities of daily living less difficult.

Click to view the Cybathlon trailer


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Krishna Shenoy – Karl Deisseroth

 

Monkeys type Shakespeare with thoughts; human trials underway

FacebooktwitterlinkedinFacebooktwitterlinkedin

Stanford professor Krishna Shenoy has developed technology that reads brain signals in monkeys, from implanted electrodes,  to control a cursor moving over a keyboard.   A clinical trial will begin soon, with the goal of creating brain computer interfaces to help paralyzed people communicate.  This could overcome the limitations of eye-controlled keyboards, which do not work when one loses control of the muscles around his or her eyes.

Two monkeys were taught to point to patterns of on-screen yellow and green dots, which spelled letters. Electrodes were then implanted in their brains. They were shown flashing dot letter patterns that spelled Hamlet and New York Times texts. The arrays measured  brain activity when a monkey thought of the where to point its arm, which it had learned to point to the next letter to spell.

The monkeys were able to, with their thoughts, type the texts at a speed of 12 words per minute.

Click to view Stanford University video.


Professor Shenoy will discuss this experiment, and current human trials, at ApplySci’s upcoming Wearable Tech + Digital Health + NeuroTech Silicon Valley conference, on February 7-8 at Stanford University.