Category Archives: BCI

Toward a speech-driven auditory Brain Computer Interface

Facebooktwitterredditlinkedinmail

University of Oldenburg student Carlos Filipe da Silva Souto is in the early stages of developing a brain computer interface that can advise a user who he/she is listening to in a noisy room.   Wearers could focus on specific conversations, and tune out background noise.

Most BCI studies have focused on visual stimuli, which typically outperforms auditory stimuli systems, possibly because of the larger cortical surface of the visual system.  As researchers further optimize classification methods for auditory systems, performance will be improved.

The goal is for visually impaired or paralyzed people to be able to use natural speech features to control hearing devices.

ApplySci’s 6th  Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Sky Christopherson – Marcus Weldon – Krishna Shenoy – Karl Deisseroth – Shahin Farshchi – Casper de Clercq – Mary Lou Jepsen – Vivek Wadhwa – Dirk Schapeler – Miguel Nicolelis

 

 

 

 

BCI controlled robotic arm; no implant required

Facebooktwitterredditlinkedinmail

University of Minnesota professor Bin He has created a brain computer interface to control a robotic arm with out an implant.

In a recent study, EEG alone was used to allow 8 people to move objects in a complex 3D environment.  Combined with advanced signal processing and machine learning, thoughts were turned into actions.

64 electrodes were required, making a clinical setting necessary.  However, if this system can be adapted to use the smaller, mobile EEG bands now being developed, a mobile or home based noninvasive interface might be possible in the future.

ApplySci’s 6th  Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Sky Christopherson – Marcus Weldon – Krishna Shenoy – Karl Deisseroth – Shahin Farshchi – Casper de Clercq – Mary Lou Jepsen – Vivek Wadhwa – Dirk Schapeler – Miguel Nicolelis

 

Implanted device enables slow, but more reliable, brain-driven typing

Facebooktwitterredditlinkedinmail

UMC Utrecht professor Nick Ramsey has developed an implantable brain computer interface that allowed a locked-in ALS patient to (very slowly) translate her thoughts into text.

Until recently, the patient used eye tracking to spell words on a computer screen, as many ALS patients do.  However, as 1/3 of patients lose the ability to control their eyes, scientists have been looking for thought controlled alternatives, both invasive, implant-driven, and non invasive, EEG-driven.

The success of this implant, while it only allows the user to “type” 2 letters per minute, is that it does not require the constant calibration of earlier devices, which required them to be used in the lab, or were unreliable.

Ramsey simplified the process by focusing on detecting only brain signals that fire when the brain counts backwards and commands the body to click a mouse.

Click to view UMC Utrecht video


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Krishna Shenoy – Karl Deisseroth

BCI for astronaut equipment control, ground communication

Facebooktwitterredditlinkedinmail

The China Astronaut Research and Training Center and Tianjin University are developing a BCI system to allow astronauts to control spacecraft equipment with their thoughts. Brain impulses will be translated into words, to operate instruments and  communicate with ground control.

The system will be tested by astronauts in space, and information is currently displayed at the International Simulation Expo in Beijing.


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Krishna Shenoy – Karl Deisseroth

“Cybathlon” features robotic exoskeleton, BCI competitions

Facebooktwitterredditlinkedinmail

ETH professor Robert Riener‘s first Cybathlon will feature competitions using robotic prostheses and brain computer interfaces.  Disabled participants will compete in brain controlled races and exoskeleton controlled tasks.  Many  will include common obstacles, like doors, ramps, and stairs, as the goal is to develop technology to increase independence and make the activities of daily living less difficult.

Click to view the Cybathlon trailer


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis – Roozbeh Ghaffari –Tarun Wadhwa – Eythor Bender – Unity Stoakes – Mounir Zok – Krishna Shenoy – Karl Deisseroth

 

Monkeys type Shakespeare with thoughts; human trials underway

Facebooktwitterredditlinkedinmail

Stanford professor Krishna Shenoy has developed technology that reads brain signals in monkeys, from implanted electrodes,  to control a cursor moving over a keyboard.   A clinical trial will begin soon, with the goal of creating brain computer interfaces to help paralyzed people communicate.  This could overcome the limitations of eye-controlled keyboards, which do not work when one loses control of the muscles around his or her eyes.

Two monkeys were taught to point to patterns of on-screen yellow and green dots, which spelled letters. Electrodes were then implanted in their brains. They were shown flashing dot letter patterns that spelled Hamlet and New York Times texts. The arrays measured  brain activity when a monkey thought of the where to point its arm, which it had learned to point to the next letter to spell.

The monkeys were able to, with their thoughts, type the texts at a speed of 12 words per minute.

Click to view Stanford University video.


Professor Shenoy will discuss this experiment, and current human trials, at ApplySci’s upcoming Wearable Tech + Digital Health + NeuroTech Silicon Valley conference, on February 7-8 at Stanford University.

 

Robotic hand exoskeleton for stroke patients

Facebooktwitterredditlinkedinmail

ETH professor Roger Gassert has developed a robotic exoskeleton that allows stroke patients  to perform daily activities by supporting motor and somatosensory functions.

His vision is that “instead of performing exercises in an abstract situation at the clinic, patients will be able to integrate them into their daily life at home, supported by a robot.” He observes that existing exoskeletons are heavy, rendering patients unable to lift their hands. They also have difficulty feeling objects and exerting the right amount of force. To address this, the palm of the hand is left free in the new device.

Gassert’s Kyushu University colleague Jumpei Arata developed a mechanism for the finger featuring three overlapping leaf springs. A motor moves the middle spring, which transmits the force to the different segments of the finger through the other two springs. The fingers thus automatically adapt to the shape of the object the patient wants to grasp.

To reduce the weight of the exoskeleton, motors are placed on the patient’s back and  force is transmitted using a bicycle brake cable. ApplySci hopes that the size and weight of the motor can be reduced, allowing it to be integrated into the exoskeleton in its next phase.

Gassert wants to make the exoskeleton thought controlled, and is using MRI and EEG to detect, in the brain,  a patient’s intention to move his or her hand, and communicating this to the device.


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis – Roozbeh Ghaffari – Unity Stoakes – Mounir Zok – Krishna Shenoy

Mobile brain health management

Facebooktwitterredditlinkedinmail

After scanning the brains of ALS, epilepsy, minimally conscious, schizophrenia, memory impaired, and healthy patients, to monitor brain health and treatment effectiveness, Brown and Tel Aviv University professor Nathan Intrator has commercialized his algorithms and launched Neurosteer.

At ApplySci’s recent NeuroTech NYC conference, Professor Intrator discussed dramatic advances in BCI, monitoring, and neurofeedback, in his keynote, and in a panel with DARPA’s Biological Technologies Office Director Justin Sanchez.  He also described Neurosteer’s progress — and the rapidly developing world of technology for cognitive and emotional wellness — in an interview with StartUp Health’s Unity Stoakes.  Click to view the June, 2016 StartUp Health NOW interview, or Steve Krein and Intrator’s June, 2015 interview here.


ApplySci’s 6th   Wearable Tech + Digital Health + NeuroTech Silicon Valley  –  February 7-8 2017 @ Stanford   |   Featuring:   Vinod Khosla – Tom Insel – Zhenan Bao – Phillip Alvelda – Nathan Intrator – John Rogers – Mary Lou Jepsen – Vivek Wadhwa – Miguel Nicolelis – Roozbeh Ghaffari – Unity Stoakes – Mounir Zok

BCI + VR helps paraplegics regain sensation, muscle control

Facebooktwitterredditlinkedinmail

Duke professor Miguel Nicolelis has used BCI and VR to help 8 paralyzed people regain  partial sensation and muscle control in their lower limbs.

Five study participants had been paralyzed  for more than five years. Two had been paralyzed for more than ten.

The patients used their own brain activity to simulate full control of their legs. Over time, improvements in  motor behavior, tactile sensations and visceral functions, below the level of the spinal cord injury,  were evident. Nicolelis  believes that with weekly training, spinal cord nerves were re-engaged.

Most study participants had sensation and muscle control improvements after seven months. After a year,  doctors upgraded four patients’ diagnosis from complete to partial paralysis. Most also saw improvements in bladder control and bowel function, reducing the need for laxatives and catheters, which could cause infections.

Almost all have continued the training for two years.  Nicolelis will publish data on their progress, and intends to create a new trial for recent spinal cord injury patients.  The hope is that faster treatment can lead to better results.

Click to view Duke University videos of the study


Wearable Tech + Digital Health + NeuroTech Silicon Valley – February 7-8, 2017 @ Stanford University

Thought controlled prosthetic arm has human-like movement, strength

Facebooktwitterredditlinkedinmail

This week at the Pentagon, Johnny Matheny unveiled his DARPA developed prosthetic arm.  The mind-controlled prosthesis has the same size, weight, shape and grip strength of a human arm, and, according to Matheny, can do anything one can do.

It is, by all accounts, the most advanced prosthetic limb created to date.

The 100 sensor arm was developed as part of the “Revolutionizing Prosthetics Program” of the Biological Technologies office, led by Dr. Justin Sanchez.

An implanted neural interface allows the wearer to control the arm with his thoughts. Sensors are also implanted in the fingertips, sending signals back to the brain, allowing users to feel sensations.

Click to view Johns Hopkins video.

Dr. Sanchez will be a keynote speaker at ApplySci’s Wearable Tech + Digital Health + NeuroTech NYC conference on June  7-8, 2016 at the New York Academy of Sciences.