bloc 33rd Square Business Tools - Brain–computer interface 33rd Square Business Tools: Brain–computer interface - All Post
Showing posts with label Brain–computer interface. Show all posts
Showing posts with label Brain–computer interface. Show all posts

Tuesday, August 18, 2015

Researchers Use Brain-Machine Interface to Control Exoskeleton


Brain-Machine Interfaces


Researchers have developed a brain-computer control interface for a lower limb exoskeleton by decoding specific signals from within the user's brain.
 


Scientists working at Korea University, Korea, and TU Berlin, Germany have developed a brain-computer control interface for a lower limb exoskeleton by decoding specific signals from within the user's brain.

The study has been published in the Journal of Neural Engineering.

Using an electroencephalogram (EEG) cap, the system allows users to move forwards, turn left and right, sit and stand simply by staring at one of five flickering light emitting diodes (LEDs).

Each of the five LEDs flickers at a different frequency, and when the user focuses their attention on a specific LED this frequency is reflected within the EEG readout. This signal is identified and used to control the exoskeleton.

A key problem has been separating these precise brain signals from those associated with other brain activity, and the highly artificial signals generated by the exoskeleton.

Researchers Use Brain-Machine Interface to Control Exoskeleton

"Exoskeletons create lots of electrical 'noise'" explains Klaus Muller, an author on the paper. "The EEG signal gets buried under all this noise -- but our system is able to separate not only the EEG signal, but the frequency of the flickering LED within this signal."

"People with high spinal cord injuries face difficulties communicating or using their limbs. Decoding what they intend from their brain signals could offer means to communicate and walk again."


Although the paper reports tests on healthy individuals, the system has the potential to aid sick or disabled people.

Related articles

"People with amyotrophic lateral sclerosis (ALS) [motor neuron disease], or high spinal cord injuries face difficulties communicating or using their limbs" continues Muller. "Decoding what they intend from their brain signals could offer means to communicate and walk again."

The control system could serve as a technically simple and feasible add-on to other devices, with EEG caps and hardware now emerging on the consumer market.

It only took volunteers a few minutes to be training how to operate the system. Because of the flickering LEDs they were carefully screened for epilepsy prior to taking part in the research. The researchers are now working to reduce the 'visual fatigue' associated with longer-term users of such systems.

Future work will target improving the system and investigate possible uses in the context of medical rehabilitation.

"We were driven to assist disabled people, and our study shows that this brain control interface can easily and intuitively control an exoskeleton system -- despite the highly challenging artefacts from the exoskeleton itself" concludes Muller.

SOURCE  IOP


By 33rd SquareEmbed



Wednesday, August 28, 2013

First Human Brain-To-Brain Interface Demonstrated

 Brain-To-Brain Interfaces
Researchers have performed what they believe is the first noninvasive human-to-human brain interface, with one researcher able to send a brain signal via the Internet to control the hand motions of a colleague.




University of Washington researchers have performed what they believe is the first noninvasive human-to-human brain interface, with one researcher able to send a brain signal via the Internet to control the hand motions of a fellow researcher.

Using electrical brain recordings and a form of magnetic stimulation, Rajesh Rao sent a brain signal to Andrea Stocco on the other side of the UW campus, causing Stocco's finger to move on a keyboard.

While researchers at Duke University have demonstrated brain-to-brain communication between two rats, and Harvard researchers have demonstrated it between a human and a rat, Rao and Stocco believe this is the first demonstration of human-to-human brain interfacing.

"The Internet was a way to connect computers, and now it can be a way to connect brains," Stocco said. "We want to take the knowledge of a brain and transmit it directly from brain to brain."

The researchers captured the full demonstration on video recorded in both labs. The version available at the end of this story.

Rao, a UW professor of computer science and engineering, has been working on brain-computer interfacing (BCI) in his lab for more than 10 years and just published a textbook, Brain-Computer Interfacing on the subject.

In 2011, spurred by the rapid advances in BCI technology, he believed he could demonstrate the concept of human brain-to-brain interfacing. So he partnered with Stocco, a UW research assistant professor in psychology at the UW's Institute for Learning & Brain Sciences.

human brain-to-brain interface
The cycle of the experiment. Brain signals from the “Sender” are recorded. When the computer detects imagined hand movements, a “fire” command is transmitted over the Internet to the TMS machine, which causes an upward movement of the right hand of the “Receiver.” This usually results in the “fire” key being hit.
Image Souce: University of Washington

On Aug. 12, Rao sat in his lab wearing a cap with electrodes hooked up to an electroencephalography machine, which reads electrical activity in the brain. Stocco was in his lab across campus wearing a purple swim cap marked with the stimulation site for the transcranial magnetic stimulation coil that was placed directly over his left motor cortex, which controls hand movement.

The team had a Skype connection set up so the two labs could coordinate, though neither Rao nor Stocco could see the Skype screens.

Rao looked at a computer screen and played a simple video game with his mind. When he was supposed to fire a cannon at a target, he imagined moving his right hand (being careful not to actually move his hand), causing a cursor to hit the "fire" button. Almost instantaneously, Stocco, who wore noise-canceling earbuds and wasn't looking at a computer screen, involuntarily moved his right index finger to push the space bar on the keyboard in front of him, as if firing the cannon. Stocco compared the feeling of his hand moving involuntarily to that of a nervous tic.

"It was both exciting and eerie to watch an imagined action from my brain get translated into actual action by another brain," Rao said. "This was basically a one-way flow of information from my brain to his. The next step is having a more equitable two-way conversation directly between the two brains."

The technologies used by the researchers for recording and stimulating the brain are both well-known. Electroencephalography, or EEG, is routinely used by clinicians and researchers to record brain activity noninvasively from the scalp. Transcranial magnetic stimulation, or TMS, is a noninvasive way of delivering stimulation to the brain to elicit a response. Its effect depends on where the coil is placed; in this case, it was placed directly over the brain region that controls a person's right hand. By activating these neurons, the stimulation convinced the brain that it needed to move the right hand.

Related articles
Computer science and engineering undergraduates Matthew Bryan, Bryan Djunaedi, Joseph Wu and Alex Dadgar, along with bioengineering graduate student Dev Sarma, wrote the computer code for the project, translating Rao's brain signals into a command for Stocco's brain.

"Brain-computer interface is something people have been talking about for a long, long time," said Chantel Prat, assistant professor in psychology at the UW's Institute for Learning & Brain Sciences, and Stocco's wife and research partner who helped conduct the experiment. "We plugged a brain into the most complex computer anyone has ever studied, and that is another brain."

At first blush, this breakthrough brings to mind all kinds of science fiction scenarios. Stocco jokingly referred to it as a "Vulcan mind meld." But Rao cautioned this technology only reads certain kinds of simple brain signals, not a person's thoughts. And it doesn't give anyone the ability to control your actions against your will.

Both researchers were in the lab wearing highly specialized equipment and under ideal conditions. They also had to obtain and follow a stringent set of international human-subject testing rules to conduct the demonstration.

"I think some people will be unnerved by this because they will overestimate the technology," Prat said. "There's no possible way the technology that we have could be used on a person unknowingly or without their willing participation."

Rao and Stocco next plan to conduct an experiment that would transmit more complex information from one brain to the other. If that works, they then will conduct the experiment on a larger pool of subjects.




SOURCE  University of Washington


By 33rd SquareSubscribe to 33rd Square

Thursday, June 13, 2013

BCI

 Brain-Computer Interfaces
University of Washington researchers have demonstrated that when humans use this technology – called a brain-computer interface – the brain behaves much like it does when completing simple motor skills such as kicking a ball, typing or waving a hand. Learning to control a robotic arm or a prosthetic limb could become second nature for people who are paralyzed.






Small electrodes placed on or inside the brain allow patients to interact with computers or control robotic limbs simply by thinking about how to execute those actions. This technology could improve communication and daily life for a person who is paralyzed or has lost the ability to speak from a stroke or neurodegenerative disease.

Now, University of Washington researchers have demonstrated that when humans use this technology – called a brain-computer interface – the brain behaves much like it does when completing simple motor skills such as kicking a ball, typing or waving a hand. Learning to control a robotic arm or a prosthetic limb could become second nature for people who are paralyzed.

“What we’re seeing is that practice makes perfect with these tasks,” said Rajesh Rao, a UW professor of computer science and engineering and a senior researcher involved in the study. “There’s a lot of engagement of the brain’s cognitive resources at the very beginning, but as you get better at the task, those resources aren’t needed anymore and the brain is freed up.”

Related articles
Rao and UW collaborators Jeffrey Ojemann, a professor of neurological surgery, andJeremiah Wander, a doctoral student in bioengineering, published their results online June 10 in the Proceedings of the National Academy of Sciences.

In this study, seven people with severe epilepsy were hospitalized for a monitoring procedure that tries to identify where in the brain seizures originate. Physicians cut through the scalp, drilled into the skull and placed a thin sheet of electrodes directly on top of the brain. While they were watching for seizure signals, the researchers also conducted this study.

The patients were asked to move a mouse cursor on a computer screen by using only their thoughts to control the cursor’s movement. Electrodes on their brains picked up the signals directing the cursor to move, sending them to an amplifier and then a laptop to be analyzed. Within 40 milliseconds, the computer calculated the intentions transmitted through the signal and updated the movement of the cursor on the screen.

Researchers found that when patients started the task, a lot of brain activity was centered in the prefrontal cortex, an area associated with learning a new skill. But after often as little as 10 minutes, frontal brain activity lessened, and the brain signals transitioned to patterns similar to those seen during more automatic actions.

“Now we have a brain marker that shows a patient has actually learned a task,” Ojemann said. “Once the signal has turned off, you can assume the person has learned it.”

While researchers have demonstrated success in using brain-computer interfaces in monkeys and humans, this is the first study that clearly maps the neurological signals throughout the brain. The researchers were surprised at how many parts of the brain were involved.

“We now have a larger-scale view of what’s happening in the brain of a subject as he or she is learning a task,” Rao said. “The surprising result is that even though only a very localized population of cells is used in the brain-computer interface, the brain recruits many other areas that aren’t directly involved to get the job done.”

Several types of brain-computer interfaces are being developed and tested. The least invasive is a device placed on a person’s head that can detect weak electrical signatures of brain activity. Basic commercial gaming products are on the market, but this technology isn’t very reliable yet because signals from eye blinking and other muscle movements interfere too much.

A more invasive alternative is to surgically place electrodes inside the brain tissue itself to record the activity of individual neurons. Researchers at Brown University and the University of Pittsburghhave demonstrated this in humans as patients, unable to move their arms or legs, have learned to control robotic arms using the signal directly from their brain.

The UW team tested electrodes on the surface of the brain, underneath the skull. This allows researchers to record brain signals at higher frequencies and with less interference than measurements from the scalp. A future wireless device could be built to remain inside a person’s head for a longer time to be able to control computer cursors or robotic limbs at home.

“This is one push as to how we can improve the devices and make them more useful to people,” Wander said. “If we have an understanding of how someone learns to use these devices, we can build them to respond accordingly.”


SOURCE  University of Washington

By 33rd SquareSubscribe to 33rd Square