by Staff Writers
San Diego CA (SPX) Jan 14, 2016
Bioengineers and cognitive scientists have developed the first portable, 64-channel wearable brain activity monitoring system that's comparable to state-of-the-art equipment found in research laboratories.
The system is a better fit for real-world applications because it is equipped with dry EEG sensors that are easier to apply than wet sensors, while still providing high-density brain activity data. The system comprises a 64-channel dry-electrode wearable EEG headset and a sophisticated software suite for data interpretation and analysis. It has a wide range of applications, from research, to neuro-feedback, to clinical diagnostics.
The researchers' goal is to get EEG out of the laboratory setting, where it is currently confined by wet EEG methods. In the future, scientists envision a world where neuroimaging systems work with mobile sensors and smart phones to track brain states throughout the day and augment the brain's capabilities.
"This is going to take neuroimaging to the next level by deploying on a much larger scale," said Mike Yu Chi, a Jacobs School alumnus and CTO of Cognionics who led the team that developed the headset used in the study. "You will be able to work in subjects' homes. You can put this on someone driving."
The researchers from the Jacobs School of Engineering and Institute for Neural Computation at UC San Diego detailed their findings in an article of the Special Issue on Wearable Technologies published recently in IEEE Transactions on Biomedical Engineering.
They also envision a future when neuroimaging can be used to bring about new therapies for neurological disorders. "We will be able to prompt the brain to fix its own problems," said Gert Cauwenberghs, a bioengineering professor at the Jacobs School and a principal investigator of the research supported in part by a five-year Emerging Frontiers of Research Innovation grant from the National Science Foundation.
"We are trying to get away from invasive technologies, such as deep brain stimulation and prescription medications, and instead start up a repair process by using the brain's synaptic plasticity."
In 10 years, using a brain-machine interface might become as natural as using your smartphone is today, said Tim Mullen, a UC San Diego alumnus, now CEO of Qusp and lead author on the study. Mullen, a former researcher at the Swartz Center for Computational Neuroscience at UC San Diego, led the team that developed the software used in the study with partial funding from the Army Research Lab.
For this vision of the future to become a reality, sensors will need to become not only wearable but also comfortable, and algorithms for data analysis will need to be able to cut through noise to extract meaningful data. The paper, titled "Real-time Neuroimaging and Cognitive Monitoring Using Wearable Dry EEG," outlines some significant first steps in that direction.
Researchers spent four years perfecting the recipe for the sensors' materials. Sensors designed to work on a subject's hair are made of a mix of silver and carbon deposited on a flexible substrate. This material allows sensors to remain flexible and durable while still conducting high-quality signals - a silver/silver-chloride coating is key here.
Sensors designed to work on bare skin are made from a hydrogel encased inside a conductive membrane. These sensors are mounted inside a pod equipped with an amplifier, which helps boost signal quality while shielding the sensors from interferences from electrical equipment and other electronics.
Next steps include improving the headset's performance while subjects are moving. The device can reliably capture signals while subjects walk but less so during more strenuous activities such as running. Electronics also need improvement to function for longer time periods - days and even weeks instead of hours.
Software and data analysis
The researchers designed an algorithm that separates the EEG data in real-time into different components that are statistically unrelated to one another. It then compared these elements with clean data obtained, for instance, when a subject is at rest. Abnormal data were labeled as noise and discarded.
"The algorithm attempts to remove as much of the noise as possible while preserving as much of the brain signal as possible," said Mullen.
But the analysis didn't stop there. Researchers used information about the brain's known anatomy and the data they collected to find out where the signals come from inside the brain. They also were able to track, in real time, how signals from different areas of the brain interact with one another, building an ever-changing network map of brain activity. They then used machine learning to connect specific network patterns in brain activity to cognition and behavior.
"A Holy Grail in our field is to track meaningful changes in distributed brain networks at the 'speed of thought'," Mullen said. "We're closer to that goal, but we're not quite there yet."
Mullen's start-up, Qusp, has developed NeuroScale, a cloud-based software platform that provides continuous real-time interpretation of brain and body signals through an Internet application program interface. The goal is to enable brain-computer interface and advanced signal processing methods to be easily integrated with various everyday applications and wearable devices.
Under joint DARPA funding, Cognionics is creating an improved EEG system, while Qusp is developing an easy-to-use graphical software environment for rapid design and application of brain signal analysis pipelines.
"These entrepreneurial efforts are integral to the success of the Jacobs School and the Institute for Neural Computation to help take neurotechnology from the lab to practical uses in cognitive and clinical applications," said Cauwenberghs, who is co-founder of Cognionics and serves on its Scientific Advisory Board.
Mullen is also affiliated with the Swartz Center for Computational Neuroscience at the Institute of Neural Computation at UC San Diego, as are co-authors Kothe, Alejandro Ojeda, Director Scott Makeig, and Co-director Tzyy-Ping Jung. Co-author Trevor Kerth is now pursuing industrial design at Kingston University, London. Science paper reference.
University of California - San Diego
All About Human Beings and How We Got To Be Here
|The content herein, unless otherwise known to be public domain, are Copyright 1995-2016 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. Privacy Statement All images and articles appearing on Space Media Network have been edited or digitally altered in some way. Any requests to remove copyright material will be acted upon in a timely and appropriate manner. Any attempt to extort money from Space Media Network will be ignored and reported to Australian Law Enforcement Agencies as a potential case of financial fraud involving the use of a telephonic carriage device or postal service.|