When people suffer from debilitating injuries or illnesses of the nervous system, they sometimes lose the ability to perform normally given tasks, such as walking, playing music, or driving a car. They can imagine doing something, but the injury can prevent that action from taking place.
Brain-computer interface systems exist that can translate brain signals into a desired action to regain some function, but they can be a burden to use because they do not always work smoothly. And repetition is required to complete simple tasks.
Researchers at the University of Pittsburgh and Carnegie Mellon University are working on understanding how the brain functions during learning tasks with the help of brain-computer interface technology. In a set of papers, the second part of which was published today in Nature Biomedical Engineering, the team is moving the needle on brain-computer interface technology aimed at helping improve the lives of amputee patients using neural prosthetics To do.
“During your work day, you plan your evening trip to the grocery store,” said Aaron Batista, associate professor of bioengineering at Pitt’s Swanson School of Engineering.
“This plan remains somewhere in your brain throughout the day, but probably doesn’t reach your motor cortex until you actually arrive at the store. We’re developing brain-computer interface technologies that make up our everyday Will one day work at the level of intentions. ”
Batista, Pitt Postdoctoral Research Associate Emily Obi and Carnegie Mellon researchers have collaborated on developing direct pathways from the brain to external devices. They use electrodes smaller than a hair that record neural activity and make it available for control algorithms.
In the first study of the team published last June in the Proceedings of the National Academy of Sciences, the group investigated how the brain changes with new brain-computer interface skills.
“When subjects create a motor intention, it causes a pattern of activity in those electrodes, and we present them as movements on a computer screen. Subjects then change their neural activity patterns in such a way that they stop the movements they want.
In the new study, the team designed a technique through which the brain – computer interface readoglobes are constantly in the background to ensure that the system is always ready for calibration and use.
“We change how neural activity affects the movement of the cursor, and it motivates learning,” said lead author Pitts Obi of the study.
“If we change that relationship in a certain way, it is essential that our animal subjects again produce new patterns of neural activity to learn to control the movement of the cursor. Doing this took them weeks of practice, and we could see how the brain changed as they learned. ”
In one sense, the algorithm “learns” how to adjust the noise and instability that is inherent in neural recording interfaces. The findings suggest that the process of humans to master a new skill involves the generation of new neural activity patterns. The team eventually intends to use this technique in a clinical setting for stroke rehabilitation.
Such a self-recalibration process has been a long-expected goal in the field of neural prosthetics, and the method presented in the team’s study automates volatiles without the need to pause the user to reorder the system by themselves. Is able to recover from.
“Explain that the instability was so large that the subject was no longer able to control the brain-computer interface,” Yu said. “Existing self-recombination processes are likely to conflict in that scenario, while in our method, we have demonstrated that it can overcome even the most dramatic instability in many cases.”
Both research projects were showcased as part of the neural base center of cognition. This institutional institutional research and education program leverages Pitt’s strengths in basic and clinical neuroscience and bioengineering with Carnegie Mellon in cognitive and computational neuroscience.
Other Carnegie Mellon collaborators on the projects include co-director Byron Yu, professor of electrical and computer engineering and biomedical engineering, and postdoctoral researchers Alan Degenhardt and William Bishop, who led the research operations.
The instability of neural recordings may render the clinical brain – computer interface (BCI) uncontrollable. Here, we show that the alignment of low-dimensional neural manifolds (low-dimensional spaces that describe specific correlation patterns between neurons) can be used to stabilize neural activity, leading to the presence of recording instabilities. BCI performance in can be maintained.