Towards neural co-processors for the brain: combining decoding and encoding in brain–computer interfaces
Introduction
A brain–computer interface (BCI) [1, 2, 3, 4] is a device that can (a) allow signals from the brain to be used to control devices such as prosthetics, cursors or robots, and (b) allow external signals to be delivered to the brain through neural stimulation. The field of BCIs has made enormous strides in the past two decades. The genesis of the field can be traced to early efforts in the 1960s by neuroscientists such as Fetz [5•] who studied operant conditioning in monkeys by training them to control the movement of a needle in an analog meter by modulating the firing rate of a neuron in their motor cortex. Others such as Delgado and Vidal explored techniques for neural decoding and stimulation in early versions of neural interfaces [6•,7]. After a promising start, there was a surprising lull in the field until the 1990s when, spurred by the advent of multi-electrode recordings as well as fast and cheap computers, the field saw a resurgence under the banner of brain–computer interfaces (BCIs; also known as brain–machine interfaces and neural interfaces) [1,2].
A major factor in the rise of BCIs has been the application of increasingly sophisticated machine learning techniques for decoding neural activity for controlling prosthetic arms [8,9,10•], cursors [11,12,13•,14,15,16••], spellers [17,18] and robots [19, 20, 21, 22]. Simultaneously, researchers have explored how information can be biomimetically or artificially encoded and delivered via stimulation to neuronal networks in the brain and other regions of the nervous system for auditory [23], visual [24], proprioceptive [25], and tactile [26,27,28•,29,30] perception.
Building on these advances in neural decoding and encoding, researchers have begun to explore bi-directional BCIs (BBCIs) which integrate decoding and encoding in a single system. In this article, we review how BBCIs can be used for closed-loop control of prosthetic devices, reanimation of paralyzed limbs, restoration of sensorimotor and cognitive function, neuro-rehabilitation, enhancement of memory, and brain augmentation.
Motivated by this recent progress, we propose a new unifying framework for combining decoding and encoding based on ‘neural co-processors’ which rely on artificial neural networks and deep learning. We show that these ‘neural co-processors’ can be used to jointly optimize cost functions with the nervous system to achieve goals such as targeted rehabilitation and augmentation of brain function, besides providing a new tool for testing computational models and understanding brain function [31].
Section snippets
Closed-loop prosthetic control
Consider the problem of controlling a prosthetic hand using brain signals. This involves (1) using recorded neural responses to control the hand, (2) stimulating somatosensory neurons to provide tactile and proprioceptive feedback, and (3) ensuring that stimulation artifacts do not corrupt the recorded signals being used to control the hand. Several artifact reduction methods have been proposed for (3) – we refer the reader to Refs. [32, 33, 34]. We focus here on combining (1) decoding with (2)
Towards a unifying framework: neural co-processors based on deep learning
A major limitation of current BBCIs is that they treat decoding and encoding as separate processes, and they do not co-adapt and jointly optimize a cost function with the nervous system. We propose that these limitations may be addressed using a ‘neural co-processor’ as shown in Figure 1. A neural co-processor uses two artificial neural networks, a co-processor network (CPN) and an emulator network (EN), combined with a new type of deep learning that approximates backpropagation through both
Challenges
A first challenge in realizing the above vision for neural co-processors is obtaining an error signal for training the two networks. In the simplest case, the error may simply be a neural error signal: the goal is to drive neural activity in areas B1, B2, and so on toward known target neural activity patterns, and we can therefore train the CPN directly to approximate these activity patterns without using an EN. However, we expect such scenarios to be rare. In the more realistic case of
Conclusions
Traditionally, much of BCI research has focused on the problem of decoding, specifically, how can movement intention be extracted from noisy brain signals to control prosthetic devices? More recently, there has been growing interest in ‘closing the loop’ using bidirectional BCIs (BBCIs) which incorporate sensory feedback, for example, from artificial tactile sensors, via stimulation. The ability to simultaneously decode neural activity from one region and encode information to deliver via
Conflict of interest statement
Nothing declared.
References and recommended reading
Papers of particular interest, published within the period of review, have been highlighted as:
• of special interest
•• of outstanding interest
Acknowledgements
This work was supported by the National Science Foundation (EEC-1028725 and 1630178), the National Institute of Mental Health (CRCNS/NIMH 1R01MH112166-01), and a grant from the W.M. Keck Foundation. The author would like to thank Eb Fetz, Chet Moritz, Andrea Stocco, Jeff Ojemann, Steve Perlmutter, Dimi Gklezakos, Jon Mishler, Richy Yun, David Caldwell, Jeneva Cronin, Nile Wilson and James Wu for discussions related to topics covered in this article.
References (60)
- et al.
Unscented Kalman filter for brain–machine interfaces
PLoS One
(2009) - et al.
Automatic extraction of command hierarchies for adaptive brain–robot interfacing
Proceedings of ICRA 2012
(2012) - et al.
Toward a proprioceptive neural interface that mimics natural cortical activity
Adv Exp Med Biol
(2016) - et al.
A cognitive neuroprosthetic that uses cortical stimulation for somatosensory feedback
J Neural Eng
(2014) - et al.
Restoration of grasp following paralysis through brain-controlled stimulation of muscles
Nature
(2012) - et al.
Playing 20 questions with the mind: collaborative problem solving by humans using a brain-to-brain interface
PLoS One
(2015) - et al.
Restoration of function after brain damage using a neural prosthesis
Proc Natl Acad Sci U S A
(2013) - et al.
High-speed spelling with a noninvasive brain–computer interface
Proc Natl Acad Sci U S A
(2015) Brain–Computer Interfacing: An Introduction
(2013)
New perspectives on neuroengineering and neurotechnologies: NSF-DFG workshop report
IEEE Trans Biomed Eng
Brain-machine interfaces: from basic science to neuroprostheses and neurorehabilitation
Physiol Rev
Operant conditioning of cortical unit activity
Science
Physical Control of the Mind: Toward a Psychocivilized Society
Toward direct brain–computer communication
Annu Rev Biophys Bioeng
Real-time control of a robot arm using simultaneously recorded neurons in the motor cortex
Nat Neurosci
Cortical control of a prosthetic arm for self-feeding
Nature
Reach and grasp by people with tetraplegia using a neurally controlled robotic arm
Nature
An EEG-based brain–computer interface for cursor control
Electroencephalogr Clin Neurophysiol
Instant neural control of a movement signal
Nature
Control of a two-dimensional movement signal by a noninvasive brain–computer interface in humans
Proc Natl Acad Sci U S A
Clinical translation of a high-performance neural prosthesis
Nat Med
High performance communication by people with paralysis using an intracortical brain–computer interface
eLife
Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials
Electroencephalogr Clin Neurophysiol
Brain–computer interface research at the University of South Florida Cognitive Psychophysiology Laboratory: the P300 speller
IEEE Trans Neural Syst Rehabil Eng
Control of a humanoid robot by a noninvasive brain–computer interface in humans
J Neural Eng
A brain-actuated wheelchair: asynchronous and non-invasive brain–computer interfaces for continuous control of robots
Clin Neurophysiol
Asynchronous non-invasive brain-actuated control of an intelligent wheelchair
Conference of the IEEE Engineering in Medicine and Biology Society
Retinal prosthesis
Annu Rev Biomed Eng
Cited by (31)
Interface, interaction, and intelligence in generalized brain–computer interfaces
2021, Trends in Cognitive SciencesCitation Excerpt :In recent years, AI and machine learning methods have been widely used in brain signal decoding [2,25]. Unlike classical BCIs, a brain–computer interaction system is a closed-loop feedback control system with a brain-in-the-loop (Figure 1B) [26]. This is also called a bidirectional BCI [27].
Cybersecurity in Brain-Computer Interfaces: RFID-based design-theoretical framework
2021, Informatics in Medicine UnlockedCitation Excerpt :In unidirectional communication the BCI can either collect data from the brain or stimulate the brain. Bidirectional communication is when both tasks are performed [R1] [5–7]. Therefore, BCI's network security is particularly important.
Walking in the shoes of others through brain-to-brain interfaces: a phenomenological approach to the generation of a collective living body
2024, Humanities and Social Sciences CommunicationsNeural co-processors for restoring brain function: results from a cortical model of grasping
2023, Journal of Neural EngineeringPhysics of cancer, volume 4 (second edition): Mechanical characterization of cells
2023, Physics of Cancer, Volume 4 (Second Edition): Mechanical characterization of cellsFunctional Mapping of the Brain for Brain–Computer Interfacing: A Review
2023, Electronics (Switzerland)