-
Sophisticated Inference Neural Comput. (IF 2.505) Pub Date : 2021-02-24 Karl Friston; Lancelot Da Costa; Danijar Hafner; Casper Hesp; Thomas Parr
Active inference offers a first principle account of sentient behavior, from which special and important cases—for example, reinforcement learning, active learning, Bayes optimal inference, Bayes optimal design—can be derived. Active inference finesses the exploitation-exploration dilemma in relation to prior preferences by placing information gain on the same footing as reward or value. In brief,
-
Detecting Scene-Plausible Perceptible Backdoors in Trained DNNs without Access to the Training Set Neural Comput. (IF 2.505) Pub Date : 2021-02-23 Zhen Xiang; David J. Miller; Hang Wang; George Kesidis
Backdoor data poisoning attacks add mislabeled examples to the training set, with an embedded backdoor pattern, so that the classifier learns to classify to a target class whenever the backdoor pattern is present in a test sample. Here, we address posttraining detection of scene-plausible perceptible backdoors, a type of backdoor attack that can be relatively easily fashioned, particularly against
-
Parameter Estimation in Multiple Dynamic Synaptic Coupling Model Using Bayesian Point Process State-Space Modeling Framework Neural Comput. (IF 2.505) Pub Date : 2021-02-23 Yalda Amidi; Behzad Nazari; Saeid Sadri; Ali Yousefi
It is of great interest to characterize the spiking activity of individual neurons in a cell ensemble. Many different mechanisms, such as synaptic coupling and the spiking activity of itself and its neighbors, drive a cell's firing properties. Though this is a widely studied modeling problem, there is still room to develop modeling solutions by simplifications embedded in previous models. The first
-
Contrastive Similarity Matching for Supervised Learning Neural Comput. (IF 2.505) Pub Date : 2021-02-23 Shanshan Qin; Nayantara Mudur; Cengiz Pehlevan
We propose a novel biologically plausible solution to the credit assignment problem motivated by observations in the ventral visual pathway and trained deep neural networks. In both, representations of objects in the same category become progressively more similar, while objects belonging to different categories become less similar. We use this observation to motivate a layer-specific learning goal
-
Classification from Pairwise Similarities/Dissimilarities and Unlabeled Data via Empirical Risk Minimization Neural Comput. (IF 2.505) Pub Date : 2021-02-23 Takuya Shimada; Han Bao; Issei Sato; Masashi Sugiyama
Pairwise similarities and dissimilarities between data points are often obtained more easily than full labels of data in real-world classification problems. To make use of such pairwise information, an empirical risk minimization approach has been proposed, where an unbiased estimator of the classification risk is computed from only pairwise similarities and unlabeled data. However, this approach has
-
Toward a Kernel-Based Uncertainty Decomposition Framework for Data and Models Neural Comput. (IF 2.505) Pub Date : 2021-02-23 Rishabh Singh; Jose C. Principe
This letter introduces a new framework for quantifying predictive uncertainty for both data and models that rely on projecting the data into a gaussian reproducing kernel Hilbert space (RKHS) and transforming the data probability density function (PDF) in a way that quantifies the flow of its gradient as a topological potential field quantified at all points in the sample space. This enables the decomposition
-
The Refractory Period Matters: Unifying Mechanisms of Macroscopic Brain Waves Neural Comput. (IF 2.505) Pub Date : 2021-02-23 Corey Weistuch; Lilianne R. Mujica-Parodi; Ken Dill
The relationship between complex brain oscillations and the dynamics of individual neurons is poorly understood. Here we utilize maximum caliber, a dynamical inference principle, to build a minimal yet general model of the collective (mean field) dynamics of large populations of neurons. In agreement with previous experimental observations, we describe a simple, testable mechanism, involving only a
-
Flexible Frequency Switching in Adult Mouse Visual Cortex Is Mediated by Competition between Parvalbumin and Somatostatin Expressing Interneurons Neural Comput. (IF 2.505) Pub Date : 2021-01-29 Justin W. M. Domhof; Paul H. E. Tiesinga
Neuronal networks in rodent primary visual cortex (V1) can generate oscillations in different frequency bands depending on the network state and the level of visual stimulation. High-frequency gamma rhythms, for example, dominate the network's spontaneous activity in adult mice but are attenuated upon visual stimulation, during which the network switches to the beta band instead. The spontaneous local
-
Joint Structure and Parameter Optimization of Multiobjective Sparse Neural Network Neural Comput. (IF 2.505) Pub Date : 2021-01-29 Junhao Huang; Weize Sun; Lei Huang
This work addresses the problem of network pruning and proposes a novel joint training method based on a multiobjective optimization model. Most of the state-of-the-art pruning methods rely on user experience for selecting the sparsity ratio of the weight matrices or tensors, and thus suffer from severe performance reduction with inappropriate user-defined parameters. Moreover, networks might be inferior
-
The Remarkable Robustness of Surrogate Gradient Learning for Instilling Complex Function in Spiking Neural Networks Neural Comput. (IF 2.505) Pub Date : 2021-01-29 Friedemann Zenke; Tim P. Vogels
Brains process information in spiking neural networks. Their intricate connections shape the diverse functions these networks perform. Yet how network connectivity relates to function is poorly understood, and the functional capabilities of models of spiking networks are still rudimentary. The lack of both theoretical insight and practical algorithms to find the necessary connectivity poses a major
-
Low-Dimensional Manifolds Support Multiplexed Integrations in Recurrent Neural Networks Neural Comput. (IF 2.505) Pub Date : 2021-01-29 Arnaud Fanthomme; Rémi Monasson
We study the learning dynamics and the representations emerging in recurrent neural networks (RNNs) trained to integrate one or multiple temporal signals. Combining analytical and numerical investigations, we characterize the conditions under which an RNN with n neurons learns to integrate D(≪n) scalar signals of arbitrary duration. We show, for linear, ReLU, and sigmoidal neurons, that the internal
-
A Generalization of Spatial Monte Carlo Integration Neural Comput. (IF 2.505) Pub Date : 2021-01-29 Muneki Yasuda; Kei Uchizawa
Spatial Monte Carlo integration (SMCI) is an extension of standard Monte Carlo integration and can approximate expectations on Markov random fields with high accuracy. SMCI was applied to pairwise Boltzmann machine (PBM) learning, achieving superior results over those of some existing methods. The approximation level of SMCI can be altered, and it was proved that a higher-order approximation of SMCI
-
Deep Network with Approximation Error Being Reciprocal of Width to Power of Square Root of Depth Neural Comput. (IF 2.505) Pub Date : 2021-01-29 Zuowei Shen; Haizhao Yang; Shijun Zhang
A new network with super-approximation power is introduced. This network is built with Floor (⌊x⌋) or ReLU (max{0,x}) activation function in each neuron; hence, we call such networks Floor-ReLU networks. For any hyperparameters N∈N+ and L∈N+, we show that Floor-ReLU networks with width max{d,5N+13} and depth 64dL+3 can uniformly approximate a Hölder function f on [0,1]d with an approximation error
-
Real-Time Decoding of Attentional States Using Closed-Loop EEG Neurofeedback Neural Comput. (IF 2.505) Pub Date : 2021-01-29 Greta Tuckute; Sofie Therese Hansen; Troels Wesenberg Kjaer; Lars Kai Hansen
Sustained attention is a cognitive ability to maintain task focus over extended periods of time (Mackworth, 1948; Chun, Golomb, & Turk-Browne, 2011). In this study, scalp electroencephalography (EEG) signals were processed in real time using a 32 dry-electrode system during a sustained visual attention task. An attention training paradigm was implemented, as designed in DeBettencourt, Cohen, Lee, Norman
-
The Effect of Class Imbalance on Precision-Recall Curves Neural Comput. (IF 2.505) Pub Date : 2021-01-29 Christopher K. I. Williams
In this note, I study how the precision of a binary classifier depends on the ratio r of positive to negative cases in the test set, as well as the classifier's true and false-positive rates. This relationship allows prediction of how the precision-recall curve will change with r, which seems not to be well known. It also allows prediction of how Fβ and the precision gain and recall gain measures of
-
Mapping Low-Dimensional Dynamics to High-Dimensional Neural Activity: A Derivation of the Ring Model from the Neural Engineering Framework Neural Comput. (IF 2.505) Pub Date : 2021-01-29 Omri Barak; Sandro Romani
Empirical estimates of the dimensionality of neural population activity are often much lower than the population size. Similar phenomena are also observed in trained and designed neural network models. These experimental and computational results suggest that mapping low-dimensional dynamics to high-dimensional neural space is a common feature of cortical computation. Despite the ubiquity of this observation
-
Implicit Regularization and Momentum Algorithms in Nonlinearly Parameterized Adaptive Control and Prediction Neural Comput. (IF 2.505) Pub Date : 2021-01-29 Nicholas M. Boffi; Jean-Jacques E. Slotine
Stable concurrent learning and control of dynamical systems is the subject of adaptive control. Despite being an established field with many practical applications and a rich theory, much of the development in adaptive control for nonlinear systems revolves around a few key algorithms. By exploiting strong connections between classical adaptive nonlinear control techniques and recent progress in optimization
-
Unsupervised Discovery, Control, and Disentanglement of Semantic Attributes with Applications to Anomaly Detection Neural Comput. (IF 2.505) Pub Date : 2021-01-29 William Paul; I-Jeng Wang; Fady Alajaji; Philippe Burlina
Our work focuses on unsupervised and generative methods that address the following goals: (1) learning unsupervised generative representations that discover latent factors controlling image semantic attributes, (2) studying how this ability to control attributes formally relates to the issue of latent factor disentanglement, clarifying related but dissimilar concepts that had been confounded in the
-
Active Inference: Demystified and Compared Neural Comput. (IF 2.505) Pub Date : 2021-01-05 Noor Sajid; Philip J. Ball; Thomas Parr; Karl J. Friston
Active inference is a first principle account of how autonomous agents operate in dynamic, nonstationary environments. This problem is also considered in reinforcement learning, but limited work exists on comparing the two approaches on the same discrete-state environments. In this letter, we provide (1) an accessible overview of the discrete-state formulation of active inference, highlighting natural
-
How Convolutional Neural Network Architecture Biases Learned Opponency and Color Tuning Neural Comput. (IF 2.505) Pub Date : 2021-01-05 Ethan Harris; Daniela Mihai; Jonathon Hare
Recent work suggests that changing convolutional neural network (CNN) architecture by introducing a bottleneck in the second layer can yield changes in learned function. To understand this relationship fully requires a way of quantitatively comparing trained networks. The fields of electrophysiology and psychophysics have developed a wealth of methods for characterizing visual systems that permit such
-
Statistical Analysis of Decoding Performances of Diverse Populations of Neurons Neural Comput. (IF 2.505) Pub Date : 2021-01-05 Kyle P. Wendling; Cheng Ly
A central theme in computational neuroscience is determining the neural correlates of efficient and accurate coding of sensory signals. Diversity, or heterogeneity, of intrinsic neural attributes is known to exist in many brain areas and is thought to significantly affect neural coding. Recent theoretical and experimental work has argued that in uncoupled networks, coding is most accurate at intermediate
-
Whence the Expected Free Energy? Neural Comput. (IF 2.505) Pub Date : 2021-01-05 Beren Millidge; Alexander Tschantz; Christopher L. Buckley
The expected free energy (EFE) is a central quantity in the theory of active inference. It is the quantity that all active inference agents are mandated to minimize through action, and its decomposition into extrinsic and intrinsic value terms is key to the balance of exploration and exploitation that active inference agents evince. Despite its importance, the mathematical origins of this quantity
-
From Biophysical to Integrate-and-Fire Modeling Neural Comput. (IF 2.505) Pub Date : 2021-01-05 Tomas Van Pottelbergh; Guillaume Drion; Rodolphe Sepulchre
This article proposes a methodology to extract a low-dimensional integrate-and-fire model from an arbitrarily detailed single-compartment biophysical model. The method aims at relating the modulation of maximal conductance parameters in the biophysical model to the modulation of parameters in the proposed integrate-and-fire model. The approach is illustrated on two well-documented examples of cellular
-
Learning in Volatile Environments with the Bayes Factor Surprise Neural Comput. (IF 2.505) Pub Date : 2021-01-05 Vasiliki Liakoni; Alireza Modirshanechi; Wulfram Gerstner; Johanni Brea
Surprise-based learning allows agents to rapidly adapt to nonstationary stochastic environments characterized by sudden changes. We show that exact Bayesian inference in a hierarchical model gives rise to a surprise-modulated trade-off between forgetting old observations and integrating them with the new ones. The modulation depends on a probability ratio, which we call the Bayes factor surprise, that
-
Stability Conditions of Bicomplex-Valued Hopfield Neural Networks Neural Comput. (IF 2.505) Pub Date : 2021-01-05 Masaki Kobayashi
Hopfield neural networks have been extended using hypercomplex numbers. The algebra of bicomplex numbers, also referred to as commutative quaternions, is a number system of dimension 4. Since the multiplication is commutative, many notions and theories of linear algebra, such as determinant, are available, unlike quaternions. A bicomplex-valued Hopfield neural network (BHNN) has been proposed as a
-
Predicting the Ease of Human Category Learning Using Radial Basis Function Networks Neural Comput. (IF 2.505) Pub Date : 2021-01-05 Brett D. Roads; Michael C. Mozer
Our goal is to understand and optimize human concept learning by predicting the ease of learning of a particular exemplar or category. We propose a method for estimating ease values, quantitative measures of ease of learning, as an alternative to conducting costly empirical training studies. Our method combines a psychological embedding of domain exemplars with a pragmatic categorization model. The
-
Enhanced Signal Detection by Adaptive Decorrelation of Interspike Intervals Neural Comput. (IF 2.505) Pub Date : 2020-11-30 William H. Nesse; Leonard Maler; André Longtin
Spike trains with negative interspike interval (ISI) correlations, in which long/short ISIs are more likely followed by short/long ISIs, are common in many neurons. They can be described by stochastic models with a spike-triggered adaptation variable. We analyze a phenomenon in these models where such statistically dependent ISI sequences arise in tandem with quasi-statistically independent and identically
-
Enhanced Equivalence Projective Simulation: A Framework for Modeling Formation of Stimulus Equivalence Classes Neural Comput. (IF 2.505) Pub Date : 2020-11-30 Asieh Abolpour Mofrad; Anis Yazidi; Samaneh Abolpour Mofrad; Hugo L. Hammer; Erik Arntzen
Formation of stimulus equivalence classes has been recently modeled through equivalence projective simulation (EPS), a modified version of a projective simulation (PS) learning agent. PS is endowed with an episodic memory that resembles the internal representation in the brain and the concept of cognitive maps. PS flexibility and interpretability enable the EPS model and, consequently the model we
-
A Novel Neural Model with Lateral Interaction for Learning Tasks Neural Comput. (IF 2.505) Pub Date : 2020-11-30 Dequan Jin; Ziyan Qin; Murong Yang; Penghe Chen
We propose a novel neural model with lateral interaction for learning tasks. The model consists of two functional fields: an elementary field to extract features and a high-level field to store and recognize patterns. Each field is composed of some neurons with lateral interaction, and the neurons in different fields are connected by the rules of synaptic plasticity. The model is established on the
-
Robust Stability Analysis of Delayed Stochastic Neural Networks via Wirtinger-Based Integral Inequality Neural Comput. (IF 2.505) Pub Date : 2020-11-30 R. Suresh; A. Manivannan
We discuss stability analysis for uncertain stochastic neural networks (SNNs) with time delay in this letter. By constructing a suitable Lyapunov-Krasovskii functional (LKF) and utilizing Wirtinger inequalities for estimating the integral inequalities, the delay-dependent stochastic stability conditions are derived in terms of linear matrix inequalities (LMIs). We discuss the parameter uncertainties
-
NMDA Receptor Alterations after Mild Traumatic Brain Injury Induce Deficits in Memory Acquisition and Recall Neural Comput. (IF 2.505) Pub Date : 2020-11-30 David Gabrieli; Samantha N. Schumm; Nicholas F. Vigilante; David F. Meaney
Mild traumatic brain injury (mTBI) presents a significant health concern with potential persisting deficits that can last decades. Although a growing body of literature improves our understanding of the brain network response and corresponding underlying cellular alterations after injury, the effects of cellular disruptions on local circuitry after mTBI are poorly understood. Our group recently reported
-
Conductance-Based Adaptive Exponential Integrate-and-Fire Model Neural Comput. (IF 2.505) Pub Date : 2020-11-30 Tomasz Górski; Damien Depannemaecker; Alain Destexhe
The intrinsic electrophysiological properties of single neurons can be described by a broad spectrum of models, from the most realistic Hodgkin-Huxley-type models with numerous detailed mechanisms to the phenomenological models. The adaptive exponential integrate-and-fire (AdEx) model has emerged as a convenient middle-ground model. With a low computational cost but keeping biophysical interpretation
-
Deeply Felt Affect: The Emergence of Valence in Deep Active Inference Neural Comput. (IF 2.505) Pub Date : 2020-11-30 Casper Hesp; Ryan Smith; Thomas Parr; Micah Allen; Karl J. Friston; Maxwell J. D. Ramstead
The positive-negative axis of emotional valence has long been recognized as fundamental to adaptive behavior, but its origin and underlying function have largely eluded formal theorizing and computational modeling. Using deep active inference, a hierarchical inference scheme that rests on inverting a model of how sensory data are generated, we develop a principled Bayesian model of emotional valence
-
Synchrony and Complexity in State-Related EEG Networks: An Application of Spectral Graph Theory. Neural Comput. (IF 2.505) Pub Date : 2020-09-18 Amir Hossein Ghaderi,Bianca R Baltaretu,Masood Nemati Andevari,Vishal Bharmauria,Fuat Balci
Neural Computation, Ahead of Print.
-
Toward a Unified Framework for Cognitive Maps. Neural Comput. (IF 2.505) Pub Date : 2020-09-18 Woori Kim,Yongseok Yoo
Neural Computation, Ahead of Print.
-
Differential Covariance: A New Method to Estimate Functional Connectivity in fMRI. Neural Comput. (IF 2.505) Pub Date : 2020-09-18 Tiger W Lin,Yusi Chen,Qasim Bukhari,Giri P Krishnan,Maxim Bazhenov,Terrence J Sejnowski
Neural Computation, Ahead of Print.
-
Analyzing and Accelerating the Bottlenecks of Training Deep SNNs with Backpropagation. Neural Comput. (IF 2.505) Pub Date : 2020-09-18 Ruizhi Chen,Ling Li
Neural Computation, Ahead of Print.
-
Flexible Working Memory through Selective Gating and Attentional Tagging Neural Comput. (IF 2.505) Pub Date : 2020-10-20 Wouter Kruijne; Sander M. Bohte; Pieter R. Roelfsema; Christian N. L. Olivers
Neural Computation, Ahead of Print.
-
Passive Nonlinear Dendritic Interactions as a Computational Resource in Spiking Neural Networks Neural Comput. (IF 2.505) Pub Date : 2020-10-20 Andreas Stöckel; Chris Eliasmith
Neural Computation, Ahead of Print.
-
Information-Theoretic Representation Learning for Positive-Unlabeled Classification Neural Comput. (IF 2.505) Pub Date : 2020-10-20 Tomoya Sakai; Gang Niu; Masashi Sugiyama
Neural Computation, Ahead of Print.
-
An EM Algorithm for Capsule Regression Neural Comput. (IF 2.505) Pub Date : 2020-10-20 Lawrence K. Saul
Neural Computation, Ahead of Print.
-
Associated Learning: Decomposing End-to-End Backpropagation Based on Autoencoders and Target Propagation Neural Comput. (IF 2.505) Pub Date : 2020-10-20 Yu-Wei Kao; Hung-Hsuan Chen
Neural Computation, Ahead of Print.
-
New Insights into Learning with Correntropy-Based Regression Neural Comput. (IF 2.505) Pub Date : 2020-10-20 Yunlong Feng
Neural Computation, Ahead of Print.
-
Efficient Actor-Critic Reinforcement Learning with Embodiment of Muscle Tone for Posture Stabilization of the Human Arm Neural Comput. (IF 2.505) Pub Date : 2020-10-20 Masami Iwamoto; Daichi Kato
Neural Computation, Ahead of Print.
-
Active Learning for Level Set Estimation under Input Uncertainty and Its Extensions Neural Comput. (IF 2.505) Pub Date : 2020-10-20 Yu Inatsu; Masayuki Karasuyama; Keiichi Inoue; Ichiro Takeuchi
Neural Computation, Ahead of Print.
-
Resonator Networks, 1: An Efficient Solution for Factoring High-Dimensional, Distributed Representations of Data Structures Neural Comput. (IF 2.505) Pub Date : 2020-10-20 E. Paxon Frady; Spencer J. Kent; Bruno A. Olshausen; Friedrich T. Sommer
Neural Computation, Ahead of Print.
-
Redundancy-Aware Pruning of Convolutional Neural Networks Neural Comput. (IF 2.505) Pub Date : 2020-10-20 Guotian Xie
Neural Computation, Ahead of Print.
-
Resonator Networks, 2: Factorization Performance and Capacity Compared to Optimization-Based Methods Neural Comput. (IF 2.505) Pub Date : 2020-10-20 Spencer J. Kent; E. Paxon Frady; Friedrich T. Sommer; Bruno A. Olshausen
Neural Computation, Ahead of Print.
-
Effect of Top-Down Connections in Hierarchical Sparse Coding. Neural Comput. (IF 2.505) Pub Date : 2020-10-20 Victor Boutin,Angelo Franciosini,Franck Ruffier,Laurent Perrinet
Neural Computation, Volume 32, Issue 11, Page 2279-2309, November 2020.
-
ReLU Networks Are Universal Approximators via Piecewise Linear or Constant Functions. Neural Comput. (IF 2.505) Pub Date : 2020-10-20 Changcun Huang
Neural Computation, Volume 32, Issue 11, Page 2249-2278, November 2020.
-
Bicomplex Projection Rule for Complex-Valued Hopfield Neural Networks. Neural Comput. (IF 2.505) Pub Date : 2020-10-20 Masaki Kobayashi
Neural Computation, Volume 32, Issue 11, Page 2237-2248, November 2020.
-
Repetitive Control for Multi-Joint Arm Movements Based on Virtual Trajectories. Neural Comput. (IF 2.505) Pub Date : 2020-10-20 Yoji Uno,Takehiro Suzuki,Takahiro Kagawa
Neural Computation, Volume 32, Issue 11, Page 2212-2236, November 2020.
-
Inferring Neuronal Couplings from Spiking Data Using a Systematic Procedure with a Statistical Criterion. Neural Comput. (IF 2.505) Pub Date : 2020-10-20 Yu Terada,Tomoyuki Obuchi,Takuya Isomura,Yoshiyuki Kabashima
Neural Computation, Volume 32, Issue 11, Page 2187-2211, November 2020.
-
Assessing Goodness-of-Fit in Marked Point Process Models of Neural Population Coding via Time and Rate Rescaling. Neural Comput. (IF 2.505) Pub Date : 2020-10-20 Ali Yousefi,Yalda Amidi,Behzad Nazari,Uri T Eden
Neural Computation, Volume 32, Issue 11, Page 2145-2186, November 2020.
-
Closed-Loop Deep Learning: Generating Forward Models with Backpropagation. Neural Comput. (IF 2.505) Pub Date : 2020-10-20 Sama Daryanavard,Bernd Porr
Neural Computation, Volume 32, Issue 11, Page 2122-2144, November 2020.
-
Reverse-Engineering Neural Networks to Characterize Their Cost Functions. Neural Comput. (IF 2.505) Pub Date : 2020-10-20 Takuya Isomura,Karl Friston
Neural Computation, Volume 32, Issue 11, Page 2085-2121, November 2020.
-
A Cerebellar Computational Mechanism for Delay Conditioning at Precise Time Intervals. Neural Comput. (IF 2.505) Pub Date : 2020-10-20 Terence D Sanger,Mitsuo Kawato
Neural Computation, Volume 32, Issue 11, Page 2069-2084, November 2020.
-
Analysis of Regression Algorithms with Unbounded Sampling. Neural Comput. (IF 2.505) Pub Date : 2020-08-14 Hongzhi Tong,Jiajing Gao
In this letter, we study a class of the regularized regression algorithms when the sampling process is unbounded. By choosing different loss functions, the learning algorithms can include a wide range of commonly used algorithms for regression. Unlike the prior work on theoretical analysis of unbounded sampling, no constraint on the output variables is specified in our setting. By an elegant error
-
Fast and Accurate Langevin Simulations of Stochastic Hodgkin-Huxley Dynamics. Neural Comput. (IF 2.505) Pub Date : 2020-08-14 Shusen Pu,Peter J Thomas
Fox and Lu introduced a Langevin framework for discrete-time stochastic models of randomly gated ion channels such as the Hodgkin-Huxley (HH) system. They derived a Fokker-Planck equation with state-dependent diffusion tensor D and suggested a Langevin formulation with noise coefficient matrix S such that SSr=D. Subsequently, several authors introduced a variety of Langevin equations for the HH system
-
A Predictive-Coding Network That Is Both Discriminative and Generative. Neural Comput. (IF 2.505) Pub Date : 2020-08-14 Wei Sun,Jeff Orchard
Predictive coding (PC) networks are a biologically interesting class of neural networks. Their layered hierarchy mimics the reciprocal connectivity pattern observed in the mammalian cortex, and they can be trained using local learning rules that approximate backpropagation (Bogacz, 2017). However, despite having feedback connections that enable information to flow down the network hierarchy, discriminative
Contents have been reproduced by permission of the publishers.