当前位置: X-MOL 学术Neural Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Might a Single Neuron Solve Interesting Machine Learning Problems Through Successive Computations on Its Dendritic Tree?
Neural Computation ( IF 2.9 ) Pub Date : 2021-05-13 , DOI: 10.1162/neco_a_01390
Ilenna Simone Jones 1 , Konrad Paul Kording 2
Affiliation  

Physiological experiments have highlighted how the dendrites of biological neurons can nonlinearly process distributed synaptic inputs. However, it is unclear how aspects of a dendritic tree, such as its branched morphology or its repetition of presynaptic inputs, determine neural computation beyond this apparent nonlinearity. Here we use a simple model where the dendrite is implemented as a sequence of thresholded linear units. We manipulate the architecture of this model to investigate the impacts of binary branching constraints and repetition of synaptic inputs on neural computation. We find that models with such manipulations can perform well on machine learning tasks, such as Fashion MNIST or Extended MNIST. We find that model performance on these tasks is limited by binary tree branching and dendritic asymmetry and is improved by the repetition of synaptic inputs to different dendritic branches. These computational experiments further neuroscience theory on how different dendritic properties might determine neural computation of clearly defined tasks.

中文翻译:

单个神经元是否可以通过对其树突树的连续计算来解决有趣的机器学习问题?

生理实验强调了生物神经元的树突如何非线性地处理分布式突触输入。然而,目前尚不清楚树突树的各个方面(例如其分支形态或突触前输入的重复)如何确定超出这种明显非线性的神经计算。这里我们使用一个简单的模型,其中树突被实现为一系列阈值线性单元。我们操纵该模型的架构来研究二进制分支约束和突触输入重复对神经计算的影响。我们发现具有此类操作的模型可以在机器学习任务上表现良好,例如 Fashion MNIST 或 Extended MNIST。我们发现这些任务的模型性能受到二叉树分支和树突不对称性的限制,并且通过对不同树突分支的突触输入重复进行改进。这些计算实验进一步推动了关于不同树突特性如何决定明确定义任务的神经计算的神经科学理论。
更新日期:2021-05-13
down
wechat
bug