当前位置: X-MOL 学术IEEE Trans. Syst. Man Cybern. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Bidirectional Backpropagation
IEEE Transactions on Systems, Man, and Cybernetics: Systems ( IF 8.6 ) Pub Date : 2020-05-01 , DOI: 10.1109/tsmc.2019.2916096
Olaoluwa Adigun , Bart Kosko

We extend backpropagation (BP) learning from ordinary unidirectional training to bidirectional training of deep multilayer neural networks. This gives a form of backward chaining or inverse inference from an observed network output to a candidate input that produced the output. The trained network learns a bidirectional mapping and can apply to some inverse problems. A bidirectional multilayer neural network can exactly represent some invertible functions. We prove that a fixed three-layer network can always exactly represent any finite permutation function and its inverse. The forward pass computes the permutation function value. The backward pass computes the inverse permutation with the same weights and hidden neurons. A joint forward–backward error function allows BP learning in both directions without overwriting learning in either direction. The learning applies to classification and regression. The algorithms do not require that the underlying sampled function has an inverse. A trained regression network tends to map an output back to the centroid of its preimage set.

中文翻译:

双向反向传播

我们将反向传播 (BP) 学习从普通的单向训练扩展到深层多层神经网络的双向训练。这给出了从观察到的网络输出到产生输出的候选输入的反向链接或逆推理的形式。经过训练的网络学习双向映射,可以应用于一些逆问题。一个双向多层神经网络可以准确地表示一些可逆函数。我们证明了一个固定的三层网络总是可以精确地表示任何有限的置换函数及其逆。前向传递计算置换函数值。反向传播计算具有相同权重和隐藏神经元的逆排列。联合前向-后向误差函数允许双向学习 BP 而不覆盖任一方向的学习。该学习适用于分类和回归。算法不要求底层采样函数具有逆函数。经过训练的回归网络倾向于将输出映射回其原像集的质心。
更新日期:2020-05-01
down
wechat
bug