当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A note on computing with Kolmogorov Superpositions without iterations
Neural Networks ( IF 6.0 ) Pub Date : 2021-08-08 , DOI: 10.1016/j.neunet.2021.07.006
Robert Demb 1 , David Sprecher 2
Affiliation  

We extend Kolmogorov’s Superpositions to approximating arbitrary continuous functions with a noniterative approach that can be used by any neural network that uses these superpositions. Our approximation algorithm uses a modified dimension reducing function that allows for an increased number of summands to achieve an error bound commensurate with that of r iterations for any r. This new variant of Kolmogorov’s Superpositions improves upon the original parallelism inherent in them by performing highly distributed parallel computations without synchronization. We note that this approach makes implementation much easier and more efficient on networks of modern parallel hardware, and thus makes it a more practical tool.



中文翻译:

关于使用无迭代 Kolmogorov 叠加计算的说明

我们将 Kolmogorov 的叠加扩展到使用非迭代方法逼近任意连续函数,任何使用这些叠加的神经网络都可以使用这种方法。我们的近似算法使用改进的降维函数,该函数允许增加被加数的数量,以实现与任何rr迭代的误差界限相称的误差界限。Kolmogorov 叠加的这个新变体通过在没有同步的情况下执行高度分布式的并行计算,改进了它们固有的原始并行性。我们注意到,这种方法使得在现代并行硬件网络上的实现更容易、更高效,从而使其成为更实用的工具。

更新日期:2021-09-23
down
wechat
bug