当前位置: X-MOL 学术Opt. Mem. Neural Networks › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Encoding and Decoding of Recursive Structures in Neural-Symbolic Systems
Optical Memory and Neural Networks ( IF 1.0 ) Pub Date : 2021-04-19 , DOI: 10.3103/s1060992x21010033
A. Demidovskij

Abstract

One of the ways to join the connectionist approach and the symbolic paradigm is Tensor Product Variable Binding. It was initially devoted to building distributed representation of recursive structures for neural networks to use it as the input. Structures are an essential part of both formal and natural languages and appear in syntactic trees, grammar, semantic interpretation. A human mind smoothly operates with the appearing problems on the neural level, and it is naturally scalable and robust. The question arises of whether it is possible to translate traditional symbolic algorithms to the sub-symbolic level to reuse performance and computational gain of the neural networks for general tasks. However, several aspects of Tensor Product Variable Binding lack attention in public research, especially in building such a neural architecture that performs computations according to the mathematical model without preliminary training. In this paper, those implementation aspects are addressed. A proposed novel design for the decoding network translates a tensor to a corresponding recursive structure with the arbitrary level of nesting. Also, several complex topics about encoding such structures in the distributed representation or tensor are addressed. Both encoding and decoding neural networks are built with the Keras framework’s help and are analyzed from the perspective of applied value. The proposed design continues the series of papers dedicated to building a robust bridge between two computational paradigms: connectionist and symbolic.



中文翻译:

神经符号系统中递归结构的编码和解码

摘要

Tensor产品变量绑定是加入连接主义方法和符号范式的一种方法。它最初致力于为神经网络构建递归结构的分布式表示形式,以将其用作输入。结构是形式语言和自然语言的重要组成部分,并且出现在语法树,语法和语义解释中。人脑在出现神经层问题时可以平稳地进行操作,并且自然地具有可扩展性和鲁棒性。问题是,是否有可能将传统的符号算法转换为亚符号级,以将神经网络的性能和计算增益重新用于一般任务。但是,Tensor产品变量绑定的几个方面在公共研究中缺乏关注,特别是在构建这种神经体系结构时,无需预先培训即可根据数学模型执行计算。本文解决了这些实现方面的问题。为解码网络提出的新颖设计将张量转换为具有任意嵌套级别的相应递归结构。此外,还讨论了有关在分布式表示形式或张量中对此类结构进行编码的几个复杂主题。编码和解码神经网络都是在Keras框架的帮助下构建的,并从应用价值的角度进行了分析。拟议的设计延续了一系列论文,这些论文致力于在两个计算范式之间建立牢固的桥梁:连接论和符号学。这些实现方面已得到解决。为解码网络提出的新颖设计将张量转换为具有任意嵌套级别的相应递归结构。此外,还讨论了有关在分布式表示形式或张量中对此类结构进行编码的几个复杂主题。编码和解码神经网络都是在Keras框架的帮助下构建的,并从应用价值的角度进行了分析。拟议的设计延续了一系列论文,这些论文致力于在两个计算范式之间建立牢固的桥梁:连接论和符号学。这些实现方面已得到解决。为解码网络提出的新颖设计将张量转换为具有任意嵌套级别的相应递归结构。此外,还讨论了有关在分布式表示形式或张量中对此类结构进行编码的几个复杂主题。编码和解码神经网络都是在Keras框架的帮助下构建的,并从应用价值的角度进行了分析。拟议的设计延续了一系列论文,这些论文致力于在两个计算范式之间建立牢固的桥梁:连接论和符号学。讨论了有关在分布式表示形式或张量中对此类结构进行编码的几个复杂主题。编码和解码神经网络都是在Keras框架的帮助下构建的,并从应用价值的角度进行了分析。拟议的设计延续了一系列论文,这些论文致力于在两个计算范式之间建立牢固的桥梁:连接论和符号学。讨论了有关在分布式表示形式或张量中对此类结构进行编码的几个复杂主题。编码和解码神经网络都是在Keras框架的帮助下构建的,并从应用价值的角度进行了分析。拟议的设计延续了一系列论文,这些论文致力于在两个计算范式之间建立牢固的桥梁:连接论和符号学。

更新日期:2021-04-19
down
wechat
bug