当前位置: X-MOL 学术arXiv.cs.NE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Strong Generalization and Efficiency in Neural Programs
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2020-07-07 , DOI: arxiv-2007.03629
Yujia Li, Felix Gimeno, Pushmeet Kohli, Oriol Vinyals

We study the problem of learning efficient algorithms that strongly generalize in the framework of neural program induction. By carefully designing the input / output interfaces of the neural model and through imitation, we are able to learn models that produce correct results for arbitrary input sizes, achieving strong generalization. Moreover, by using reinforcement learning, we optimize for program efficiency metrics, and discover new algorithms that surpass the teacher used in imitation. With this, our approach can learn to outperform custom-written solutions for a variety of problems, as we tested it on sorting, searching in ordered lists and the NP-complete 0/1 knapsack problem, which sets a notable milestone in the field of Neural Program Induction. As highlights, our learned model can perform sorting perfectly on any input data size we tested on, with $O(n log n)$ complexity, whilst outperforming hand-coded algorithms, including quick sort, in number of operations even for list sizes far beyond those seen during training.

中文翻译:

神经程序的强泛化性和效率

我们研究了在神经程序归纳框架中强泛化的高效算法的学习问题。通过精心设计神经模型的输入/输出接口,并通过模仿,我们能够学习出对任意输入大小产生正确结果的模型,实现强泛化。此外,通过使用强化学习,我们优化了程序效率指标,并发现了超越模仿中使用的老师的新算法。有了这个,我们的方法可以学习在各种问题上优于自定义编写的解决方案,因为我们在排序、在有序列表中搜索和 NP 完全 0/1 背包问题对其进行了测试,这在该领域树立了一个显着的里程碑神经程序归纳。作为亮点,
更新日期:2020-07-09
down
wechat
bug