当前位置: X-MOL 学术arXiv.cs.NE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
TorchDyn: A Neural Differential Equations Library
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2020-09-20 , DOI: arxiv-2009.09346
Michael Poli, Stefano Massaroli, Atsushi Yamashita, Hajime Asama, Jinkyoo Park

Continuous-depth learning has recently emerged as a novel perspective on deep learning, improving performance in tasks related to dynamical systems and density estimation. Core to these approaches is the neural differential equation, whose forward passes are the solutions of an initial value problem parametrized by a neural network. Unlocking the full potential of continuous-depth models requires a different set of software tools, due to peculiar differences compared to standard discrete neural networks, e.g inference must be carried out via numerical solvers. We introduce TorchDyn, a PyTorch library dedicated to continuous-depth learning, designed to elevate neural differential equations to be as accessible as regular plug-and-play deep learning primitives. This objective is achieved by identifying and subdividing different variants into common essential components, which can be combined and freely repurposed to obtain complex compositional architectures. TorchDyn further offers step-by-step tutorials and benchmarks designed to guide researchers and contributors.

中文翻译:

TorchDyn:神经微分方程库

连续深度学习最近成为深度学习的一种新视角,可提高与动态系统和密度估计相关的任务的性能。这些方法的核心是神经微分方程,其前向传递是由神经网络参数化的初始值问题的解。由于与标准离散神经网络相比的特殊差异,例如必须通过数值求解器进行推理,才能释放连续深度模型的全部潜力需要一组不同的软件工具。我们介绍 TorchDyn,这是一个专用于持续深度学习的 PyTorch 库,旨在将神经微分方程提升为与常规的即插即用深度学习原语一样易于访问。这个目标是通过识别不同的变体并将其细分为共同的基本组件来实现的,这些组件可以组合并自由地重新利用以获得复杂的组合架构。TorchDyn 进一步提供了旨在指导研究人员和贡献者的分步教程和基准。
更新日期:2020-09-22
down
wechat
bug