当前位置: X-MOL 学术arXiv.cs.LG › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Meta-Learning Symmetries by Reparameterization
arXiv - CS - Machine Learning Pub Date : 2020-07-06 , DOI: arxiv-2007.02933
Allan Zhou, Tom Knowles, Chelsea Finn

Many successful deep learning architectures are equivariant to certain transformations in order to conserve parameters and improve generalization: most famously, convolution layers are equivariant to shifts of the input. This approach only works when practitioners know the symmetries of the task and can manually construct an architecture with the corresponding equivariances. Our goal is an approach for learning equivariances from data, without needing to design custom task-specific architectures. We present a method for learning and encoding equivariances into networks by learning corresponding parameter sharing patterns from data. Our method can provably represent equivariance-inducing parameter sharing for any finite group of symmetry transformations. Our experiments suggest that it can automatically learn to encode equivariances to common transformations used in image processing tasks. We provide our experiment code at https://github.com/AllanYangZhou/metalearning-symmetries.

中文翻译:

通过重新参数化的元学习对称性

许多成功的深度学习架构对某些变换是等变的,以保存参数并提高泛化能力:最著名的是,卷积层对输入的变化是等变的。这种方法只有在从业者知道任务的对称性并且可以手动构建具有相应等方差的架构时才有效。我们的目标是一种从数据中学习等方差的方法,而无需设计自定义的特定于任务的架构。我们提出了一种通过从数据中学习相应的参数共享模式来学习等方差并将其编码到网络中的方法。我们的方法可以证明表示任何有限对称变换组的等方差诱导参数共享。我们的实验表明,它可以自动学习对图像处理任务中使用的常见变换的等方差进行编码。我们在 https://github.com/AllanYangZhou/metalearning-symmetries 提供我们的实验代码。
更新日期:2020-10-08
down
wechat
bug