当前位置: X-MOL 学术Mach. Learn. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Regularisation of neural networks by enforcing Lipschitz continuity
Machine Learning ( IF 4.3 ) Pub Date : 2020-12-06 , DOI: 10.1007/s10994-020-05929-w
Henry Gouk , Eibe Frank , Bernhard Pfahringer , Michael J. Cree

We investigate the effect of explicitly enforcing the Lipschitz continuity of neural networks with respect to their inputs. To this end, we provide a simple technique for computing an upper bound to the Lipschitz constant of a feed forward neural network composed of commonly used layer types and demonstrate inaccuracies in previous work on this topic. Our technique is then used to formulate training a neural network with a bounded Lipschitz constant as a constrained optimisation problem that can be solved using projected stochastic gradient methods. Our evaluation study shows that, in isolation, our method performs comparatively to state-of-the-art regularisation techniques. Moreover, when combined with existing approaches to regularising neural networks the performance gains are cumulative. We also provide evidence that the hyperparameters are intuitive to tune and demonstrate how the choice of norm for computing the Lipschitz constant impacts the resulting model.

中文翻译:

通过强制 Lipschitz 连续性对神经网络进行正则化

我们研究了明确强制执行神经网络关于其输入的 Lipschitz 连续性的效果。为此,我们提供了一种简单的技术来计算由常用层类型组成的前馈神经网络的 Lipschitz 常数的上限,并证明了先前关于该主题的工作的不准确性。然后,我们的技术用于制定训练具有有界 Lipschitz 常数的神经网络作为约束优化问题,可以使用投影随机梯度方法解决该问题。我们的评估研究表明,孤立地看,我们的方法与最先进的正则化技术相比具有相当的性能。此外,当与现有的正则化神经网络方法相结合时,性能提升是累积的。
更新日期:2020-12-06
down
wechat
bug