当前位置: X-MOL 学术Automatica › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Lazily Adapted Constant Kinky Inference for nonparametric regression and model-reference adaptive control
Automatica ( IF 4.8 ) Pub Date : 2020-08-29 , DOI: 10.1016/j.automatica.2020.109216
Jan-Peter Calliess , Stephen J. Roberts , Carl Edward Rasmussen , Jan Maciejowski

Techniques known as Nonlinear Set Membership prediction or Lipschitz Interpolation are approaches to supervised machine learning that utilise presupposed Lipschitz properties to perform inference over unobserved function values. Provided a bound on the true best Lipschitz constant of the target function is known a priori, they offer convergence guarantees, as well as bounds around the predictions. Considering a more general setting that builds on Lipschitz continuity, we propose an online method for estimating the Lipschitz constant online from function value observations that are possibly corrupted by bounded noise. Utilising this as a data-dependent hyper-parameter gives rise to a nonparametric machine learning method, for which we establish strong universal approximation guarantees. That is, we show that our prediction rule can learn any continuous function on compact support in the limit of increasingly dense data, up to a worst-case error that can be bounded by the level of observational error. We also consider applications of our nonparametric regression method to learning-based control. For a class of discrete-time settings, we establish convergence guarantees on the closed-loop tracking error of our online learning-based controllers. To provide evidence that our method can be beneficial not only in theory but also in practice, we apply it in the context of nonparametric model-reference adaptive control (MRAC). Across a range of simulated aircraft roll-dynamics and performance metrics our approach outperforms recently proposed alternatives that were based on Gaussian processes and RBF-neural networks.



中文翻译:

用于非参数回归和模型参考自适应控制的惰性适应常数变态推断

称为非线性集成员资格预测或Lipschitz插值的技术是监督机器学习的方法,该方法利用预设的Lipschitz属性对未观察到的函数值进行推断。假设先验已知目标函数的真实最佳Lipschitz常数的界限,则它们提供收敛保证以及预测的界限。考虑到基于Lipschitz连续性的更一般的设置,我们提出了一种在线方法,用于从可能受边界噪声破坏的函数值观测值在线估计Lipschitz常数。将其用作数据相关的超参数会产生一种非参数机器学习方法,为此我们建立了强大的通用逼近保证。也就是说,我们证明了我们的预测规则可以在日益密集的数据的限制内学习紧致支持下的任何连续函数,最坏情况下的误差可以由观测误差的水平来限制。我们还考虑将我们的非参数回归方法应用于基于学习的控制。对于一类离散时间设置,我们为基于在线学习的控制器的闭环跟踪误差建立收敛保证。为了提供证据证明我们的方法不仅在理论上而且在实践中都可以受益,我们将其应用于非参数模型参考自适应控制(MRAC)。在一系列模拟的飞机侧倾动力学和性能指标方面,我们的方法优于最近提出的基于高斯过程RBF神经网络的替代方案。

更新日期:2020-08-29
down
wechat
bug