当前位置: X-MOL 学术IEEE Trans. Cybern. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Novel Multitask Conditional Neural-Network Surrogate Models for Expensive Optimization
IEEE Transactions on Cybernetics ( IF 9.4 ) Pub Date : 9-3-2020 , DOI: 10.1109/tcyb.2020.3014126
Jianping Luo 1 , Liang Chen 1 , Xia Li 2 , Qingfu Zhang 3
Affiliation  

Multiple-related tasks can be learned simultaneously by sharing information among tasks to avoid tabula rasa learning and to improve performance in the no transfer case (i.e., when each task learns in isolation). This study investigates multitask learning with conditional neural process (CNP) networks and proposes two multitask learning network models on the basis of CNPs, namely, the one-to-many multitask CNP (OMc-MTCNP) and the many-to-many MTCNP (MMc-MTCNP). Compared with existing multitask models, the proposed models add an extensible correlation learning layer to learn the correlation among tasks. Moreover, the proposed multitask CNP (MTCNP) networks are regarded as surrogate models and applied to a Bayesian optimization framework to replace the Gaussian process (GP) to avoid the complex covariance calculation. The proposed Bayesian optimization framework simultaneously infers multiple tasks by utilizing the possible dependencies among them to share knowledge across tasks. The proposed surrogate models augment the observed dataset with a number of related tasks to estimate model parameters confidently. The experimental studies under several scenarios indicate that the proposed algorithms are competitive in performance compared with GP-, single-task-, and other multitask model-based Bayesian optimization methods.

中文翻译:


用于昂贵优化的新型多任务条件神经网络代理模型



通过在任务之间共享信息,可以同时学习多个相关任务,以避免白板学习并提高无转移情况下的性能(即,当每个任务单独学习时)。本研究研究了条件神经过程(CNP)网络的多任务学习,并提出了两种基于CNP的多任务学习网络模型,即一对多多任务CNP(OMc-MTCNP)和多对多MTCNP( MMc-MTCNP)。与现有的多任务模型相比,所提出的模型添加了可扩展的相关性学习层来学习任务之间的相关性。此外,所提出的多任务CNP(MTCNP)网络被视为代理模型,并应用于贝叶斯优化框架以取代高斯过程(GP)以避免复杂的协方差计算。所提出的贝叶斯优化框架通过利用多个任务之间可能的依赖关系来共享跨任务的知识,同时推断多个任务。所提出的替代模型通过许多相关任务来增强观察到的数据集,以自信地估计模型参数。多种场景下的实验研究表明,与GP、单任务和其他基于多任务模型的贝叶斯优化方法相比,所提出的算法在性能上具有竞争力。
更新日期:2024-08-22
down
wechat
bug