当前位置: X-MOL 学术arXiv.cs.NE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Lazy Greedy Hypervolume Subset Selection from Large Candidate Solution Sets
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2020-07-04 , DOI: arxiv-2007.02050
Weiyu Chen, Hisao Ishibuhci, and Ke Shang

Subset selection is a popular topic in recent years and a number of subset selection methods have been proposed. Among those methods, hypervolume subset selection is widely used. Greedy hypervolume subset selection algorithms can achieve good approximations to the optimal subset. However, when the candidate set is large (e.g., an unbounded external archive with a large number of solutions), the algorithm is very time-consuming. In this paper, we propose a new lazy greedy algorithm exploiting the submodular property of the hypervolume indicator. The core idea is to avoid unnecessary hypervolume contribution calculation when finding the solution with the largest contribution. Experimental results show that the proposed algorithm is hundreds of times faster than the original greedy inclusion algorithm and several times faster than the fastest known greedy inclusion algorithm on many test problems.

中文翻译:

从大型候选解决方案集中懒惰的贪婪超体积子集选择

子集选择是近年来的热门话题,并且已经提出了许多子集选择方法。在这些方法中,超体积子集选择被广泛使用。贪婪超体积子集选择算法可以很好地逼近最优子集。然而,当候选集很大时(例如,具有大量解的无界外部档案),该算法非常耗时。在本文中,我们提出了一种新的懒惰贪婪算法,利用了超容量指标的子模块特性。核心思想是在寻找贡献最大的解时避免不必要的超体积贡献计算。
更新日期:2020-07-07
down
wechat
bug