当前位置: X-MOL 学术Mathematics › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
RHOASo: An Early Stop Hyper-Parameter Optimization Algorithm
Mathematics ( IF 2.4 ) Pub Date : 2021-09-20 , DOI: 10.3390/math9182334
Ángel Luis Muñoz Castañeda , Noemí DeCastro-García , David Escudero García

This work proposes a new algorithm for optimizing hyper-parameters of a machine learning algorithm, RHOASo, based on conditional optimization of concave asymptotic functions. A comparative analysis of the algorithm is presented, giving particular emphasis to two important properties: the capability of the algorithm to work efficiently with a small part of a dataset and to finish the tuning process automatically, that is, without making explicit, by the user, the number of iterations that the algorithm must perform. Statistical analyses over 16 public benchmark datasets comparing the performance of seven hyper-parameter optimization algorithms with RHOASo were carried out. The efficiency of RHOASo presents the positive statistically significant differences concerning the other hyper-parameter optimization algorithms considered in the experiments. Furthermore, it is shown that, on average, the algorithm needs around 70% of the iterations needed by other algorithms to achieve competitive performance. The results show that the algorithm presents significant stability regarding the size of the used dataset partition.

中文翻译:

RHOASo:一种提前停止超参数优化算法

这项工作基于凹渐近函数的条件优化,提出了一种用于优化机器学习算法超参数的新算法 RHOASo。对算法进行了比较分析,特别强调了两个重要的特性:算法能够有效地处理一小部分数据集并自动完成调整过程,即无需用户明确说明,算法必须执行的迭代次数。对 16 个公共基准数据集进行了统计分析,比较了七种超参数优化算法与 RHOASo 的性能。RHOASo 的效率呈现出与实验中考虑的其他超参数优化算法的正统计显着差异。70%其他算法实现竞争性能所需的迭代次数。结果表明,该算法在所用数据集分区的大小方面表现出显着的稳定性。
更新日期:2021-09-20
down
wechat
bug