Hyperparameter Importance Analysis based on N-RReliefF Algorithm

Authors

  • Yunlei Sun China University of Petroleum(East China) http://orcid.org/0000-0003-3745-6899
  • Huiquan Gong Faculty of Information Technology Beijing University of Technology, China No.100, Pingleyuan, Chaoyang District, Beijing, 100124, China xinel_ghq@126.com
  • Yucong Li College of Computer & Communication Engineering China University of Petroleum(East China), China No.66, West Changjiang Road, Huangdao District, Qingdao 266580, China
  • Dalin Zhang Beijing Jiaotong University

Keywords:

Hyperparameter optimization, Bayesian optimization, RReliefF Algorithm

Abstract

Hyperparameter selection has always been the key to machine learning. The Bayesian optimization algorithm has recently achieved great success, but it has certain constraints and limitations in selecting hyperparameters. In response to these constraints and limitations, this paper proposed the N-RReliefF algorithm, which can evaluate the importance of hyperparameters and the importance weights between hyperparameters. The N-RReliefF algorithm estimates the contribution of a single hyperparameter to the performance according to the influence degree of each hyperparameter on the performance and calculates the weight of importance between the hyperparameters according to the improved normalization formula. The N-RReliefF algorithm analyses the hyperparameter configuration and performance set generated by Bayesian optimization, and obtains the important hyperparameters in random forest algorithm and SVM algorithm. The experimental results verify the effectiveness of the N-RReliefF algorithm.

References

Bartz-Beielstein, T. (2006). Experimental research in evolutionary computation: The New Experimentalism, Springer Berlin Heidelberg, 2006.

Bergstra, J.; Bengio, Y. (2012). Random search for hyper-parameter optimization, Journal of Machine Learning Research, 13, 281-305, 2012.

Bischl, B.; Casalicchio, G.; Feurer, M. et al. (2017). OpenML benchmarking suites and the openml100, ,

Carlos, A.; Sellmann, M.; Tierney, K. (2009). A gender-based genetic algorithm for the automatic configuration of algorithms, Principles and Practice of Constraint Programming - CP 2009, International Conference Proceedings, CP 2009, Lisbon, Portugal, 142-157, 2009. https://doi.org/10.1007/978-3-642-04244-7_14

Chang, C. C. C. C.; Lin, C.C.C. (2011). LIBSVM: A library for support vector machines, ACM Transactions on Intelligent Systems and Technology, 27, 1-27, 2011. https://doi.org/10.1145/1961189.1961199

Chiarandini, M.; Goegebeur, Y. (2009). Mixed Models for the analysis of optimization algorithms, Experimental Thermal & Fluid Science, 34(7), 972-978, 2009.

Cui, J.; Yang, B. (2018). Survey on Bayesian optimization methodology and applications, Journal of Software, 29(10), 3068-3090, 2018.

Daolio, F.; Liefooghe, A.; Sebastien, V.; Aguirre, H.; Tanaka, K. (2017). Problem Features vs. Algorithm Performance on Rugged Multi-objective Combinatorial Fitness Landscapes, Acm Sigevolution, 9(3), 21-21, 2017. https://doi.org/10.1145/3066862.3066868

Deng, S. (2019). Hyper-parameter optimization of CNN based on improved Bayesian optimization algorithm, Application Research of Computers, 2019(7), 2019.

Falkner, S.; Klein, A.; Hutter, F. (2018). BOHB: Robust and Efficient Hyperparameter Optimization at Scale, Proceedings of the35thInternational Conference on MachineLearning, Stockholm, Sweden, 2018.

Feurer, M.; Springenberg, J.T.; Hutter, F. (2015). Initializing bayesian hyperparameter optimization via meta-learning, Twenty-Ninth AAAI Conference on Artificial Intelligence. AAAI Press, 1128-1135, 2015.

Adrian-Catalin Florea, A.-C.; Andonie, R. (2019). Weighted Random Search for Hyperparameter Optimization, International Journal of Computers Communications & Control, 14(2), 154-169, 2019. https://doi.org/10.15837/ijccc.2019.2.3514

Gomes, T.A.F.; Soares, C. (2012). Combining meta-learning and search techniques to select parameters for support vector machines, Neurocomputing, 75(1), 3-13, 2012. https://doi.org/10.1016/j.neucom.2011.07.005

Hutter, F.; Hoos, H. H.; Leyton-Brown, K. (2011). Sequential model-based optimization for general algorithm configuration, Learning and Intelligent Optimization -, International Conference, Lion 5, Rome, Italy, January 17-21, 2011. Selected Papers. DBLP, 507-523, 2011. https://doi.org/10.1007/978-3-642-25566-3_40

Hutter, F.; Hoos, H. H.; Leyton-Brown, K. (2013). Identifying key algorithm parameters and instance features using forward selection, Learning and Intelligent Optimization, 7997, 364-381, 2013. https://doi.org/10.1007/978-3-642-44973-4_40

Hutter, F.; Hoos, H. H.; Leyton-Brown, K. (2014). An efficient approach for assessing hyperparameter importance, In Proc. of ICML 2014, 754-762, 2014.

Kira, K.; Rendell, L. A. (1992). A practical approach to feature selection, Machine Learning: Proceedings of International Conference, 1992, 249-256, 1992. https://doi.org/10.1016/B978-1-55860-247-2.50037-1

Lin, S. W.; Ying, K. C.; Chen, S. C.; Lee, Z. J. (2008). Particle swarm optimization for parameter determination and feature selection of support vector machines, Expert Systems with Applications, 35(4), 1817-1824, 2008. https://doi.org/10.1016/j.eswa.2007.08.088

Mockus, J.; Tiesis, V.; Zilinskas, A. (1978). The application of Bayesian methods for seeking the extremum, Towards Global Optimisation, 2.1978, 117-129, 1978.

Nannen, V.; Eiben, A. E. (2007). Relevance estimation and value calibration of evolutionary algorithm parameters, International Joint Conference on Artifical Intelligence Morgan Kaufmann Publishers Inc, 103-110, 2007. https://doi.org/10.1109/CEC.2007.4424460

Probst, P.; Bischl, B.; Boulesteix, A. L. (2018). Tunability: Importance of Hyperparameters of Machine Learning Algorithms, arXiv:1802.09596 [stat.ML], 2018.

Reif, M.; Shafait, F.; Dengel, A. (2012). Meta-learning for evolutionary parameter optimization of classifiers, Machine Learning, 87(3), 357-380, 2012. https://doi.org/10.1007/s10994-012-5286-7

Robnik-Sikonja, M.; Kononenko, I. (1997). An adaptation of Relief for attribute estimation in regression, Fourteenth International Conference on Machine Learning, Morgan Kaufmann Publishers, 1997.

Robnik-Sikonja, M.; Kononenko, I. (2003). Theoretical and empirical analysis of relieff and rrelieff, Machine Learning, 53(1-2), 23-69, 2003. https://doi.org/10.1023/A:1025667309714

Snoek, J.; Larochelle, H.; Adams, R.P. (2012). Practical bayesian optimization of machine learning algorithms, International Conference on Neural Information Processing Systems. Curran Associates Inc, 2951-2959, 2012.

Wang, J.; Zhang, L.; Chen, G. (2012). A parameter optimization method for an SVM based on improved grid search algorithm, Applied Science and Technology, 39(3), 28-31, 2012.

Wu, J. (2017). Complex network link classification based on RReliefF feature selection algorithm, Computer Engineering, 43(8), 208-214, 2017.

Zhang, D. (2017). High-speed Train Control System Big Data Analysis Based on Fuzzy RDF Model and Uncertain Reasoning, International Journal of Computers, Communications & Control, 12(4), 577-591, 2017. https://doi.org/10.15837/ijccc.2017.4.2914

Zhang, D.; Jin, D.; Gong, Y.; Chen, S.; Wang, C. (2015). Research of alarm correlations based on static defect detection, Tehnicki vjesnik, 22(2), 311-318, 2015. https://doi.org/10.17559/TV-20150317102804

Zhang, D.; Sui, J.; Gong, Y. (2017). Large scale software test data generation based on collective constraint and weighted combination method, Tehnicki Vjesnik, 24(4), 1041-1050, 2017. https://doi.org/10.17559/TV-20170319045945

Published

2019-08-05

Most read articles by the same author(s)

Obs.: This plugin requires at least one statistics/report plugin to be enabled. If your statistics plugins provide more than one metric then please also select a main metric on the admin's site settings page and/or on the journal manager's settings pages.