Skip to main content
Log in

A multi-objective optimization algorithm for feature selection problems

  • Original Article
  • Published:
Engineering with Computers Aims and scope Submit manuscript

Abstract

Feature selection (FS) is a critical step in data mining, and machine learning algorithms play a crucial role in algorithms performance. It reduces the processing time and accuracy of the categories. In this paper, three different solutions are proposed to FS. In the first solution, the Harris Hawks Optimization (HHO) algorithm has been multiplied, and in the second solution, the Fruitfly Optimization Algorithm (FOA) has been multiplied, and in the third solution, these two solutions are hydride and are named MOHHOFOA. The results were tested with MOPSO, NSGA-II, BGWOPSOFS and B-MOABC algorithms for FS on 15 standard data sets with mean, best, worst, standard deviation (STD) criteria. The Wilcoxon statistical test was also used with a significance level of 5% and the Bonferroni–Holm method to control the family-wise error rate. The results are shown in the Pareto front charts, indicating that the proposed solutions' performance on the data set is promising.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Gharehchopogh FS, Shayanfar H, Gholizadeh H (22019) A comprehensive survey on symbiotic organisms search algorithms. Artif Intell Rev:1–48.

  2. Rahnema N, Gharehchopogh FS (2020) An improved artificial bee colony algorithm based on whale optimization algorithm for data clustering. Multim Tools Appl:1–26.

  3. Gharehchopogh FS, Gholizadeh H (2019) A comprehensive survey: whale Optimization Algorithm and its applications. Swarm Evolut Comput 48:1–24

    Article  Google Scholar 

  4. Abedi M, Gharehchopogh FS (2020) An improved opposition based learning firefly algorithm with dragonfly algorithm for solving continuous optimization problems. Intell Data Anal 24(2):309–338

    Article  Google Scholar 

  5. Benyamin A, Farhad SG, Saeid B (2020) Discrete farmland fertility optimization algorithm with metropolis acceptance criterion for traveling salesman problems. Int J Intell Syst.

  6. Rajamohana S, Umamaheswari K (2018) Hybrid approach of improved binary particle swarm optimization and shuffled frog leaping for feature selection. Comput Elect Eng.

  7. Nakamura RY, et al (2012) BBA: a binary bat algorithm for feature selection. In: 2012 25th SIBGRAPI conference on graphics, Patterns and Images. IEEE.

  8. Meng T, Pan Q-K (2017) An improved fruit fly optimization algorithm for solving the multi-dimensional knapsack problem. Appl Soft Comput 50:79–93

    Article  Google Scholar 

  9. Pan Q-K et al (2014) An improved fruit fly optimization algorithm for continuous function optimization problems. Knowl Based Syst 62:69–83

    Article  Google Scholar 

  10. Zhou R et al (2017) Improved fruit fly optimization Algorithm-based density peak clustering and its applications. Tehnicki vjesnik 24(2):473–480

    Google Scholar 

  11. Wang Q, et al. (2017) Kernel-based fuzzy C-means clustering based on fruit fly optimization algorithm. In: 2017 International Conference on Grey Systems and Intelligent Services (GSIS). IEEE.

  12. Zhou R et al (2018) Density peak clustering algorithm using knowledge learning-based fruit fly optimization. Int J Comput Appl 40(3):1–10

    Google Scholar 

  13. Mitić M et al (2015) Chaotic fruit fly optimization algorithm. Knowl Based Syst 89:446–458

    Article  Google Scholar 

  14. Wu L et al (2018) A new improved fruit fly optimization algorithm IAFOA and its application to solve engineering optimization problems. Knowl Based Syst 144:153–173

    Article  Google Scholar 

  15. Du T-S et al (2018) DSLC-FOA: improved fruit fly optimization algorithm for application to structural engineering design optimization problems. Appl Math Model 55:314–339

    Article  MathSciNet  Google Scholar 

  16. Heidari AA et al (2019) Harris hawks optimization: algorithm and applications. Futur Gener Comput Syst 97:849–872

    Article  Google Scholar 

  17. Zhang Y, et al (2020) Boosted binary Harris hawks optimizer and feature selection. Structure 25:26

  18. Hans R, Kaur H, Kaur N (2020) Opposition-based Harris Hawks optimization algorithm for feature selection in breast mass classification. J Interdiscip Math 23(1):97–106

    Article  Google Scholar 

  19. Abdel-Basset M, Ding W, El-Shahat D (2020) A hybrid Harris Hawks optimization algorithm with simulated annealing for feature selection. Artif Intell Rev:1–45.

  20. Jia H et al (2019) Dynamic harris hawks optimization with mutation mechanism for satellite image segmentation. Remote Sens 11(12):1421

    Article  Google Scholar 

  21. Abd Elaziz M, et al (2020) A competitive chain-based Harris Hawks Optimizer for global optimization and multi-level image thresholding problems. Appl Soft Comput:106347.

  22. Hussain K, Zhu W, Salleh MNM (2019) Long-term memory HarrisTM hawk optimization for high dimensional and optimal power flow problems. IEEE Access 7:147596–147616

    Article  Google Scholar 

  23. Abbasi A, Firouzi B,Sendur P (2019) On the application of Harris hawks optimization (HHO) algorithm to the design of microchannel heat sinks. Eng Comput: 1–20.

  24. Aghdam MH, Kabiri P (2016) Feature selection for intrusion detection system using ant colony optimization. Int J Netw Secur 18(3):420–432

    Google Scholar 

  25. Gauthama Raman MR et al (2017) An efficient intrusion detection system based on hypergraph - Genetic algorithm for parameter optimization and feature selection in support vector machine. Knowl Based Syst 134:1–12

    Article  Google Scholar 

  26. Acharya N, Singh S (2018) An IWD-based feature selection method for intrusion detection system. Soft Comput 22(13):4407–4416

    Article  Google Scholar 

  27. Selvakumar B, Muneeswaran K (2019) Firefly algorithm based feature selection for network intrusion detection. Comput Secur 81:148–155

    Article  Google Scholar 

  28. Mafarja MM, Mirjalili S (2017) Hybrid whale optimization algorithm with simulated annealing for feature selection. Neurocomputing 260:302–312

    Article  Google Scholar 

  29. Emary E, Zawbaa HM (2018) Feature selection via Lèvy Antlion optimization. Pattern Anal Appl:1–20.

  30. Alamiedy TA, et al. (2019) Anomaly-based intrusion detection system using multi-objective grey wolf optimisation algorithm. J Ambient Intell Hum Comput:1–22.

  31. Gharehchopogh FS, Mousavi SK (2019) A new feature selection in email spam detection by particle swarm optimization and fruit fly optimization algorithms. J Comput Knowl Eng 2(2).

  32. Zhang Y et al (2020) Binary differential evolution with self-learning for multi-objective feature selection. Inf Sci 507:67–85

    Article  MathSciNet  Google Scholar 

  33. Sohrabi MK, Tajik A (2017) Multi-objective feature selection for warfarin dose prediction. Comput Biol Chem 69:126–133

    Article  Google Scholar 

  34. Wan Y et al (2020) Multi-objective Hyperspectral Feature Selection Based on Discrete Sine Cosine Algorithm. IEEE Trans Geosci Remote Sens 58(5):3601–3618

    Article  Google Scholar 

  35. Wang L, Zheng X-L (2018) A knowledge-guided multi-objective fruit fly optimization algorithm for the multi-skill resource constrained project scheduling problem. Swarm Evolut Comput 38:54–63

    Article  Google Scholar 

  36. Wu L, et al (2018) Multi-objective Fruit Fly Optimization Based on Cloud Model. In: 2018 13th World Congress on Intelligent Control and Automation (WCICA). IEEE.

  37. Ma Q, He Y, Zhou F (2016) Multi-objective fruit fly optimization algorithm for test point selection. In: 2016 IEEE advanced information management, communicates, electronic and automation control conference (IMCEC). IEEE.

  38. Du P, et al (2020) A novel hybrid model based on multi-objective Harris hawks optimization algorithm for daily PM2. 5 and PM10 forecasting. Appl Soft Comput 96: 106620

  39. Amoozegar M, Minaei-Bidgoli B (2018) Optimizing multi-objective PSO based feature selection method using a feature elitism mechanism. Expert Syst Appl 113:499–514

    Article  Google Scholar 

  40. Rodrigues D, de Albuquerque VHC, Papa JP (2020) A multi-objective artificial butterfly optimization approach for feature selection. Appl Soft Comput: 106442

  41. Al-Tashi Q et al (2020) Binary multi-objective grey wolf optimizer for feature selection in classification. IEEE Access 8:106247–106263

    Article  Google Scholar 

  42. Zhang Y et al (2019) Cost-sensitive feature selection using two-archive multi-objective artificial bee colony algorithm. Expert Syst Appl 137:46–58

    Article  Google Scholar 

  43. Wang X-H et al (2020) Multi-objective feature selection based on artificial bee colony: an acceleration approach with variable sample size. Appl Soft Comput 88:106041

    Article  Google Scholar 

  44. He C-L, et al (2019) Multi-objective feature selection based on artificial bee colony for hyperspectral images. In: international conference on bio-inspired computing: theories and applications. Springer

  45. Xue B, Zhang M, Browne WN (2012) Particle swarm optimization for feature selection in classification: a multi-objective approach. IEEE Trans Cybern 43(6):1656–1671

    Article  Google Scholar 

  46. Bouraoui A, Jamoussi S, BenAyed Y (2018) A multi-objective genetic algorithm for simultaneous model and feature selection for support vector machines. Artif Intell Rev 50(2):261–281

    Article  Google Scholar 

  47. Ghosh M et al (2018) Feature selection using histogram-based multi-objective GA for handwritten Devanagari numeral recognition. Intelligent engineering informatics. Springer, pp 471–479

    Chapter  Google Scholar 

  48. Liu Y et al (2017) A many-objective evolutionary algorithm using a one-by-one selection strategy. IEEE Trans Cybern 47(9):2689–2702

    Article  Google Scholar 

  49. Gong D, Sun J, Miao Z (2016) A set-based genetic algorithm for interval many-objective optimization problems. IEEE Trans Evol Comput 22(1):47–60

    Article  Google Scholar 

  50. Pan W-T (2012) A new Fruit Fly Optimization Algorithm: taking the financial distress model as an example. Knowl Based Syst 26:69–74

    Article  Google Scholar 

  51. Kennedy J, Eberhart RC (1997) A discrete binary version of the particle swarm algorithm. In: 1997 IEEE International conference on systems, man, and cybernetics. Computational cybernetics and simulation. IEEE.

  52. Han J, Pei J, Kamber M (2011) Data mining: concepts and techniques. Elsevier.

  53. Deb K et al (2002) A fast and elitist multi-objective genetic algorithm: NSGA-II. IEEE Trans Evol Comput 6(2):182–197

    Article  Google Scholar 

  54. Al-Tashi Q et al (2019) Binary optimization using hybrid grey wolf optimization for feature selection. IEEE Access 7:39496–39508

    Article  Google Scholar 

  55. Hancer E et al (2018) Pareto front feature selection based on artificial bee colony optimization. Inf Sci 422:462–479

    Article  Google Scholar 

  56. Coello CC, Lechuga MS (2002) MOPSO: A proposal for multiple objective particle swarm optimization. In: Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No. 02TH8600). IEEE.

  57. Dua D (2017) and C. Graff, UCI machine learning repository

    Google Scholar 

  58. Holm S (1979) A simple sequentially rejective multiple test procedure. Scand J Stat: 65–70.

  59. Balachandran M et al (2012) Optimizing properties of nanoclay–nitrile rubber (NBR) composites using face centred central composite design. Mater Des 35:854–862

    Article  Google Scholar 

  60. Choong SS, Wong L-P, Lim CP (2019) An artificial bee colony algorithm with a modified choice function for the Traveling Salesman Problem. Swarm and Evolut Comput 44:622–635

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Farhad Soleimanian Gharehchopogh.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Abdollahzadeh, B., Gharehchopogh, F.S. A multi-objective optimization algorithm for feature selection problems. Engineering with Computers 38 (Suppl 3), 1845–1863 (2022). https://doi.org/10.1007/s00366-021-01369-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00366-021-01369-9

Keywords

Navigation