Skip to main content
Log in

Modified forensic-based investigation algorithm for global optimization

  • Original Article
  • Published:
Engineering with Computers Aims and scope Submit manuscript

Abstract

Forensic-based investigation (FBI) is recently developed metaheuristic algorithm inspired by the suspect investigation–location–pursuit operations of police officers. This study focuses on the search processes of the FBI algorithm, called Step A and Step B, to improve and increase its performance. For this purpose, opposition-based learning is adopted to Step A to enhance diversity, while Cauchy-based mutation is integrated with Step B to guide the search to different regions and to jump out of local minima. To show the effectiveness of these improvements, the proposed algorithm has been tested with two different benchmark sets. To verify the performance of the new modified algorithm, the statistical test is carried out on numerical functions. This study also investigates the application of the proposed algorithm to a set of six real-world problems. The proposed and adapted/integrated methods appear to have a significant impact on the FBI algorithm, which augments its performance, resulting in better solutions than the compared algorithms in most of the functions and real-world problems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Goldberg DE (1989) Genetic algorithms in search, optimization, and machine learning. Addison Wesley, Boston. https://doi.org/10.1007/s10589-009-9261-6

    Book  MATH  Google Scholar 

  2. Storn R, Price K (1997) Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 11:341–359. https://doi.org/10.1023/A:1008202821328

    Article  MathSciNet  MATH  Google Scholar 

  3. Eberhart RC, Kennedy J (1995) A new optimizer using particle swarm theory. In: Proceedings of the sixth international symposium on micro machine and human science, pp 39–43. https://doi.org/10.1109/mhs.1995.494215.

  4. Soleimani H, Govindan K (2015) A hybrid particle swarm optimization and genetic algorithm for closed-loop supply chain network design in large-scale networks. Appl Math Model 39:3990–4012. https://doi.org/10.1016/j.apm.2014.12.016

    Article  MathSciNet  MATH  Google Scholar 

  5. Segura C, Carlos A, Coello C, Hernández-Díaz AG (2015) Improving the vector generation strategy of differential evolution for large-scale optimization. Inf Sci 323:106–129. https://doi.org/10.1016/j.ins.2015.06.029

    Article  MathSciNet  Google Scholar 

  6. Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1:67–82. https://doi.org/10.1109/4235.585893

    Article  Google Scholar 

  7. Geem ZW, Kim JH, Loganathan GV (2001) A new heuristic optimization algorithm: harmony search. Simulation 76:60–68. https://doi.org/10.1177/003754970107600201

    Article  Google Scholar 

  8. Wang Y, Cai Z, Zhang Q (2011) Differential evolution with composite trial vector generation strategies and control parameters. IEEE Trans Evol Comput 15:55–66. https://doi.org/10.1109/tevc.2010.2087271

    Article  Google Scholar 

  9. Zhang J, Sanderson AC (2009) JADE: adaptive differential evolution with optional external archive. IEEE Trans Evol Comput 13:945–958. https://doi.org/10.1109/TEVC.2009.2014613

    Article  Google Scholar 

  10. E-Shareef H, Ahmad AI, Ammar HM (2015) Lightning search algorithm. Appl Soft Comput 36:315–333. https://doi.org/10.1016/j.asoc.2015.07.028

    Article  Google Scholar 

  11. Mirjalili S, Gandomi AH, Mirjalili SZ, Saremi S, Faris H, Mirjalili SM (2017) Salp Swarm Algorithm: a bio-inspired optimizer for engineering design problems. Adv Eng Softw 114:163–191. https://doi.org/10.1016/j.advengsoft.2017.07.002

    Article  Google Scholar 

  12. Hashim FA, Houssein EH, Mabrouk MS, Al-Atabany W, Mirjalili S (2019) Henry gas solubility optimization: a novel physics-based algorithm. Future Gener Comput Syst 101:646–667. https://doi.org/10.1016/j.future.2019.07.015

    Article  Google Scholar 

  13. Heidari AA, Mirjalili S, Faris H, Aljarah I, Mafarja M, Chen H (2019) Harris hawks optimization: algorithm and applications. Future Gener Comput Syst 97:849–872. https://doi.org/10.1016/j.future.2019.02.028

    Article  Google Scholar 

  14. Askari Q, Saeed M, Younas I (2020) Heap-based optimizer inspired by corporate rank hierarchy for global optimization. Expert Syst Appl 161:113702. https://doi.org/10.1016/j.eswa.2020.113702

    Article  Google Scholar 

  15. Chou JS, Nguyen NM (2020) FBI inspired meta-optimization. Appl Soft Comput 93:106339. https://doi.org/10.1016/j.asoc.2020.106339

    Article  Google Scholar 

  16. Paiva FAP, Silva CRM, Leite IVO, Marcone MHF, Costa JAF (2017) Modified bat algorithm with Cauchy mutation and elite opposition-based learning. In: 2017 IEEE Latin American conference on computational intelligence (LA-CCI), pp 1–6. https://doi.org/10.1109/la-cci.2017.8285715

  17. Gupta S, Kusum D (2018) Cauchy Grey Wolf optimiser for continuous optimisation problems J. Exp Theor Artif Intell 30(6):1051–1075. https://doi.org/10.1080/0952813X.2018.1513080

    Article  Google Scholar 

  18. Mirjalili S, Mirjalili SM, Hatamlou A (2016) Multi-verse optimizer: a nature-inspired algorithm for global optimization. Neural Comput Appl 27(2):495–513. https://doi.org/10.1007/s00521-015-1870-7

    Article  Google Scholar 

  19. Saremi S, Mirjalili S, Lewis A (2017) Grasshopper optimisation algorithm: theory and application. Adv Eng Softw 105:30–47. https://doi.org/10.1016/j.advengsoft.2017.01.004

    Article  Google Scholar 

  20. Mirjalili S, Andrew L (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67. https://doi.org/10.1016/j.advengsoft.2016.01.008

    Article  Google Scholar 

  21. Liang JJ, Qu BY, Suganthan PN, Hernández-Díaz AG (2013) Problem definitions and evaluation criteria for the CEC 2013 special session on real-parameter optimization. Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou, China and Nanyang Technological University, Singapore, Technical Report, vol 201212, no 34, pp 281–295

  22. Derrac J, García S, Molina D, Herrera F (2011) A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol Comput 1:3–18. https://doi.org/10.1016/j.swevo.2011.02.002

    Article  Google Scholar 

  23. Das S, Suganthan PN (2010) Problem definitions and evaluation criteria for CEC 2011 competition on testing evolutionary algorithms on real world optimization problems. Jadavpur University, Nanyang Technological University, Kolkata, pp 341–359

  24. Rao RV, Savsani VJ, Vakharia DP (2011) Teaching–learning-based optimization: a novel method for constrained mechanical design optimization problems. Comput Aided Des 43:303–315. https://doi.org/10.1016/j.cad.2010.12.015

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yiğit Çağatay Kuyu.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

Equations

Dim

Range

Min

\(f_{1} \left( x \right) = \sum\nolimits_{i = 1}^{n} {x_{i}^{2} }\)

\(50\)

\(\left[ { - 100, + 100} \right]\)

\(0\)

\(f_{2} \left( x \right) = \sum\nolimits_{i = 1}^{n} {\left| {x_{i} } \right|} + \Pi_{i = 1}^{n} \left| {x_{i} } \right|\)

\(50\)

\(\left[ { - 10, + 10} \right]\)

\(0\)

\(f_{3} \left( x \right) = \sum\nolimits_{i = 1}^{n} {\left( {\sum\nolimits_{j - 1}^{i} {x_{j} } } \right)^{2} }\)

\(50\)

\(\left[ { - 100, + 100} \right]\)

\(0\)

\(f_{4} \left( x \right) = {\text{max}}_{i} \left\{ {\left| {x_{i} } \right|, 1 \le i \le n} \right\}\)

\(50\)

\(\left[ { - 100, + 100} \right]\)

\(0\)

\(f_{5} \left( x \right) = \sum\nolimits_{i = 1}^{n - 1} {\left[ {100\left( {x_{i + 1} - x_{i}^{2} } \right)^{2} + \left( {x_{i} - 1} \right)^{2} } \right]}\)

\(50\)

\(\left[ { - 30, + 30} \right]\)

\(0\)

\(f_{6} \left( x \right) = \sum\nolimits_{i = 1}^{n} {\left( {\left[ {x_{i} + 0.5} \right]} \right)^{2} }\)

\(50\)

\(\left[ { - 100, + 100} \right]\)

\(0\)

\(f_{7} \left( x \right) = \sum\nolimits_{i = 1}^{n} {ix_{i}^{4} + {\text{random}}\left[ {0,1} \right)}\)

\(50\)

\(\left[ { - 1.28, + 1.28} \right]\)

\(0\)

\(f_{8} \left( x \right) = \sum\nolimits_{i = 1}^{n} { - x_{i} {\text{sin}}\left( {\sqrt {\left| {x_{i} } \right|} } \right)}\)

\(50\)

\(\left[ { - 500, + 500} \right]\)

\(- 418.9829X{\text{dim}}\)

\(f_{9} \left( x \right) = \sum\nolimits_{i = 1}^{n} {\left[ {x_{i}^{2} - 10\cos \left( {2\pi x_{i} } \right) + 10} \right]}\)

\(50\)

\(\left[ { - 5.12, + 5.12} \right]\)

\(0\)

\(f_{10} \left( x \right) = - 20 {\text{exp}}\left( { - 0.2\sqrt {\frac{1}{n}\sum\nolimits_{i = 1}^{n} {x_{i}^{2} } } } \right) - {\text{exp}}\left( {\frac{1}{n}\sum\nolimits_{i = 1}^{n} {\cos \left( {2\pi x_{i} } \right)} } \right) + 20 + e\)

\(50\)

\(\left[ { - 32, + 32} \right]\)

\(0\)

\(f_{11} \left( x \right) = \frac{1}{4000}\sum\nolimits_{i = 1}^{n} {x_{i}^{2} } - \prod_{i = 1}^{n} \cos \left( {\frac{{x_{i} }}{\sqrt i }} \right) + 1\)

\(50\)

\(\left[ { - 600, + 600} \right]\)

\(0\)

\(\begin{gathered} f_{12} \left( x \right) = \frac{\pi }{n}\left\{ {10\sin \left( {\pi y_{1} } \right) + \sum\nolimits_{i = 1}^{n - 1} {\left( {y_{i} - 1} \right)^{2} } \left[ {1 + 10\sin^{2} \left( {\pi y_{i + 1} } \right)} \right] + y_{n} - 1^{2} } \right\} \hfill \\ + \sum\nolimits_{i = 1}^{n} {u\left( {x_{i} , 10, 100, 4} \right)} \hfill \\ \end{gathered}\)

\(y_{i} = 1 + \frac{{x_{i} + 1}}{4}u\left( {x_{i} , a, k, m} \right) = \left\{ {\begin{array}{*{20}c} {k(x_{i} - a)^{m} x_{i} > a} \\ {0 - a < x_{i} < a} \\ {k( - x_{i} - a)^{m} x_{i} < - a} \\ \end{array} } \right.\)

\(50\)

\(\left[ { - 50, + 50} \right]\)

\(0\)

\(\begin{gathered} f_{13} \left( x \right) = 0.1\left\{ {\sin^{2} \left( {3\pi x_{1} } \right) + \sum\nolimits_{i = 1}^{n} {\left( {x_{i} - 1} \right)^{2} } \left[ {1 + \sin^{2} \left( {3\pi x_{i} + 1} \right)} \right] + \left( {x_{n} - 1} \right)^{2} \left[ {1 + \sin^{2} \left( {2\pi x_{n} } \right)} \right]} \right\} \hfill \\ + \sum\nolimits_{i = 1}^{n} {u\left( {x_{i} , 5, 100, 4} \right)} \hfill \\ \end{gathered}\)

\(50\)

\(\left[ { - 50, + 50} \right]\)

\(0\)

\(f_{14} \left( x \right) = \left( {\frac{1}{500} + \sum\nolimits_{j = 1}^{25} {\frac{1}{{j + \mathop \sum \nolimits_{i = 1}^{2} \left( {x_{i} - a_{ij} } \right)^{6} }}} } \right)^{ - 1}\)

2

\(\left[ { - 65.536, + 65.536} \right]\)

1

\(f_{15} \left( x \right) = \sum\nolimits_{i = 1}^{11} {\left| {a_{i} - \frac{{x_{{1\left( {b_{i}^{2} + b_{i} x_{2} } \right)}} }}{{b_{i}^{2} + b_{i} x_{3} + x_{4} }}} \right|^{2} }\)

4

\(\left[ { - 5, + 5} \right]\)

0.0003

\(f_{16} \left( x \right) = 4x_{1}^{2} - 2.1x_{1}^{4} + \frac{1}{3}x_{1}^{6} + x_{1} x_{2} - 4x_{2}^{2} + 4x_{2}^{4}\)

2

\(\left[ { - 5, + 5} \right]\)

− 1.0316

\(f_{17} \left( x \right) = \left( {x_{2} - \frac{5.1}{{4\pi^{2} }}x_{1}^{2} + \frac{5}{\pi }x_{1} - 6} \right)^{2} + 10\left( {1 - \frac{1}{8\pi }} \right)\cos x_{1} + 10\)

2

\(\left[ { - 5, + 0} \right]\)

\(\left[ {10,15} \right]\)

0.398

\(f_{18} \left( x \right) = \left[ {1 + \left( {x_{1} + x_{2} + 1} \right)^{2} \left( {19 - 14x_{1} + 3x_{1}^{2} - 14x_{2} + 6x_{1} x_{2} + 3x_{2}^{2} } \right)} \right] \times \left[ {30 + \left( {2x_{1} - 3x_{2} } \right)^{2} \times \left( {18 - 32x_{1} + 12x_{1}^{2} + 48x_{2} - 36x_{1} x_{2} + 27x_{2}^{2} } \right)} \right]\)

2

\(\left[ { - 2, + 2} \right]\)

3

\(f_{19} \left( x \right) = - \sum\nolimits_{i = 1}^{4} {c_{i } {\text{exp}}\left( { - \mathop \sum \limits_{j = 1}^{3} a_{ij} (x_{j} - p_{ij} )^{2} } \right)}\)

3

\(\left[ {0, 1} \right]\)

− 3.86

\(f_{20} \left( x \right) = - \sum\nolimits_{i = 1}^{4} {c_{i } {\text{exp}}\left( { - \mathop \sum \limits_{j = 1}^{6} a_{ij} (x_{j} - p_{ij} )^{2} } \right)}\)

6

\(\left[ {0, 10} \right]\)

− 3.32

\(f_{21} \left( x \right) = - \sum\nolimits_{i = 1}^{5} {\left[ {\left( {X - a_{i} } \right)\left( {X - a_{i} } \right)^{T} + c_{i} } \right]^{ - 1} }\)

4

\(\left[ {0, 10} \right]\)

− 10.1532

\(f_{22} \left( x \right) = - \sum\nolimits_{i = 1}^{7} {\left[ {\left( {X - a_{i} } \right)\left( {X - a_{i} } \right)^{T} + c_{i} } \right]^{ - 1} }\)

4

\(\left[ {0, 10} \right]\)

− 10.4028

\(f_{23} \left( x \right) = - \sum\nolimits_{i = 1}^{10} {\left[ {\left( {X - a_{i} } \right)\left( {X - a_{i} } \right)^{T} + c_{i} } \right]^{ - 1} }\)

4

\(\left[ {0, 10} \right]\)

− 10.5363

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kuyu, Y.Ç., Vatansever, F. Modified forensic-based investigation algorithm for global optimization. Engineering with Computers 38, 3197–3218 (2022). https://doi.org/10.1007/s00366-021-01322-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00366-021-01322-w

Keywords

Navigation