Next Article in Journal
Equilibrium and Kinetics of CO2 Adsorption by Coconut Shell Activated Carbon Impregnated with Sodium Hydroxide
Previous Article in Journal
Stabilization of Anaerobic Co-Digestion Process via Constant the Digestate Solids Content
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Chaotic Search Based Equilibrium Optimizer for Dealing with Nonlinear Programming and Petrochemical Application

1
Department of Mathematics and Statistics, College of Science, Taif University, P.O. Box 11099, Taif 21944, Saudi Arabia
2
Department of Mathematics, College of Science and Humanities in Al-Kharj, Prince Sattam bin Abdulaziz University, Al-Kharj 11942, Saudi Arabia
3
Department of Basic Engineering Science, Faculty of Engineering, Menofia University, Shebin El-Kom 32511, Egypt
4
Department of Biomedical Engineering, Faculty of Engineering at Helwan, Helwan University, Cairo, Helwan 11795, Egypt
*
Author to whom correspondence should be addressed.
Processes 2021, 9(2), 200; https://doi.org/10.3390/pr9020200
Submission received: 13 December 2020 / Revised: 16 January 2021 / Accepted: 18 January 2021 / Published: 21 January 2021

Abstract

:
In this article, chaotic search based constrained equilibrium optimizer algorithm (CS-CEOA) is suggested by integrating a novel heuristic approach called equilibrium optimizer with a chaos theory-based local search algorithm for solving general non-linear programming. CS-CEOA is consists of two phases, the first one (phase I) aims to detect an approximate solution, avoiding being stuck in local minima. In phase II, the chaos-based search algorithm improves local search performance to obtain the best optimal solution. For every infeasible solution, repair function is implemented in a way such that, a new feasible solution is created on the line segment defined by a feasible reference point and the infeasible solution itself. Due to the fast globally converging of evolutionary algorithms and the chaotic search’s exhaustive search, CS-CEOA could locate the true optimal solution by applying an exhaustive local search for a limited area defined from Phase I. The efficiency of CS-CEOA is studied over multi-suites of benchmark problems including constrained, unconstrained, CEC’05 problems, and an application of blending four ingredients, three feed streams, one tank, and two products to create some certain products with specific chemical properties, also to satisfy the target costs. The results were compared with the standard evolutionary algorithms as PSO and GA, and many hybrid algorithms in the same simulation environment to approve its superiority of detecting the optimal solution over selected counterparts.

1. Introduction

An exhaustive investigation of both theoretical and practical areas of the constrained non-linear programming problems (CNPPs) can be the subject matter of this paper. CNPPs have many characteristics, such as non-differentiable, non-convex, unimodal, and multimodal. Owing to the complexities of the CNPPs that often occur, researchers are trying to implement efficient optimizers to deal with non-linear programming problems NLP.
From the view of mathematical optimization methods, there are two main classifications: (1) deterministic optimization techniques, and (2) stochastic optimization techniques. Linear programming and non-linear programming methods [1,2] are of the most common deterministic methods that are used by searching for space and finding a solution using problem gradient knowledge. These methods are useful for problems with linear search areas (unimodal functions), but in problems having non-linear search areas, like real-world applications with non-convex formulation, they are vulnerable to local optima impairment [3,4]. This problem can be combated by modifying or hybridizing the algorithm [5] with different initial design. Another alternative method for these traditional methods is the stochastic-based optimization methods that implement random variables. These methods are used to explore the search space globally to detect optimal global or optimal solution close-to-global solution. Their advantages include simplicity, independence, problem flexibility, and non-gradient nature [6].
Among the existing stochastic methods, famous algorithms such as genetic algorithms based approaches [7,8,9,10], artificial immune system [11], neural network-based methods [12], particle swarm based methods [13,14,15], ant colony based methods [16], artificial bee colony based methods [17], bacterial foraging based algorithm (BFA) [18,19], cat swarm based optimization algorithm (CSO) [20], glowworm swarm based optimization algorithm(GSOA) [21], firefly-based optimization algorithm (FOA) [22], monkey-based algorithm (MA) [23], krill herd algorithm (KHA) [24], cuckoo search based algorithm [25], whale optimization algorithm (WOA) [26], sine cosine algorithm [27,28], grasshopper based optimization algorithm (GOA) [29], salp swarm based algorithm [30], equilibrium optimizer based optimization algorithm (EOA) [31], gradient-based optimizer (GBO) [32], slime mold-based algorithm (SMA) [33], and Harris hawks optimization (HHO) [34], and others.
There are many stochastic-based methods that have recently been used to deal with CNPPs such as a carnivorous plant algorithm [35], modified Sine Cosine Algorithm [36] enhanced a modified SCA with a novel mutation operator, and a transition parameter, water turbulent flow optimization (TFWO) algorithm [37]. Chaos mechanism based on quasi-opposition was presented [38], an ABC algorithm with adaptive heterogeneous competition [39], an improved FOA [40], Bare-Bones Based SCA [41], Group teaching optimization algorithm (GTOA) [42], Political Optimizer (PO) [43], Refined selfish herd optimizer [44], etc.
Many scientists and researchers investigated the hybridization between chaos theory and evolutionary optimization techniques to enhance optimization algorithm performance. Wei et al. [45] implemented the tent chaotic map to randomly generate the initial population for the genetic algorithm to guarantee well-distributed throughout the search space. Dashet al. [46] presented a novel hybrid evolutionary swarm algorithm by combining conditional mutual information maximization with a chaotic firefly approach. Fuertes et al. [47] proposed a new contribution to the chaos-based genetic algorithm and they investigated the entropy effectiveness in the initial population. Mousa et al. [48] presented a hybrid evolutionary algorithm based on the sinusoidal chaotic mapping. El-Shorbagy et al. [49] presented an interesting comparison between 14 chaotic mappings representing chaotic local search. Abo-Elnaga et al. [50] presented a chaos local Search based genetic algorithm for dealing with bilevel programming problems.
EOA is a novel optimization approach that simulates control volume mass balance models, which implemented to determine both dynamic and equilibrium states was proposed by Faramarzi et al. [31]. Each particle (solution) represents a search agent with its position (concentration) in the EOA. In order to ultimately achieve an optimal balance (optimum result), search agents randomly update their attention on the best available choices, i.e., matching applicants. It has been shown that the term "generation rate" improves the capability of EOA to escape local minima and to establish a balance between exploitation operator and exploration operator.
The rising literature shows that EOA is becoming more common in different fields. For example, a binary EOA for 0–1 knapsack problem had been proposed in [51]. While an effective EOA with a mutation strategy for numerical optimization had been provided in [52]. In addition, it was used to calculate the optimal estimate of Schottky diode parameters [53], to determine the solar photovoltaic parameter [54], and to reconfigure and distribute generation in power systems [55], etc. The efficacy of EOA enables multidisciplinary researchers to further improve its applicability. There are three ways to strengthen the initial EOA as follows:
  • Changing EOA parameters or algorithmic procedures to boost algorithm performance;
  • The development of the EOA by modern learning methods to improve the use of information;
  • Hybridization of the EOA by other search methods;
  • Combining EOA with chaotic search methods.
In this study, CS-CEOA is suggested to solve non-linear programming and an application of petrochemical engineering. CS-CEOA is an integration between a chaos-based local search algorithm, and a new heuristic approach called equilibrium optimizer algorithm (EOA). The principle of co-evolution, reparation, elitism, and chaotic search are the main features of the proposed method. The repair method was implemented to co-evolves any infeasible solution until it becomes feasible, in a way such that, a new feasible solution is created on the segment defined by the feasible reference point and the unfeasible solution itself. The elitist strategy is used to elite the best-found solution all the generation, which gives the proposed algorithm a faster convergence to the optimal solution, while the chaotic search increasing the CS-CEOA capability to get the global solution. CS-CEOA is examined using a set of the most well-known benchmark test problems “CEC’05” and eight constrained benchmark problems elicited from the literature [56,57]. Further, the proposed algorithm is implemented in solving an application of blending four ingredients, three feed streams, one tank, and two products to obtain certain products with certain (required) chemical properties and determining costs. The efficiency of our algorithm was achieved compared with other algorithms in the literature.
This paper is structured as follows. In Section 2, we explain the standard formulation of the constrained non-linear programming problems. The suggested algorithm is investigated in Section 3. Section 4 addresses the simulation experiments. The limitation of the proposed study is presented in Section 5, Finally, our observations and future work are discussed in Section 6.

2. Constrained Non-linear Programming Problem (CNPP)

In mathematics, a constrained non-linear programming problem (CNPP) is the process of handling an optimization problem where any of the constraints or objective function are nonlinear. Linear programming problem is a special case of NPP.
The general CNPP is written generally as [58]:
min f ( x ) s u b j e c t   to   c m x = 0 ,   m E , c m x 0 ,   m I , l i x i u i ,   i 1 , , n
where x R n are the decision variables, l R n ,   u R n represent lower bounds and upper bounds of the decision variables, ε is the set of equality constraints and I is the set of inequality constraints, the function f is the objective function, and c m m E I are the set of constraint functions, the functions f , c m m are mapping from R n to R .

3. The Suggested Algorithm (CS-CEOA)

In this section, the suggested algorithm, a chaotic search based on a constrained equilibrium optimizer algorithm (CS-CEOA) is presented.

3.1. Brief Discribtion of Equilibrium Optimizer Algorithm

The equilibrium optimizer algorithm (EOA) is a simulated optimizer that was originally presented by Faramarzi [31] in 2020. The simulated optimizer simulates the equilibrium and dynamic m states related to the mass balance models where each particle concentration (particle position) is updated in a random way with a target of reaching the equilibrium state (particle fitness). The equilibrium optimizer has a very simple procedure, also it has an adaptive dynamic control parameter. It is initialized with initial positions of the particles (initial positions C i ,   i   =   1 ,   2 ,   ,   No .   of   Particles 1 2 ) with a special number (No. of particles) and problem’s dimensions (dim) as in the following equation:
C i n i t i a l = rand No .   of   particles , dim × ( u b l b ) + l b ,
where C i n i t i a l locates the initial positions of the particles; the decision variables bounds l b and u b are the specified lower and upper bounds respectively of the decision optimization variables.
  • Equilibrium pool and candidates (Ceq)
The terminology of the equilibrium state is called the final convergence state of EOA. At the initialization of the algorithm, equilibrium candidates are assigned to support a search pattern for the particles. There are four best-so-far particles identified during the algorithm optimization process with another particle, whose position is the arithmetic mean of the other four particles. EOA has an exploration scheme using four candidates and an exploitation scheme using the average mean. These five particles are called equilibrium candidates which are used to construct the equilibrium pool:
C e q , p o o l = C e q , 1 , C e q , 2 , C e q , 3 , C e q , 4 , C e q , a v ,
The position of every particle in each iteration of the whole algorithm is updated using an equilibrium pool by random selection among candidates chosen with the same probability. Then, the particle positions are repeatedly updated with respect to the equilibrium pool, which is extracted as the best-so-far candidates. The procedure of updating the mechanism of the EO as in the following equation:
C n e w = C e q + G λ ( 1 F ) + ( C o l d C e q ) × F ,
F = a 1 s i g n ( r 0.5 ) ( e λ t 1 ) ,
G = 0.5 r 1 if   r 2 G P   0 if   r 2 < G P ,
t = 1 T T max a 2 T T max ;
where C o l d is the current position (concentration) vector, and C n e w is the new updated position vectors of the particle? From the equilibrium pool, we randomly pick one concentration vector which denoted by C e q . λ is a random vector between 0 and 1; a1 and a2 are constants ( a 1 =   2 and a 1 =   1 ), r , r 1 , r 2 are random numbers generated between 0 and 1, GP is the generation probability, T is the current iteration counter and T max is the predetermined maximum number of the iterations. In each generation repetition, the problem objective function is calculated for each particle’s position to determine their states. In addition, the equilibrium pool C e q , p o o l = C e q , 1 , C e q , 2 , C e q , 3 , C e q , 4 , C e q , a v is updated each iteration to contain the four best so far particles.

3.2. Basic Algorithm

The combined algorithm CS-CEOA is constructed of two phases, the first one (phase I) aims to locate the approximate solution, avoiding being stuck in local minima. In phase II, the chaos-based search algorithm increases CS-CEOA’s performance and obtain the best optimal solution. CEOA’s main steps are defined as follows:
  • Phase I: Constrained equilibrium optimizer algorithm
Step 1. Initialization stage: Initial population in the first generation are randomly initialized according to Equation (1).
Step 2. Initial feasible particle: The algorithm requires to get at least one initial feasible reference point (satisfying the set of constraints) to evolve the algorithm process. If the algorithm has difficulties in finding such an initial reference point (RP), the algorithm shall implement one of the following two ways: (1) doubling the number of tests to obtain the initial reference point, or (2) increasing temporally feasible space [59].
Step 3. Repairing infeasible particles: This step co-evolves any infeasible solution until it becomes feasible. A feasible solution is created on the segment defined by the feasible reference point and the infeasible solution [60].
Step 4. Elitist strategy for selection: To make the algorithm converge faster to the optimal solution, using the elitist strategy. The elitist particle represents the best solution for the population. By using an elitist solution, the best fitness particle can never be increased from one generation to the next until the optimization process is over.
Step 5. Evolution process stage: The algorithm applies EOA procedures to create a new population using Equations (4)–(7).
Step 6. Stopping criteria: The proposed algorithm is stopped for any of the following two conditions:
-
Reaching the maximum predetermined number of generations T max .
-
When the population’s particles converge. Particle convergence happens when all solutions in the population are similar.
Optimization by using phase-I yields an approximate solution x * = x 1 * , x 2 * , , x n * close to its true global solution. Chaotic local search (CLS) has the capability to perturb the position x * ; where local zone around x * will be exhaustively explored. There are various chaotic maps that have been used in optimization algorithms to enhance their efficiency. In the suggested algorithm, the chaotic circle map was used in the CLS Phase. The detailed procedure of the CLS scheme are presented as follows:
  • Phase II: Chaotic local search (CLS):
Step 1. Determine the range of CLS a i , b i ,   i = 1 , 2 , . . , n by x i * ε > a i ,   x i * + ε < b i ; where ε is the predetermined radius of chaotic local search.
Step 2. Chaotic random numbers z L are generated using the chaotic circle map; where α = 0.5 ,   β = 0.2 as follows:
z L + 1 = z L + β α 2 π sin 2 π z L M o d 1 ,
where L is the CLS iterations and Mod is a mathematical function, that returns the remainder or signed remainder of a division after one number is divided by another.
Step 3. Map the chaos variable z L into the decision variables range of optimization valuable a i , b i by
x i L = a i + ( b i a i ) z L ,
By substituting the value of a i = x i * ε and b i = x i * + ε , then Equation (9) can be rewritten as:
x i L = x i * ε + 2 ε z L i = 1 , , n ,
Step 4. If f x L < f x * then set x * = x L , otherwise break the iteration.
Step 5. If f x * has not been improved for all L iterations, terminate chaos search algorithm and put out x * as the best optimal global solution.
The proposed algorithm is said to have convergence if
X T + 1 X * X T X * τ , τ 0 ,
where X T and X T + 1 denote the solutions obtained at the end of iterations T and T + 1, respectively, X * represents the optimum solution, and X denotes the length or norm of the vector X . The proposed optimization method is said to have super-linear convergence (corresponds to fast convergence) if:
lim T X T + 1 X * X T X * 0
The pseudo-code of chaotic is illustrating local search is declared in Figure 1, while the flow chart of the proposed algorithm is shown in Figure 2.

4. Experimental Findings

This section is developed to validate the proposed algorithm to handle the non-linear programming problems and to petrochemical engineering application; where it is tested by a set of well-known benchmark test problems “CEC’05”, set of eight constrained benchmark test problems [56,57], and petrochemical engineering application. The efficiency of our algorithm is achieved compared with other recent algorithms in the literature. All the experiments are coded in Matlab 14.0, and the numerical simulations are done on an Intel Core machine (Intel i7, 2.9 GHz, 16 GB DDR4 RAM). The controlled parameters of the proposed algorithm are shown in Table 1.

4.1. Benchmark Unconstrained Problem Suite

This subsection focuses on the reliability and robustness of the proposed algorithm (CS-CEOA) evaluated by 17 unconstrained benchmark functions. The results are compared against, integrated particle swarm with genetic algorithms (Integrated PSO-GAs) [15], a hybrid optimization algorithm from PSO, and GA (H_PSO_GA) [61], a continuous genetic algorithm (CGA) [62], continuous hybrid algorithm CHA [63], PSO based hybrid GA (GA-PSO) [64] and the original constrained equilibrium optimizer algorithm (CEOA). To avoid biasing the optimization results to the random of the initial population and to make unbiased comparisons, we run each problem 30 times, starting with various randomly selected positions in the hyperrectangular search space. The numerical comparison between the results calculated by the proposed algorithm versus the global optimal solutions is shown in Table 2. While Table 3 illustrates the obtained experimental results using the proposed algorithm versus five recent evolutionary algorithms according to average error. The numerical simulations have demonstrated the superiority of the proposed approach to locating the global optimal solution.

4.2. Benchmark Constrained Problem Suite

This subsection focuses on the reliability, robustness, and ability of the CS-CEOA to solve constraining problems as it is evaluated through 8 constrained standard functions [57]. For comparison, we have chosen the constrained PSO algorithm according to [57]. Table 4 shows a comparison between the constrained PSO algorithm [57], the original constrained equilibrium optimizer algorithm (CEOA), and our approach CS-CEOA according to the absolute error. It is observed that CS-CEOA optimized the constrained problems effectively; where the average error of our solutions is less than that obtained by the constrained PSO algorithm in most problems.
Additionally, by comparing the proposed algorithm (CS-CEOA) with the original CEOA, it can be shown that chaotic search (CS) improves outcomes for both unconstrained benchmark problem suite (Table 2 and Table 3) and constrained benchmark problem suite (Table 4). On the other hand, implementing chaotic local search influences most significantly the algorithm convergence time, saving up to 12% of the time without affecting the result accuracy.

4.3. CEC 2005 Benchmark Unconstrained Problems

The proposed approach is tested by 25 problems the set of CEC’05 “special session 2005 on real-parameter optimization problems” [56]. Table 5 shows the comparison results between the average error obtained by CS-CEOA and the other nine reported optimization algorithms in the literature [65,66,67,68,69,70,71,72,73,74]; where all reported algorithms have been run fifty times for each test problem. The algorithm stops either when the maximal number of evaluations (1 × 105) is achieved, or when the obtained error is less than 1 × 10-8, or. Further, for each problem, we ranked the various methods according to the average error values obtained, as in Table 6 and Table 7. Figure 3 shows the relative weight of each algorithm, which computed according to its rank. On the other hand, Figure 4 shows the comparison of different problems between the different algorithms according to their ranks. Overall, the proposed algorithm CS-CEOA performs well on almost all the test problems used for this suite.

4.4. Petrochemical Engineering Application (Blending Four Ingredients, Three feed Streams, One Pool, and Two Products)

Optimization has a lot of applications in various fields in chemical and petroleum engineering such as design, development, scheduling, analysis, planning, and operating chemical processes. It is helpful as it enables the formulation of unstable systems and utilizing sparsity and development process models. The pooling network system constructed of any number of feeder streams, pools (tanks), and products, in which any feeder stream may connect any tank and any product. These applications are familiar in chemical engineering and petrochemical engineering. Figure 5 illustrates a graphical structure of a simple pooling network system involving three feed streams, one bending tank, and two products.
The purpose of this application is to calculate the flow stream of these four ingredients in various pools in order to obtain certain specific products with certain (required) chemical properties and determining costs. The four ingredients can be blended in the pool or be directly reach any of the finite products. These applications were investigated, among others [75,76,77,78,79]. The vectors xij, yij, and zij represent the flow streams between different feeder i-pool l, pool l-product j, and feeder i-product j, respectively. Ben-Tal et al. [75] presented the substitution of flowrate xil which represents the flow stream from feeder i to pool l; with a fractional flowrate qil; representing the fraction of flow stream from feed i to pool l. With these notions, we can define the following sets.
I is the set of feed streams, J is the set of required products, L is the number of mixed tanks, and K is the set of components, whose quality is being monitored. For this petrochemical engineering application, we can define the parameters of the physical problem as follows:
  • Ai is the maximum output flow of feed i;
  • Dj the maximum predicted demand for product j;
  • Sl the size of Tank l: Cik the percentage of ingredient k in feeder i,
  • Pjk the maximum percentage of ingredient k in product j, ci is the unit price of feeder i, and dj is the unit price of finite product j.
The mathematical formulation of a pooling system application is formulated as follows:
M a x j = 1 J l = 1 L ( d j i I c i q i l ) y i j + i = 1 I j = 1 J ( d j c i ) z i j S u b j e c t t o l = 1 L j = 1 J q i l y l j + j = 1 J z i j A i , i I j = 1 J y l j S l , l I j = 1 J y l j + i = 1 I z i j D j , j J l = 1 L i = 1 I C i k q i l P j k y l j + i = 1 I C i k P j k z i j , j J , k K , 0 q i l 1 , i I , l L 0 y l j D j , l L , j J 0 z i j D j , i I , j J
Figure 6 shows the application network system with four feeder streams, one pool, and two products.
The data of this application [79] is as follows:
A = ( , , , 50 ) , D = ( 100 , 200 ) , C = ( 3   , 1   , 1   ) , P = ( 5 / 2 , 3 / 2 ) , c = ( 6 , 16 , , 15 ) , d = ( 9 , 15 ) , S 1 = .
The mathematical formulation of this network system is as follows:
M a x 9 6 q 11 16 q 21 15 q 41 y 11 + 15 6 q 11 16 q 21 15 q 41 y 12 z 31 + 5 z 32 S u b j e c t t o q 41 y 11 + q 41 y 12 50 , y 11 + z 31 100 , y 12 + z 23 200 , 3 q 11 + q 21 + q 41 2.5 y 11 0.5 z 31 0 , 3 q 11 + q 21 + q 41 1.5 y 12 0.5 z 32 0 , q 11 + q 21 + q 41 = 1 , 0 y 11 100 , 0 y 12 200 0 z 31 100 , 0 z 32 200 0 q 11 1 , 0 q 21 1 , 0 q 41 1
The results of this problem are presented in Table 8, which demonstrate the validity of the proposed algorithm to solve real-life applications.
In this subsection, a comparative study has been investigated to examine the proposed algorithm concerning the solutions quality. First, evolutionary-based-approaches suffer from the quality of the solution, where they get an approximated optimal solution, and thus CS-CEOA has been used to evolve the quality of the obtained solution by applying a chaotic local search that guarantees fast convergence towards the true optimal solution. On the other hand, unlike conventional approaches, CS-CEOA searches using a population of particles, not a single point, so CS-CEOA can provide a globally search algorithm, that can locate the global zone from the search space. In addition, CS-CEOA implements only the objective function values, not derivatives, or any other auxiliary knowledge; therefore, it can handle non-continuous, non-smooth, and non-differentiable functions which are presented in practical real-life optimization problems. Furthermore. In addition, the equilibrium optimizer can be hybridized with other search processes, and its parameters can be modified to improve the efficiency of CS-CEOA. The findings of the simulation also show the superiority of CS-CEOA over those stated in the literature, as it is substantially better than other methods. Finally, owing to the simplicity of the procedures, the reality of using CS-CEOA to deal with complex problems of realistic dimensions has been approved.

5. Limitations of the Proposed Algorithm

The core advantage of the traditional optimization techniques is that it guarantees to find the truly global solution, unlike population-based approaches, but they have critical limitations with large-scale real-life, nondifferentiable, nonconvex, ill-defined problems, and non-formulated problems. Population-based methods are usually very efficient and robust in finding near-global solutions, especially with complex problems. There are some critical limitations of the proposed technique. The proposed technique randomly generates the position for agents, which produce degeneracy. The degeneracy occurs when multiple agents represent the same position, which may lead to an inefficient solution. To date, mathematical theoretical convergence analysis of population-based algorithms is still at an early stage and has been experimentally studied in the literature, making mathematical convergence analysis an important subject in a future study. The advantages and disadvantages of the proposed algorithm can be stated in Table 9.

6. Conclusions

Recently developed nature-inspired optimization approaches are good techniques for finding global solutions for real-life optimization applications. The equilibrium optimizer algorithm (EOA) is a novel optimization approach, which inspired by control volume mass balance models implemented to determine both dynamic and equilibrium states. In this paper, the chaotic search-based constrained equilibrium optimizer algorithm (CS-CEOA) has been proposed as a new algorithm for optimizing constrained optimization problems. CS-CEOA integrates the algorithm of evolving individuals modeled by EOA with the algorithm of Local-improvement of chaotic local search (CLS); thus, CS-CEOA synthesizes the merits of both EOA and chaotic search, and it is a simple and yet robust model to deal with different types of optimization problems. CS-CEOA is computed in two phases, the first one (phase I) intends to locate the approximate optimal solution, avoiding being trapped in local minima, while in phase II, the chaos-based search algorithm increases local search performance and obtain the best optimal solution. In addition, a repair function was implemented to co-evolves any infeasible solution until it becomes feasible, in a way such that, a new feasible solution is created on the segment defined by the feasible reference point and the infeasible solution itself. Due to the fast globally converging characteristics of evolutionary algorithms, and the chaotic search’s exhaustive search, CS-CEOA was able to locate the true optimal solution by implementing an exhaustive local search on a small zone. The superior performance of CS-CEOA in comparison to the performance of the recent competitive algorithms has been validated by multi-benchmark suites of problems including constrained, unconstrained, CEC’05 problems, and an application of blending four ingredients, three feed streams, one tank, and two products to obtain some certain products with specific chemical properties and determining costs. The results were compared with the standard evolutionary algorithm, which concludes the superiority of CS-CEOA to handle non-linear programming problems. The following observations reveal some major benefits of the proposed approach:
  • CS-CEOA has been used to increase the solution quality by combining the merits of EOA and CLS.
  • Implementing chaotic local search influences the algorithm convergence time, saving up to 12% of the time.
  • Unlike traditional techniques, CS-CEOA searches using a population of particles, therefore it can be considered as a global search algorithm.
  • CS-CEOA uses only the objective function values, therefore it can handle all types of functions that existed in practical real-life optimization problems.
  • The numerical simulation approves the superiority of CS-CEOA to the reported algorithms in the literature.
To date, theoretical convergence analysis of evolutionary algorithms is still at an early stage and has been experimentally studied in the literature, making mathematical convergence analysis an important subject in a future study.

Author Contributions

Conceptualization, writing—original draft preparation, methodology, formal analysis, investigation, A.A.A.M., M.A.E.-S., I.M., and H.A. writing—review and editing, A.A.A.M., M.A.E.-S., and I.M. Supervision, A.A.A.M. and M.A.E.-S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Acknowledgments

The authors extend their appreciation the great support of Deanship of Scientific Research, in Taif University for funding Taif University Researcher Supporting Project Number (Tursp-2020/48), Taif University, Taif, Saudi Arabia.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Luenberger, D.G. Linear and Non-Linear Programming, 2nd ed.; Addison-Wesley: Boston, MA, USA, 1984. [Google Scholar]
  2. Boyd, S.; Vandenberghe, L. Convex Optimization; Cambridge University Press: Cambridge, UK, 2004. [Google Scholar]
  3. Afshar, M.H.; Faramarzi, A. Size Optimization of Truss Structures by Cellular Automata. J. Comput. Sci. Eng. 2010, 3, 1–9. [Google Scholar]
  4. Faramarzi, A.; Afshar, M.H. A novel hybrid cellular automata–linear programming approach for the optimal sizing of planar truss structures. Civ. Eng. Environ. Syst. 2014, 31, 209–228. [Google Scholar] [CrossRef]
  5. Faramarzi, A.; Afshar, M.H. Application of cellular automata to size and topology optimization of truss structures. Sci. Iran. 2012, 19, 373–380. [Google Scholar] [CrossRef]
  6. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  7. El-Shorbagy, M.A.; Mousa, A.A.; Farag, M. Solving non-linear single-unit commitment problem by genetic algorithm based clustering technique. Rev. Comput. Eng. Res. 2017, 4, 11–29. [Google Scholar] [CrossRef] [Green Version]
  8. El-Shorbagy, M.A.; Mousa, A.A.; Farag, M.A. An intelligent computing technique based on a dynamic-size subpopulations for unit commitment problem. OPSEARCH 2019, 56, 911–944. [Google Scholar] [CrossRef]
  9. El-Shorbagy, M.A.; Ayoub, A.Y.; Mousa, A.A.; El-Desoky, I.M. An Enhanced Genetic Algorithm with New Mutation for Cluster Analysis. Comput. Stat. 2019, 34, 1355–1392. [Google Scholar] [CrossRef]
  10. El-Desoky, I.M.; El-Shorbagy, M.A.; Nasr, S.M.; Hendawy, Z.M.; Mousa, A.A. A Hybrid Genetic Algorithm for Job Shop Scheduling Problems. Int. J. Adv. Eng. Technol. Comput. Sci. 2016, 3, 6–17. [Google Scholar]
  11. Saurabh, P.; Verma, B. An efficient proactive artificial immune system based anomaly detection and prevention system. Expert Syst. Appl. 2016, 60, 311–320. [Google Scholar] [CrossRef]
  12. Mousa, A.A.; El-Shorbagy, M.A. Identifying a Satisfactory Operation Point for Fuzzy Multiobjective Environmental/Economic Dispatch Problem. Am. J. Math. Comput. Model. 2016, 1, 1–14. [Google Scholar]
  13. Mousa, A.A.; El-Shorbagy, M.A. Enhanced particle swarm optimization based local search for reactive power compensation problem. Appl. Math. 2012, 3, 1276–1284. [Google Scholar] [CrossRef] [Green Version]
  14. El-Shorbagy, M.A.; Mousa, A.A. Chaotic Particle Swarm Optimization for Imprecise Combined Economic and Emission Dispatch Problem. Rev. Inf. Eng. Appl. 2017, 4, 20–35. [Google Scholar]
  15. El-Wahed, W.F.A.; Mousa, A.A.; El-Shorbagy, M.A. Integrating particle swarm optimization with genetic algorithms for solving non-linear optimization problems. J. Comput. Appl. Math. 2011, 235, 1446–1453. [Google Scholar] [CrossRef]
  16. Mousa, A.A.; El_Desoky, I.M. Stability of Pareto optimal allocation of land reclamation by multistage decision-based multipheromone ant colony optimization. Swarm Evol. Comput. 2013, 13, 13–21. [Google Scholar] [CrossRef]
  17. Karaboga, D. An Idea Based on Honey Bee Swarm for Numerical Optimization; Technical Report-TR06; Erciyes University, Engineering Faculty, Computer Engineering Department: Kayseri, Turkey, 2005. [Google Scholar]
  18. Passino, K.M. Biomimicry of Bacteria Foraging for Distributed Optimization and Control. IEEE Control Syst. Mag. 2002, 22, 52–67. [Google Scholar]
  19. Zhao, W.; Wang, L. An effective bacterial foraging optimizer for global optimization. Inf. Sci. 2016, 329, 719–735. [Google Scholar] [CrossRef]
  20. Guo, L.; Meng, Z.; Sun, Y.; Wang, L. Parameter identification and sensitivity analysis of solar cell models with cat swarm optimization algorithm. Energy Convers Manag. 2016, 108, 520–528. [Google Scholar] [CrossRef]
  21. Marinaki, M.; Marinakis, Y. A Glowworm Swarm Optimization algorithm for the Vehicle Routing Problem with Stochastic Demands. Expert Syst. Appl. 2016, 46, 145–163. [Google Scholar] [CrossRef]
  22. Verma, S.; Mukherjee, V. Firefly algorithm for congestion management in deregulated environment. Eng. Sci. Technol. Int. J. 2016, 19, 1254–1265. [Google Scholar] [CrossRef] [Green Version]
  23. Zhou, Y.; Chen, X.; Zhou, G. An improved monkey algorithm for a 0–1 knapsack problem. Appl. Soft Comput. 2016, 38, 817–830. [Google Scholar] [CrossRef] [Green Version]
  24. Bolaji, A.L.; Al-Betar, M.A.; Awadallah, M.A.; Khader, A.T.; Abualigah, L.M. A comprehensive review: Krill Herd algorithm (KH) and its applications. Appl. Soft Comput. 2016, 49, 437–446. [Google Scholar] [CrossRef]
  25. Shehab, M.; Khader, A.T.; Laouchedi, M.; Alomari, O.A. Hybridizing cuckoo search algorithm with bat algorithm for global numerical optimization. J. Supercomput. 2019, 75, 2395–2422. [Google Scholar] [CrossRef]
  26. Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  27. Farag, M.A.; El-Shorbagy, M.A.; Mousa, A.A.; El-Desoky, I.M. A New Hybrid Metaheuristic Algorithm for Multiobjective Optimization Problems. Int. J. Comput. Intell. Syst. 2020, 13, 920–940. [Google Scholar] [CrossRef]
  28. El-Shorbagy, M.A.; Farag, M.A.; Mousa, A.A.; El-Desoky, I.M. A Hybridization of Sine Cosine Algorithm with Steady State Genetic Algorithm for Engineering Design Problems. In Proceedings of the International Conference on Advanced Machine Learning Technologies and Applications AMLTA 2019, AISC 921, Cairo, Egypt, 28–30 March 2019; pp. 1–13. [Google Scholar]
  29. Saremi, S.; Mirjalili, S.; Lewis, A. Grasshopper Optimisation Algorithm: Theory and application. Adv. Eng. Softw. 2017, 105, 30–47. [Google Scholar] [CrossRef]
  30. Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp Swarm Algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
  31. Faramarzi, A.; Heidarinejad, M.; Stephens, B.; Mirjalili, S. Equilibrium optimizer: A novel optimization algorithm. Knowl.-Based Syst. 2020, 191, 105190. [Google Scholar] [CrossRef]
  32. Ahmadianfar, I.; Bozorg-Haddad, O.; Chu, X. Gradient-based optimizer: A new Metaheuristic optimization algorithm. Inf. Sci. 2020, 540, 131–159. [Google Scholar] [CrossRef]
  33. Li, S.; Chen, H.; Wang, M.; Heidari, A.A.; Mirjalili, S. Slime mould algorithm: A new method for stochastic optimization. Future Gener. Comput. Syst. 2020, 111, 300–323. [Google Scholar] [CrossRef]
  34. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  35. Meng, O.K.; Pauline, O.; Kiong, S.C. A carnivorous plant algorithm for solving global optimization problems. Appl. Soft Comput. J. 2020, 98, 106833. [Google Scholar] [CrossRef]
  36. Gupta, S.; Deep, K.; Mirjalili, S.; Kim, J.H. A modified Sine Cosine Algorithm with novel transition parameter and mutation operator for global optimization. Expert Syst. Appl. 2020, 154, 113395. [Google Scholar] [CrossRef]
  37. Ghasemi, M.; Davoudkhani, I.F.; Akbari, E.; Rahimnejad, A.; Ghavidel, S.; Li, L. A novel and effective optimization algorithm for global optimization and its engineering applications: Turbulent Flow of Water-based Optimization (TFWO). Eng. Appl. Artif. Intell. 2020, 92, 103666. [Google Scholar] [CrossRef]
  38. Chen, H.; Li, W.; Yang, X. A whale optimization algorithm with chaos mechanism based on quasi-opposition for global optimization problems. Expert Syst. Appl. 2020, 158, 113612. [Google Scholar] [CrossRef]
  39. Chu, X.; Cai, F.; Gao, D.; Li, L.; Cui, J.; Xu, S.X.; Qin, Q. An artificial bee colony algorithm with adaptive heterogeneous competition for global optimization problems. Appl. Soft Comput. J. 2020, 93, 106391. [Google Scholar] [CrossRef]
  40. Wu, J.; Wang, Y.-G.; Burrage, K.; Tian, Y.-C.; Lawson, B.; Ding, Z. An improved firefly algorithm for global continuous optimization problems. Expert Syst. Appl. 2020, 149, 113340. [Google Scholar] [CrossRef]
  41. Li, N.; Wang, L. Bare-Bones Based Sine Cosine Algorithm for global optimization. J. Comput. Sci. 2020, 47, 101219. [Google Scholar] [CrossRef]
  42. Zhang, Y.; Jin, Z. Group teaching optimization algorithm: A novel metaheuristic method for solving global optimization problems. Expert Syst. Appl. 2020, 148, 113246. [Google Scholar] [CrossRef]
  43. Askari, Q.; Younas, I.; Saeed, M. Political Optimizer: A novel socio-inspired meta-heuristic for global optimization. Knowl.-Based Syst. 2020, 195, 105709. [Google Scholar] [CrossRef]
  44. Yimit, A.; Iigura, K.; Hagihara, Y. Refined selfish herd optimizer for global optimization problems. Expert Syst. Appl. 2020, 139, 112838. [Google Scholar] [CrossRef]
  45. Wei, X.; Yuan, S.; Ye, Y. Optimizing facility layout planning for reconfigurable manufacturing system based on chaos genetic algorithm. Prod. Manuf. Res. 2019, 7, 109–124. [Google Scholar] [CrossRef]
  46. Dash, S.; Thulasiram, R.; Thulasiraman, P. Modified Firefly Algorithm with Chaos Theory for Feature Selection: A Predictive Model for Medical Data. Int. J. Swarm Intell. Res. 2019, 10, 1–20. [Google Scholar] [CrossRef]
  47. Fuertes, G.; Vargas, M.; Alfaro, M.; Soto-Garrido, R.; Sabattin, J.; Peralta, M.A. Chaotic genetic algorithm and the effects of entropy in performance optimization. Chaos 2019, 29, 013132. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  48. Mousa, A.A.; Qassm, S.M.; Alnefaie, A.A. Sinusoidal Chaotic Genetic Algorithm for Constrained Optimization: Recent Trends in Applied Optimization. Int. J. Eng. Res. Technol. 2019, 12, 2787–2802. [Google Scholar]
  49. L-Shorbagy, M.A.E.; Mousa, A.A.; Nasr, S.M. A chaos-based evolutionary algorithm for general non-linear programming problem. Chaos Solitons Fractals 2016, 85, 8–21. [Google Scholar] [CrossRef]
  50. Abo-Elnaga, Y.; Nasr, S.M.; El-Desoky, I.M.; Hendawy, Z.M.; Mousa, A.A. Enhanced Genetic Algorithm and Chaos Search for Bilevel Programming Problems. In Proceedings of the International Conference on Advanced Machine Learning Technologies and Applications, Cairo, Egypt, 28–30 March 2019; pp. 478–487. [Google Scholar]
  51. Abdel-Basset, M.; Mohamed, R.; Mirjalili, S. A Binary Equilibrium Optimization Algorithm for 0–1 Knapsack Problems. Comput. Ind. Eng. 2020. [Google Scholar] [CrossRef]
  52. Gupta, S.; Deep, K.; Mirjalili, S. An efficient equilibrium optimizer with mutation strategy for numerical optimization. Appl. Soft Comput. J. 2020, 96, 106542. [Google Scholar] [CrossRef]
  53. Rabehi, A.; Nail, B.; Helal, H.; Douara, A.; Ziane, A.; Amrani, M.; Akkal, B.; Benamara, Z. Optimal estimation of Schottky diode parameters using a novel optimization algorithm: Equilibrium optimizer. Superlattices Microstruct. 2020, 146, 106665. [Google Scholar] [CrossRef]
  54. Abdel-Basset, M.; Mohamed, R.; Mirjalili, S.; Chakrabortty, R.K.; Ryan, M.J. Solar photovoltaic parameter estimation using an improved equilibrium optimizer. Sol. Energy 2020, 209, 694–708. [Google Scholar] [CrossRef]
  55. Shaheen, A.M.; Elsayed, A.M.; El-Sehiemy, R.A.; Abdelaziz, A.Y. Equilibrium optimization algorithm for network reconfiguration and distributed generation allocation in power systems. Appl. Soft Comput. J. 2020, 98, 106867. [Google Scholar] [CrossRef]
  56. Suganthan, P.N.; Hansen, N.; Liang, J.J.; Deb, K.; Chen, Y.P.; Auger, A.; Tiwari, S. Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization; Kanpur Genetic Algorithms Laboratory (KanGAL) Report 2005005, No. 2005; Indian Institute of Technology Kanpur (IIT Kanpur): Kanpur, India, 2005; Available online: https://bee22.com/resources/Liang%20CEC2014.pdf (accessed on 20 October 2020).
  57. Sedlaczek, K.; Eberhard, P. Constrained Particle Swarm Optimization of Mechanical Systems. In Proceedings of the 6th World Congresses of Structural and Multidisciplinary Optimization, Rio de Janeiro, Brazil, 30 May–3 June 2005; Available online: https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.60.4525&rep=rep1&type=pdf (accessed on 20 October 2020).
  58. Mousa, A.A.; El-Shorbagy, M.A.; Farag, M.A. Steady-State Sine Cosine Genetic Algorithm Based Chaotic Search for non-linear Programming and Engineering Applications. IEEE Access 2020, 8, 212036–212054. [Google Scholar] [CrossRef]
  59. Osman, M.S.; Abo-Sinna, M.A.; Mousa, A.A. A Combined Genetic Algorithm-Fuzzy Logic Controller (GA-FLC) In non-linear Programming. J. Appl. Math. Comput. 2005, 170, 821–840. [Google Scholar] [CrossRef]
  60. Osman, M.S.; Abo-Sinna, M.A.; Mousa, A.A. IT-CEMOP: An Iterative Co-evolutionary Algorithm for Multiobjective Optimization Problem with non-linear Constraints. J. Appl. Math. Comput. 2006, 183, 373–389. [Google Scholar] [CrossRef]
  61. El-Shorbagy, M.A.; Mousa, A.A. A Hybrid Optimization System Coupling Particle Swarm Optimization Algorithm and Genetic Algorithm Applied to non-linear Optimization Problems. Online J. Math. Stat. 2015, 6, 118–125. [Google Scholar]
  62. Chelouah, R.; Siarry, P. A continuous genetic algorithm designed for the global optimization of multimodal functions. J. Heurist. 2000, 6, 191–213. [Google Scholar] [CrossRef]
  63. Chelouah, R.; Siarry, P. Genetic and Nelder–Mead algorithms hybridized for a more accurate global optimization of continuous multiminima functions. Eur. J. Oper. Res. 2003, 148, 335–348. [Google Scholar] [CrossRef]
  64. Kao, Y.; Zahara, E. A hybrid genetic algorithm and particle swarm optimization for multimodal functions. Appl. Soft Comput. 2008, 8, 849–857. [Google Scholar] [CrossRef]
  65. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of IV IEEE International Conference on Neural Networks; IEEE: Piscataway, NJ, USA, 1995; pp. 1942–1948. [Google Scholar]
  66. Auger, A.; Hansen, N. A restart CMA evolution strategy with increasing population size. In Proceedings of the 2005 IEEE Congress on Evolutionary Computation, Edinburgh, Scotland, UK, 2–5 September 2005; pp. 1769–1776. [Google Scholar]
  67. Eshelman, L.J. The CHC adaptative search algorithm: How to have safe search when engaging in nontraditional genetic recombination. In Foundations of Genetic Algorithms; Rawlins, G.J.E., Ed.; Morgan Kaufmann: San Mateo, CA, USA, 1991; pp. 265–283. [Google Scholar]
  68. Eshelman, L.J.; Schaffer, J.D. Real-coded genetic algorithms and interval schemata. In Foundations of Genetic Algorithms; Whitley, D., Ed.; Morgan Kaufmann: San Mateo, CA, USA, 1993; pp. 187–202. [Google Scholar]
  69. Fernandes, C.; Rosa, A. A study of non-random matching and varying population size in genetic algorithm using a royal road function. In Proceedings of the 2001 Congress on Evolutionary Computation, Seoul, Korea, 27–30 May 2001; pp. 60–66. [Google Scholar]
  70. Mülenbein, H.; Schlierkamp-Voosen, D. Predictive models for the breeding genetic algorithm in continuous parameter optimization. Evol. Comput. 1993, 1, 25–49. [Google Scholar] [CrossRef]
  71. Herrera, F.; Lozano, M.; Molina, D. Continuous scatter search: An analysis of the integration of some combination methods and improvement strategies. Eur. J. Oper. Res. 2006, 169, 450–476. [Google Scholar] [CrossRef]
  72. Laguna, M.; Marti, R. Scatter Search. Methodology and Implementation in C; Kluwer Academic Publishers: Dordrecht, The Netherlands, 2003. [Google Scholar]
  73. Price, K.V.; Rainer, M.; Lampinen, J.A. Differential Evolution: A Practical Approach to Global Optimization; Springer: Berlin/Heidelberg, Germany, 2005. [Google Scholar]
  74. Qin, A.K.; Suganthan, P.N. Self-adaptive differential evolution algorithm for numerical optimization. In Proceedings of the 2005 IEEE Congress on Evolutionary Computation, Scotland, UK, 2–5 September 2005; Volume 2, pp. 1785–1791. [Google Scholar]
  75. Ben-Tal, A.; Eiger, G.; Gershovitz, V. Global minimization by reducting the duality gap. Math. Program. 1994, 63, 193–212. [Google Scholar] [CrossRef]
  76. Haverly, C. Studies of the behavior of recursion for the pooling problem. ACM-Sigmap Bull. 1978, 25, 19–28. [Google Scholar] [CrossRef]
  77. Lasdon, L.; Waren, A.; Sarkar, S.; Palacios, F. Solving the pooling problem using generalized reduced gradient and successive linear programming algorithms. ACM-Sigmap Bull. 1979, 26, 9–15. [Google Scholar] [CrossRef]
  78. Visweswaran, V.; Floudas, C.A. Computational results for an efficient implementation of the GOP algorithm and its variants. In Gloobal Optimization in Engineering Design; Grassmann, I.E., Ed.; Kluwer Book Series in Nonconvex Optimization and Its Applications; Kluwer Academic: Dordrecht, The Netherlands, 1996; Chapter 4. [Google Scholar]
  79. Andrei, N. Non-Linear Optimization Applications Using the GAMS Technology; Springer: New York, NY, USA; Berlin/Heidelberg, Germany; NDordrecht, The Netherlands; NLondon, UK, 2013; Available online: https://link.springer.com/book/10.1007/978-1-4614-6797-7 (accessed on 20 October 2020).
Figure 1. The pseudo-code of the chaotic local search (CLS).
Figure 1. The pseudo-code of the chaotic local search (CLS).
Processes 09 00200 g001
Figure 2. Flowchart of the proposed algorithm.
Figure 2. Flowchart of the proposed algorithm.
Processes 09 00200 g002
Figure 3. The relative weight of each algorithm by rank.
Figure 3. The relative weight of each algorithm by rank.
Processes 09 00200 g003
Figure 4. Comparison between the different algorithms according to its ranks with different problems.
Figure 4. Comparison between the different algorithms according to its ranks with different problems.
Processes 09 00200 g004
Figure 5. Structure representation of four ingredients polling system pooling problem.
Figure 5. Structure representation of four ingredients polling system pooling problem.
Processes 09 00200 g005
Figure 6. Structure representation of petrochemical pooling problem.
Figure 6. Structure representation of petrochemical pooling problem.
Processes 09 00200 g006
Table 1. The parameters setting.
Table 1. The parameters setting.
Constrained Equilibrium Optimizer AlgorithmChaotic Local Search (CLS)
ParameterValueParameterValue/Description
Number of Particles50Chaotic MappingChaotic Circle
The maximum number of iterations T max 100Specified neighborhood radius ( ε )1 × 10−6
Probability of generation (GP)0.5 α ,   β 0.5, 0.2
a1, a22, 1Chaos search iteration (L)100
Table 2. Calculated solution versus Global optimal solution.
Table 2. Calculated solution versus Global optimal solution.
Test ProblemF OptimalIntegrated PSO-GAsH_PSO_GACEOACS-CEOA
RC0.3978870.3978870.3978870.3980190.397887
B20000.0017850
ES−1−1−1−0.999993−1
GP3333.00004783
SH−186.7309−186.7309−186.7309−186.6298−186.731
DJ02.6022 × 10−6409.3799 × 10−530
H3,4−3.86278−3.86343347−3.86343347787−3.861023−3.86278
H6,4−3.32237−3.322368−3.322368−3.32226−3.32237
S4,5−10.1532−10.1532−10.1532−10.0487−10.1532
S4,7−10.40294−10.402916−10.40291634−10.179683−10.403
S4,10−10.53641−10.5363855−10.53638559−9.998507−10.5365
R201.38584 × 10−211.5061 × 10−243.50704 × 10−13−1 × 10−30
R501.7476 × 10−111.7634 × 10−138.4143 × 10−40
R1001.1367 × 10−92.3369 × 10−136.4923 × 10−50
Z201.8461 × 10−1804.2805 × 10−140
Z503.8176 × 10−906.1409 × 10−60
Z1002.0996 × 10−905.2118 × 10−70
Table 3. Results provided by CS-CEOA, CEOA, H_PSO_GA, Integrated PSO-GAs, CGA, CHA and GA-PSO.
Table 3. Results provided by CS-CEOA, CEOA, H_PSO_GA, Integrated PSO-GAs, CGA, CHA and GA-PSO.
Test FunctionAverage Error
CS-CEOACEOAH_PSO_GA [61]Integrated PSO-GAs [15]CGA [62]CHA [63]GA-PSO [64]
RC0.00.00.04.59 × 10−70.00010.00010.00009
B20.00.00.01 × 10−250.00030.00000020.00001
ES0.00.00.01 × 10−300.00100.00100.00003
GP0.00.00.0−6.3060 × 10−140.00100.00100.00012
SH0.00.00.08.83064 × 10−60.00500.00500.00007
DJ0.00.00.08.443663 × 10−150.00020.00020.00004
H3,43 × 10−63 × 10−60.000020.000030.00500.00500.00020
H6,44 × 10−84 × 10−85 × 10-72 × 10−60.04000.00800.00024
S4,50.00.00.00.00.14000.00900.00014
S4,70.0000170.0000170.0000130.000020.12000.01000.00015
S4,100.0000910.0000910.0000110.000020.15000.01500.00012
R21 × 10−301 × 10−301 × 10-321 × 10−300.00400.00400.00064
R50.00.01 × 10-251 × 10−200.15000.01800.00013
R100.00.01 × 10-201 × 10−180.02000.00800.00005
Z20.00.00.01 × 10−150.0000030.0000030.00005
Z50.00.00.01 × 10−170.00040.000060.00000
Z100.00.00.01 × 10−250.0000010.0000010.00000
Table 4. Error provided by constrained PSO, CEOA, and CS-CEOA.
Table 4. Error provided by constrained PSO, CEOA, and CS-CEOA.
Benchmark Problem Error = |Optimal Value–Best Found Value|
CS-CEOACEOAConstrained PSO [57]
P_110 × 10−3010 × 10−175 × 10−4
P_21 × 10−121 × 10−42 × 10−5
P_30.000.000.00
P_410 × 10−30.011.76
P_50.0010 × 10−90.00
P_60.0010 × 10−80.00
P_70.0010 × 10−50.00
P_80.0010 × 10−110.00
Table 5. Average error of CEC’05 obtained by CS-CEOA versus other optimization algorithms.
Table 5. Average error of CEC’05 obtained by CS-CEOA versus other optimization algorithms.
ProblemPSO [65]IPOP-CMA-E [66]CHC [67,68]SSGA [69,70]SS-BLX [71]SS-Arit [72]DE-Bin [73]DE-Exp [73]SaDE [74]CS-CEOA
F_11.23 × 10−402.468.42 × 10−93.40 × 1011.067.72 × 10−98.26 × 10−98.42 × 10−90.000
F_22.60 × 10-201.18 × 1028.72 × 10−51.735.288.34 × 10−98.18 × 10−98.21 × 10−90.000
F_35.17 × 10402.70 × 1057.95 × 1041.84 × 1052.54 × 1054.23 × 1019.94 × 1016.56 × 1030.000
F_42.488 2.93 × 1039.19 × 1012.59 × 10−36.235.767.69 × 10−98.35 × 10−98.09 × 10−90.000
F_54.10 × 1028.10 × 10−102.64 × 1021.34 × 1022.191.44 × 1018.61 × 10−98.51 × 10−98.64 × 10−90.000
F_67.31 × 10201.42 × 1066.171.15 × 1024.95 × 1027.96 × 10−98.39 × 10−91.61 × 10−20.000
F_72.68 × 1011.27 × 1031.27 × 1031.27 × 1031.97 × 1031.91 × 1031.27 × 1031.27 × 1031.26 × 1031.6231
F_82.043 × 1012.00 × 1012.03 × 1012.04 × 1012.04 × 1012.04 × 1012.032.04 × 10312.03 × 1012.025
F_91.44 × 1012.84 × 1015.897.29 × 10−94.205.964.558.15 × 10−98.33 × 10−95.523 × 10−9
F_101.40 × 1012.33 × 1017.121.71 × 1011.24 × 1012.18 × 1011.23 × 10311.12+311.55 × 1011.7632
F_115.5901.341.603.262.932.862.432.076.801.9390
F_126.36 × 1022.13 × 1027.06 × 1022.79 × 1021.51 × 1022.41 × 1021.06 × 1026.31 × 1015.63 × 1015.98530
F_131.5031.138.30 × 1016.71 × 1013.25 × 1015.48 × 1011.576.40 × 1017.07 × 1011.4434
F_143.3043.782.07 × 1012.262.802.973.073.163.422.7518
F_153.398 × 1021.93 × 1022.75 × 1022.92 × 1021.14 × 1021.29 × 1023.72 × 1022.94 × 1028.42 × 1017.124 × 103
F_161.33 × 1021.17 × 1029.73 × 1011.05 × 1021.04 × 1021.13 × 1021.12 × 1021.13 × 1021.23 × 1021179 × 102
F_171.50 × 1023.39 × 1021.05 × 1021.19 × 1021.18 × 1021.28 × 1021.42 × 1021.31 × 1021.39 × 1021.269 × 102
F_188.51 × 1025.57 × 1028.80 × 1028.06 × 1027.67 × 1026.58 × 1025.10 × 1024.48 × 1025.32 × 1024.043 × 102
F_198.50 × 1025.29 × 1028.80 × 1028.90 × 1027.56 × 1027.01 × 1025.01 × 1024.34 × 1025.20 × 1027.650 × 102
F_208.51 × 1025.26 × 1028.96 × 1028.89 × 1027.46 × 1026.41 × 1024.93 × 1024.19 × 1024.77 × 1028.100 × 102
F_219.14 × 1024.42 × 1028.16 × 1028.52 × 1024.85 × 1025.01 × 1025.24 × 1025.42 × 1025.14 × 1024.0111 × 102
F_228.07 × 1027.65 × 1027.74 × 1027.52 × 1026.83 × 1026.94 × 1027.72 × 1027.72 × 1027.66 × 1027.505 × 102
F_231.03 × 1028.54 × 1021.08 × 1031.0 × 1035.74 × 1025.83 × 1026.34 × 1025.82 × 1026.51 × 1021.201 × 102
F_244.12 × 1036.10 × 1022.96 × 1022.36 × 1022.51 × 1022.01 × 1022.06 × 1022.02 × 1022.00 × 1023.040 × 102
F_255.10 × 1021.82 × 1031.76 × 1031.75 × 1031.79 × 1021.80 × 1021.74 × 1031.74 × 1031.74 × 1034.120 × 102
Table 6. Average error ranking of the 25 CEC’05 problems for all algorithms.
Table 6. Average error ranking of the 25 CEC’05 problems for all algorithms.
ProblemPSO [65]IPOP-CMA-E [66]CHC [67,68]SSGA [69,70]SS-BLX [71]SS-Arit [72]DE-Bin [73]DE-Exp [73]SaDE [74]CS-CEOA
F_14139526781
F_25149238671
F_36148235971
F_462105437981
F_567532498101
F_67142368951
F_710567984321
F_810158674932
F_956381429107
F_1061028594371
F_1191287654103
F_1294106358271
F_1331108564792
F_148812456793
F_1577451286109
F_1686102153479
F_1791013258674
F_1895108763241
F_1984910652137
F_2084109653127
F_2110289346751
F_2210594127863
F_2321031578694
F_2491075624318
F_2510854673219
Table 7. Statistical frequency table of ranking values.
Table 7. Statistical frequency table of ranking values.
RankR_1R_2R_3R_4R_5R_6R_7R_8R_9R_10
Method
PSO [65]0111242455
IPOP-CMA-ES [66]7203322204
CHC [67,68]2234311126
SSGA [69,70]1322311741
SS-BLX [71]4432452010
DE-Exp [73]2332044250
DE-Bin [73]0245232610
SS-Arit [72]0433643110
SaDE [74]2221216234
CS-CEOA11232003130
Table 8. The results of this problem are presented in Figure 6.
Table 8. The results of this problem are presented in Figure 6.
ParametersAlgorithms
SNOPT [79]MINOS [79]KNITRO [79]CONOPT [79]CS-CEOA
q1111111
q21001.8754 × 10−700
q41006.6576 × 10−800
y115050505050
y125050505050
z315050505050
z32150150150150150
Objective function130013001299.999513001300
Table 9. Advantages and Disadvantages.
Table 9. Advantages and Disadvantages.
AdvantagesHave greater success at locating global solution for small problems and nearby global optimal solution in the very large complex problems.
Do not require nether, convexity, continuous, differentiability, formulated, nor well-defined problem
Can be applied with both, discrete, continuous and mixed variables.
DisadvantagesTime consuming algorithm, especially for very large scale and complex problems.
Mathematical theoretical analysis for convergence to the global solution is still at an early stage, and need further investigation
Produce a nearby Global solution in the very large scale complex problems.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Mousa, A.A.A.; El-Shorbagy, M.A.; Mustafa, I.; Alotaibi, H. Chaotic Search Based Equilibrium Optimizer for Dealing with Nonlinear Programming and Petrochemical Application. Processes 2021, 9, 200. https://doi.org/10.3390/pr9020200

AMA Style

Mousa AAA, El-Shorbagy MA, Mustafa I, Alotaibi H. Chaotic Search Based Equilibrium Optimizer for Dealing with Nonlinear Programming and Petrochemical Application. Processes. 2021; 9(2):200. https://doi.org/10.3390/pr9020200

Chicago/Turabian Style

Mousa, Abd Allah A., Mohammed A. El-Shorbagy, Ibrahim Mustafa, and Hammad Alotaibi. 2021. "Chaotic Search Based Equilibrium Optimizer for Dealing with Nonlinear Programming and Petrochemical Application" Processes 9, no. 2: 200. https://doi.org/10.3390/pr9020200

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop