Elsevier

Information Sciences

Volume 541, December 2020, Pages 345-361
Information Sciences

Evolutionary algorithm with multiobjective optimization technique for solving nonlinear equation systems

https://doi.org/10.1016/j.ins.2020.06.042Get rights and content

Abstract

The challenge of solving nonlinear equation systems is how to locate multiple optimal solutions simultaneously in a single run. To address this issue, this paper proposes a novel algorithm by combining a diversity indicator, multi-objective optimization technique, and clustering technique. Firstly, a diversity indicator is designed to maintain the diversity of the population. Then, a K-means clustering-based selection strategy is introduced to locate the promising solutions. Finally, the local search is used to accelerate the convergence of population. The experimental results on 30 nonlinear equation systems show that the proposed algorithm is better than six state-of-the-art algorithms in terms of convergence rate and success rate.

Introduction

Nonlinear equation systems (NESs) appear in many areas such as robotics [1], physics [2], engineered materials [3], chemical processing [4], economics [5], and so on [6], [7]. In general, a NES can be defined as follows [8]:e1(X)=0e2(X)=0em(X)=0where X=(x1,x2,,xD)S denotes a D-dimensional decision vector, SRDis the decision space, the ith equation is ei(X)=0 (i{1,2,,m}), and m is the number of equations. S can be defined as follows:S=j=1D[Lj,Uj]where Lj and Uj denote the lower and upper bounds of xj, respectively.

If i{1,2,,m},ei(X)=0, then X is called an optimal solution (or a root) of a NES. Generally speaking, a NES may contain multiple optimal solutions, and all these optimal solutions are equally important. Fig. 1 shows that a NES (i.e., F03) has two nonlinear equations and 15 optimal solutions. The main task of solving NESs is to locate these optimal solutions simultaneously in a single run.

To solve NESs, many classical methods have been proposed based on traditional optimization [9], [10], [11], [12], such as steepest descent method, Newton method, conjugate gradient method, quasi-Newton method and so on. However, these methods have some disadvantages, mainly involving the initial point, differentiability, and convergence. Specifically, the initial point plays an important role in these methods. If the initial point is not selected properly, the result may be poor. These classical methods usually require gradient information. What’s more, only one optimal solution can be found by these methods in a single run. However, evolutionary algorithms (EAs) can overcome the drawbacks of the traditional methods. They may locate multiple optimal solutions in a single run.

In general, the methods of solving NESs by EAs consist of two steps. In the first step, a transformation technique is designed to transform a NES into an optimization problem. In the next step, the transformed optimization problem is solved by an optimization algorithm.

Currently, the main transformation techniques fall into three categories: 1) the transformation technique based on single-objective optimization [13], [14], [15], [16], [17], [18], [19], [20], [21], [22]; 2) the transformation technique based on constrained optimization [23]; 3) the transformation technique based on multiobjective optimization [24], [25], [26], [27].

A single-objective optimization problem is built through the first transformation technique as follows [16]:mini=1m(ei(X))2ormini=1m|ei(X)|Combined with this transformation technique, many algorithms are proposed to deal with NESs. For example, Liao et al. [21] proposed the memetic niching-based evolutionary algorithm to solve NESs and M. Ariyaratne et al. [28] developed a modified firefly algorithm for NESs.

In addition, the second transformation technique transforms a NES into a constrained optimization problem as follows [23]:miniS1|ei(X)|s.t.ej(X)0(jS2)where S1{1,2,,m},S2{1,2,,m}/S1.

The third transformation technique transforms a NES into a multiobjective optimization problem as follows [24]:min|e1(X)|min|e2(X)|min|em(X)|This transformation technique is a simple yet direct transformation method, which considers each equation in a NES as an objective function. However, the performance of this method decreases with the increase of the number of equations. The multiobjective optimization problem that has m [24] objectives may cause the “curse of dimensionality” if m is a big number. To solve this problem, Song et al. [25] presented an algorithm called MONES. In MONES, a NES is transformed into a biobjective optimization problem. This transformed problem can be divided into two parts: the system function and the location function. This biobjective optimization problem is given byminf1(X)=x1+i=1m|ei(X)|minf2(X)=1-x1+m·max(|e1(X)|,,|em(X)|)where x1 denotes the first decision variable.

However, since only one decision variable is used to design the location function, MONES may lose some optimal solutions. Based on this consideration, Gong et al. [27] proposed an algorithm called A-WeB where a weighted biobjective optimization is defined by the following equations:minf1(X)=j=1Dωj·xjj=1Dωj+i=1m|ei(X)|minf2(X)=1-j=1Dωj·xjj=1Dωj+i=1m|ei(X)|where xj denotes the jth decision variable, j=1,2,,D. ω=(ω1,ω2,,ωD) is the weight vector.

In this paper, we present a new evolutionary algorithm with multiobjective optimization technique for NESs, called MOPEA. In MOPEA, a NES is transformed into a single-objective optimization problem as shown in Eq. (4). Moreover, a diversity indicator is introduced to keep the diversity of the population. The main contributions of this paper can be described as follows:

  • (1)

    To balance the diversity and the convergence, the multiobjective optimization technique is applied to solve NESs. Specifically, NSGA-II is employed to obtain a lot of candidate solutions, and an approximate fitness landscape can be obtained by all candidate solutions in this process.

  • (2)

    A K-means clustering-based selection strategy is used to divide all candidate solutions into several subregions. The selection strategy may screen out the promising solutions and accelerate the convergence of the algorithm. Subsequently, the local search is designed to locate the optimal solutions quickly.

The rest of this paper is organized as follows. The basic backgrounds about the multiobjective optimization and clustering are introduced in Section 2. Section 3 discusses the motivation of this work and describes the proposed algorithm in detail. The experimental studies and parameter analysis are presented in Section 4. Finally, Section 5 draws the conclusion of this paper.

Section snippets

Multiobjective optimization (MOP)

In general, the form of a MOP can be defined as follows [25], [29]:minf(X)=(f1(X),f2(X),,fM(X))where X=(x1,x2,,xD)S denotes a decision vector with D decision variables and SRD denotes the decision space. f(X)Y denotes an objective vector and YRMdenotes the objective space. M denotes the number of objectives.

For a MOP, two candidate solutions are compared by Pareto dominance. To be specific, let X1 and X2 be two candidate solutions, X1 is said to Pareto dominate X2 iffi1,2,,Mfi(X1)fi(X2

Motivations

As mentioned above, this paper presents an algorithm for NESs based on multiobjective optimization technique. The motivations of the proposed algorithm mainly include the following two points:

  • (1)

    The classical methods for a NES have some weaknesses. For instance, the classical methods depend on the differentiability and are highly sensitive to the initial points. Furthermore, they are easy to get trapped in the local optimal solutions and obtain one optimal solution at each run. Based on these

Benchmark test functions

In this paper, 30 NESs (denoted as F01–F30) with different characteristics are chosen from the literature [17] to test the performance of different methods. Some of them are derived from the real world, such as multiple steady states problem (F08) [38], robot kinematics problem (F17) [39] and so on. Table 1 describes the information of 30 NESs. D is the number of the decision variables. LE and NE are the number of linear and nonlinear equations, respectively. NOP is the number of the known

Conclusion

Many real-world problems are NESs with multiple optimal solutions. It is a challenge to locate multiple optional solutions of NESs simultaneously in a single run. In this paper, MOPEA is presented for solving NESs. A NES is transformed into a single-objective optimization, and a diversity indicator is designed to keep the diversity of population. Then, NSGA-II is employed to obtain a lot of candidate solutions. Afterward, the selection strategy is applied to select the promising solutions from

CRediT author statement

Weifeng Gao: Writing - original draft. Yuting Luo: Writing - original draft. Jingwei Xu Writing - original draft. Shengqi Zhu: Writing - original draft.

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References (44)

  • M. Ariyaratne et al.

    Solving systems of nonlinear equations using a modified firefly algorithm (modfa)

    Swarm Evol. Comput.

    (2019)
  • X.H. Wu et al.

    Mixed fuzzy inter-cluster separation clustering algorithm

    Appl. Math. Model.

    (2011)
  • I. Tsoulos et al.

    On locating all roots of systems of nonlinear equations inside bounded domain using global optimization methods

    Nonlinear Anal. Real World Appl.

    (2010)
  • W.F. Sacco et al.

    Finding all solutions of nonlinear systems using a hybrid metaheuristic with fuzzy clustering means

    Appl. Soft Comput.

    (2011)
  • C. Wang et al.

    A new filled function method for an unconstrained nonlinear equation

    J. Comput. Appl. Math.

    (2011)
  • N. Henderson et al.

    Calculation of critical points of thermodynamic mixtures with differential evolution algorithms

    Ind. Eng. Chem. Res.

    (2010)
  • I. Amaya et al.

    Solution of the mathematical model of a nonlinear direct current circuit using particle swarm optimization

    Dyna

    (2012)
  • J. Xu et al.

    Joint range and angle estimation using mimo radar with frequency diverse array

    IEEE Trans. Signal Process.

    (2015)
  • J. Xu et al.

    An adaptive range-angle-doppler processing approach for fda-mimo radar using three-dimensional localization

    IEEE J. Sel. Top. Signal Process.

    (2017)
  • M.A. Gomesruggiero

    Solving nonlinear systems of equations by means of quasi-newton methods with a nonmonotone strategy

    Opt.Meth. Softw.

    (2012)
  • C.G. Broyden

    A class of methods for solving nonlinear simultaneous equations

    Math. Comput.

    (1965)
  • M.D. Gonzlez-Lima et al.

    A newton-like method for nonlinear system of equations

    Numer. Algorithms

    (2009)
  • Cited by (15)

    • Decomposition-based multiobjective optimization for nonlinear equation systems with many and infinitely many roots

      2022, Information Sciences
      Citation Excerpt :

      In this paper, we discuss the theoretical feasibility of our novel transformation and the further clarify the working principle of DeMDELS. The relative advantages of DeMDELS over SOEAs and Pareto-dominance-based MOEAs for NESs [14,38,11] are also presented. We assessed effectiveness, efficiency, and runtime of DeMDELS in systematic experiments with 30 NESs from industrial applications.

    • An improved dynamic multi-objective optimization approach for nonlinear equation systems

      2021, Information Sciences
      Citation Excerpt :

      The advantage of pure multi-objective optimization-based EAs is clear: they enable the roots of an NES to be simultaneously located as non-dominated solutions in a single trial, and some pre-defined parameters and preprocessing of repulsion and clustering methods are usually not required. Due to previous successes in multi-objective optimization, various studies have been done to solve NES as a special case of MOPs [13,14,19,39]. To apply multi-objective optimization to an NES, the NES must be transformed into an MOP form, and then the roots can be located in a way that enables the non-dominated solutions to be located by multi-objective optimization techniques.

    View all citing articles on Scopus
    View full text