Skip to main content
Log in

A memetic algorithm with optimal recombination for the asymmetric travelling salesman problem

  • Regular Research Paper
  • Published:
Memetic Computing Aims and scope Submit manuscript

Abstract

We propose a new memetic algorithm with optimal recombination for the asymmetric travelling salesman problem (ATSP). The optimal recombination problem (ORP) is solved in a crossover operator based on a new exact algorithm that solves the ATSP on cubic digraphs. A new mutation operator makes random jumps in 3-opt or 4-opt neighborhoods. The initial population is constructed by means of greedy constructive heuristics. The 3-opt local search is used to improve the initial and the final populations. A computational experiment on the TSPLIB instances shows that the proposed algorithm yields results competitive to those of other well-known algorithms for ATSP and confirms that the ORP may be used successfully in memetic algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Brown BW, Hollander M (1977) Statistics: a biomedical introduction. Wiley, New York

    Book  Google Scholar 

  2. Buriol LS, Franca PM, Moscato P (2004) A new memetic algorithm for the asymmetric traveling salesman problem. J Heuristics 10:483–506

    Article  Google Scholar 

  3. Burke EK, Cowling PI, Keuthen R (2001) Effective local and guided variable neighbourhood search methods for the asymmetric travelling salesman problem. In: Boers EJW (ed) EvoWorkshop 2001, applications of evolutionary computing, LNCS, vol 2037. Springer, Berlin, pp 203–212

    Google Scholar 

  4. Dongarra JJ (2014) Performance of various computers using standard linear equations software. Technical report CS-89-85. University of Manchester

  5. Eppstein D (2007) The traveling salesman problem for cubic graphs. J Graph Algorithms Appl 11(1)

    Article  MathSciNet  Google Scholar 

  6. Eremeev AV (2019) A restarting rule based on the \(\text{Schnabel}\) census for genetic algorithms. In: Battiti R, Brunato M, Kotsireas I, Pardalos PM (eds) Learning and Intelligent Optimization, LNCS, vol 11353. Springer, Cham, pp 337–351

  7. Eremeev AV, Kovalenko JV (2014) Optimal recombination in genetic algorithms for combinatorial optimization problems: part II. Yugoslav J Oper Res 24(2):165–186

    Article  MathSciNet  Google Scholar 

  8. Eremeev AV, Kovalenko JV (2016) Experimental evaluation of two approaches to optimal recombination for permutation problems. In: Chicano F, Hu B, García-Sánchez P (eds) Evolutionary computation in combinatorial optimization, LNCS, vol 9595. Springer, Cham, pp 138–153

    Chapter  Google Scholar 

  9. Eremeev AV, Kovalenko YV (2018) Genetic algorithm with optimal recombination for the asymmetric travelling salesman problem. In: Lirkov I, Margenov S (eds) Large-scale scientific computing, LNCS, vol 10665. Springer, Cham, pp 341–349

    Chapter  Google Scholar 

  10. Freisleben B, Merz P (1996) A genetic local search algorithm for solving symmetric and asymmetric traveling salesman problems. In: IEEE international conference on evolutionary computation. IEEE Press, pp 616–621

  11. Garey MR, Johnson DS (1979) Computers and intractability. A guide to the theory of NP-completeness. W. H. Freeman and Company, San Francisco

    MATH  Google Scholar 

  12. Goldberg D, Thierens D (1994) Elitist recombination: an integrated selection recombination GA. In: First IEEE world congress on computational intelligence, vol 1. IEEE Service Center, Piscataway, pp 508–512

  13. Johnson DS, McGeorch LA (1997) The traveling salesman problem: a case study. In: Aarts E, Lenstra JK (eds) Local search in combinatorial optimization. Wiley, New York, pp 215–336

    Google Scholar 

  14. Kanellakis PC, Papadimitriou CH (1980) Local search for the asymmetric traveling salesman problem. Oper Res 28:1086–1099

    Article  MathSciNet  Google Scholar 

  15. Mood AM, Graybill FA, Boes DC (1974) Introduction to the theory of statistics. McGraw-Hill, New York

    MATH  Google Scholar 

  16. Nagata Y, Soler D (2012) A new genetic algorithm for the asymmetric TSP. Expert Syst Appl 10:8947–8953

    Article  Google Scholar 

  17. Neri F, Cotta C (2012) Memetic algorithms and memetic computing optimization: a literature review. Swarm Evol Comput 2:1–14

    Article  Google Scholar 

  18. Neri F, Cotta C, Moscato P (2012) Handbook of memetic algorithms. Springer, Berlin, Heidelberg

    Book  Google Scholar 

  19. Neri F, Toivanen J, Cascella GL, Ong YS (2007) An adaptive multimeme algorithm for designing HIV multidrug therapies. IEEE/ACM Trans Comput Biol Bioinf 4:264–278

    Article  Google Scholar 

  20. Norman M, Moscato P (1991) A competitive and cooperative approach to complex combinatorial search. In: 20th joint conference on informatics and operations research. Buenos Aires, pp 3.15–3.29

  21. Radcliffe NJ (1994) The algebra of genetic algorithms. Ann Math Artif Intell 10(4):339–384

    Article  MathSciNet  Google Scholar 

  22. Reeves CR (1997) Genetic algorithms for the operations researcher. INFORMS J Comput 9(3):231–250

    Article  MathSciNet  Google Scholar 

  23. Reeves CR, Eremeev AV (2004) Statistical analysis of local search landscapes. J Oper Res Soc 55(7):687–693

    Article  Google Scholar 

  24. Rego C, Gamboa D, Glover F (2016) Doubly-rooted stem-and-cycle ejection chain algorithm for the asymmetric traveling salesman problem. Networks 68(1):23–33

    Article  MathSciNet  Google Scholar 

  25. Reinelt G (1991) TSPLIB—a traveling salesman problem library. ORSA J Comput 3(4):376–384

    Article  MathSciNet  Google Scholar 

  26. Tinós R, Whitley D, Ochoa G (2014) Generalized asymmetric partition crossover (GAPX) for the asymmetric TSP. In: The 2014 annual conference on genetic and evolutionary computation. ACM, New York, pp 501–508

  27. Turkensteen M, Ghosh D, Goldengorin B, Sierksma G (2008) Tolerance-based branch and bound algorithms for the ATSP. Eur J Oper Res 189:775–788

    Article  MathSciNet  Google Scholar 

  28. Whitley D, Starkweather T, Shaner D (1991) The traveling salesman and sequence scheduling: quality solutions using genetic edge recombination. In: Davis L (ed) Handbook of genetic algorithms. Van Nostrand Reinhold, New York, pp 350–372

    Google Scholar 

  29. Xing LN, Chen YW, Yang KW, Hou F, Shen XS, Cai HP (2008) A hybrid approach combining an improved genetic algorithm and optimization strategies for the asymmetric TSP. Eng Appl Artif Intell 21(8):1370–1380

    Article  Google Scholar 

  30. Yagiura M, Ibaraki T (1996) The use of dynamic programming in genetic algorithms for permutation problems. Eur J Oper Res 92:387–401

    Article  Google Scholar 

  31. Zhang W (2000) Depth-first branch-and-bound versus local search: a case study. In: 17th national conference on artificial intelligence. Austin, pp 930–935

Download references

Acknowledgements

The research was supported by the Russian Science Foundation (the work on Sect. 2 was supported by Grant 15-11-10009, the work on other sections was supported by Grant 17-18-01536). The authors thank Prof. Yuri Kochetov for his valuable advices on usage of the local search.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yulia V. Kovalenko.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

The appendix contains an outline of the algorithm of Eppstein from [5], a proof of Lemma 1 (which uses some ideas from the algorithm of D. Eppstein), and the details about the restarting rule from [6].

1.1 Algorithm of D. Eppstein

For the sake of completeness, here we present the recursive algorithm of D. Eppstein proposed in [5] for listing all Hamiltonian cycles in a graph of a degree three.

Let \(G=(V,E)\) be a simple graph with maximum degree three, F be the set of forced edges and \({U:=E\backslash F}\). The algorithm of D. Eppstein enumerates all Hamiltonian cycles of G, containing all forced edges, as shown in Algorithm 1. At each stage we firstly reduce the size of the input graph without branching (Step 1). Secondly, we choose an edge to branch on, and consider two subproblems (Step 2), the one in which the edge is forced to be in the solution and the one in which it is excluded. These two subproblems are solved recursively. Steps of the algorithm either return or reduce the input graph to one or two smaller cubic graphs.

Algorithm 2. Listing all Hamiltonian Cycles Given the Forced Edges

Input: Simple cubic graph G with a set of forced edges F.

Output: All Hamiltonian cycles of the graph G, containing all forced edges.

  1. 1.

    Repeat the following steps until one of the steps returns or none of them applies:

    1. (a)

      If G contains a vertex with degree zero or one, or if F contains three edges meeting at a vertex, backtrack.

    2. (b)

      If G contains only three vertices and F consists of a Hamiltonian cycle, output the cycle and backtrack.

    3. (c)

      If G contains a vertex with degree two, add its incident edges to F.

    4. (d)

      If G contains a triangle xyz, and the non-triangle edge incident to x belongs to F, add edge yz to F.

    5. (e)

      If F contains exactly two edges \(\{x,y\}\) and \(\{y,z\}\) meeting at some vertex y, remove from G that vertex and any other edge incident to it, and replace the two edges by single edge \(\{x,z\}\). If this contraction would lead to two parallel edges in G, remove the other edge from G.

  2. 2.

    If F is nonempty, let \(\{x,y\}\) be any edge in F and \(\{y,z\}\) be an adjacent edge in \(G\backslash F\). Otherwise, if F is empty, let \(\{y,z\}\) be any edge in G. Call the algorithm recursively on the two graphs \(G,\ F \cup \{y,z\}\) and \(G \backslash \{y,z\},\ F\).

Theorem 2 is a combination of results proved in [5].

Theorem 2

[5] The number of Hamiltonian cycles in \(G=(V,F\cup U)\) is \(O(2^{|U|/4})\) in the worst case and these cycles can be listed be means of Algorithm 2 in \(O(2^{|U|/4})\) time and linear space. Each step of Algorithm 2 can be implemented in constant time per step.

1.2 Proof of Lemma 1 from section 2.1

Consider the forced ATSP on a directed cubic graph \({\mathcal G}=({{\mathcal {V}}},{{\mathcal {A}}})\), where \({{\mathcal {V}}}\) is the set of vertices and \({{\mathcal {A}}}\) is the set of arcs. Each vertex \(v\in {{\mathcal {V}}}\) has a degree at most three. Every arc \((v_i,v_j)\in {{\mathcal {A}}}\) is associated with a length (weight) \(c(v_i,v_j)\ge 0\). Let \({{\mathcal {F}}}\) be the set of forced arcs in the graph \({{\mathcal {G}}}\). We propose an exact algorithm for finding a minimum length Hamiltonian circuit in \({{\mathcal {G}}}\).

Initially the graph \({{\mathcal {G}}}\) is modified by means of the following preprocessing procedure. For each arc \((u,v)\in {{\mathcal {G}}}\), we check whether there is an arc (vu) in the graph \({{\mathcal {G}}}\). If such an arc (vu) is not present in \({{\mathcal {A}}}\), then we add it to the graph and set \({c(v,u):=+\infty }\). If such an arc (vu) is in \({{\mathcal {A}}}\), then we identify the following four cases:

  1. 1)

    if \((u,v)\in {{\mathcal {F}}}\) and \((v,u)\in {{\mathcal {F}}}\), then we report that the forced ATSP on digraph \({{\mathcal {G}}}\) is infeasible;

  2. 2)

    if \((u,v)\in {{\mathcal {F}}}\), but \((v,u)\notin {{\mathcal {F}}}\), then \({c(v,u):=+\infty }\);

  3. 3)

    if \((v,u)\in {{\mathcal {F}}}\), but \((u,v)\notin {{\mathcal {F}}}\), then \({c(u,v):=+\infty }\);

  4. 4)

    if \((u,v)\notin {{\mathcal {F}}}\) and \((v,u)\notin {{\mathcal {F}}}\), then do nothing.

If an arc (uv) is forced in \({{\mathcal {G}}}\) after the preprocessing, then the length of (vu) is infinity. A solution to the modified problem may be obtained through enumeration of all feasible solutions to the TSP with forced edges on a supplementary graph \({\bar{G}} =({{\mathcal {V}}}, {\bar{E}})\), where a pair of vertices u, v is connected iff these vertices were connected by a pair of arcs in the digraph \({{\mathcal {G}}}\). An edge \(\{u, v\}\in {\bar{E}}\) is assumed to be forced if \((u, v)\in {{\mathcal {F}}}\) or \((v, u)\in {{\mathcal {F}}}\). A set of forced edges in \({\bar{G}}\) will be denoted by \({\bar{F}}\). Given a set of forced arcs \({{\mathcal {F}}},\) a minimum length Hamiltonian circuit in \({{\mathcal {G}}}\) may be found using Algorithm 3, which is an adaptation of the algorithm proposed by Eppstein [5] (see Algorithm 2).

Algorithm 3. Finding a Minimum Length Hamiltonian Circuit Given the Forced Arcs

Input: Directed cubic graph \({{\mathcal {G}}}\) and the corresponding undirected graph \({\bar{G}}\) with a set of forced edges \({\bar{F}}\).

Output: A minimum length Hamiltonian circuit of the graph \({{\mathcal {G}}}\), containing all forced arcs, if such circuit exists.

  1. 1.

    Let \(c_{\mathrm {bst}}\) denote the best found value of objective function. Initially set \({c_{bst}:=+\infty }\).

  2. 2.

    Repeat the following steps until one of the steps returns or none of them applies:

    1. (a)

      If \({\bar{G}}\) contains a vertex with degree zero or one, or if \({\bar{F}}\) contains three edges meeting at a vertex, backtrack.

    2. (b)

      If graph \({\bar{G}}\) contains only three vertices and \({\bar{F}}\) consists of a Hamiltonian cycle, compute thelengths of the circulations within this cycle in both directions using arc lengths from graph \({{\mathcal {G}}}\). If necessary, update the value of \(c_{\mathrm {bst}}\) and the circulation corresponding to this value, and backtrack. Deleted vertices of the original input graph \({{\mathcal {G}}}\) will be added to the best found circuit at backtracking stages.

    3. (c)

      If \({\bar{G}}\) contains a vertex with degree two, add its incident edges to \({\bar{F}}\).

    4. (d)

      If \({\bar{G}}\) contains a triangle xyz, and the non-triangle edge incident to x belongs to \({\bar{F}}\), add edge \(\{y,z\}\) to \({\bar{F}}\).

    5. (e)

      If \({\bar{F}}\) contains exactly two edges \(\{x,y\},\)\(\{y,z\}\) meeting at some vertex y, add edge \(\{x,z\}\) to \({\bar{G}}\), and add arcs (xz),  (zx) to \({{\mathcal {G}}}\). If this leads to two parallel edges in \({\bar{G}}\) (or parallel arcs in \({{\mathcal {G}}}\)), remove the other edge from \({\bar{G}}\) (or the other arc from \({{\mathcal {G}}}\)). The lengths of arcs (xz) and (zx) are computed as follows: \(c(x,z):=c(x,y)+c(y,z),\)\(c(z,x):=c(z,y)+c(y,x)\). Remove vertex y from \({\bar{G}}\) and \({{\mathcal {G}}}\).

  3. 3.

    If \({\bar{F}}\) is nonempty, let \(\{x,y\}\) be any edge in \({\bar{F}}\) and let \(\{y,z\}\) be an adjacent edge in \({\bar{G}}\backslash {\bar{F}}\). Otherwise, if \({\bar{F}}\) is empty, let \(\{y,z\}\) be any edge in \({\bar{G}}\). Call the algorithm recursively on the following two input data sets: \({{\mathcal {G}}}\), \({\bar{G}},\ {\bar{F}} \cup \{y,z\}\) and \({{\mathcal {G}}}\backslash \{(y,z)\cup (z,y)\}\), \({\bar{G}}\backslash \{y,z\},\ {\bar{F}}\).

  4. 4.

    If \({c_{\mathrm {bst}}<+\infty }\), then return the Hamiltonian circuit corresponding to \(c_{\mathrm {bst}}\). Otherwise no solution exists.

Algorithm 3 is recursive and at each call it either returns or reduces the input graph to two smaller cubic graphs by choosing arcs to branch on. In contrast to the algorithm from [5], our algorithm operates with a directed graph. It enumerates all Hamiltonian circuits of this graph and finds a minimum length one. Due to the structure of the initial graph \({{\mathcal {G}}}\), at most one circuit of finite length on three vertices is identified every time at Step 2b. The circuit corresponds to a circulation in the initial graph \({{\mathcal {G}}}\), containing all forced arcs.

Whenever a backtracking occurs in Algorithm 3, all vertices, edges and arcs deleted at the current stage are restored and these vertices are added to the best found circuit if it is needed. This leads to the same number of additional steps as has been performed at the current stage. So, each step of Algorithm 3 has a time complexity O(1), and the total number of steps is \(O(2^{ (|{\bar{E}}|-|{\bar{F}}|)/4 })\) as for the algorithm from [5] (see Theorem 2). The time complexity of the preprocessing is just \(O(|{\mathcal V}|)\), therefore, Lemma 1 holds.

1.3 Restarting rule

Schnabel census method was developed in biometrics for statistical estimation of the size of animal populations. According to this method, one takes repeated samples of size \(n_0\) from a population and counts the number of distinct animals seen. Often it is assumed that the probability of catching any particular animal is the same. The sampled animals are marked, unless they were marked previously, and returned back into the population. Then a statistical estimate for the total number \(\nu \) of individuals in population is computed on the basis of the total number of distinct animals marked in all the samples. In this paper, we apply the Schnabel census method to estimate the number of values that a discrete random variable may take with non-zero probability.

Let a parameter r define the length of the historical period considered for statistical analysis in the \(\mathcal MA\) restart rule. Given a value of r, we assume that during the r latest iterations, all new offspring in the \(\mathcal MA\) obeyed the same distribution and they may be treated as the sampled animals in the Schnabel census method. Then we apply the Schnabel census method in order to estimate the number \(\nu \) of different solutions that may be visited with a positive probability, assuming that the current distribution of offspring remains unchanged.

We also assume that in the latest r iterations of a \(\mathcal MA\), the observed sample consists of r independent offspring solutions. Let us define the random variable K as the number of distinct solutions in this sample. We make another simplifying assumption that all solutions, that may be generated with the current distribution, have equal probabilities. Then, for any fixed \(\nu ,\) the random variable K has the following distribution:

$$\begin{aligned} \Pr \{K=k\} = \frac{\nu !}{(\nu -k)!}\frac{S(r,k)}{\nu ^r} , \end{aligned}$$

where \(S(r,k)=\frac{1}{k!} \sum _{s=0}^k (-1)^k {k \atopwithdelims ()s} (k-s)^r\) is the Stirling number of the second kind. This distribution is also known as the Arfwedson distribution. The maximum likelihood estimate \({\hat{\nu }}^{\mathrm{ML}}\) for the unknown \(\nu \) is

$$\begin{aligned} {\hat{\nu }}(r,k)=\text{ argmax } \left\{ \frac{\nu !}{(\nu -k)! \nu ^r}\right\} , \end{aligned}$$
(1)

where k is the number of different solutions actually generated on the latest r iterations. The value \({\hat{\nu }}^\mathrm{ML}={\hat{\nu }}(r,k)\) may be found from (1) by the standard one-dimensional optimization methods. The technical details as well as further literature references may be found e.g. in [23].

In this paper, we use the restart rule proposed in [6], which restarts an algorithm as soon as the estimate \({\hat{\nu }}^{\mathrm{ML}}\) becomes equal to k. The rationale behind this rule is that once the equality \({\hat{\nu }}^{\mathrm{ML}}=k\) is satisfied, most likely there are no more non-visited solutions in the area where the GA population spent the latest r iterations. In such a situation, it is more appropriate to restart the GA rather than to wait till the population distribution will significantly change by the evolutionary mechanisms.

The value of parameter r is chosen adaptively as follows: Whenever the best found solution is improved, we set r to be the population size N. If the best incumbent was not improved during the latest 2r iterations, then the value of r is doubled. Here we reset r to the population size when the best found solution is improved, assuming that if the best incumbent has been improved, this means that the population has reached a new unexplored area and the length of the historic period for analysis should be reduced. To reduce the CPU cost, the termination condition is checked only when the value of r is updated.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Eremeev, A.V., Kovalenko, Y.V. A memetic algorithm with optimal recombination for the asymmetric travelling salesman problem. Memetic Comp. 12, 23–36 (2020). https://doi.org/10.1007/s12293-019-00291-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12293-019-00291-4

Keywords

Navigation