Skip to main content
Log in

A new particle swarm optimization algorithm for noisy optimization problems

  • Published:
Swarm Intelligence Aims and scope Submit manuscript

Abstract

We propose a new particle swarm optimization algorithm for problems where objective functions are subject to zero-mean, independent, and identically distributed stochastic noise. While particle swarm optimization has been successfully applied to solve many complex deterministic nonlinear optimization problems, straightforward applications of particle swarm optimization to noisy optimization problems are subject to failure because the noise in objective function values can lead the algorithm to incorrectly identify positions as the global/personal best positions. Instead of having the entire swarm follow a global best position based on the sample average of objective function values, the proposed new algorithm works with a set of statistically global best positions that include one or more positions with objective function values that are statistically equivalent, which is achieved using a combination of statistical subset selection and clustering analysis. The new PSO algorithm can be seamlessly integrated with adaptive resampling procedures to enhance the capability of PSO to cope with noisy objective functions. Numerical experiments demonstrate that the new algorithm is able to consistently find better solutions than the canonical particle swarm optimization algorithm in the presence of stochastic noise in objective function values with different resampling procedures.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  • AlRashidi, M. R., & El-Hawary, M. E. (2009). A survey of particle swarm optimization applications in electric power systems. IEEE Transactions on Evolutionary Computation, 13(4), 913–918.

    Article  Google Scholar 

  • Audibert, J. Y., Munos, R., & Szepesvári, C. (2009). Exploration–exploitation tradeoff using variance estimates in multi-armed bandits. Theoretical Computer Science, 410(19), 1876–1902.

    Article  MATH  MathSciNet  Google Scholar 

  • Auer, P. (2003). Using confidence bounds for exploitation–exploration trade-offs. The Journal of Machine Learning Research, 3, 397–422.

    MATH  MathSciNet  Google Scholar 

  • Bartz-Beielstein, T., Blum, D., & Branke, J. (2007). Particle swarm optimization and sequential sampling in noisy environments. In K. F. Doerner, M. Gendreau, P. Greistorfer, W. J. Gutjahr, R. F. Hartl, M. Reimann (Eds.), Metaheuristics, operations research/computer science interfaces series (pp. 261–273). Heidelberg: Springer.

  • Beielstein, T., & Markon, S. (2002). Threshold selection, hypothesis tests, and doe methods. IEEE Proceedings of the World Congress on Computational Intelligence, 1, 777–782.

    Google Scholar 

  • Bird, S., & Li, X. (2006). Enhancing the robustness of a speciation-based PSO. In IEEE Congress on Evolutionary Computation, 2006. CEC 2006 (pp 843–850).

  • Boesel, J., Nelson, B. L., & Ishii, N. (2003). A framework for simulation-optimization software. IIE Transactions, 35(3), 221–229.

    Article  Google Scholar 

  • Branke, J., & Schmidt, C. (2003). Selection in the presence of noise. In Genetic and Evolutionary Computation—GECCO 2003 (pp 766–777). Berlin: Springer.

  • Bratley, P., Fox, B. L., & Schrage, L. E. (2011). A guide to simulation. Berlin: Springer.

    MATH  Google Scholar 

  • Cantú-Paz, E. (2004). Adaptive sampling for noisy problems. In Genetic and evolutionary computation—GECCO 2004 (pp. 947–958). Berlin: Springer.

  • Chen, C. H., Lin, J., Yücesan, E., & Chick, S. E. (2000). Simulation budget allocation for further enhancing the efficiency of ordinal optimization. Discrete Event Dynamic Systems, 10(3), 251–270.

    Article  MATH  MathSciNet  Google Scholar 

  • Chen, C. H., He, D., Fu, M., & Lee, L. H. (2008). Efficient simulation budget allocation for selecting an optimal subset. INFORMS Journal on Computing, 20(4), 579–595.

    Article  Google Scholar 

  • Chen, W. N., Zhang, J., Chung, H. S. H., Zhong, W. L., Wu, W. G., & Shi, Y. H. (2010). A novel set-based particle swarm optimization method for discrete optimization problems. IEEE Transactions on Evolutionary Computation, 14(2), 278–300.

    Article  Google Scholar 

  • Chick, S. E., Inoue, K., Inoue, K., & Inoue, K. (2001). New two-stage and sequential procedures for selecting the best simulated system. Operations Research, 49(5), 732–743.

    Article  Google Scholar 

  • Clerc, M., & Kennedy, J. (2002). The particle swarm—Explosion, stability, and convergence in a multidimensional complex space. IEEE Transactions on Evolutionary Computation, 6(1), 58–73.

    Article  Google Scholar 

  • Di Mario, E., Navarro, I., & Martinoli, A. (2015a). A distributed noise-resistant particle swarm optimization algorithm for high-dimensional multi-robot learning. In IEEE international conference on robotics and automation (ICRA) (pp. 5970–5976).

  • Di Mario, E., Navarro, I., & Martinoli, A. (2015b). Distributed particle swarm optimization using optimal computing budget allocation for multi-robot learning. In IEEE congress on evolutionary computation (CEC) (pp. 566–572).

  • Di Pietro, A., While, L., & Barone, L. (2004). Applying evolutionary algorithms to problems with noisy, time-consuming fitness functions. In Congress on evolutionary computation, 2004. CEC 2004, IEEE (Vol. 2, pp. 1254–1261).

  • Eberhart, R. C., & Kennedy, J. (1995). A new optimizer using particle swarm theory. In Proceedings of the sixth international symposium on micro machine and human science, New York, NY (Vol. 1, pp. 39–43).

  • Engelbrecht, A. P. (2013). Particle swarm optimization: Global best or local best? In BRICS Congress on computational intelligence and 11th Brazilian congress on computational intelligence (BRICS-CCI & CBIC), IEEE (pp. 124–135).

  • Fernandez-Marquez, J. L., & Arcos, J. L. (2009). An evaporation mechanism for dynamic and noisy multimodal optimization. In Proceedings of the 11th annual conference on genetic and evolutionary computation, ACM (pp. 17–24).

  • Fernandez-Marquez, J. L., & Arcos, J. L. (2010). Adapting particle swarm optimization in dynamic and noisy environments. In IEEE Congress on Evolutionary Computation (CEC), 2010, IEEE (pp. 1–8).

  • Fernandez-Martinez, J. L., & Garcia-Gonzalo, E. (2011). Stochastic stability analysis of the linear continuous and discrete PSO models. IEEE Transactions on Evolutionary Computation, 15(3), 405–423.

    Article  Google Scholar 

  • Fitzpatrick, J. M., & Grefenstette, J. J. (1988). Genetic algorithms in noisy environments. Machine learning, 3(2–3), 101–120.

    Google Scholar 

  • Frazier, P., Powell, W., & Dayanik, S. (2009). The knowledge-gradient policy for correlated normal beliefs. INFORMS Journal on Computing, 21(4), 599–613.

    Article  MATH  MathSciNet  Google Scholar 

  • Frazier, P. I., Powell, W. B., & Dayanik, S. (2008). A knowledge-gradient policy for sequential information collection. SIAM Journal on Control and Optimization, 47(5), 2410–2439.

    Article  MATH  MathSciNet  Google Scholar 

  • Horng, S. C., Lin, S. Y., Lee, L. H., & Chen, C. H. (2013). Memetic algorithm for real-time combinatorial stochastic simulation optimization problems with performance analysis. IEEE Transactions on Cybernetics, 43(5), 1495–1509.

    Article  Google Scholar 

  • Hu, X., & Eberhart, R. (2002). Multiobjective optimization using dynamic neighborhood particle swarm optimization. In Proceedings of the world congress on computational intelligence, IEEE (pp. 1677–1681).

  • Jacod, J., & Protter, P. E. (2003). Probability essentials. Berlin: Springer.

    MATH  Google Scholar 

  • Jiang, M., Luo, Y. P., & Yang, S. Y. (2007). Stochastic convergence analysis and parameter selection of the standard particle swarm optimization algorithm. Information Processing Letters, 102(1), 8–16.

    Article  MATH  MathSciNet  Google Scholar 

  • Jin, Y., & Branke, J. (2005). Evolutionary optimization in uncertain environments—A survey. IEEE Transactions on Evolutionary Computation, 9(3), 303–317.

    Article  Google Scholar 

  • Kennedy, J. (1999). Small worlds and mega-minds: Effects of neighborhood topology on particle swarm performance. In Proceedings of the 1999 Congress on evolutionary computation, 1999. CEC’99, (Vol. 3).

  • Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization. Proceedings of IEEE International Conference on Neural Networks, 4, 1942–1948.

    Article  Google Scholar 

  • Kennedy, J., & Mendes, R. (2002). Population structure and particle swarm performance. In Proceedings of the IEEE congress on evolutionary computation (CEC) (pp. 1671–1676). Honolulu, HI/Piscataway: IEEE.

  • Kennedy, J., & Mendes, R. (2006). Neighborhood topologies in fully informed and best-of-neighborhood particle swarms. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, 36(4), 515.

    Article  Google Scholar 

  • Kennedy, J., Kennedy, J. F., Eberhart, R. C., & Shi, Y. (2001). Swarm intelligence. San Francisco: Morgan Kaufmann.

    Google Scholar 

  • Kim, S. H., & Nelson, B. L. (2006). Selecting the best system. In Handbooks in operations research and management science: Simulation (Vol 13, pp 261–273). Amsterdam: Elsevier

  • Langeveld, J., & Engelbrecht, A. P. (2012). Set-based particle swarm optimization applied to the multidimensional knapsack problem. Swarm Intelligence, 6(4), 297–342.

    Article  Google Scholar 

  • Law, A. M., & Kelton, W. D. (2000). Simulation modeling and analysis (3rd ed.). Boston: McGraw Hill.

    MATH  Google Scholar 

  • Li, L., & Tang, K. (2015). History-based topological speciation for multimodal optimization. IEEE Transactions on Evolutionary Computation, 19(1), 136–150.

    Article  Google Scholar 

  • Liu, J., Li, C., Yang, F., Wan, H., & Uzsoy, R. (2011). Production planning for semiconductor manufacturing via simulation optimization. In Proceedings of the winter simulation conference (pp. 3617–3627).

  • Mahajan, S., & van Ryzin, G. (2001). Stocking retail assortments under dynamic consumer substitution. Operations Research, 49(3), 334–351.

    Article  MATH  Google Scholar 

  • Markon, S., Arnold, D. V., Back, T., Beielstein, T., & Beyer, H. G. (2001). Thresholding—a selection operator for noisy ES. In Proceedings of the 2001 congress on evolutionary computation, 2001 (Vol. 1, pp. 465–472).

  • Maron, O., & Moore, A. W. (1993). Hoeffding races: Accelerating model selection search for classification and function approximation. In J. D. Cowan et al. (Eds.), Advances in neural information processing systems (Vol. 6, pp. 59–66). San Francisco, CA: Morgan Kaufmann.

  • Mendes, R., Kennedy, J., & Neves, J. (2004). The fully informed particle swarm: Simpler, maybe better. IEEE Transactions on Evolutionary Computation, 8(3), 204–210.

    Article  Google Scholar 

  • Miller, B. L., & Goldberg, D. E. (1996). Genetic algorithms, selection schemes, and the varying effects of noise. Evolutionary Computation, 4(2), 113–131.

    Article  Google Scholar 

  • Olorunda, O., & Engelbrecht, A. P. (2008). Measuring exploration/exploitation in particle swarms using swarm diversity. In IEEE congress on evolutionary computation, 2008. CEC 2008 (IEEE world congress on computational intelligence) (pp. 1128–1134).

  • Pan, H., Wang, L., & Liu, B. (2006). Particle swarm optimization for function optimization in noisy environment. Applied Mathematics and Computation, 181(2), 908–919.

    Article  MATH  MathSciNet  Google Scholar 

  • Parrott, D., & Li, X. (2006). Locating and tracking multiple dynamic optima by a particle swarm model using speciation. IEEE Transactions on Evolutionary Computation, 10(4), 440–458.

    Article  Google Scholar 

  • Pehlivanoglu, Y. V. (2013). A new particle swarm optimization method enhanced with a periodic mutation strategy and neural networks. IEEE Transactions on Evolutionary Computation, 17(3), 436–452.

    Article  Google Scholar 

  • Piperagkas, G. S., Georgoulas, G., Parsopoulos, K. E., Stylios, C. D., & Likas, A. C. (2012). Integrating particle swarm optimization with reinforcement learning in noisy problems. In Proceedings of the 14th annual conference on Genetic and evolutionary computation, ACM (pp. 65–72).

  • Pugh, J., Martinoli, A., & Zhang, Y. (2005). Particle swarm optimization for unsupervised robotic learning. In Proceedings of IEEE swarm intelligence symposium (SIS) (pp. 92–99). Piscataway: IEEE.

  • Rada-Vilela, J., Zhang, M., & Johnston, M. (2013). Optimal computing budget allocation in particle swarm optimization. In Proceedings of the 15th annual conference on genetic and evolutionary computation, ACM (pp. 81–88).

  • Rada-Vilela, J., Johnston, M., & Zhang, M. (2014). Population statistics for particle swarm optimization: Resampling methods in noisy optimization problems. Swarm and Evolutionary Computation, 17, 37–59.

  • Rada-Vilela, J., Johnston, M., & Zhang, M. (2015a). Population statistics for particle swarm optimization: Single-evaluation methods in noisy optimization problems. Soft computing, 19(9), 2691–2716. doi:10.1007/s00500-014-1438-y.

  • Rada-Vilela, J., Johnston, M., & Zhang, M. (2015b). Population statistics for particle swarm optimization: Hybrid methods in noisy optimization problems. Swarm and Evolutionary Computation, 22, 15–29.

  • Rudolph, G. (2001a). Evolutionary search under partially ordered fitness sets. In Proceedings of the international symposium on information science innovations in engineering of natural and artificial intelligent systems (ISI 2001) (pp. 818–822). Millet, AB, CA: ICSC Academic Press.

  • Rudolph, G. (2001b). A partial order approach to noisy fitness functions. In Proceedings of the 2001 congress on evolutionary computation, 2001 (Vol. 1, pp. 318–325).

  • Samanta, B., & Nataraj, C. (2009). Application of particle swarm optimization and proximal support vector machines for fault detection. Swarm Intelligence, 3(4), 303–325.

    Article  Google Scholar 

  • Shi, Y. (2004). Particle swarm optimization. IEEE Connections, 2(1), 8–13.

    Google Scholar 

  • Shi, Y., & Eberhart, R. C. (1998a). A modified particle swarm optimizer. In Proceedings of the IEEE international conference on evolutionary computation (pp. 69–73). Piscataway: IEEE.

  • Shi, Y., & Eberhart, R. C. (1998b). Parameter selection in particle swarm optimization. In LNCS Proceedings of the seventh annual conference on evolutionary programming (Vol. 1447, pp. 591–600). Berlin: Springer.

  • Suganthan, P. N. (1999). Particle swarm optimiser with neighbourhood operator. In Proceedings of the IEEE congress on evolutionary computation (CEC) (pp. 1958–1962). Piscataway: IEEE.

  • Sun, T. Y., Liu, C. C., Tsai, S. J., Hsieh, S. T., & Li, K. Y. (2011). Cluster guide particle swarm optimization (CGPSO) for underdetermined blind source separation with advanced conditions. IEEE Transactions on Evolutionary Computation, 15(6), 798–811.

    Article  Google Scholar 

  • Tang, K., Li, X., Suganthan, P., Yang, Z., & Weise, T. (2009). Benchmark functions for the CEC2010 special session and competition on large scale global optimization. China: Nature Inspired Computation and Applications Laboratory, USTC.

    Google Scholar 

  • Thompson, S. K., & Seber, G. A. (1996). Adaptive Sampling. New York: Wiley.

    MATH  Google Scholar 

  • Trelea, I. C. (2003). The particle swarm optimization algorithm: Convergence analysis and parameter selection. Information Processing Letters, 85(6), 317–325.

    Article  MATH  MathSciNet  Google Scholar 

  • Wasserman, L. (2004). All of statistics: A concise course in statistical inference (Springer Texts in Statistics). Berlin: Springer.

    Book  MATH  Google Scholar 

  • Weber, R., et al. (1992). On the Gittins index for multiarmed bandits. The Annals of Applied Probability, 2(4), 1024–1033.

    Article  MATH  MathSciNet  Google Scholar 

  • Whittle, P. (1980). Multi-armed bandits and the Gittins index. Journal of the Royal Statistical Society Series B (Methodological)42(2), 143–149.

  • Xiao, H., & Lee, L. H. (2014). Simulation optimization using genetic algorithms with optimal computing budget allocation. Simulation, 90(10), 1146–1157.

    Article  Google Scholar 

  • Xu, J., Nelson, B. L., & Hong, J. L. (2010). Industrial strength compass: A comprehensive algorithm and software for optimization via simulation. ACM Transactions on Modeling and Computer Simulation (TOMACS), 20(1), 3:1–3:29. doi:10.1145/1667072.1667075.

    Article  Google Scholar 

  • Xu, J., Vidyashankar, A., & Nielsen, M. K. (2014). Drug resistance or re-emergence? simulating equine parasites. ACM Transactions on Modeling and Computer Simulation (TOMACS), 24(4), 20.

    Article  Google Scholar 

  • Xu, R., & Wunsch, D. (2005). Survey of clustering algorithms. IEEE Transactions on Neural Networks, 16(3), 645–678.

    Article  Google Scholar 

  • Xu, R., Venayagamoorthy, G. K., & Wunsch, D. C. (2007). Modeling of gene regulatory networks with hybrid differential evolution and particle swarm optimization. Neural Networks, 20(8), 917–927.

    Article  MATH  Google Scholar 

  • Yoshida, H., Kawata, K., Fukuyama, Y., Takayama, S., & Nakanishi, Y. (2000). A particle swarm optimization for reactive power and voltage control considering voltage security assessment. IEEE Transactions on Power Systems, 15(4), 1232–1239.

    Article  Google Scholar 

  • Zhang, S., Chen, P., Lee, L. H., Peng, C. E., & Chen, C. H. (2011). Simulation optimization using the particle swarm optimization with optimal computing budget allocation. In Proceedings of the winter simulation conference (pp. 4303–4314).

  • Zheng, Y. L., Ma, L. H., Zhang, L. Y., & Qian, J. X. (2003). On the convergence analysis and parameter selection in particle swarm optimization. IEEE International Conference on Machine Learning and Cybernetics, 3, 1802–1807.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jie Xu.

Additional information

This work was supported in part by the National Science Foundation under Grant CMMI-1233376.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Taghiyeh, S., Xu, J. A new particle swarm optimization algorithm for noisy optimization problems. Swarm Intell 10, 161–192 (2016). https://doi.org/10.1007/s11721-016-0125-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11721-016-0125-2

Keywords

Navigation