Skip to main content
Log in

Evaluating performance of super-efficiency models in ranking efficient decision-making units based on Monte Carlo simulations

  • Original Research
  • Published:
Annals of Operations Research Aims and scope Submit manuscript

Abstract

In response to the limitation of classical Data Envelopment Analysis (DEA) models, the super efficiency DEA models, including Andersen and Petersen (Manag Sci 39(10): 1261–1264, 1993)’s model (hereafter called AP model) and Li et al. (Eur J Oper Res 255(3): 884–892, 2016)’s cooperative-game-based model (hereafter called L–L model), have been proposed to rank efficient decision-making units (DMUs). Although both models have been widely applied in practice, there is a paucity of research examining the performance of the two models in ranking efficient DMUs. Consequently, it is unclear how close the rankings obtained by the two models are to the “true” ones. Among the very few studies, Banker et al. (Ann Oper Res 250(1): 21–35, 2017) pointed out that the ranking performance of the AP model is unsatisfactory; Li et al. (Eur J Oper Res 255(3): 884–892, 2016) and Hinojosa et al. (Exp Syst Appl 80(9): 273–283, 2017) demonstrated the L–L model’s capability of ranking efficient DMUs without addressing the ranking performance. In this study, we, thus, examine the ranking performance of the two super-efficiency models. In evaluating their performance, we carry out Monte Carlo simulations based on the well-known Cobb–Douglas production function and adopt Kendall rank correlation coefficient. Unlike Banker et al. (Ann Oper Res 250(1): 21–35, 2017), we use the rankings obtained based on the two models and the “true” ones as the basis of performance evaluation in our simulations. Moreover, we consider several types of returns to scale (RS) and study the impact of changes of some parameters on the ranking performance. In view of the importance, we also carry out additional simulations to examine the influence of technical inefficiency on the two models’ ranking performance. Based on the simulation results, we conclude: (1) Under different RS, the ranking performance of the two models remains the same when changing parameters, e.g., the distribution of input variables; (2) Under different RS, when technical inefficiency (in comparison with random noise) is more important, the two models have satisfactory performance by providing rankings that are close to, or the same as, the “true” ones; (3) The L–L model has better performance than the AP model and is more robust. This is especially true when technical inefficiency is less important; (4) Under different RS, when technical inefficiency is less important, both models have unsatisfactory ranking performance; and (5) The relative importance of technical inefficiency plays an prominent role in ranking efficient DMUs.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

Notes

  1. To be consistent with literature, in this study, efficient DMUs can be either strongly or weakly efficient.

  2. The simulation procedure in this study can also be used to rank inefficient DMUs.

  3. There are some exceptional cases, e.g., when n = 10 and m = 10.

  4. The results and conclusions remain the same when different intervals, e.g., (0, 2), (2, 11), are used.

References

  • Adhikari, A., Majumdar, A., Gupta, G., & Bisi, A. (2020). An innovative super-efficiency data envelopment analysis, semi-variance, and Shannon-entropy-based methodology for player selection: Evidence from cricket. Annals of Operations Research, 284(1), 1–32.

    Article  Google Scholar 

  • Adler, N., Friedman, L., & Sinuany-Stern, Z. (2002). Review of ranking methods in the data envelopment analysis context. European Journal of Operational Research, 140(2), 249–265.

    Article  Google Scholar 

  • Andersen, P., & Petersen, N. C. (1993). A procedure for ranking efficient units in data envelopment analysis. Management Science, 39(10), 1261–1264.

    Article  Google Scholar 

  • Avkiran, N. K. (2011). Association of DEA super-efficiency estimates with financial ratios: Investigating the case for Chinese banks. Omega, 39(3), 323–334.

    Article  Google Scholar 

  • Banker, R. D., & Chang, H. (2006). The super-efficiency procedure for outlier identification, not for ranking efficient units. European Journal of Operational Research, 175(2), 1311–1320.

    Article  Google Scholar 

  • Banker, R. D., Chang, H., & Zheng, Z. (2017). On the use of super-efficiency procedures for ranking efficient units and identifying outliers. Annals of Operations Research, 250(1), 21–35.

    Article  Google Scholar 

  • Banker, R. D., Charnes, A., & Cooper, W. W. (1984). Some models for estimating technical and scale inefficiencies in data envelopment analysis. Management Science, 30(9), 1078–1092.

    Article  Google Scholar 

  • Banker, R. D., Cooper, W. W., Seiford, L. M., Thrall, R. M., & Zhu, J. (2004). Returns to scale in different DEA models. European Journal of Operational Research, 154(2), 345–362.

    Article  Google Scholar 

  • Banker, R. D., & Natarajan, R. (2008). Evaluating contextual variables affecting productivity using data envelopment analysis. Operations Research, 56(1), 48–58.

    Article  Google Scholar 

  • Bjurek, H., Hjalmarsson, L., & Forsund, F. R. (1990). Deterministic parametric and nonparametric estimation of efficiency in service production: A comparison. Journal of Econometrics, 46(1–2), 213–227.

    Article  Google Scholar 

  • Charnes, A., Cooper, W. W., & Rhodes, E. (1978). Measuring the efficiency of decision making units. European Journal of Operational Research, 2(6), 429–444.

    Article  Google Scholar 

  • Chen, C. M., & Delmas, M. A. (2012). Measuring eco-inefficiency: A new frontier approach. Operations Research, 60(5), 1064–1079.

    Article  Google Scholar 

  • Coelli, T. J., Rao, D. S. P., O’Donnell, C. J., & Battese, G. E. (2005). An introduction to efficiency and productivity analysis. Springer.

    Google Scholar 

  • Cook, W. D., Roll, Y., & Kazakov, A. (1990). DEA model for measuring the relative efficiencies of highway maintenance patrols. Information Systems and Operational Research, 28(2), 113–124.

    Article  Google Scholar 

  • Dai, Q., Li, Y., & Liang, L. (2016). Allocating fixed costs with considering the return to scale: A DEA approach. Journal of Systems Science and Complexity, 29(5), 1320–1341.

    Article  Google Scholar 

  • Doyle, J., & Green, R. (1994). Efficiency and cross-efficiency in DEA: Derivations, meanings and uses. Journal of the Operational Research Society, 45(5), 567–578.

    Article  Google Scholar 

  • Du, J., Wang, J., Chen, Y., Chou, S. Y., & Zhu, J. (2014). Incorporating health outcomes in Pennsylvania hospital efficiency: An additive super-efficiency DEA approach. Annals of Operations Research, 221(1), 161–172.

    Article  Google Scholar 

  • Düzakın, E., & Düzakın, H. (2007). Measuring the performance of manufacturing firms with super slacks based model of data envelopment analysis: An application of 500 major industrial enterprises in Turkey. European Journal of Operational Research, 182(3), 1412–1432.

    Article  Google Scholar 

  • Giraleas, D., Emrouznejad, A., & Thanassoulis, E. (2012). Productivity change using growth accounting and frontier-based approaches: evidence from a Monte Carlo analysis. European Journal of Operational Research, 222(3), 673–683.

    Article  Google Scholar 

  • Hinojosa, M. A., Lozano, S., Borrero, D. V., et al. (2017). Ranking efficient DMUs using cooperative game theory. Expert Systems with Applications, 80(9), 273–283.

    Article  Google Scholar 

  • Jenkins, L., & Anderson, M. (2003). Multivariate statistical approach to reducing the number of variables in data envelopment analysis. European Journal of Operational Research, 147(1), 51–61.

    Article  Google Scholar 

  • Kendall, M. G., & Gibbons, J. D. (1990). Rank correlation methods. Oxford University Press.

    Google Scholar 

  • Li, H., & Shi, J. F. (2014). Energy efficiency analysis on Chinese industrial sectors: An improved Super-SBM model with undesirable outputs. Journal of Cleaner Production, 65, 97–107.

    Article  Google Scholar 

  • Li, J. Y., & Cheng, Y. X. (2012). Total-factor energy efficiency analysis of Hebei province based on super efficiency DEA. Industrial Engineering Journal, 15(1), 87–92.

    Google Scholar 

  • Li, L., & Liang, L. (2010). A Shapley value index on the importance of variables in DEA models. Expert Systems with Applications, 37(9), 6287–6292.

    Article  Google Scholar 

  • Li, Y., Xie, J., Wang, M., & Liang, L. (2016). Super efficiency evaluation using a common platform on a cooperative game. European Journal of Operational Research, 255(3), 884–892.

    Article  Google Scholar 

  • Li, Y., Yang, F., Liang, L., & Hua, Z. (2009). Allocating the fixed cost as a complement of other cost inputs: A DEA approach. European Journal of Operational Research, 197(1), 389–401.

    Article  Google Scholar 

  • Liang, L., Jie, W., Cook, W. D., et al. (2018). Alternative secondary goals in DEA cross-efficiency evaluation. International Journal of Production Economics, 113(2), 1025–1030.

    Article  Google Scholar 

  • Lin, R., & Chen, Z. (2016). Fixed input allocation methods based on super CCR efficiency invariance and practical feasibility. Applied Mathematical Modelling, 40(9–10), 5377–5392.

    Article  Google Scholar 

  • Minh, N. K., Long, G. T., & Hung, N. V. (2013). Efficiency and super-efficiency of commercial banks in Vietnam: Performances and determinants. Asia Pacific Journal of Operational Research, 30(1), 1250047-1–19.

    Google Scholar 

  • O’Neill, L. (2005). Methods for understanding super-efficient data envelopment analysis results with an application to hospital inpatient surgery. Health Care Management Science, 8(4), 291–298.

    Article  Google Scholar 

  • Pastor, J. T., Ruiz, J. L., & Sirventm, I. (2002). A statistical test for nested radial DEA models. Operations Research, 50, 728–735.

    Article  Google Scholar 

  • Sexton, T. R., Silkman, R. H., & Hogan, A. J. (1986). Data envelopment analysis: Critique and extensions. New Directions for Evaluation, 32, 73–105.

    Article  Google Scholar 

  • Shapley, L.S. (1953). A value for N-person games. Contribution to the Theory of Games, Vol. 2. Princeton University Press, Princeton, NJ, 307–317.

  • Tsagris, M., Beneki, C., & Hassani, H. (2014). On the Folded Normal Distribution. Mathematics, 2(1), 12–28.

    Google Scholar 

  • Uzawa, H. (1962). Production functions with constant elasticities of substitution. The Review of Economic Studies, 29(4), 291–299.

    Article  Google Scholar 

  • Xue, M., & Harker, P. T. (2002). Note: Ranking DMUs with infeasible super-Efficiency DEA models. Management Science, 48(5), 705–710.

    Article  Google Scholar 

  • Yawe, B. (2010). Hospital performance evaluation in Uganda: A super-efficiency data envelope analysis model. Zambia Social Science Journal, 1(1), 79–105.

  • Zhang, Q. Z., He, F., & Zhao, X. (2012). Analysis of Chinese energy efficiency of iron and steel industry based on super-efficiency DEA. Soft Science, 26(2), 65–68.

    Google Scholar 

  • Zimková, E. (2014). Technical efficiency and super-efficiency of the banking sector in Slovakia. Procedia Economics and Finance, 12(6), 780–787.

    Article  Google Scholar 

Download references

Acknowledgement

The authors would especially like to thank the reviewers’ helpful comments and suggestions. This work is supported by the National Natural Science Foundation of China under grant (Nos. 61673381, 71701060, 72071192, 71671172, 71631006), Project of Great Wall Scholar, Beijing Municipal Commission of Education (No. CITTCD20180305), Humanities and Social Science Fund (Beijing University of Technology, No. 011000546318525), Natural Science Foundation of Beijing Municipality (No. 9202002), the Anhui Provincial Quality Engineering Teaching and Research Project (No. 2020JYXM2279), and the Anhui University and Enterprise Cooperation Practice Education Base Project (No. 2019SJJD02).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qiwei Xie.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix 1: methods for ranking DMUs and their limitations

The cross-efficiency sorting method was first proposed by Doyle and Green (1994). In the DEA background, this method is to sort the DMUs based on a cross-efficiency matrix. The cross-efficiency matrix was originally developed by Sexton et al. (1986), who started the theme of DEA ranking. Doyle and Green (1994) pointed out that decision makers do not always have a reasonable mechanism for ranking units and, thus, recommended the cross-efficiency matrix as a hierarchical unit. The cross-efficiency ranking method uses the optimal weights evaluated by n linear programs (LPs) to calculate the efficiency score of each DMU n times. The results of all DEA cross efficiency scores can be organized as the cross-efficiency matrix. The cross-efficiency soring method allows each DMU to selfishly choose an optimal set of input and output weights and defines the average of a DMU’s efficiencies based on the optimal weights as its cross efficiency. However, cross-efficiency scores are generally not unique and depend on the alternative optimal solutions to the LPs used (Liang and Wu et al., 2008b). The efficiency obtained by this method is, thus, not unique.

The Common weight-based method is also used for ranking DMUs proposed in (Cook et al., 1990). It attempts to find a common set of weights to calculate efficiencies for all DMUs and further ranks them based on their efficiencies. Compared with the cross-efficiency method, this method evaluates the DMUs based on a common platform, i.e., the set of common weights. In this regard, the efficiency scores of units are comparable. The problem with this ranking method is that it is very difficult to find an inclusion principle to choose a common weight set. Consequently, the different principles adopted result in different common weight sets and grades between DMUs.

Nicole Adler et al. (2002) discussed the Benchmark ranking method, which sorts DMUs in two stages. In the first stage, the efficient units are ranked by simply counting the number of times they appear in the reference sets of inefficient units. The inefficient units are then ranked, in the second stage, by counting the number of DMUs that needs to be removed from the analysis before they are considered efficient. However, a complete ranking cannot be assured because many DMUs may receive the same ranking score. Another problem with this ranking method is that a DMU is highly ranked if it is chosen as a useful target for many other DMUs.

Appendix 2: Results of simulation where parameters are changed under DRS

See Figs. 15, 16, 17 and 18 and Tables 16, 17, 18, 19.

Fig. 15
figure 15

Probability bar charts corresponding to the fifth set of experiments

Fig. 16
figure 16

Probability bar charts corresponding to the sixth set of experiments

Fig. 17
figure 17

Probability bar charts corresponding to the seventh set of experiments

Fig. 18
figure 18

Probability bar corresponding to the eighth set of experiments

Table 16 Results of the fifth set of experiments (x ~ \(FN~\left[ {0,2.5} \right]~\))
Table 17 Results of the sixth set of experiments (\(x\) ~ \(FN~\left[ {0,2.5} \right]\) and \(\alpha\) ~ \(FN~\left[ {1/2m,~2/m} \right]\))
Table 18 Results of the seventh set of experiments (\(x\) ~ \(U\left( {1,6} \right)\))
Table 19 Results of the eighth set of experiments (\(x\) ~ \(U\left( {1,6} \right)\) and \(\alpha\) ~ \(U\left( {1/3m,~3/m} \right)\))

Appendix 3: Results of simulation where parameters are changed under IRS.

See Figs. 19, 20, 21 and 22 and Tables 20, 21, 22 and 23.

Fig. 19
figure 19

Probability bar charts corresponding to the ninth set of experiments

Fig. 20
figure 20

Probability bar charts corresponding to the tenth set of experiments

Fig. 21
figure 21

Probability bar charts corresponding to the eleventh set of experiments

Fig. 22
figure 22

Probability bar corresponding to the twelfth set of experiments

Table 20 Results of the ninth set of experiments (x ~ \(FN~\left[ {0,2.5} \right]~\))
Table 21 Results of the tenth set of experiments (\(x\) ~ \(FN~\left[ {0,2.5} \right]\) and \(\alpha\) ~ \(FN~\left[ {1/2m,~2/m} \right]\))
Table 22 Results of the eleventh set of experiments (\(x\) ~ \(U\left( {1,6} \right)\))
Table 23 Results of the twelfth set of experiments (\(x\) ~ \(U\left( {1,6} \right)\) and \(\alpha\) ~ \(U\left( {1/3m,~3/m} \right)\))

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xie, Q., Zhang, L.L., Shang, H. et al. Evaluating performance of super-efficiency models in ranking efficient decision-making units based on Monte Carlo simulations. Ann Oper Res 305, 273–323 (2021). https://doi.org/10.1007/s10479-021-04148-3

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10479-021-04148-3

Keywords

Navigation