Skip to main content
Log in

Does the Financial Times FT50 journal list select the best management and economics journals?

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

The ranking of academic journals and the considerable impact of journal lists have been increasingly criticized, especially in management research. To assess the effectiveness of peer review selection versus bibliometric assessment, a benchmark of the best management and economic journals is performed. Based on multiple indicators, the Financial Times FT50 list is compared to the 100 best ranked journals. The position of the FT journals in our aggregate bibliometric ranking confirms the effectiveness of the peer review of the FT. It also highlights the complementarity of peer review for research assessment with the metrics. The h-index emerges as the best predictor for FT membership, followed by the average citation. The Journal Impact Factor is refuted as a predictor for FT inclusion, while the SJR indicator seems to be more representative only for economics journals. The FT50 confirms the stability of the top-tier journals. It crowns journals’ long-term reputation. The FT50 covers a good balance of subfields and also leaves room for a practitioner journal and openness for innovation. The dichotomy between economics and general management journals remains a concern for rankings and journal lists, as does the size of the selection and the quest for a balanced subdivision between subfields.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Notes

  1. https://www.gate.cnrs.fr/IMG/pdf/KeeleEconJournalRanks442.pdf all last retrieved on August 18th 2020.

  2. https://ideas.repec.org/.

  3. https://charteredabs.org/academic-journal-guide-2018/.

  4. https://abdc.edu.au/research/abdc-journal-list/.

  5. https://www.gate.cnrs.fr/IMG/pdf/categorisation37_liste_juin_2020-2.pdf.

  6. https://harzing.com/resources/journal-quality-list.

  7. https://jindal.utdallas.edu/the-utd-top-100-business-school-research-rankings/.

  8. https://www.ft.com/content/3405a512-5cbb-11e1-8f1f-00144feabdc0.

  9. http://help.incites.clarivate.com/incitesLiveJCR/glossaryAZgroup/g4/7790-TRS.html.

  10. Supplementary information in excel-file.

References

  • Adler, N. J., & Harzing, A. W. (2009). When knowledge wins: Transcending the sense and nonsense of academic rankings. Academy of Management Learning & Education, 8(1), 72–95

    Google Scholar 

  • Agarwal, R., & Hoetker, G. (2007). A Faustian bargain? The growth of management and its relationship with related disciplines. Academy of Management Journal, 50(6), 1304–1322

    Google Scholar 

  • Aguinis, H., Shapiro, D. L., Antonacopoulou, E. P., & Cummings, T. G. (2014). Scholarly impact: A pluralist conceptualization. Academy of Management Learning & Education, 13(4), 623–639

    Google Scholar 

  • Allison, P. D., Krauze, T. K., & Long, J. S. (1982). Cumulative advantage and inequality in science. American Sociological Review, 47, 615–616

    Google Scholar 

  • Archambault, É., & Larivière, V. (2009). History of the journal impact factor: Contingencies and consequences. Scientometrics, 79(3), 635–649

    Google Scholar 

  • Bailey, J. R. (2013). The iron cage and the monkey’s paw: Isomorphism, legitimacy, and the perils of a rising journal. Academy of Management Learning & Education, 12(1), 108–114

    Google Scholar 

  • Bertocchi, G., Gambardella, A., Jappelli, T., Nappi, C. A., & Peracchi, F. (2015). Bibliometric evaluation versus informed peer review: Evidence from Italy. Research Policy, 44(2), 451–466

    Google Scholar 

  • Biehl, M., Kim, H., & Wade, M. (2006). Relationships among the academic business disciplines: A multi-method citation analysis. Omega International Journal of Management Science, 34(4), 359–371

    Google Scholar 

  • Bohannon, J. (2016). Hate journal impact factors? New study gives you one more reason. Science. https://doi.org/10.1126/science.aag0643

    Article  Google Scholar 

  • Bontis, N., & Serenko, A. (2009). A follow-up ranking of academic journals. Journal of Knowledge Management, 13(1), 16–26

    Google Scholar 

  • Bornmann, L., & Williams, R. (2017). Can the journal impact factor be used as a criterion for the selection of junior researchers? A large-scale empirical study based on ResearcherID data. Journal of Informetrics, 11(3), 788–799

    Google Scholar 

  • Bornmann, L., Butz, A., & Wohlrabe, K. (2018). What are the top five journals in economics? A new meta-ranking. Applied Economics, 50(6), 659–675

    Google Scholar 

  • Bradshaw, C. J., & Brook, B. W. (2016). How to rank journals. PloS One, 11(3), e0149852

    Google Scholar 

  • Braun, T., Glänzel, W., & Schubert, A. (2006). A hirsch-type index for journals. Scientometrics, 69(1), 169–173

    Google Scholar 

  • Brown, T., & Gutman, S. (2019). Impact factor, eigenfactor, article influence, scopus SNIP, and SCImage journal rank of occupational therapy journals. Scandinavian Journal of Occupational Therapy, 26(7), 475–483

    Google Scholar 

  • Burgess, T. F., & Shaw, N. E. (2010). Editorial board membership of management and business journals: A social network analysis study of the financial times 40. British Journal of Management, 21(3), 627–648

    Google Scholar 

  • Callaway, E. (2016). Publishing elite turns against impact factor. Nature, 535(7611), 210–211

    Google Scholar 

  • Cameron, B. D. (2005). Trends in the usage of ISI bibliometric data: Uses, abuses, and implications. Portal: Libraries and the Academy, 5(1), 105–125

    Google Scholar 

  • Chen, C. R., & Huang, Y. (2007). Author affiliation index, finance journal ranking, and the pattern of authorship. Journal of Corporate Finance, 13(5), 1008–1026

    Google Scholar 

  • Clark, T., Wright, M., Iskoujina, Z., & Garnett, P. (2014). JMS at 50: Trends over time. Journal of Management Studies, 51(1), 19–37

    Google Scholar 

  • Collins, F. L., & Park, G. S. (2016). Ranking and the multiplication of reputation: Reflections from the frontier of globalizing higher education. Higher Education, 72(1), 115–129

    Google Scholar 

  • da Silva, J. A. T., & Memon, A. R. (2017). Citescore: A cite for sore eyes, or a valuable, transparent metric? Scientometrics, 111(1), 553–556

    Google Scholar 

  • Dewett, T., & Denisi, A. (2004). Exploring scholarly reputation: It’s more than just productivity. Scientometrics, 60(2), 249–272

    Google Scholar 

  • Drivas, K., & Kremmydas, D. (2020). The Matthew effect of a journal’s ranking. Research Policy, 49(4), 103951

    Google Scholar 

  • Falagas, M., Kouranos, V., Arencibia-Jorge, R., & Karageorgopoulos, D. E. (2008). Comparison of SCImago journal rank indicator with journal impact factor. FASEB Journal, 22(8), 2623–2628

    Google Scholar 

  • Geary, J., Marriott, L., & Rowlinson, M. (2004). Journal rankings in business and management and the 2001 research assessment exercise in the UK. British Journal of Management, 15(2), 95–141

    Google Scholar 

  • Gonzalez-Pereira, B., Guerrero-Bote, V., & Moya-Anegon, F. (2010). A new approach to the metric of journals’ scientific prestige: The SJR indicator. Journal of Informetrics, 4(3), 379–391

    Google Scholar 

  • Hair, J., Wood, B., & Sharland, A. (2019). Toward a better understanding of the Australian business deans council (ABDC) list and its rankings. International Journal of Educational Management, 33(4), 644–650

    Google Scholar 

  • Halim, Z., & Khan, S. (2019). A data science-based framework to categorize academic journals. Scientometrics, 119(1), 393–423

    Google Scholar 

  • Hamet, J., & Maurer, F. (2017). Is management research visible outside the academic community? Management, 20(5), 492–516

    Google Scholar 

  • Hawkins, R. G., Ritter, L. S., & Walter, I. (1973). What economists think of their journal. Journal of Political Economy, 81(4), 1017–1032

    Google Scholar 

  • Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences USA, 102, 16569–16572

    MATH  Google Scholar 

  • Hodgson, G., & Rothman, H. (1999). The editors and authors of economics journals: A case of institutional oligopoly? Economic Journal, 109(453), 165–186

    Google Scholar 

  • Hole, A. R. (2017). Ranking economics journals using data from a national research evaluation exercise. Oxford Bulletin of Economics and Statistics, 79(5), 621–636

    Google Scholar 

  • Hudson, J. (2013). Ranking journals. Economic Journal, 123(570), 202–222

    Google Scholar 

  • Hussain, S. (2015). Journal list fetishism and the ‘sign of 4’in the ABS guide: A question of trust? Organization, 22(1), 119–138

    Google Scholar 

  • Judge, T. A., Cable, D. M., Colbert, A. E., & Rynes, S. L. (2007). What causes a management article to be cited—Article, author, or journal? Academy of Management Journal, 50(3), 491–506

    Google Scholar 

  • Kalaitzidakis, P., Mamuneas, T. P., & Stengos, T. (2003). Rankings of academic journals and institutions in economics. Journal of the European Economic Association, 1(6), 1346–1366

    Google Scholar 

  • Kiermer, V., Larivière, V., & MacCallum, C. (2016). Measuring up: Impact factors do not reflect article citation rates. PlosOne. Retrieved from https://blogs.plos.org/plos/2016/07/impact-factors-do-not-reflect-citation-rates/.

  • Leydesdorff, L. (2009). How are new citation-based journal indicators adding to the bibliometric toolbox? Journal of the American Society for Information Science and Technology, 60(7), 1327–1336

    Google Scholar 

  • Liebowitz, S. J., & Palmer, J. P. (1984). Assessing the relative impacts of economics journals. Journal of Economic Literature, 22(1), 77–88

    Google Scholar 

  • Linton, J. (2010). Editorial. How do technology innovation management journals stack up against the financial times 45—Impressively and other notes. Technovation, 30, 483–484

    Google Scholar 

  • Lowry, P. B., Gaskin, J., Humpherys, S. L., Moody, G. D., Galletta, D. F., Barlow, J. B., & Wilson, D. W. (2013). Evaluating journal quality and the association for information systems senior scholars’ journal basket via bibliometric measures: Do expert journal assessments add value? MIS Quarterly, 37, 993–1012

    Google Scholar 

  • Lozano, G. A., Larivière, V., & Gingras, Y. (2012). The weakening relationship between the impact factor and papers’ citations in the digital age. Journal of the American Society for Information Science and Technology, 63(11), 2140–2145

    Google Scholar 

  • Mingers, J. (2009). Measuring the research contribution of management academics using the hirsch-index. Journal of the Operational Research Society, 60(9), 1143–1153

    Google Scholar 

  • Mingers, J., & Willmott, H. (2013). Taylorizing business school research: On the ‘one best way’performative effects of journal ranking lists. Human Relations, 66(8), 1051–1073

    Google Scholar 

  • Mingers, J., & Yang, L. (2017). Evaluating journal quality: A review of journal citation indicators and ranking in business and management. European Journal of Operational Research, 257(1), 323–337

    MathSciNet  MATH  Google Scholar 

  • Moed, H. F. (2006). Citation analysis in research evaluation. (Vol. 9)Springer.

  • Moed, H., Colledge, L., Reedijk, J., Moya-Anegon, F., Guerrero-Bote, V., Plume, A., & Amin, M. (2012). Citation-based metrics are appropriate tools in journal assessment provided that they are accurate and used in an informed way. Scientometrics, 92(2), 367–376

    Google Scholar 

  • Moosa, I. A. (2017). Citations, journal ranking and multiple authorships: Evidence based on the top 300 papers in economics. Applied Economics Letters, 24(3), 175–181

    Google Scholar 

  • Morris, H., Harvey, C., & Kelly, A. (2009). Journal rankings and the ABS journal quality guide. Management Decision, 47, 1441–1451

    Google Scholar 

  • Nkomo, S. M. (2009). The seductive power of academic journal rankings: Challenges of searching for the otherwise. Academy of Management Learning & Education, 8(1), 106–121

    Google Scholar 

  • Podsakoff, P. M., Mackenzie, S., Bachrach, D., & Podsakoff, N. (2005). The influence of management journals in the 1980s and 1990s. Strategic Management Journal, 26, 473–488

    Google Scholar 

  • Rinia, E. J., van Leeuwen, Th. N., van Vuren, H. G., & van Raan, A. F. J. (1998). Comparative analysis of a set of bibliometric indicators and central peer review criteria. Research Policy, 27, 95–107

    Google Scholar 

  • Ritzberger, K. (2008). A ranking of journals in economics and related fields. German Economic Review, 9(4), 402–430

    Google Scholar 

  • Rousseau, R. (2016). Citation data as proxy for quality or scientific influence are at best PAC (Probably Approximately Correct). Journal of the Association for Information Science and Technology, 67(12), 3092–3094

    Google Scholar 

  • Rousseau, R., Egghe, L., & Guns, R. (2018). Becoming metric-wise: A bibliometric guide for researchers. Chandos Publishing.

  • Rowlinson, M., Harvey, C., Kelly, A., & Morris, H. (2011). The use and abuse of journal quality lists. Organization, 18, 443–446

    Google Scholar 

  • Schultz, M. (2010). Reconciling pragmatism and scientific rigor. Journal of Management Inquiry, 19(3), 274–277

    Google Scholar 

  • Seglen, P. O. (1997). Why the impact factor of journals should not be used for evaluating research. British Medical Journal, 314(7079), 498–502

    Google Scholar 

  • Starbuck, W. H. (2005). How much better are the most-prestigious journals? The statistics of academic publication. Organization Science, 16(2), 180–200

    Google Scholar 

  • Stern, D. (2013). Uncertainty measures for economics journal impact factors. Journal of Economic Literature, 51(1), 173–189

    Google Scholar 

  • Stremersch, S., Verniers, I., & Verhoef, P. C. (2007). The quest for citations: Drivers of article impact. Journal of Marketing, 71(3), 171–193

    Google Scholar 

  • Vogel, R., Hattke, F., & Petersen, J. (2017). Journal rankings in management and business studies: What rules do we play by? Research Policy, 46, 1707–1722

    Google Scholar 

  • Walker, J. T., Fenton, E., Salter, A., & Salandra, R. (2019). What influences business academics’ use of the association of business schools (ABS) list? Evidence from a survey of UK academics. British Journal of Management., 30(3), 730–747

    Google Scholar 

  • Willmott, H. (2011). Journal list fetishism and the perversion of scholarship: reactivity and the ABS list. Organization, 18(4), 429–442

    Google Scholar 

  • Wouters, P., Sugimoto, C. R., Larivière, V., McVeigh, M. E., Pulverer, B., de Rijcke, S., & Waltman, L. (2019). Rethinking impact factors: Better ways to judge a journal. Nature, 569, 621

    Google Scholar 

  • Wright, M., & Hitt, M. A. (2017). Strategic entrepreneurship and SEJ: Development and current progress. Strategic Entrepreneurship Journal, 11(3), 200–210

    Google Scholar 

  • Zhang, L., Rousseau, R., & Sivertsen, G. (2017). Science deserves to be judged by its contents, not by its wrapping: Revisiting Seglen’s work on journal impact and research evaluation. PLoS One, 12(3), e0174205

    Google Scholar 

  • Zhang, T. (2020). Will the increase in publication volumes “dilute” prestigious journals’ impact factors? A trend analysis of the FT50 journals. Scientometrics, 126, 863–869

    Google Scholar 

Download references

Acknowledgements

The author thanks emeritus Professor Marc Buelens (Vlerick Business School and Ghent University) for his advice on statistical methodology and Dr. Emmanuel Abatih of FIRE for performing the PCA-analysis. FIRE (Fostering Innovative Research based on Evidence) is the statistical consulting office of Ghent University.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yves Fassin.

Supplementary Information

Appendices

Appendix 1: The search and selection process of the 100 best ranked journals.

The search process to select the 100 best ranked journals in management and economics was performed in successive iterative steps, first by increasing the numbers of journals under study (from 177 to 475), followed by a gradual reduction of the selection (to 300 and finally 158). In a first step, we searched in the Web of Science for journals that published on the ‘topics’ management, marketing, finance, organization, entrepreneurship, ethics, so concentrating on management journals. For each of these topics that correspond to the subfields of management, we look at the function ‘Analysis—Source titles’ to select the journals in order of productivity, based on their number of articles. A second search on the same topics was performed on the h-core, the top articles that have more citations than the h-index of that dataset. That second search allowed to select the journals on impact: where are articles with the highest impact published? This procedure led to a mix of 177 journals of high and lower impact, the FT50 and 127 other management journals. We defined the thresholds for the top 50 and top 100 according to both criteria of JIF and SJR.

In a second round we searched for missing journals in InCites Journal Citation Reports and in the SJR Scimago Journal and Country Rank databases, in different categories related to management, business and finance. Both databases have different categories and some journals are included in different categories of each database, for example in the categories Management and in Economics or in Finance. We selected the missing journals of the classification according to the JIF higher than the top 100 threshold and with more than 100 articles or a h-index higher than 30, insofar management or economics were the main theme of the journal (for example, we did not select the Renewable and Sustainable Energy Reviews with a JIF of 10.556 and a h-index of 231, but too far from management issues as focused on energy and technology issues. We did not include the Journal of Conflict Resolution, with a SJR of 4.241 and h-index 94, as it focuses on international conflict of nations, despite its 5th place in the SJR category of Business, Management and Accounting). We followed the same procedure for the SJR rankings. We adapted the thresholds of the top 100. In order to cover all the disciplines represented in the FT50, we performed a third round of searches focused on economics, finance and accountancy, and explored also a few other specialized subfields as economics or management in transport, tourism and sports that were represented in the JIF or SJR rankings.

An additional check was performed based on the JQL list (www.harzing.com). We checked all missing 4 ranked journals of the UK ABS list and all A* ranked journals from the Australian ABDC list on the h-index and SJR factor, and with the Idea/Repec and Keele list. Most of the missing journals were economics journals, the best ranked were journals with a special focus: on statistics, or on a specific management domain as tourism or in public administration, not the focus of the FT50 list. We included the selected journals further in the subfield analysis.

With the new Incites 2019 data available, the dataset was further expanded to 475 journals, and then in three successive stages reduced to 158 journals. We eliminated the book series and proceedings, in order to concentrate on real journals; we also excluded some vulgarization and executive journals as Forbes and Fortune, and a few journals that were too distant from a management perspective, mainly in psychology and political sciences.

Multiple comparative rankings have been established following five indicators: JIF, h-index, SJR, average citations per year and total citations. An aggregated ranking allowed to retain the 300 best ranked journals. The same procedure was duplicated in the second round. Now, the 125 best ranked journals were selected, as well as the 80 best ranked management journals and the 60 best economic journals. In addition, journals in the top 10 of one of the individual indicator rankings were also retained (as the Annual Review of Organizational Psychology), besides the 9 FT50 journals and 5 former FT journals that did not reach the top 100. In this last stage, 5 additional journals were eliminated, despite their place in the top 5 of the productivity and total citation rankings, as their major focus was not on managerial topics, but rather technological or more specific ones (the Journal of Cleaner Production, the European Journal of Operational Research, the Journal of Environmental Management, Sustainability (all publishing around 1000 articles per year) and Knowledge Based Systems).

New rankings (on 158) were then established following the 9 chosen metrics, and an aggregated ranking. With a high probability one can assume with rather good certainty that the selection of the 158 articles includes the top 100 management and economic journals, following various categorizations. In the third round of the search process, only one missing journal reached the top 50 (between place 40 and 50) and four other journals joined the top 100 for one of the criteria.

The ranking at the different steps slightly differs, because of diminishing differentials in ranking on 300 or on 158. A sensitivity analysis reveals that the ranking can also slightly differ according to the number of indicators taken into account, and according to the drop of the extreme values. But those differences in ranking are minimal and the purpose of this study is not to develop an exact ranking but to position the FT50 journals in the overall ranking. A second objective was to have a good spread of the distribution of journals in the lower categories to analyze the best ranked journals in the subfields. The update of the new 2019 data enabled us to perform a sensitivity analysis. It also allowed a control where potential deviations could be identified.

Appendix 2

See Table 9.

Table 9 Correlation matrix of selection of 158 journals

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fassin, Y. Does the Financial Times FT50 journal list select the best management and economics journals?. Scientometrics 126, 5911–5943 (2021). https://doi.org/10.1007/s11192-021-03988-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-021-03988-x

Keywords

Navigation