Introduction

Due to the widespread prevalence of the “publish or perish” paradigm in most international universities and research institutions [1], the analysis of the publication trajectories of academics have become an essential line of inquiry in general, and in communications studies in particular [2,3,4,5,6,7,8]. Prior studies typically portray excellence as a function of productivity and impact: while the first is thought to be expressed by the number of published papers, the second is manifested by the number, and sometimes the weight, of citations [9]. The mushrooming body of literature on publication patterns has extensively examined a plethora of issues, including main journals, co-citation and collaboration networks, editorial boards, and publishing houses within communication studies [10,11,12,13,14]. However, while top-performing scholars typically serve as role models for the scientific community [15], thus far only limited scholarly attention has been directed towards their publication strategies.

Drawing upon an illustrative network analysis of the publication trajectories of the top 100 most productive communication scholars between 2015 and 2019, this study seeks to provide an insightful analysis of both the publication paths of top-performing scholars, and the network of those communication journals in which they publish. Results show that scholars who largely publish the most papers, also tend to publish in top journals, lending further support to the idea that quantity is generally aligned with quality and impact. Our findings also showcase that the most popular journals amongst trending scholars form a connected network, and, based on the strong inter-connection between journals in the fields of journalism studies, health communication and public relations, some relatively autonomous subnetworks have evolved.

“Publish or Perish” in Communication Studies

Due to the growing internationalization of higher education and scholarly research [16], international productivity (measured by publication in top-tier journals) and impact (measured by citations) have become the most salient factors during research assessments [17]. Moreover, as a consequence of the internationalization of the labor force and the growing mobility of researchers [18, 19], the infamous publish or perish paradigm [1, 20,21,22] has been slightly modified to become a “publish in internationally visible journals or perish” paradigm [23]. Accordingly, publishing in indexed, preferable Scopus, journals, has become the gold standard in academia [24].

Several assessment systems across countries aim to analyze and qualify academic productivity, including the Research Excellence Framework (REF) in the UK, the Spanish Agencia Nacional de Evaluación de la Calidad y Acreditación (ANECA), AERES in France, ANVUR in Italy, the Scielo in Latin-America, etc. Not surpringly, even there is a research assessment framework for the evaluation of European research projects, namely the Science and Technology Options Assessment (STOA) [25]. Most of these evaluating systems (with the sole exception of Scielo) typically work only with papers published in internationally visible journals, especially those indexed in salient databases as Scopus or the Web of Science. The rationale behind this exclusive selection is that most research institutions, international rankings, funding agencies, and even policy makers assume that publishing in leading journals is a significant antecedent of research quality and future impact [26,27,28].

Beyond traditional scientrometric works, laudable efforts have also been driven to specifically account for communication studies research patterns. For instance, in 1989 a special issue of Communication Research was entirely dedicated to the analysis of publication patterns within the discipline, and Journal of Communication, the flagship journal of the International Communication Association (ICA), published three special issues to examine publication patterns in the communication field (Vol 43 Issue 3, Vol 54 Issue 4 and Vol 55 Issue 3). Citation analyses has also been common, starting to flourish from the 1980s onwards [29,30,31,32,33].

Later studies provide extensive empirical evidence that show how the field evolves in terms of methodological sophistication and thematic patterns [2,3,4,5,6,7,8, 34,35,36,37]. Other scholars focused on more specific publication trends. For instance, Freelon [38] made a detailed analysis of top research clusters, co-citation networks and geographical diversities through a network analysis of nine top-tier communication journals. His research was later extended by Günther and Domahidi [39] extensively examining the range of topics that form the field of communication. The main topics of the field has been also widely examined through citation patterns, as a score of studies argue that the number of citations is a proxy of the popularity of a given research field [12,13,14].

Finally, a recent line of research concentrates on the “fragmentation” or “balkanization” of communication scholarship by analyzing how different communication subfields are (or are not) interconnected [40, 41]. Focusing on publication patterns in communication studies, this study contributes to the growing literature on publication trends within the discipline by presenting an insightful analysis of the publication trajectories of the 100 most productive scholars in communication. In line with former studies [38], we use methods inspired by network science to identify research clusters in which top scholars typically publish. Accordingly, we pose the following research questions:

RQ1: Which are the publication trajectories and popular publication outlets of the most productive scholars in communication?

RQ1: Which are the main thematic clusters in which the most productive communication scholars publish?

Protocol of Data Collection and Analysis

Sampling and calculations for this study came from SciVal, Scopus and Scimago. They were chosen for three main reasons. First, these three platforms are interrelated, since both Scimago and SciVal work with Scopus data. For journal rankings, the Scimago Journal List was used, while for publication records we rely on Scopus. Finally, SciVal was used for de selection of the most productive scholars. This is the sole methodologically sound process to examine publication trajectories at the level of both journals, publications, and leading scholars (see [42,43,44,45]). Second, Scimago and Scopus are more inclusive datasets than, for instance, the Web of Science. Thus, a more detailed picture of publication trajectories within the field can be depicted. Finally, as opposed to earlier research that mainly focused on a predetermined set of journals [4, 38, 39], based on the actual publication patters of trending scholars, this study samples a pool of journals that contains only those periodicals where the top 100 most productive communication scholars published between 2015 and 2019. By doing so, we were able to detect several types of potential biases that cannot be perceived when researchers only analyze the best-known journals of the field.

To answer the research questions, we first developed a pool of the most productive researchers in communication. For this purpose, we used SciVal’s Top 500 Authors Worldwide list, which focuses on research productivity between 2015 and 2019. We chose this five-year frame in order to have an overarching longitudinal analysis of publication trends (as 2020 data were incomplete at the time of data collection, we discard it from the analysis). For the formal analysis, we aimed to concentrate on the most productive agents only, thus we selected the 100 most productive researchers, removing cases that do not fit our research ambitions. Two types of researchers were particularly discarded.

First, we eliminated those scholars who published exclusively in engineering journals and not in social sciences, since they are working in quite a different field than mainstream communication scholars, with different platforms and publication trends. Communication engineers, even if their papers are categorized in both engineering and communication, work in a separate publication cluster with their own platforms, typically IEEE conference proceedings. Accordingly, they do not publish in mainstream communication journals, just as social scientists within communication do not publish in engineering journals. Second, we eliminated those scholars who most likely hack the publication system. These scholars publish dozens of papers annually in their own journals, where they are typically the editors-in-chief. In addition, we also found that they do not often publish elsewhere. We consider this to be malpractice, and therefore they were discarded from the sample. Some of these questionable scholars have published more than 120 papers in their own journals over the last five years, and mostly have not published elsewhere. Amongst the 100 most productive scholars in communication, we found 29 who should be included in engineering (and not communication), 5 scholars with a questionable publication record, and in one case, we could not identify the appropriate scholar since SciVal presented thousands of entrances with the same name.

After we had the cleaned list of top authors (n = 66), we compute their record of publications in each year between 2015 and 2019. We only consider papers that have been published in Scopus-indexed journals in a full paper format. Accordingly, we excluded editorials, book reviews and conference papers. From the final sample, we obtained 1482 papers published in 126 different journals. Furthermore, based on the Scimago Journal List, we add ranking positions (range = 1—445) for all the journals in our sample. Consequently, we calculate the prestige of the published papers based on the ranking of the journals in which they appear.

To examine the publication trajectories of our sample, we performed a network analysis. For doing this, we considered journals as nodes and publication trajectories as edges. All the papers were coded on the basis of the journal in which they appear. As a result, we coded 1482 papers with 126 different codes. Edges were coded as directed edges in a source-target structure. The edges were computed by connecting papers from yearn (source) with papers from year n+1 (target), each author. For example, all the papers from an author X who had published n papers in 2015 (source) were connected to all the papers he published in 2016 (target); this protocol of data collection was extensive to all authors and subsequent years of each author. This protocol enables us to build a graph for representing research trajectories across year, and it shows whether publishing in a given journal anticipates a publication in another, as well as the cumulative in-degree and out-degree values. By following this protocol of data collection, we had 6042 individual edges between 126 nodes.

When constructing the network, we bulked those edges that connected the same nodes. Thus, we added weight to each edge, where the weight of an edge shows the number of individual edges between the corresponding nodes. For example, if there were 15 instances where authors had published in Journalism and Mass Communication Quarterly and then, in the next year, they published in Journalism; and if there were 10 instances when authors had published in Journalism and, in the next year, they published in Journalism and Mass Communication Quarterly; then the weight of the edge XY will be 25, X will have an out-degree of 15 and an in-degree of 10, while Y will have an out-degree of 10 and an in-degree of 15. (see Fig. 1).

Fig. 1
figure 1

Explanation of edge weighting

As a result, from the 6042 individual edges, we composed 1780 edges with different weights. With directed edges that connect different journals, we developed a network in which edge arrows show sequential order: the edge goes from the source journal (where the author published in yearn) to the target journal (where the author published in yearn+1). Thus, excluding the year 2015, when journals can only be sources, and the year 2019, where journals can only be targets, each journal can be both target and source, according to the corresponding relation (see Fig. 2). Network calculations, visualization and analysis were conducted with Gephi.

Fig. 2
figure 2

Examples of how to construct edges. Numbers under a given year refer to those journals’ Scimago position in which a given author published. For every subsequent pairs of years, we calculated all possible combinations

For a clearer picture of the most significant trends, we developed another graph that visualize only those edges with a weight greater than 5. With this, we can display a network of the most typical publication trajectories within the field as it is measured by the publication patterns of the best performing scholars in the sample.

Results

When we compared the scholarly output of the top 100 researchers with the total pool of researchers in SciVal (n = 500), we found that the production of top scholars clearly stands out from the regular trend (Fig. 3). Communication scholars should publish at least 11 Scopus-indexed papers between 2015 and 2019 to appear on the top 500 list. However, for being in the top 100 list, authors should publish, at least, 18 papers over the last 5 years. The mean is 10 papers more amongst the most productive scholars than in the group of the top 500. In short, the top 100 scholars publish an annual number of 5 papers in Scopus journals on average. However, in both groups, the distribution of productivity is long-tailed: only a few published more than 40 papers in 5 years amongst top scholars, and only a few from the top 500 published more than 18 papers in the same period.

Fig. 3
figure 3

The productivity of top 100 and top 500 researchers in communication (2015–2019)

After data cleaning, we found by serendipity that 34 percent of the top 100 high-performing authors are in some way irrelevant to mainstream communication scholarship. This is a severe distortion that raises some concerns over research productivity assessments in communication. Implications will be discussed further in the discussion section.

We also found that contrary to expectation that quantity might lead to lower quality, top performing scholars not only publish the most papers, but also publish in top-ranked journals. Specifically, 87 percent of all the analyzed papers were published in the first quartile of Scopus, 11 percent in the second quartile, 2 percent in the third quartile, and only 4 papers (0.2 percent) in the fourth quartile. This research patterns suggests that 98 percent of the overall production of top performing scholars were published in the first half of the Scimago Journal Rank. We also found that all the authors had at least one paper in a Q1 journal, 97 percent had at least one paper in one of the D1 journals (representing the top 10 percent of communication journals), and 74 percent of the authors had at least one paper in one of the top 10 journals of the field. Moreover, there was a slight increase in the prestige of journals in which the top-ranked scholars published in the examined time-frame (Fig. 4).

Fig. 4
figure 4

The mean position of journals in which our authors published in a given year Ranking numbers are reverse, so lower numbers represent higher positions

The most frequent connections were those that connect some journals with themselves, so it is common to see an author publish in the same journal year after year (Table 1). For example, the weight = 150 value of Communication Education means that, in 150 instances, authors publish in this specific journal in two subsequent years.

Table 1 Journals with most self-loops in publication patterns

Naturally, journals with a similar focus are strongly interconnected, but there are some journals that especially tend to form publication networks (Table 2).

Table 2 Journals with most edges in publication patterns (ordered by direction)

Our calculations also showed that, while all those journals with the most connections – as it is represented by high degree – have both high in-degree and out-degree, we can also differentiate between two kinds of journals. In the first case, in-degree is higher than out-degree, meaning that it might be easier to publish in these journals than in journals with a higher out-degree. It can also suggest that these journals publish relatively more papers than other journals in the field. The typical example is Journalism Studies, which is more likely to be the target of a subsequent publication than a source from which other publications follow. On the other hand, there are journals where out-degree is higher than in-degree. These are rather selective journals that publish a relatively small number of papers, but which also serve as gatekeepers, because publication in these journals might be followed by a large number of publications elsewhere. The typical example is Journal of Communication with a relatively small in-degree and a high out-degree (Table 3).

Table 3 Journals with top degrees (ranked by in-degree)

For answering our first research question, we calculated a network of journals in which our pool of authors published between 2015 and 2019 (Fig. 5). The graph shows a strong concentration of publications in a relatively small number of leading Q1 journals. Around the center, we can observe a denser network of other Q1 journals. At the same time, we can also see that the position of Q2 and Q3 journals is peripheral, while Q4 journals are almost absent.

Fig. 5
figure 5

The whole graph. Red represent the top quartile (Q1), green the Q2, and green the Q3 quartile. Edge colors refer to the target journals (Color figure online)

Network properties are presented in Table 4. The publication network of the most productive scholars is relatively well connected with a diameter of 4. This means that the longest walk between two possible nods consists of four edges. Nods have a relatively high average degree (14) and the clustering coefficient (0.497) also shows a high connectedness. Density measures show how close the network is to be a complete graph. A complete graph has all possible edges and a density equal to 1. The clustering coefficient indicates a “small world” effect: it indicates how nodes are embedded in their neighborhood. The average clustering coefficient gives an overall indication of the clustering in the network.

Table 4 Network properties of the full graph (nodes: 126, edges = 1780)

For answering our second research question, we developed more clear-out version of the network in which we cut those journals with degree < 6. As a result, this graph shows the main hubs in the publication network (Fig. 6). From these hubs, the “Journalism hub” is the most salient (Fig. 7), but there are three additional hubs that should be discussed. The first can be labeled as the “education hub”, since it entails journals with a focus on communication education, teaching and education research. The main journal of this hub is Communication Education, and it has a strong tie to Communication Research Reports (weight = 94) that connects the education hub to other significant journals beyond education, such as Communication Research or Health Communication. These connections show that communication education is a field that has strong ties to quantitative research, but, besides a relatively thin tie to the top-ranked journal Communication Research, this network consists of journals from the lower echelons of the Q1 list. Another hub consists of journals with a specific focus on public relations and advertisement. These are journals with high ranks on the Scimago list, with Journal of Advertisement (2/445), Public Relations Review (40/445), International Journal of Advertising (15/445) and International Journal of Strategic Communication (54/445).

Fig. 6
figure 6

Cleaned network with edge weight over 5 Arrow size represents in-degree

Fig. 7
figure 7

Journalism hub with edge weight over 5 Arrow size represents in-degree

The “journalism hub” is the most salient cluster in the graph with well-connected journals (Fig. 7). This means that this specific discipline is extremely popular amongst the best performing scholars in communication and media studies. Three main journals – Journalism, Journalism Practice and Journalism Studies – constitute the center of the hub, but New Media and Society and Digital Journalism are also well embedded in the network. The strongest tie connects Journalism Studies to Journalism Practice (weight = 77), followed by the edges between Journalism and Journalism Studies (weight = 66) and between Journalism and Journalism Practice (weight = 56). The most typical directed edge goes from Journalism Practice to Journalism Studies (directed weight = 43), followed by the edge from Journalism to Journalism Studies (directed weight = 36). Journalism Studies is also the most popular target from Digital Journalism (directed weight = 33) and from New Media and Society (directed weight = 24).

Discussion and Conclusions

Within the framework of global knowledge production in communication [10], both the productivity of individual scholars and the prestige-positions of academic journals of the field have been widely investigated [2, 34, 46]. An extensive research deals with the content analysis of per reviewed international journals [3, 29], citation networks [38, 47], and the topical fragmentation of the field [40, 41]. However, a detailed analysis of the actual publication trajectories of the best-performing researchers and an inductive determination of their most popular publication outlets is still missing. Assuming that top-performing scholars might serve as role models for the whole scientific community [15], this paper aims to scrutinize the publication trajectories, the most popular publication outlets and the main thematic clusters (research hubs) of the most productive communication scholars. Our results provide two general contributions to the ongoing discussion of publication trends, productivity, and research assessment within communication studies and beyond.

First, when research excellence is discussed in the literature, the question of quality versus quantity frequently emerges. It would be in keeping with common sense to assume that publishing less papers would correlate with high quality, since scholars with an infrequent publication pace can dedicate more time to their research. Consequently, they can publish richer and more sophisticated papers. If this were the case, empirical evidence should show that scholars with high productivity publish in less selective journals. However, several studies refute this theory by providing empirical evidence that demonstrate that higher productivity is positively associated to higher impact, as measured by citation counts. In line with these research directions, we scrutinized the publication strategies of the most productive scholars in communication, and, based on our empirical analysis, our findings fully support the assumption that quantity goes hand in hand with quality.

Specifically, almost 90 percent of papers by top authors were published in the top (Q1) quartile of Scopus, while there was a minimal proportion (less than 1 percent) of publications in the lowest (Q4) quartile. Moreover, all the top performing authors had at least one paper in a Q1 journal. Also, 97 percent had at least one paper in one of the D1 journals (representing the top 10 percent of communication journals), and three-quarters of the authors had at least one paper in one of the top 10 journals in the field (representing the top 2 percent of all indexed periodicals). Therefore, results show not only that all the most productive authors were able to publish at the top level, but also demonstrated that most of them were able to publish in the most selective journals of the field.

By computing a pool of the most productive scholars, two challenging distortions emerge by serendipity. Accordingly, our second contribution lies in the examination of misleading research assessments when scholars, without careful criticism, take scientometric data at face value. The first type of distortion relates to categorization itself, since very different branches of research can be classified under the single label of “communication”. Specifically, scholars in engineering and computer science, who typically work on information processing topics, are classified as communication scholars, even though they are not. Not surprisingly, the number of engineers amongst the top 100 scholars was significant (n = 29). This pattern may be explained by the fact that engineers do not only publish in journals but also in Scopus-indexed conference proceedings [48].

Communication engineers form a different academic network from communication. This rational assumption is mainly backup by two instances: 1) none of the top 100 scholars who publish in engineering journals and conference proceedings have ever published in communication journals within social sciences, and vice versa, and 2) no prior scholarship dealing with the classification of different scholarly fields within communication considers communication engineering as a subfield [2, 3, 34, 35, 38, 41].

The second contribution of this paper is related to the research tradition that investigates so-called fragmentation [41] or balkanization [40] of communication studies. Our results show that there are robust self-loops for publication trajectories in the case of several high-profile journals. Self-loops refer to cases where authors publish in the same journal repeatedly. When the presence of these self-loops is significant, it can mean that there is a remarkable number of platforms that publish specific research that is not typically published elsewhere.

However, explanations of the presence of strong self-loops should not be simplistic, since there are several potential causes for this phenomenon. For example, in the case of Communications Sciences and Disorders, self-loops might be due to the fact that the journal is published in Korea, and, according to Scopus data, 97 percent of the papers published here are written by Korean scholars. In this case, cultural and regional impermeability can explain self-loops for both the journal and for the most productive authors of the journal. In our sample, those authors that publish in this journal might be very productive, but their production is limited to their intensive publication in Communication Sciences and Disorders, without publishing elsewhere. Another example is Communication Education. While 90 percent of its authorship works in the US, it is the limitation of topical interest of the journal, and not geopolitical closedness that most likely explains tis self-loops.

Finally, there are instances where self-loops might be an indicator of selectivity bias. In other words, some journals may publish much more papers than similar journals in the field. This is the case of both Health Communication, that publishes 5 or 6 times more papers annually than the Journal of Health Communication, and International Journal of Advertising, that publishes, on average, twice as many papers as the Journal of Advertising. Consequently, less selective journals tend to form self-loops because authors prefer them due to the greater potential success of publishing their papers, as contrasted to their chances with more selective periodicals.

Besides the presence of self-loops, relatively autonomous hubs also point to the balkanization of the field [40]. Our results show that the publication trajectories of the most productive scholars form 4 interconnected networks, of which three are relatively separate hubs and one is more embedded. The first relatively separate hub is formed around Health Communication, a journal which also has very strong self-loops. However, it is strongly interconnected with International Journal of Health Communication – a less selective journal with the same focus. More interestingly, it has also relatively tight links to the top-tier journal Human Communication Research, most likely because they both have a focus on human communication. While both human communication and health communication have the human agent as their focal points, earlier studies typically do not mention human communication as a distinguished cluster in communication research. Waisbord [41], for example, enumerates several research clusters that deal with human communication, but his classification is based on the type, the mode, the medium or the aim of communication, and not on the main agents involved.

Another relatively autonomous cluster developed around Communication Education, which has strong ties with Communication Teacher, a journal with a similar focus (both are published by the National Communication Association). Interestingly, there are several journals with a more general focus that are connected to Communication Education. It is noteworthy that all these journals are published by one of the regional American communication associations. These periodicals are Communication Studies and Western Journal of Communication, published by the Western States Communication Association, Communication Research Reports, Communication Quarterly and Qualitative Research Reports in Communication by the Eastern States Communication Association, and Communication Studies by the Central States Communication Association. Thus, the existence of this cluster can be explained by the fact that official communication associations are committed not just to communication research, but also to communication education, and thus their journals are open to papers that focus on this specific topic.

Journals that focus on advertising and public relations form another distinguished cluster with two main journals: The Journal of Advertising and the International Journal of Advertising, besides the Journal of Advertising Research. Through the International Journal of Advertising, there is a strong connection with a relatively separate sub-cluster with a public relations focus; this is formed around Public Relations Review, and it involves journals such as the International Journal of Strategic Communication and the Journal of Communication Management.

Finally, the most complex cluster has a focus on journalism and media studies. In terms of publication outlets for the most productive scholars, this is the most important and most extensive cluster within the field. It contains more than a dozen high-profile journals with strong ties between them, but the cluster is also connected with most journals with different research foci. However, there are four journals in a salient position within the cluster, namely Journalism, Journalism Studies, Journalism Practice and Digital Journalism. The very strong connections suggest that authors who publish in the field of journalism most likely publish in these journals alternately, and the most productive authors within this specific field publish in all of them.

A fifth journal, New Media and Society should be also mentioned, but it is slightly different since its ties to the other four journals are looser, with the strongest connections to Journalism Studies and Digital Journalism. It is noteworthy that the oldest journal in the field, Journalism and Mass Communication Quarterly is not a salient part of this cluster as we might suppose, based on the shared research focus. It has, although low-weighted, ties to Journalism and Journalism Studies, but not to other important journals of the field such as Journalism Practice or New Media and Society.

It is also noteworthy that, while the journalism hub has extensive connections to journals beyond this specific hub, it is relatively closed if we consider primary relations only (Fig. 8). In this case, the journalism hub consists of only 14 journals, all the other journals connect to the hub through some general journals such as the International Journal of Communication and Communication Research. Moreover, Global South journals such as the Chinese Journal of Communication, the Asian Journal of Communication, African Journalism Studies and their Latin-American counterparts such as the Brazilian Journalism Research or Comunicacion y Sociedad Mexico, connect to the hub by relatively weak ties. Thus, when we consider only those journals that have at least 5 primary connections to the journalism hub, all Global South journals disappear (Fig. 8).

Fig. 8
figure 8

The journalism hub with primary connections only Arrow size represents in-degree

Besides outlining the main clusters where the most productive scholars publish, our analysis also offers information on the possible trajectories between different journals. Clusters tell in which journals a specific kind of author publishes regularly and directed edges can also tell the most typical order of publication. Journals with more in-degree but less out-degree are those in which it is easier to publish than in other journals with a similar focus. The position of these target journals can be explained either by their higher annual number of publications or by their lower ranking position (these two features are typically, but not always, interrelated). The opposite holds for those periodicals that have higher out-degree than in-degree values, such as the Journal of Communication. Notwithstanding, these assumptions are tentative, since we do not have information on the number of papers submitted, but only on the number of published articles. Still, if we hypothesize that the authors’ intention to publish is the same in the case of all journals, and assuming similar ranking positions and selectivity, then, within the same field, it is easier to publish in those journals that publish more papers.

In conclusion, given the growing importance of productivity as a currency for assessing research excellence, this study attempts to provide an overview of the publication trajectories of the most productive communication scholars. We found that scholars that publish the most also publish in the best journals. Thus the “publish or perish” paradigm calls not only for quantity, but also for quality. We also found that the balkanization of the field is a phenomenon that can be spot not just on a general level, but also on the level of the most productive scholars. As opposed to previous studies that build their analysis on a set of predetermined journals, we identify the most salient journals in which leading scholars publish. Results support the relevance of this approach in two respects.

First, we found that top scholars generally publish in those periodicals that are considered as top-tier journals. Our analysis thus confirms the appropriateness of journal rankings in communication studies. Second, our analysis uncover some distortions in current methods for research assessment, as a consequence of categorization problems. Based on our results, we suggest that when evaluating scientific performance, special care should be taken to examine not only scientometric records, but also individual publication trajectories, which must also be compared with the general publication pattern of the discipline.

Limitations and Further Studies

Several limitations of this analysis are noteworthy. First, we decided to use SciVal for the determination of the most productive scholars. This means that we counted only papers published in Scopus-indexed journals. This selection has its obvious limitations, since we might have got a different pool of authors if we had used another tool such as Web of Science’s InCites, JCR or Google Scholar. However, JCR, on the one hand, is frequently criticized for its strong bias in favor of the English language, Western topics, and quantitative approaches [46]. Google Scholar, on the other hand, has no appropriate scientometric tool for an analysis of this kind yet. We suggest that future studies should conduct a similar analysis in order to determine the most typical publication outlets of best performing scholars, instead of working with a predetermined set of well-known periodicals. Second, we decided to rank the authors by the number of published papers, and not by other scientometric values such as Hirsch index, citation counts, altimetric measures or field-weighted citation counts. For our purposes, it was reasonable to work with the number of publications since it represents the productivity of our selected scholars, while other aspects tend to show their impact instead. Also, SciVal ranks top authors by their productivity, and not by impact metrics. However, future studies could analyze the publication trajectories of scholars with the most impact, which might lead to different results.