Abstract
Cognitive ability is one of the best predictors of performance on the job and past research has seemingly converged on the idea that narrow cognitive abilities do not add incremental validity over general mental ability (GMA) for predicting job performance. In the present study, we propose that the reason for the lack of incremental validity in previous research is that the narrow cognitive abilities that have been assessed most frequently are also the abilities that are most highly correlated with GMA. Therefore, we expect that examining a broader range of narrow cognitive abilities that are less highly correlated with GMA will demonstrate incremental validity for narrow abilities. To examine this prediction, we conducted an updated meta-analysis of the relationship between cognitive ability and a multidimensional conceptualization of job performance (task performance, training performance, organizational citizenship behavior, counterproductive work behavior, withdrawal). Using several different methods of analyzing the data, results indicated that the narrow cognitive abilities that are the least highly correlated with GMA added substantial incremental validity for predicting task performance, training performance, and organizational citizenship behavior. These results have important implications for the assessment of cognitive ability and the employee selection process.
Similar content being viewed by others
Notes
Here, we differentiate between overall job performance and specific dimensions of job performance. Overall job performance is the higher-order construct while the specific dimensions of job performance are the narrower forms of performance such as task performance, training performance, OCB, CWB, and withdrawal. In the present study, we focus on the specific dimensions of performance to examine differences in the prediction of these behaviors rather than the higher-order performance construct.
To estimate the intercorrelations, we examined the correlation matrices provided in the original studies in our database. If at least two different cognitive abilities were measured in the same study, we coded the intercorrelations between those abilities when they were reported in the article. For studies that did not report these intercorrelations, we searched the technical manuals (if available) for the cognitive ability measure that was used and included the correlations reported in the manuals whenever possible. We identified 24 studies that reported correlations between two or more cognitive abilities. In addition, to supplement these 24 studies, we also included information reported by Carroll (1993), which is the most comprehensive and widely used study on the structure of cognitive ability to date. Carroll (1993) reported results from 135 additional samples that we used to calculate correlations between cognitive abilities. These correlations were then corrected for range restriction but not unreliability because we were interested in the operational validities. The procedures for correcting these correlations are described in the online supplemental material.
We could not estimate latent factors for the narrow abilities because we did not have item-level data in our meta-analytic database.
To calculate the meta-analytic correlations, the relevant regression weights for different conditions would simply be added to the intercept to estimate the overall effect size. For example, the meta-analytic estimate of the correlation between GMA and an objective measure of training performance is equal to .23 (intercept) + .02 (weight for GMA) + .10 (weight for training performance) + .14 (weight for objective performance) = .49 (see Table 4).
To determine how much of an effect the smaller SD ratio had on the results of the present study, we re-ran our analyses using the smaller SD ratio to determine the meta-analytic correlation between GMA and performance. These exploratory results indicated that the meta-analytic correlation between GMA and task performance was .25 for subjective criteria and .40 for objective criteria. In addition, the meta-analytic correlation between GMA and training performance was .36 for subjective criteria and .51 for objective criteria. Despite these stronger correlations, we continue to use the SD ratio of .89 to correct for range restriction in all subsequent analyses because that correction corresponded to the data from the studies incorporated in this meta-analysis.
Although estimating a latent factor using these three narrow abilities may be methodologically preferable to correlating errors, we used correlated errors so that we could examine the incremental validity of quantitative knowledge, reading and writing, and general knowledge separately in this model.
References
Ackerman, P. L. (1996). A theory of adult intellectual development: Process, personality, interests, and knowledge. Intelligence, 22, 227–257.
Alvarez, K. M., & Hulin, C. L. (1972). Two explanations of temporal changes in ability-skill relationships: A literature review and theoretical analysis. Human Factors, 14, 295–308.
Bertua, C., Anderson, N., & Salgado, J. F. (2005). The predictive validity of cognitive ability tests: A UK meta-analysis. Journal of Occupational and Organizational Psychology, 78, 387–409.
Bommer, W. H., Johnson, J. L., Rich, G. A., Podsakoff, P. M., & MacKenzie, S. B. (1995). On the interchangeability of objective and subjective measures of employee performance: A meta-analysis. Personnel Psychology, 48, 587–605.
Borman, W. C., & Motowidlo, S. J. (1997). Task performance and contextual performance: The meaning for personnel selection research. Human Performance, 10, 99–109.
Brown, K. G., Le, H., & Schmidt, F. L. (2006). Specific aptitude theory revisited: Is there incremental validity for training performance? International Journal of Selection and Assessment, 14, 87–100.
Call, M. L., Nyberg, A. J., & Thatcher, S. (2015). Stargazing: An integrative conceptual review, theoreticalreconciliation, and extension for star employee research. Journal of Applied Psychology, 100, 623–640.
Campbell, J. P. (2012). Behavior, Performance, and Effectiveness in the Twenty-first century. The Oxford Handbook of Organizational Psychology, 1, 159–194.
Carpenter, N. C., & Berry, C. M. (2017). Are counterproductive work behavior and withdrawal empirically distinct? Ameta-analytic investigation. Journal of Management, 43, 834–863.
Carroll, J. B. (1993). Human cognitive abilities: A survey of factor-analytic studies. Cambridge University Press.
Cattell, R. B. (1943). The measurement of adult intelligence. Psychological Bulletin, 40, 153–193.
Cattell, R. B. (1971). Abilities: Their structure, growth, and action. Houghton Mifflin.
Cochran, W. G. (1977). Sampling techniques (3rd ed.). Wiley.
Conway, J. M., & Huffcutt, A. I. (1997). Psychometric properties of multisource performance ratings: A meta-analysis of subordinate, supervisor, peer, and self-ratings. Human Performance, 10, 331–360.
Dilchert, S., Ones, D. S., Davis, R. D., & Rostow, C. D. (2007). Cognitive ability predicts objectively measured counterproductive work behaviors. Journal of Applied Psychology, 92, 616–627.
Drasgow, F. (2013). Intelligence and the workplace. In N. W. Schmitt & S. Highhouse (Eds.), Handbook of psychology: Industrial and organizational psychology, Vol. 12 (pp. 184–210). John Wiley & Sons Inc.
Fabrigar, L. R., Wegener, D. T., MacCallum, R. C., & Strahan, E. J. (1999). Evaluating the use of exploratory factor analysis in psychological research. Psychological Methods, 4, 272–299.
Gignac, G., & Szodorai, E. (2016). Effect size guidelines for individual differences researchers. Personality and Individual Differences, 102, 74–78.
Gonzalez-Mulé, E., & Aguinis, H. (2018). Advancing theory by assessing boundary conditions with metaregression: A critical review and best-practice recommendations. Journal of Management, 44, 2246–2273.
Gonzalez-Mulé, E., Mount, M. K., & Oh, I. S. (2014). A meta-analysis of the relationship between general mental ability and nontask performance. Journal of Applied Psychology, 99, 1222–1243.
Gruys, M. L., & Sackett, P. R. (2003). Investigating the dimensionality of counterproductive work behavior. International Journal of Selection and Assessment, 11, 30–42.
Hambrick, D. Z., Oswald, F. L., Altmann, E. M., Meinz, E. J., Gobet, F., & Campitelli, G. (2014). Deliberate practice: Is that all it takes to become an expert? Intelligence, 45, 34–45.
Hoerl, A. E., & Kennard, R. W. (1970). Ridge regression: Applications to nonorthogonal problems. Technometrics, 12, 69–82.
Horn, J. L., & Blankson, N. (2005). Foundations for better understanding of cognitive abilities. In D. P. Flanagan & P. L. Harrison (Eds.), Contemporary intellectual assessment: Theories, tests, and issues (2nd ed., pp. 41–68). Guilford Press.
Humphreys, L. G. (1994). Intelligence from the standpoint of a (pragmatic) behaviorist. Psychological Inquiry, 5, 179–192.
Hunter, J. E. (1986). Cognitive ability, cognitive aptitudes, job knowledge, and job performance. Journal of Vocational Behavior, 29, 340–362.
Hunter, J. E., & Hunter, R. F. (1984). Validity and utility of alternate predictors of job performance. Psychological Bulletin, 96, 72–98.
Hunter, J. E., & Schmidt, F. L. (2004). Methods of meta-analysis: Correcting error and bias in research findings (2nd ed.). Sage.
Hunter, J. E., Schmidt, F. L., & Le, H. (2006). Implications of direct and indirect range restriction for meta-analysis methods and findings. Journal of Applied Psychology, 91, 594–612.
Jackson, D. J., Putka, D. J., & Teoh, K. R. (2015). The first principal component of multifaceted variables: It’s more than a G thing. Industrial and Organizational Psychology, 8, 446–452.
Jensen, A.R. (1998). The G Factor: The Science of Mental Ability. Human evolution, behavior, and intelligence. Praeger.
Johnson, J. W. (2000). A heuristic method for estimating the relative weight of predictor variables in multiple regression. Multivariate Behavioral Research, 35, 1–19.
Krumm, S., Schmidt-Atzert, L., & Lipnevich, A. A. (2014). Specific Cognitive Abilities at Work. Journal of Personnel Psychology, 13, 117–122.
Lang, J. W., Kersting, M., Hülsheger, U. R., & Lang, J. (2010). General mental ability, narrower cognitive abilities, and job performance: The perspective of the nested-factors model of cognitive abilities. Personnel Psychology, 63, 595–640.
Lubinski, D. (2006). Ability tests. In M. Eid & E. Diener (Eds.), Handbook of multimethod measurement in psychology (pp. 101–114). American Psychological Association.
Maltarich, M. A., Nyberg, A. J., & Reilly, G. (2010). A conceptual and empirical analysis of the cognitive ability–voluntary turnover relationship. Journal of Applied Psychology, 95, 1058–1070.
McGrew, K. S. (1997). Analysis of the major intelligence batteries according to a proposed comprehensive Gf–Gc framework. In D. P. Flanagan, J. L. Genshaft, & P. L. Harrison (Eds.), Contemporary intellectual assessment: Theories, tests, and issues (pp. 151–179). Guilford.
McGrew, K. S. (2009). CHC theory and the human cognitive abilities project: Standing on the shoulders of the giants of psychometric intelligence research. Intelligence, 37, 1–10.
Mount, M. K., Oh, I. S., & Burns, M. (2008). Incremental validity of perceptual speed and accuracy over general mental ability. Personnel Psychology, 61, 113–139.
Murphy, K. R. (2008). Explaining the weak relationship between job performance and ratings of job performance. Industrial and Organizational Psychology, 1, 148–160.
Murphy, K. R. (2009). Content validation is useful for many things, but validity isn’t one of them. Industrial and Organizational Psychology, 2, 453–464.
Murphy, K., Dzieweczynski, J. L., & Zhang, Y. (2009). Positive manifold limits the relevance of content-matching strategies for validating selection test batteries. Journal of Applied Psychology, 94, 1018–1031.
Nye, C. D., Su, R., Rounds, J., & Drasgow, F. (2017). Interest congruence and performance: Revisiting recent meta-analytic findings. Journal of Vocational Behavior, 98, 138–151.
Nye, C. D., Chernyshenko, O. S., Stark, S., Drasgow, F., Phillips, H. L., Phillips, J. B., & Campbell, J. S. (2020). More than g: Evidence for the incremental validity of performance-based assessments for predicting training performance. Applied Psychology, 69, 302–324.
Ones, D. S., Dilchert, S., & Viswesvaran, C. (2012). Cognitive Abilities. In N. Schmitt (Ed.), The Oxford Handbook of Personnel Assessment and Selection (pp. 179–224). Oxford University Press.
Organ, D. W., Podsakoff, P. M., & MacKenzie, S. B. (2006). Organizational citizenship behavior: Its nature, antecedents, and consequences. Thousand Oaks, CA: Sage.
Ree, M. J., & Carretta, T. R. (2002). g2k. Human Performance, 15, 3–23.
Ree, M. J., & Earles, J. A. (1991). Predicting training success: Not much more than g. Personnel Psychology, 44, 321–332.
Ree, M. L., Earles, J. A., & Teachout, M. S. (1994). Predicting job performance: Not much more than g. Journal of Applied Psychology, 79, 518–524.
Richman, W. L., Kiesler, S., Weisband, S., & Drasgow, F. (1999). A meta-analytic study of social desirability distortion in computer-administered questionnaires, tradiational questionnaires, and interviews. Journal of Applied Psychology, 84, 754–775.
Salgado, J. F., Anderson, N., Moscoso, S., Bertua, C., De Fruyt, F., & Rolland, J. P. (2003). A meta-analytic study of general mental ability validity for different occupations in the European community. Journal of Applied Psychology, 88, 1068–1081.
Schmidt, F. L. (2002). The role of general cognitive ability and job performance: Why there cannot be a debate. Human Performance, 15, 187–210.
Schmidt, F. L. (2014). A general theoretical integrative model of individual differences in interests, abilities, personality traits, and academic and occupational achievement: A commentary on four recent articles. Perspectives on Psychological Science, 9, 211–218.
Schmidt, F. L., & Hunter, J. E. (1998). The validity and utility of selection methods in personnel psychology: Practical and theoretical implications of 85 years of research findings. Psychological Bulletin, 124, 262–274.
Schmidt, F. L., Hunter, J. E., & Outerbridge, A. N. (1986). Impact of job experience and ability on job knowledge, work sample performance, and supervisory ratings of job performance. Journal of Applied Psychology, 71, 432–439.
Schmitt, N. (2014). Personality and cognitive ability as predictors of effective performance at work. Annual Review of Organizational Psychology and Organizational Behavior, 1, 45–65.
Schneider, W. J., & Newman, D. A. (2015). Intelligence is multidimensional: Theoretical review and implications of specific cognitive abilities. Human Resource Management Review, 25, 12–27.
Stanhope, D. S., & Surface, E. A. (2014). Examining the incremental validity and relative importance of specific cognitive abilities in a training context. Journal of Personnel Psychology, 13, 146–156.
Sturman, M. C., Cheramie, R. A., & Cashen, L. H. (2005). The impact of job complexity and performance measurement on the temporal consistency, stability, and test-retest reliability of employee job performance ratings. Journal of Applied Psychology, 90, 269–283.
Tett, R. P., Jackson, D. N., & Rothstein, M. (1991). Personality measures as predictors of job performance: A metaanalyticreview. Personnel psychology, 44, 703–742.
Tonidandel, S., & LeBreton, J. M. (2015). RWA web: A free, comprehensive, web-based, and user-friendly tool forrelative weight analyses. Journal of Business and Psychology, 30, 207–216.
Wee, S., Newman, D. A., & Joseph, D. L. (2014). More than g: Selection quality and adverse impact implications of considering second-stratum cognitive abilities. Journal of Applied Psychology, 99, 547–563.
Woodcock, R. W., & Johnson, M. B. (1989). Woodcock Johnson Psycho-Educational Battery-Revised. Riverside.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Below is the link to the electronic supplementary material.
Rights and permissions
About this article
Cite this article
Nye, C.D., Ma, J. & Wee, S. Cognitive Ability and Job Performance: Meta-analytic Evidence for the Validity of Narrow Cognitive Abilities. J Bus Psychol 37, 1119–1139 (2022). https://doi.org/10.1007/s10869-022-09796-1
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10869-022-09796-1