Abstract
Abstract. To follow best practices in creating selection tools, an important phase in job analysis is gathering linkage ratings between knowledge, skills, abilities, and other characteristics (KSAOs) and job tasks. However, the literature provides little guidance on best practices for collecting linkage ratings. Two studies were conducted to contribute to the limited research. Study 1 examined the interrater agreement of different types of raters: job incumbents, managers, or job analysts. Results revealed that job analysts have the highest interrater agreement. Study 2 examined the impact of a frame-of-reference (FOR) training for raters. Results suggested that a brief consensus training session vastly improved agreement among raters. These studies provide more guidance on best practices for obtaining high-quality linkage ratings.
References
2007). Industrial/organizational psychology: An applied approach (5th ed.). Belmont, CA: Wadsworth.
(2005). Examining rating source variation in work behavior to KSA linkages. Personnel Psychology, 58, 1041–1054. https://doi.org/10.1111/j.1744-6570.2005.00234.x
(2011, February). Evaluating training and experience: Do multiple raters or consensus make a difference? Paper presented at the American Society of Business and Behavioral Sciences Annual Conference, Las Vegas, NV.
(2004). A systematic approach for assessing the currency (“up-to-dateness”) of job analytic information. Public Personnel Management, 37, 261–277.
(2009). Effects of descriptor specificity and observability on incumbent work analysis ratings. Personnel Psychology, 62, 601–628. https://doi.org/10.1111/j.1744-6570.2009.01151.x
(2003). Accurate tests of statistical significance for rWG and average deviation interrater agreement indexes. Journal of Applied Psychology, 88, 356–362. https://doi.org/10.1037/0021-9010.88.2.356
(1978). Uniform guidelines on employee selection procedures. Federal Register, 43, 38290–38315.
. (2003). Statistical methods for rates and proportions (3rd ed.). Hoboken, NJ: Wiley.
(2012, April). Think about the link: Best practices for collecting KSAO-task linkages. Poster presented at 27th meeting of the Society for Industrial-Organizational Psychology, San Diego, CA.
(2000). Yes Virginia, there is an objective reality in job analysis. Journal of Organizational Behavior, 21, 829–854. https://doi.org/10.1002/1099-1379(200011)21:7<829::AID-JOB30>3.0.CO;2-4
(1989). Evaluation of task and job skill linkage judgments used to develop test specifications. Personnel Psychology, 42, 283–292. https://doi.org/10.1111/j.1744-6570.1989.tb00658.x
(1984). Estimating within-group interrater reliability with and without response bias. Journal of Applied Psychology, 69, 85–98. https://doi.org/10.1037/0021-9010.69.1.85
(1993). rwg: An assessment of within-group interrater agreement. Journal of Applied Psychology, 78, 306–309. https://doi.org/10.1037/0021-9010.78.2.306
(2001). Assessing the incremental validity of team consensus ratings over aggregation of individual-level data in predicting team effectiveness. Personnel Psychology, 54, 645–667. https://doi.org/10.1111/j.1744-6570.2001.tb00226.x
(2011). Is more structure really better? A comparison of frame-of-reference training and descriptively anchored rating scales to improve interviewers’ rating quality. Personnel Psychology, 64, 53–87. https://doi.org/10.1111/j.1744-6570.2010.01202.x
(1997). Social and cognitive sources of potential inaccuracy in job analysis. Journal of Applied Psychology, 82, 627–655. https://doi.org/10.1037/0021-9010.82.5.627
(2004). Self-presentation processes in job analysis: A field experiment investigating inflation in abilities, tasks, and competencies. Journal of Applied Psychology, 89, 674–686. https://doi.org/10.1037/0021-9010.89.4.674
(2011, April). Back to basics: Who should complete KSAO-task linkages? Poster presented at 26th meeting of the Society for Industrial-Organizational Psychology, Chicago, IL.
(1996). Individual differences in interviewer ratings: The impact of standardization, consensus discussion, and sampling error on the validity of a structured interview. Personnel Psychology, 49, 85–102. https://doi.org/10.1111/j.1744-6570.1996.tb01792.x
(1992). Mechanical vs consensus-derived assessment center ratings: A comparison of job performance validities. Public Personnel Management, 21, 18–28. https://doi.org/10.1177/009102609202100102
(2013). Organizational behavior (15th ed.). Upper Saddle River, NJ: Pearson Education.
(2012). Rater training revisited: An updated meta-analytic review of frame-of-reference training. Journal of Occupational and Organizational Psychology, 85, 370–395. https://doi.org/10.1111/j.2044-8325.2011.02045.x
(2003). Principles for the validation and use of personnel selection procedures (4th ed.). Bowling Green, OH: Author.
. (1993). An alternative procedure for analyzing job analysis results for content-oriented test development. Journal of Business and Psychology, 8, 215–226. https://doi.org/10.1007/BF02230386
(1994). Understanding frame-of-reference training: The impact of training on the recall of performance information. Journal of Applied Psychology, 79, 525–534.
(