Abstract
Recent research has uncovered significant concerns about the validity of some types of college student self-reports. This study examines the extent to which student reports about a critical type of college experience—good teaching practices—may be biased as a function of students’ intellectual orientations and cognitive reasoning abilities. Perceptions of instruction and instructional practices are especially important in higher education, given their increasing use for institutional quality assurance, as well as faculty rehiring and promotion processes. Using a large, multi-institutional, longitudinal dataset of first-year students, this study shows that several cognitive indicators predict perceptions of six different sets of good teaching practices and that these relationships do not seem to be explained by actual differences in students’ experiences. Additional analyses indicate that halo effects, in which global evaluations of instructor quality and institutional satisfaction affect students’ perceptions of their engagement with good practices, may partially explain these findings. The results provide important implications for practice and research related to college student survey data, including ways that these biases can be reduced or eliminated to more accurately capture students’ engagement in good practices and the factors that may contribute to students’ perceptions of their environment.
Similar content being viewed by others
References
ABET. (2016). Criteria for accrediting engineering programs, 2016–2017. http://www.abet.org/accreditation/accreditation-criteria/criteria-for-accrediting-engineering-programs-2016-2017/
Allison, P. D. (1999). Multiple regression: A primer. Pine Forge Press.
Allison, P. D. (2009). Fixed effects regression models. Sage.
Altbach, P. G., & Hazelkorn, E. (2018). Measuring education quality in global rankings: What’s the likelihood? International Higher Education, 95, 12–14.
American Association of Colleges and Universities. (2011). The LEAP vision for learning. Washington, DC: American Association of Colleges and Universities.
American College Testing Program (ACT). (1991). CAAP technical handbook. Author.
Astin, A. W. (1970). The methodology of research on college impact, part one. Sociology of Education, 43(3), 223–254.
Barge, S., & Gehlbach, H. (2012). Using the theory of satisficing to evaluate the quality of survey data. Research in Higher Education, 53(2), 182–200.
Biglan, A. (1973). The characteristics of subject matter in difference academic areas. Journal of Applied Psychology, 57(3), 195–203.
Bowman, N. A. (2010). Can 1st-year college students accurately report their learning and development? American Educational Research Journal, 47(2), 466–496.
Bowman, N. A. (2010a). Assessing learning and development among diverse college students. In S. Herzog (Ed.), Diversity and educational benefits (New Directions for Institutional Research, no. 145, pp. 53–71). Jossey-Bass.
Bowman, N. A. (2011). Validity of self-reported gains at diverse institutions. Educational Researcher, 40(1), 22–24.
Bowman, N. A. (2011a). Examining systematic errors in predictors of college student self-reported gains. In S. Herzog & N. A. Bowman (Eds.), Validity and limitations of college student self-report data (New Directions for Institutional Research, no. 150, pp. 7–19). Jossey-Bass.
Bowman, N. A., & Brandenberger, J. W. (2010). Quantitative assessment of service-learning outcomes: Is self-reported change an adequate proxy for longitudinal change? In J. Keshen, B. Holland, & B. Moely (Eds.), Research for what? Making engaged scholarship matter (Advances in Service-Learning Research, Vol. 10, pp. 25–43). Information Age Publishing.
Bowman, N. A., & Hill, P. L. (2011). Measuring how college affects students: Social desirability and other potential biases in self-reported gains. In S. Herzog & N. A. Bowman (Eds.), Validity and limitations of college student self-report data (New Directions for Institutional Research, no. 150, pp. 73–85). Jossey-Bass.
Bowman, N. A., & Seifert, T. A. (2011). Can students accurately assess what affects their learning and development? Journal of College Student Development, 52(3), 270–290.
Boyer Commission on Educating Undergraduates in the Research University. (1998). Reinventing undergraduate education: A blueprint for America’s research universities. The Carnegie Foundation for the Advancement of Teaching.
Braxton, J. M., Hirschy, A. S., & McClendon, S. A. (2004). Understanding and reducing college student departure (ASHE-ERIC Higher Education Report, Vol. 30, no. 3). Jossey-Bass.
Braxton, J. M., Olsen, D., & Simmons, A. (1998). Affinity disciplines and the use of principles of good practice for undergraduate education. Research in Higher Education, 39(3), 299–318.
Cacioppo, J., Petty, R., Feinstein, J., & Jarvis, W. (1996). Dispositional differences in cognitive motivation: The life and times of individuals varying in need for cognition. Psychological Bulletin, 119(2), 197–253.
Campbell, C. M. (2017). An inside view: The utility of quantitative observation in understanding college educational experiences. Journal of College Student Development, 58(2), 290–299.
Campbell, C. M., & Cabrera, A. F. (2011). How sound is NSSE?: Investigating the psychometric properties of NSSE at a public, research-extensive institution. The Review of Higher Education, 35(1), 77–103.
Chen, P.-S.D. (2011). Finding quality responses: The problem of low-quality survey responses and its impact on accountability measures. Research in Higher Education, 52(7), 659–674.
Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 39(7), 3–7.
Chickering, A. W., & Gamson, Z. F. (1999). Development and adaptations of the seven principles for good practice in undergraduate education. New Directions for Teaching and Learning, 80, 75–81.
Cohen, J., Cohen, P., West, S. G., & Aiken, L. S. (2003). Applied multiple regression/correlation analysis for the behavioral sciences (3rd ed.). Lawrence Erlbaum.
Cruce, T., Wolniak, G., Seifert, T., & Pascarella, E. (2006). Impacts of good practices on cognitive development, learning orientations, and graduate degree plans during the first year of college. Journal of College Student Development, 47(4), 365–383.
De Boer, H., Jongbloed, B., Benneworth, P., Cremonini, L., Kolster, R., Kottmann, A., & Vossensteyn, H. (2015). Performance-based funding and performance agreements in fourteen higher education systems. Center for Higher Education Policy Studies.
De Witte, K., & Rogge, N. (2011). Accounting for exogenous influences in performance evaluations of teachers. Economics of Education Review, 30(4), 641–653.
Entwistle, N. (2010). Taking stock: An overview of key research findings (pp. 15–57). Taking stock: Research on teaching and learning in higher education.
Entwistle, N. J., & Tait, H. (1990). Approaches to learning, evaluations of teaching, and preferences for contrasting academic environments. Higher Education, 19, 169–194.
Gosen, J., & Washbush, J. (1999). Perceptions of learning in TE simulations. Developments in Business Simulation & Experiential Learning, 26, 170–175.
Groves, R. M., Fowler, F. J., Jr., Couper, M. P., Lepkowski, J. M., Singer, E., & Tourangeau, R. (2009). Survey methodology. John Wiley.
Hess, J. A., & Smythe, M. J. (2001). Is teacher immediacy actually related to student cognitive learning? Communication Studies, 52(3), 197–219.
Jessup-Anger, J. E. (2012). Examining how residential college environments inspire the life of the mind. The Review of Higher Education, 35(3), 431–462.
Kilgo, C. A., Culver, K. C., Young, R. L., & Paulsen, M. B. (2017). The relationship between students’ perceptions of “good practices for undergraduate education” and the paradigmatic development of disciplines in course-taking behavior. Research in Higher Education, 58(4), 430–448.
King, P. M., Kendall Brown, M., Lindsay, N. K., & VanHecke, J. R. (2007). Liberal arts student learning outcomes: An integrated approach. About Campus, 12(4), 2–9.
Kuncel, N. R., Credé, M., & Thomas, L. L. (2016). The validity of self-reported grade point averages, class ranks, and test scores: A meta-analysis and review of the literature. Review of Educational Research, 75(1), 63–82.
Loes, C. N., Salisbury, M. H., & Pascarella, E. T. (2014). Student perceptions of effective instruction and the development of critical thinking: A replication and extension. Higher Education, 69(5), 1–16.
Marsh, H. W. (2007). Students’ evaluations of university teaching: Dimensionality, reliability, validity, potential biases and usefulness. In R. P. Perry & J. C. Smart (Eds.), The scholarship of teaching and learning in higher education: An evidence-based perspective (pp. 319–383). Springer.
Mayhew, M. J., Rockenbach, A. N., Bowman, N. A., Seifert, T. A., & Wolniak, G. C., with Pascarella, E. T., & Terenzini, P. T. (2016). How college affects students (Volume 3): 21st century evidence that higher education works. Jossey-Bass.
McCormick, A. C., & McClenney, K. (2012). Will these trees ever bear fruit? A response to the special issue on student engagement. The Review of Higher Education, 35(2), 307–333.
Museus, S. D. (2014). The culturally engaging campus environments (CECE) model: A new theory of success among racially diverse college student populations. In Higher education: Handbook of theory and research (pp. 189–227). Springer.
Nelson Laird, T. F., & Engberg, M. E. (2011). Establishing differences between diversity requirements and other courses with varying degrees of diversity inclusivity. The Journal of General Education, 60(2), 117–137.
Nisbett, R. E., & Wilson, T. D. (1977). The halo effect: Evidence for unconscious alteration of judgments. Journal of Personality and Social Psychology, 35(4), 250–256.
Ogilvie, K., & Reza, E. M. (2009). Business student performance in traditional vs. honors course settings. Business Education Innovation Journal, 1(2), 31–37.
Padgett, R. D. (2011). The effects of the first year of college on undergraduates' development of altruistic and socially responsible behavior. (Doctoral dissertation). Iowa Research Online.
Padgett, R. D., Johnson, M. P., Saichaie, K., & Paulsen, M. B. (2009, November). The impact of physical health and good practices on psychological well-being. Paper presented at the annual conference of the Association for the Study of Higher Education, Vancouver, British Columbia, Canada.
Pascarella, E. T. (1985). College environmental influences on learning and cognitive development: A critical review and synthesis. Higher Education: Handbook of Theory and Research, 1, 1–61.
Pascarella, E. T. (2001). Using student self-reported gains to estimate college impact: A cautionary tale. Journal of College Student Development, 42(5), 488–492.
Pascarella, E. T., Cruce, T., Umbach, P., Wolniak, G., Kuh, G. D., Carini, R., et al. (2006). Institutional selectivity and good practices in undergraduate education: How strong is the link? Journal of Higher Education, 77(2), 251–285.
Pascarella, E. T., Cruce, T. M., Wolniak, G. C., & Blaich, C. F. (2004). Do liberal arts colleges really foster good practices in undergraduate education? Journal of College Student Development, 45(1), 57–74.
Pascarella, E. T., Salisbury, M. H., & Blaich, C. (2011). Exposure to effective instruction and college student persistence: A multi-institutional replication and extension. Journal of College Student Development, 52(1), 4–19.
Pascarella, E. T., Seifert, T. A., & Blaich, C. (2010). How effective are the NSSE benchmarks in predicting important educational outcomes? Change, 42(1), 16–22.
Pascarella, E. T., & Terenzini, P. T. (1991). How college affects students. San Francisco, CA: Jossey-Bass.
Pascarella, E. T., & Terenzini, P. T. (2005). How college affects students: A third decade of research (2nd Ed.). Jossey-Bass.
Pascarella, E., Wolniak, G., Seifert, T., Cruce, T., & Blaich, C. (2005). Liberal arts colleges and liberal arts education: New evidence on impacts. Jossey-Bass/ASHE.
Perry, R. P., & Smart, J. C. (2007). The scholarship of teaching and learning in higher education: An evidence-based perspective. Springer.
Pike, G. R. (1993). The relationship between perceived learning and satisfaction with college: An alternative view. Research in Higher Education, 34(1), 23–40.
Pike, G. R. (1999). The constant error of the halo in educational outcomes research. Research in Higher Education, 40(1), 61–86.
Porter, S. R. (2011). Do college student surveys have any validity? The Review of Higher Education, 35(1), 45–76.
Porter, S. R. (2013). Self-reported learning gains: A theory and test of college student survey response. Research in Higher Education, 54(2), 201–226.
Prosser, M., & Trigwell, K. (1990). Student evaluations of teaching and courses: Student study strategies as a criterion of validity. Higher Education, 20(2), 135–142.
Rabe-Hesketh, S., & Skrondal, A. (2006). Multilevel modelling of complex survey data. Journal of the Royal Statistical Society: Series A (Statistics in Society), 169(4), 805–827.
Ross, M. (1989). Relation of implicit theories to the construction of personal histories. Psychological Review, 96(2), 341–357.
Sederberg, P. (2005). Characteristics of the contemporary honors college. A descriptive analysis of a survey of NCHC member colleges. Journal of the National Collegiate Honors Council, 6(2), 121–136.
Seifert, T. A., Gillig, B., Hanson, J. M., Pascarella, E. T., & Blaich, C. F. (2014). The conditional nature of high impact/good practices on student learning outcomes. The Journal of Higher Education, 85(4), 531–564.
Slavin, C., Coladarci, T., & Pratt, P. A. (2008). Is student participation in a honors program related to retention and graduation rates? Journal of the National Collegiate Honors Council, 9(2), 59–69.
Sorcinelli, M. D. (1991). Research findings on the seven principles. In A. W. Chickering & Z. F. Gamson (Eds.), Applying the seven principles of good practice in undergraduate education – New directions for teaching and learning (No. 47) (pp. 13–25). Jossey Bass.
Spooren, P., Brockx, B., & Mortelmans, D. (2013). On the validity of student evaluation of teaching the state of the art. Review of Educational Research, 83(4), 598–642.
Stevens, J. P. (2002). Applied multivariate statistics for the social sciences (4th ed.). Erlbaum.
Thorndike, E. L. (1920). A constant error in psychological ratings. Journal of Applied Psychology, 4(1), 25–29.
Ting, K. F. (2000). A multilevel perspective on student ratings of instruction: Lessons from the Chinese experience. Research in Higher Education, 41(5), 637–661.
Tourangeau, R., Rips, L., & Rasinski, K. (2000). The psychology of survey response. Cambridge University Press.
Williams, W. M., & Ceci, S. J. (1997). “How’m I doing?” Problems with student ratings of instructors and courses. Change: the magazine of higher learning, 29(5), 12–23.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendices
Appendix A Variable descriptions, descriptive statistics, and reliabilities for ACT sample (N = 6531).
Variable | Definition | Mean | Standard deviation | Min | Max |
---|---|---|---|---|---|
Intellectual orientations and cognitive reasoning abilities | |||||
Precollege academic motivation | Mean-based scale measuring academic motivation; 8-item scale, α = 69; standardized | 0.00 | 1.00 | −4.47 | 2.50 |
Precollege need for cognition | The degree to which one enjoys engaging in effortful cognitive activities; 18-item scale, α = 0.89; standardized | 0.00 | 1.00 | −3.82 | 2.50 |
Precollege ability (ACT or equivalent) | Composite ACT or SAT equivalent score converted to an ACT metric standardized | 0.00 | 1.00 | −3.77 | 2.22 |
Precollege critical thinking | Critical thinking skills; 32-item scale, α = 0.81—0.82; standardized | 0.00 | 1.00 | −2.85 | 1.94 |
Good teaching practices | |||||
Student-faculty interactions | Students’ perceptions of faculty's interest and willingness to interact with students outside the classroom and the frequency with which students interacted with faculty outside the classroom; 9-item scale, α = 0.77; standardized | 0.00 | 1.00 | −3.67 | 2.85 |
Active learning | Students’ perceptions of the frequency that faculty used various active learning techniques (including class discussions, class presentations, and assignments that required critique of an argument); 10-item scale, α = 0.77; standardized | 0.00 | 1.00 | −3.63 | 2.34 |
Collaborative learning | Students' perceptions of the frequency that faculty integrated collaborative learning approaches inside and outside of class (i.e., group projects, study groups); 7-item scale, α = 0.71; standardized | 0.00 | 1.00 | −3.50 | 2.63 |
Prompt feedback | Students’ perceptions that faculty provided timely written or oral feedback and evaluated student learning informally in the classroom; 3-item scale, α = 0.67; standardized | 0.00 | 1.00 | −3.00 | 2.27 |
Time on task | Students’ perceptions that instruction was relevant, organized, and helpful for achieving clearly defined course goals and the amount of effort students put forth on studying as a result of institutional and instructor expectations; 8-item scale, α = 0.72; standardized | 0.00 | 1.00 | −5.91 | 2.49 |
High expectations | Students’ perceptions of the frequency that faculty challenged students intellectually, especially through the use of techniques requiring higher-order thinking (i.e., applying, critiquing, and/or arguing); 5-item scale, α = 0.81; standardized | 0.00 | 1.00 | −3.36 | 2.00 |
Potential mechanisms | |||||
Perceptions of instructor quality | Generalized assessment of teaching ability and interest in teaching among instructors; 2-item scale, α = 0.89; standardized | 0.00 | 1.00 | −3.94 | 1.41 |
Overall college satisfaction | Overall satisfaction with educational experience at this institution; 2-item scale, α = 0.72; standardized | 0.00 | 1.00 | −3.68 | 1.10 |
Student background characteristics | |||||
Race/Ethnicity: Black; African American | 0 = no; 1 = yes | 0.09 | 0.28 | 0.00 | 1.00 |
Race/Ethnicity: Asian; Pacific Islander | 0 = no; 1 = yes | 0.05 | 0.23 | 0.00 | 1.00 |
Race/Ethnicity: Latinx | 0 = no; 1 = yes | 0.05 | 0.21 | 0.00 | 1.00 |
Race/Ethnicity: Other; Race/Ethnicity Unknown | 0 = no; 1 = yes | 0.03 | 0.18 | 0.00 | 1.00 |
Sex: Male | 0 = no; 1 = yes | 0.38 | 0.48 | 0.00 | 1.00 |
First-Generation Student (no parent attended college) | 0 = no; 1 = yes | 0.11 | 0.31 | 0.00 | 1.00 |
H.S. Interactions with Teachers | Frequency of interacting with teachers outside of class during high school; (1 = never, to 5 = very often); standardized | 0.00 | 1.00 | −2.06 | 0.49 |
First-year college experiences | |||||
Paradigmatic development of courses taken | 0.00 | 1.00 | −1.48 | 2.68 | |
Intended major: biological sciences | 0 = no; 1 = yes | 0.11 | 0.31 | 0.00 | 1.00 |
Intended major: business | 0 = no; 1 = yes | 0.11 | 0.32 | 0.00 | 1.00 |
Intended major: education | 0 = no; 1 = yes | 0.07 | 0.26 | 0.00 | 1.00 |
Intended major: engineering | 0 = no; 1 = yes | 0.05 | 0.21 | 0.00 | 1.00 |
Intended major: physical science | 0 = no; 1 = yes | 0.06 | 0.24 | 0.00 | 1.00 |
Intended major: professional | 0 = no; 1 = yes | 0.11 | 0.32 | 0.00 | 1.00 |
Intended major: social science | 0 = no; 1 = yes | 0.17 | 0.37 | 0.00 | 1.00 |
Courses: diverse cultures and perspectives (e.g., ethnic studies) | Number of courses taken focusing on diverse cultures and perspectives (1 = 0 courses, to 5 = 4 or more courses) | 0.00 | 1.00 | −0.71 | 3.67 |
Courses: women’s/gender studies | Number of courses taken focusing on women’s/gender studies (1 = 0 courses, to 5 = 4 or more courses) | 0.00 | 1.00 | −0.42 | 5.86 |
Courses: focus on issues of equality and/or social justice | Number of courses taken focusing on issues of equality and/or social justice (1 = 0 courses, to 5 = 4 or more courses) | 0.00 | 1.00 | −0.67 | 4.14 |
Honors program or college | 0 = no; 1 = yes | 0.16 | 0.37 | 0.00 | 1.00 |
First-year seminar | 0 = no; 1 = yes | 0.67 | 0.47 | 0.00 | 1.00 |
Learning community | 0 = no; 1 = yes | 0.32 | 0.46 | 0.00 | 1.00 |
Service learning | 0 = no; 1 = yes | 0.45 | 0.59 | 0.00 | 1.00 |
Undergraduate research | 0 = no; 1 = yes | 0.05 | 0.22 | 0.00 | 1.00 |
College grades | Students’ self-reported grades (1 = C- or lower, to 8 = A) | 0.00 | 1.00 | −3.24 | 1.22 |
Hours working for pay | How many hours per week students spend working for pay (1 = 0 h to 8 = more than 30 h) | 0.00 | 1.00 | −0.68 | 8.12 |
Appendix B Items included in each scale of good teaching practices
Faculty-student interactions (a = 0.76) |
---|
Most faculty with whom I have had contact are genuinely interested in studentsa |
Most faculty with whom I have had contact are willing to spend time outside of class to discuss issues of interest and importance to studentsa |
Most faculty with whom I have had contact are interested in helping students grow in more than just academic areasa |
During current school year, how often have you discussed grades or assignments with an instructor?b |
During current school year, how often have you talked about career plans with a faculty member or advisor?b |
During current school year, how often have you discussed ideas from readings or classes with faculty members outside of class?b |
During current school year, how often have you worked with faculty members on activities other than coursework (committees, orientation, student life activities, etc.)?b |
Indicate the extent to which you agree/disagree that I am satisfied with the opportunities to meet and interact informally with faculty membersa |
Collaborative learning (a = 0.70) |
---|
In my classes, students taught each other in addition to faculty teachingc |
Faculty encouraged me to participate in study groups outside of classc |
I have participated in one or more study group(s) outside of class. c |
In your experience at your institution during the current school year, about how often have you worked with classmates outside of class to prepare class assignments?b |
Active learning (a = 0.73) |
---|
In your experience at your institution during the current school year, about how often have you asked questions in class or contributed to class discussions?b |
In your experience at your institution during the current school year, about how often have you made a class presentation?b |
During the current school year, how much has your coursework emphasized synthesizing and organizing ideas, information, or experiences into new, more complex interpretations and relationships?d |
During the current school year, how much has your coursework emphasized making judgments about the value of information, arguments, or methods, such as examining how others gathered and interpreted data and assessing the soundness of their conclusions?d |
How often have exams or assignments required me to write essaysc |
How often have exams or assignments required me to use course content to address a problem not presented in the coursec |
How often have exams or assignments required me to compare or contrast topics or ideas from a coursec |
How often have exams or assignments required me to point out the strengths and weaknesses of a particular argument or point of viewc |
How often have exams or assignments required me to argue for or against a particular point of view and defend my argument?c |
Time on task (a = 0.72) |
---|
About how many hours in a typical week do you spend preparing for class (studying, reading, writing, doing homework or lab work, analyzing data, rehearsing, and other academic activities)?f |
Frequency that faculty gave assignments that helped in learning the course materialc |
Frequency that class time was used effectivelyc |
Frequency that the presentation of material was well organizedc |
Frequency that course goals and requirements were clearlyc explained |
High expectations (a = 0.71) |
---|
How often have faculty asked challenging questions in class?c |
How often have faculty asked you to argue for or against a particular point of view?c |
How often have faculty challenged your ideas in class?c |
Mark the box that best represents the extent to which your examinations during the current school year challenged you to do your best worke |
In your experience at your institution during the current school year, about how often have you worked harder than you thought you could to meet an instructor's standards or expectations?b |
How often have students challenged each other's ideas in class?c |
Prompt feedback (a = 0.67) |
---|
How often have faculty informed you of your level of performance in a timely manner?c |
How often have faculty checked to see if you had learned the material well before going on to new material?c |
In your experience at your institution during the current school year, about how often have you received prompt written or oral feedback from faculty on your academic performance?b |
Rights and permissions
About this article
Cite this article
Culver, K.C., Bowman, N.A. & Pascarella, E.T. How Students’ Intellectual Orientations and Cognitive Reasoning Abilities and May Shape Their Perceptions of Good Teaching Practices. Res High Educ 62, 765–788 (2021). https://doi.org/10.1007/s11162-021-09625-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11162-021-09625-z