Hostname: page-component-76fb5796d-2lccl Total loading time: 0 Render date: 2024-04-25T17:52:09.024Z Has data issue: false hasContentIssue false

Investigating the correlation between students’ perception of authenticity in assessment and their academic achievement in the associated assessment tasks

Published online by Cambridge University Press:  01 December 2020

Samrat Ghosh*
Affiliation:
National Centre for Ports & Shipping, Australian Maritime College, University of Tasmania, Launceston, Tasmania, Australia
Benjamin Brooks
Affiliation:
National Centre for Ports & Shipping, Australian Maritime College, University of Tasmania, Launceston, Tasmania, Australia
Dev Ranmuthugala
Affiliation:
National Centre for Ports & Shipping, Australian Maritime College, University of Tasmania, Launceston, Tasmania, Australia
Marcus Bowles
Affiliation:
School of Engineering and Built Environment, Deakin University, Burwood, Victoria, Australia
*
*Corresponding author. E-mail: sghosh@utas.edu.au

Abstract

The objective of this research was to investigate the factors of assessment that students undergoing authentic assessment perceived to be significant regarding their academic achievement. This project advanced past research by the authors which found that the academic achievement of seafarer students was significantly higher in a formatively implemented authentic assessment compared with a summative traditional assessment. The academic achievement (assessment scores) was based on the students’ performance in analysing information presented in a real-world context (authentic assessment) as opposed to the analysis of information presented devoid of a real-world context (traditional assessment). Using the data obtained from students undergoing the authentic assessment, this project correlated their perceptions of authenticity for factors of assessment to their scores in the associated task. Stage 1 focused on deriving the factors conceptually from the definition of the authentic assessment by the authors, based on which a perception survey questionnaire was designed. Stage 2 extracted new factors through a factor analysis conducted using the software SPSS. Both stages of investigation found that the factor of transparency of criteria was a significant predictor of the students’ academic achievement.

Type
Research Article
Copyright
Copyright © The Royal Institute of Navigation 2020

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Abeywickrama, P. (2012). Rethinking traditional assessment concepts in classroom-based assessment. The CATESOL Journal, 23(1), 205214.Google Scholar
AMC. (2011). Report on the Review of the National Centre for Ports & Shipping: Deck Officer Courses. Launceston, Tasmania: Australian Maritime College.Google Scholar
Ashford-Rowe, K., Herrington, J. and Brown, C. (2014). Establishing the critical elements that determine authentic assessment. Assessment & Evaluation in Higher Education, 39(2), 205222. doi:10.1080/02602938.2013.819566CrossRefGoogle Scholar
Bailey, K. M. (1998). Learning About Language Assessment: Dilemmas, Decisions, and Directions. New York: Heinle & Heinle.Google Scholar
Bhardwaj, S. (2009). Quality Maritime Education and Training. In: Loginovsky, V. (ed.) MET Trends in the XXI Century: Shipping Industry and Training Institutions in the Global Environment – Area of Mutual Interests and Cooperation, 19–21 September. Saint Petersburg, Russia: Admiral Makarov State Maritime Academy, 2932.Google Scholar
Biggs, J. and Tang, C. (2011). Teaching for Quality Learning at University: What the Student Does. Kuala Lumpur, England: McGraw-Hill.Google Scholar
Black, P. and Wiliam, D. (1998a). Assessment and classroom learning. Assessment in Education. Principles, Policy & Practice, 5(1), 774. doi: 10.1080/0969595980050102.CrossRefGoogle Scholar
Black, P. and Wiliam, D. (1998b). Inside the black box: raising standards through classroom assessment. Phi Delta Kappan, 80(2), 139144.Google Scholar
Blondy, L. C. (2007). A Correlational Study Between the Critical Thinking Skills of Nursing Faculty and Their Perceived Barriers to Teaching Critical Thinking Skills to Nursing Students. Minneapolis, MN: Capella University.Google Scholar
Boud, D. and Falchikov, N. (2006). Aligning assessment with long-term learning. Assessment & Evaluation in Higher Education, 31(4), 399413. doi:10.1080/02602930600679050CrossRefGoogle Scholar
Clark, R. A. (2014). Correlation Study: The Effect of Student-Teacher Rapport on High School Student Performance Rate. Lynchburg, VA: Liberty University.Google Scholar
Dikli, S. (2003). Assessment at a distance: traditional vs. alternative assessments. The Turkish Online Journal of Educational Technology, 2(3), 1319.Google Scholar
Emad, G. and Roth, W. M. (2007). Evaluating the competencies of seafarers: challenges in current practice. In: Pelton, T., Reis, G. and Moore, K. (eds.). Proceedings of the University of Victoria Faculty of Education Research Conference - Connections'07, Victoria, BC: University of Victoria, 7176.Google Scholar
Ghosh, S., Bowles, M., Ranmuthugala, D. and Brooks, B. (2016). Authentic assessment in seafarer education: Using literature review to investigate its’ validity and reliability through rubrics. WMU Journal of Maritime Affairs, 15(2), 317330. doi:10.1007/s13437-015-0094-0CrossRefGoogle Scholar
Ghosh, S., Bowles, M., Ranmuthugala, D. and Brooks, B. (2017). Improving the validity and reliability of authentic assessment in seafarer education: A conceptual and practical framework to enhance resulting assessment outcomes. WMU Journal of Maritime Affairs, 16(3), 455472. doi:10.1007/s13437-017-0129-9CrossRefGoogle Scholar
Ghosh, S., Brooks, B., Ranmuthugala, D. and Bowles, M. (2020). Authentic versus traditional assessment: An empirical study investigating the difference in seafarer students’ academic achievement. Journal of Navigation. Published Online. doi:10.1017/S0373463319000894CrossRefGoogle Scholar
Goodwin, L. D. and Leech, N. L. (2006). Understanding correlation: factors that affect the size of r. The Journal of Experimental Education, 74(3), 251266. doi:10.3200/JEXE.74.3.249-266.CrossRefGoogle Scholar
Graziano, A. M. and Raulin, M. L. (2000). Research Methods: A Process of Inquiry. New York, USA: A Pearson Education Company.Google Scholar
Gulikers, J. T. M. (2006). Authenticity Is in the Eye of the Beholder: Beliefs and Perceptions of Authentic Assessment and the Influence on Student Learning. Heerlen: Open Universiteit Nederland.Google Scholar
Hattie, J. A. C. (2009). Visible Learning: A Synthesis of Over 800 Meta-Analyses Relating to Achievement. London: Routledge.Google Scholar
Hattie, J. A. C. and Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81112. doi: 10.3102/003465430298487CrossRefGoogle Scholar
Herrington, J. A. (1997). Authentic Learning in Interactive Multimedia Environments. Perth, Western Australia: Edith Cowan University.Google Scholar
IMO. (2011). International Convention on Standards of Training, Certification, and Watchkeeping for Seafarers. London: International Maritime Organization.Google Scholar
Jonsson, A. (2008). Educative Assessment for/of Teacher Competency. A Study of Assessment and Learning in the ‘Interactive Examination’ for Student Teachers. Malmo, Sweden: Malmo University.Google Scholar
Law, B. and Eckes, M. (1995). Assessment and ESL: On the Yellow Big Road to the Withered of Oz. Winnipeg, Peguis: Manitoba.Google Scholar
Lukacs, P. M., Burnham, K. P. and Anderson, D. R. (2010). Model selection bias and Freedman's paradox. Annals of the Institute of Statistical Mathematics, 62, 117125. doi: 10.1007/s10463-009-0234-4CrossRefGoogle Scholar
Maltby, A. and Mackie, S. (2009). Virtual learning environments – help or hindrance for the ‘disengaged’ student? ALT-J, 17(1), 4962. doi: 10.1080/09687760802657577CrossRefGoogle Scholar
Maringa, M. T. H. (2015). Assessment of Quality of Training and Education of Seafarers in South Africa and Ghana. Malmo, Sweden: World Maritime University.Google Scholar
Morrissey, P. E. (2014). Investigating how an Authentic Task can Promote Student Engagement when Learning About Australian History. Wollongong, NSW, Australia: University of Wollongong.Google Scholar
Mueller, J. (2006). Authentic assessment toolbox. Retrieved from http://jonathan.mueller.faculty.noctrl.edu/toolbox/whatisit.htm#looklikeGoogle Scholar
Neill, J. (2008). Writing up a Factor Analysis. Centre for Applied Psychology. Canberra, Australia: University of Canberra.Google Scholar
Reddy, M. Y. (2007). Rubrics and the enhancement of student learning. Educate, 7(1), 317.Google Scholar
Robson, C. (2011). Real World Research. Chichester, UK: John Wiley & Sons.Google Scholar
Robson, C. S. (2007). Toward an International Rubric: A Compilation of STCW Competency Assessment Methodologies. In: Zhukov, D. (ed.), 8th IAMU Annual General Assembly. Odesa, Ukraine: Odesa National Maritime Academy, 247258.Google Scholar
Sadler, P. M. and Good, E. (2006). The impact of self- and peer-grading on student learning. Educational Assessment, 11(1), 131. doi: 10.1207/s15326977ea1101-1CrossRefGoogle Scholar
Sarkar, E., Keskin, S. and Unver, H. (2011). Using of factor analysis scores in multiple linear regression model for prediction of kernel weight in Ankara walnuts. The Journal of Animal & Plant Sciences, 21(2), 182185.Google Scholar
Suresh, K. P. and Chandrashekara, S. (2012). Sample size estimation and power analysis for clinical research studies. Journal of Human Reproductive Sciences, 8(3), 186192. doi: 10.4103/0974-1208.97779Google Scholar
Tan, S. H. and Tan, S. B. (2011). The correct interpretation of confidence intervals. Proceedings of Singapore Healthcare, 19(3), 276278.CrossRefGoogle Scholar
Tavakol, M. and Dennick, R. (2011). Making sense of Cronbach's alpha. International Journal of Medical Education, 2, 5355. doi: 10.5116/ijme.4dfb.8dfdCrossRefGoogle ScholarPubMed
Wiggins, G. (1989). A true test: toward more authentic and equitable assessment. The Phi Delta Kappan, 70(9), 703713. doi: 10.1177/003172171109200721Google Scholar
Williams, B., Brown, T. and Onsman, A. (2010). Exploratory factor analysis: A five-step guide for novices. Journal of Emergency Primary Health Care, 8(3), 113.Google Scholar
Yong, A. G. and Pearce, S. (2013). A beginner's guide to factor analysis: focusing on exploratory factor analysis. Tutorials in Quantitative Methods for Psychology, 9(2), 7994. doi: 10.20982/tqmp.09.2.p079CrossRefGoogle Scholar
Zhang, L. and Zheng, Y. (2018). Feedback as an assessment for learning tool: How useful can it be?, 43(7), 11201132. doi: 10.1080/02602938.2018.1434481Google Scholar