Skip to main content
Log in

Analysis of Student Responses to Constructed Response Items in the Science Assessment of Educational Achievement in South Korea

  • Published:
International Journal of Science and Mathematics Education Aims and scope Submit manuscript

Abstract

The study aims to analyze student responses to chemistry constructed response items to obtain detailed information on science NAEA (National Assessment of Educational Achievement) in South Korea and to draw suggestions for enhancing curriculum, teaching, and learning. For this purpose, we analyzed 7444 answers that could be generalized as 1.29% of the 9th grade students by the two-stage stratified cluster sampling method. The types of answers to constructed item were classified, and the response rate distribution curve according to the achievement score for each achievement level was drawn and analyzed. In this way, analyzing the descriptive answers is an advantage of the constructed response items; that is, students’ responses vary widely, and they can systematically analyze the various types of misconceptions they have. As a result, students at the basic level tended to have misconceptions about the quantity of thermal energy, whereas students above proficient level had misconceptions about thermal equilibrium. Therefore, both the understanding and misconceptions of students differed according to achievement levels, so customized teaching and learning based on achievement levels should be developed. And the implications for curricula and improvement of teachers’ expertise in assessment of constructed items are suggested.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6.

Similar content being viewed by others

References

  • Acar, O. (2018). Investigation of the science achievement models for low and high achieving schools and gender differences in Turkey. Journal of Research in Science Teaching, 56(5), 649–675. https://doi.org/10.1002/tea.21517.

    Article  Google Scholar 

  • Adadan, E., & Yavuakaya, M. N. (2018). Examining the progression and consistency of thermal concepts: A cross-age study. International Journal of Science Education, 40(4), 371–396. https://doi.org/10.1080/09500693.2018.1423711.

    Article  Google Scholar 

  • Angoff, W. H. (1971). Scales, norms and equivalent scores. In R. L. Thorndike (Ed.), Educational measurement (2nd ed.). American Council on Education.

    Google Scholar 

  • Australian Curriculum, Assessment and Reporting Authority (ACARA). (2013). The Australian curriculum. Retrieved July 29, 2020, from. http://www.australiancurriculum.edu.au.

  • Bourque, M. L. (2009, March). A history of NAEP achievement levels: Issues, implementation, and impact 1989-2009. Paper commissioned for the 20th Anniversary of the National Assessment Governing Board, Washington, DC.

  • Chalmers, K. A., & Freeman, E. E. (2018). Does accuracy and confidence in working memory performance relate to academic achievement in NAPLAN, the Australian national curriculum assessment? Australian Journal of Psychology, 70, 388–395. https://doi.org/10.1111/ajpy.12207.

    Article  Google Scholar 

  • Department for Education (DfE). (2013). The National Curriculum for England. GOV.UK. Framework document. Retrieved July 29, 2020, from https://www.gov.uk/national-curriculum.

  • Elementary and Secondary Education Act (ESEA). (2013). [Act No. 12129]. Retrieved August 19, 2020, from http://www.law.go.kr/lsInfoP.do?lsiSeq=148845#0000.

  • Enforcement Decree of the Elementary and Secondary Education Act (EDESEA). (2013). [Presidential Decree No.25050]. Retrieved August 19, 2020, from https://ko.wikisource.org/wiki/%EC%B4%88%C2%B7%EC%A4%91%EB%93%B1%EA%B5%90%EC%9C%A1%EB%B2%95_%EC%8B%9C%ED%96%89%EB%A0%B9.

  • Jung, H. D., Kang, S. P., & Kim, S. J. (2010). Analysis on error types of descriptive evaluations in the learning of elementary mathematics. Journal of Elementary Mathematics Education in Korea, 14(3), 885–905.

    Google Scholar 

  • Kim, H. K., & Jeong, J. S. (2018). Astudy on the standard setting for the national assessment of educational achievement according to the revised science curriculums. Journal of Learner-Centered Curriculum and Instruction, 18(5), 305–330.

  • Kim, H. K., Lee, D. H., & Kim, S. (2016). Trends of science ability in the National Assessment of Educational Achievement (NAEA) of Korean Ninth Graders. Eurasia Journal of Mathematics, Science & Technology Education, 12(7), 1781–1798.

  • Kim, L. Y., & Lee, M. H. (2013). Analyzing eighth grade students’ errors in the constructed-response assessment: A case of algebra. Journal of Educational Research in Mathematics, 23(3), 373–388.

    Google Scholar 

  • Lawrie, G. A., Schultz, M., & Wright, A. H. (2017). Insights and teacher perceptions regarding students’ conceptions as they enter tertiary chemistry studies: A comparative study. International Journal of Science and Mathematics Education, 17(1), 43–65. https://doi.org/10.1007/s10763-017-9853-z.

  • Lee, B. N., & Sohn, W. S. (2019). Exploring the role of formative assessment in science experiment classes. Journal of Education Evaluation Research, 32(4), 649–670.

    Google Scholar 

  • Lee, I., Lee, S. I., Kim, S. H., Lee, J., Seo, M., Jo, Y. D., … Lee, K. H. (2015). A Study on the Development of the 2015 National Assessment of Educational Achievement (NAEA). Kice report (RRE 2015-12-1).

  • Lee, S. W. Y. (2018). Identifying the item hierarchy and charting the progression across grade levels: Surveying Taiwanese students’ understanding of scientific models and modeling. International Journal of Science and Mathematics Education, 16(8), 1409–1430. https://doi.org/10.1007/s10763-017-9854-y.

  • Local Autonomy Act (LAA). (2014). [Act No.12280]. Retrieved August 19, 2020, from http://www.law.go.kr/lsInfoP.do?lsiSeq=150394&urlMode=engLsInfoR#0000.

  • Local Education Autonomy Act (LEAA). (2013). [Act No.12128]. Retrieved August 19, 2020, from https://ko.wikisource.org/wiki/%EC%A7%80%EB%B0%A9%EA%B5%90%EC%9C%A1%EC%9E%90%EC%B9%98%EC%97%90_%EA%B4%80%ED%95%9C_%EB%B2%95%EB%A5%A0.

  • Martin, M. O., Mullis, I. V. S., & Foy, P. (2016). TIMSS 2015 International Science Report. Boston College.

    Google Scholar 

  • Ministry of Education (MOE). (2015). National Science Curriculum. No. 2015–74 (p. 2015). Ministry of Education.

    Google Scholar 

  • Ministry of Education (MOE). (2018). Organized materials related to the written and essay-type evaluation policies (p. 2018). Ministry of Education.

    Google Scholar 

  • Ministry of Education (MOE). (2020). ‘National Curriculum Innovation Forum’ to seek the direction of learner-centered future National Curriculum Revision (p. 2020). Ministry of Education.

    Google Scholar 

  • Ministry of Education and Science Technology (MEST). (2009). National Science Curriculum. No. 2009–41. Seoul: KOREA.

  • Mo, Y., & Troia, G. A. (2017). Similarities and differences in constructs represented by U.S. states’ middle school writing tests and the 2007 national assessment of educational progress writing assessment. Assessing Writing, 33, 48–67. https://doi.org/10.1016/j.asw.2017.06.001.

    Article  Google Scholar 

  • Mullis, I. V. S., Martin, M. O., & Foy, P. (2016). TIMSS 2015 international mathematics report. Boston College.

    Google Scholar 

  • National Research Council (NRC). (2013). Next generation science standards. National Academy Press.

    Google Scholar 

  • Next Generation Science Standards (NGSS). (2013). Next generation science standards: For states, by states. NGSS Lead States.

  • Organisation for Economic Co-operation and Development (OECD). (2020). PISA 2018 technical report. OECD.

  • Opfer, J. E., Nehm, R. H., & Ha, M. (2012). Cognitive foundations for science assessment design: Knowing what students know about evolution. Journal of Research in Science Teaching, 49(6), 744–777. https://doi.org/10.1002/tea.21028.

    Article  Google Scholar 

  • Romine, W. L., Todd, A. N., & Clark, T. B. (2016). How do undergraduate students conceptualize acid–base chemistry? Measurement of a Concept Progression. Science Education, 100(6), 1150–1183.

    Article  Google Scholar 

  • Salta, K., & Tzougraki, C. (2011). Conceptual versus algorithmic problem-solving: Focusing on problems dealing with conservation of matter in chemistry. Research Science Education, 41, 587–609.

    Article  Google Scholar 

  • Sedumedi, T. T. (2017). Practical work activities as a method of assessing learning in chemistry teaching, Eurasia Journal of Mathematics. Science & Technology Education, 36(4), 531–552.

    Google Scholar 

  • Shen, J., Liu, L. O., & Chang, H. Y. (2015). Assessing students’ deep conceptual understanding in physical sciences: An example on sinking and floating. International Journal of Science and Mathematics Education, 15(1), 57–70. https://doi.org/10.1007/s10763-015-9680-z.

  • Stavy, R. (1988). Children ' s conception of gas. International Journal of Science Education, 10(5), 553–560.

    Article  Google Scholar 

  • Tiruneh, D. T., Cock, M. D., Weldeslassie, A. G., Elen, J., & Janssen, R. (2017). Measuring critical thinking in physics: Development and validation of a critical thinking test in electricity and magnetism. International Journal of Science and Mathematics Education, 15(4), 663–682. https://doi.org/10.1007/s10763-016-9723-0.

  • Torrance, H. (2018). The return to final paper examining in English national curriculum assessment and school examinations: Issues of validity, accountability and politics. British Journal of Educational Studies., 66(1), 3–27. https://doi.org/10.1080/00071005.2017.1322683.

    Article  Google Scholar 

  • Yin, Y., Tomita, M. K., & Shavelson, R. J. (2014). Using formal embedded formative assessments aligned with a short-term learning progression to promote conceptual change and achievement in science. International Journal of Science Education, 36(4), 531–552.

    Article  Google Scholar 

Download references

Acknowledgements

We have used for our study data from NAEA which was collected by the KICE and we are grateful for it.

Funding

This paper was supported by the international research funds for humanities and social science of Jeonbuk National University in 2020.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hyun-Kyung Kim.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kim, HK., Kim, H.A. Analysis of Student Responses to Constructed Response Items in the Science Assessment of Educational Achievement in South Korea. Int J of Sci and Math Educ 20, 901–919 (2022). https://doi.org/10.1007/s10763-021-10198-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10763-021-10198-7

Keywords

Navigation