Skip to main content

Advertisement

Log in

Learners’ beliefs about the functions of proof: building an argument for validity

  • Published:
Educational Studies in Mathematics Aims and scope Submit manuscript

Abstract

Learners’ difficulties with proof have been ascribed to their lack of understanding of functions that proof performs in mathematics, namely, verification, explanation, communication, discovery, and systematization. However, the extant mathematics education literature on validation of instruments designed to measure learners’ beliefs about the functions of proof is scant. The purpose of this mixed methods study was to use the Standards for Educational and Psychological Testing (as discussed by the American Educational Research Association/American Psychological Association/National Council on Measurement in Education (AERA/APA/NCME), 2014) as a theoretical and methodological platform to support arguments for the validity of the learners’ beliefs about the functions of proof (BAFP) instrument. Scale items were generated from de Villiers’ model and a review of the literature, panel of experts, and key informants. The resulting instrument, comprising 28 Likert-scale items and 5 open questions that assessed the five functions of proof, was administered to 87 grade 11 learners in one high school in the Pinetown Education District, KwaZulu-Natal, South Africa. The results put the validity of the scores on the BAFP instrument into question. The study was an exploration of validity arguments based on the evidence from various data sources. Confirmatory factor analysis with a larger sample in a different context needs to be conducted to warrant the use of the BAFP instrument. The key contribution of this study to the field is that it sheds light on the complexities of instrument validation; the scope of the effort may explain why comprehensive validation efforts have not been documented extensively in literature.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

Notes

  1. I join Tall et al. (2012) in defining proof as an activity that “involves thinking about new situations, focusing on significant aspects, using previous knowledge to put new ideas together in new ways, consider relationships, make conjectures, formulate definitions as necessary and to build a valid argument” (p. 15).

  2. The word “Dinaledi” means “stars” in the Sesotho language, which is one of the indigenous languages, along with IsiZulu, commonly spoken in South Africa.

  3. The term “Colored” is used in the same sense as Isaacs-Martin and Petrus (2012) to identify a specific group in South Africa, most often attributed to learners popularly perceived as being of mixed racial and ethnic descent who, over time and due to specific historical, cultural, social, and other factors, have undergone various changes in their perceptions of their identity as Colored.

  4. I use the phrase “group discussion” as being synonymous with focus group.

  5. It is worth noting that after an EFA rerun, some of the items remained negatively loaded onto their factors. However, changing the rotation method from oblimin to promax changed these loadings to positive. An examination of item-rest correlations showed that these items correlated poorly with their factors. The source of the problem could be that, despite group discussion, the participants found the items to be highly ambiguous and confusing or that the data are inconsistent with the hypothesized factor structure. Cognitive interviews could shed more light on this problem.

References

  • American Educational Research Association/American Psychological Association/National Council on Measurement in Education [AERA/APA/NCME]. (2014). Standards for educational and psychological testing. American Educational Research Association.

  • Bearden, W. O., Sharma, S., & Teel, J. E. (1982). Sample size effects on chi square and other statistics used in evaluating causal models. Journal of Marketing Research, 19, 425–430.

  • Bell, A. W. (1976). A study of pupils’ proof-explanations in mathematical situations. Educational Studies in Mathematics, 7, 23–40.

  • Brombacher, A. (2007). Mathematical literacy: A reader. Bateleur Books.

  • Byrne, B. M. (1994). One application of structural equation modeling from two perspectives: Exploring the EQS and LISREL strategies. In R. H. Hoyle (Ed.), Structural equation modeling: Concepts, issues, and applications (pp. 138–157). Sage.

  • Chazan, D., & Yerushalmy, M. (1998). Charting a course for secondary geometry. In R. Lehrer & D. Chazan (Eds.), Designing learning environments for developing understanding of geometry and space (pp. 67–90). Erlbaum.

  • Cohen, J. A. (1960). A coefficient of agreement for nominal scales. Educational and Psychological Measurement, 20, 37–46.

  • Common Core State Standards Initiative [CCSSI]. (2010). Common Core State Standards for Mathematics (CCSSM). National Governors Association Center for Best Practices and Council of Chief State School Officers Retrieved from http://www.corestandards.org/wp-content/uploads/Math_Standards.pdf

  • Costello, A. B., & Osborne, J. W. (2005). Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Practical Assessment, Research and Evaluation, 10(7), 1–9. https://doi.org/10.7275/jyj1-4868

    Article  Google Scholar 

  • Creswell, J. W., & Plano Clark, V. L. (2011). Designing and conducting mixed methods research (2nd ed.). Sage.

  • Creswell, J. W., Fetters, M. D., & Ivankova, N. V. (2004). Designing a mixed methods study in primary care. Annals of Family Medicine, 2(1), 7–12.

    Article  Google Scholar 

  • de Villiers, M. D. (1990). The role and function of proof in mathematics. Pythagoras, 24, 17–24.

    Google Scholar 

  • de Villiers, M. D. (2012). Rethinking proof with the Geometer’s Sketchpad (vol. 5). Key Curriculum Press.

  • de Winter, J. C., Dodou, D., & Wieringa, P. A. (2009). Exploratory factor analysis with small sample sizes. Multivariate Behavioral Research, 44(2), 147–181. https://doi.org/10.1080/00273170902794206

    Article  Google Scholar 

  • Department of Basic Education [DBE]. (2009). The Dinaledi Schools Project: Report from a strategic engagement between the national department of education and business on increasing support for mathematics and science in education in schools. Department of Basic Education.

  • DeVellis, R. F. (1991). Scale development: Theory and applications. Sage.

  • Dickey, D. (1996). Testing the fit of our models of psychological dynamics using confirmatory methods: An introductory primer. In B. Thompson (Ed.), Advances in social science methodology (vol. 4, pp. 219–227). JAI.

  • Ellis, A. B., Ozgur, Z., Vinsonhaler, R., Dogan, M. F., Carolan, T., Lockwood, E., … Zaslavsky, O. (2019). Student thinking with examples: The criteria-affordances-purposes-strategies framework. The Journal of Mathematical Behavior, 53, 263–283.

    Article  Google Scholar 

  • Gillespie, D. F., & Perron, B. E. (2015). Key concepts in measurement. Oxford University Press.

  • Goldin, G. (2002). Affect, meta-affect, and mathematical belief structures. In G. Leder, E. Pehkonen, & G. Törner (Eds.), Beliefs: A hidden variable in mathematics education? (pp. 59–72). Kluwer.

  • Goldin, G., Rösken, B., & Törner, G. (2009). Beliefs-No longer a hidden variable in mathematical teaching and learning processes. In J. Maaß & W. Schlöglmann (Eds.), Beliefs and attitudes in mathematics education: New research results (pp. 9–28). Sense.

  • Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2014). Multivariate data analysis (7th ed.). Pearson Education Limited.

  • Healy, L., & Hoyles, C. (1998). Justifying and proving in school mathematics (Technical Report). Institute of Education, University of London.

  • Isaacs-Martin, W., & Petrus, T. (2012). The multiple meanings of coloured identity in South Africa. Africa Insight, 42(1), 87–102.

    Google Scholar 

  • Jarvenpaa, S. (1989). The effect of task demands and graphical format on information processing strategies. Management Science, 35(3), 285–303.

    Article  Google Scholar 

  • Kane, M. T. (2010). Validity and fairness. Language Testing, 27(2), 177–182.

    Article  Google Scholar 

  • Kane, M. T. (2013). Validating the interpretations and uses of test scores. Journal of Educational Measurement, 50(1), 1–73.

    Article  Google Scholar 

  • Kline, R. B. (2016). Principles and practice of structural equation modeling (4th ed.). Guilford Press.

  • Knuth, E. J. (2002). Teachers’ conceptions of proof in the context of secondary school mathematics. Journal of Mathematics Teacher Education, 5, 61–88.

    Article  Google Scholar 

  • McMillan, J. H., & Schumacher, S. (2010). Research in education: Evidence-based inquiry (7th ed.). Pearson Education, Inc..

  • Miller, J. C., Meier, E., Muehlenkamp, J., & Weatherly, J. N. (2009). Testing the construct validity of Dixon and Johnson’s (2007) gambling functional assessment. Behavior Modification, 33(2), 156–174.

    Article  Google Scholar 

  • Moore, G. C., & Benbasat, I. (1991). Development of an instrument to measure the perceptions of adopting an information technology innovation. Information Systems Research, 2(3), 192–222.

    Article  Google Scholar 

  • National Council of Teachers of Mathematics [NCTM]. (2000). Principles and Standards for School Mathematics. NCTM.

  • Nunnally, J. C., & Bernstein, I. H. (1994). Psychometric theory (3rd ed.). McGraw-Hill.

  • Ryan, K., Gannon-Slater, N., & Culbertson, M. J. (2012). Improving survey methods with cognitive interviews in small- and medium-scale evaluations. American Journal of Evaluation, 33(3), 414–430.

    Article  Google Scholar 

  • Smith, G. T., & McCarthy, D. M. (1995). Methodological considerations in the refinement of clinical assessment instruments. Psychological Assessment, 7(3), 300–308. https://doi.org/10.1037/1040-3590.7.3.300

    Article  Google Scholar 

  • Stevens, J. P. (1992). Applied multivariate statistics for the social sciences. Erlbaum.

  • Straub, D., Boudreau, M.-C., & Gefen, D. (2004). Validation guidelines for IS positivist research. Communications of the Association for Information Systems, 13(24), 380–427.

    Google Scholar 

  • Stylianides, A. J., & Stylianides, G. J. (2018). Addressing key and persistent problems of students’ learning: The case of proof. In A. J. Stylianides & G. Harel (Eds.), Advances in mathematics education research on proof and proving (pp. 99–113). Springer.

  • Tall, D., Yevdokimov, O., Koichu, B., Whiteley, W., Kondratieva, M., & Cheng, Y.-H. (2012). The cognitive development of proof. In G. Hanna & M. D. de Villiers (Eds.), ICME 19: Proof and proving in mathematics education (pp. 13–49). Springer.

  • Taut, S., Santelices, M. V., & Stecher, B. (2012). Validation of a national teacher assessment and improvement system. Educational Assessment, 17(4), 163–199. https://doi.org/10.1080/10627197.2012.735913

    Article  Google Scholar 

  • Watson, J. C. (2017). Establishing evidence for internal structure using exploratory factor analysis. Measurement and Evaluation in Counseling and Development, 50(4), 232–238. https://doi.org/10.1080/07481756.2017.1336931

    Article  Google Scholar 

  • Worthington, R. L., & Whittaker, T. A. (2006). Scale development research: A content analysis and recommendations for best practices. The Counseling Psychologist, 34(6), 806–838.

    Article  Google Scholar 

  • Yaghmale, F. (2003). Content validity and its estimation. Journal of Medical Education, 3(1), 25–27.

    Google Scholar 

  • Zaslavsky, O., Nickerson, S. D., Stylianides, A. J., Kidron, I., & Winicki-Landman, G. (2012). The need for proof and proving: Mathematical and pedagogical perspectives. In G. Hanna & M. D. de Villiers (Eds.), Proof and proving in mathematics (pp. 215–229). Springer.

  • Zumbo, B. D., & Hubley, A. M. (Eds.). (2017). Understanding and investigating response processes in validation research (vol. 16). Springer.

Download references

Acknowledgements

I wish to thank Vilma Mesa and the various anonymous reviewers for their helpful comments on earlier versions of this manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Benjamin Shongwe.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

Beliefs about the functions of proof instrument

figure a
figure b

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shongwe, B. Learners’ beliefs about the functions of proof: building an argument for validity. Educ Stud Math 107, 503–523 (2021). https://doi.org/10.1007/s10649-021-10047-y

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10649-021-10047-y

Keywords

Navigation