Introduction

The transition from mathematics as a school subject to mathematics as a scientific discipline in academic mathematics programs and mathematics teacher education programs is challenging for most students. In Germany, Dieter (2012) reports a dropout rate of over 30% among first-year students with a major in mathematics (see OECD 2010). In general, high dropout rates are considered as serious problems for the individual student and for society (Rasmussen and Ellis 2013). Empirical studies have shown that students’ cognitive prerequisites, such as prior knowledge, are the most important determinants of study success (e.g. Hailikari et al. 2008; Kosiol et al. 2019). Whereas successful students differ from failing students in mathematical knowledge at the beginning of study, motivational variables seem to play a comparably minor role (Kosiol et al. 2019).

This pronounced role of prior knowledge is plausible from a theoretical perspective. Under the term “prior knowledge”, existing studies on the transition to university mathematics subsume individual knowledge about mathematical concepts, that are being used, extended or reconceptualized during university mathematics studies, and which has been acquired until end of secondary school. Based on cognitivist and constructivist perspectives on learning, learners reconstruct new information encountered in education individually, using their existing knowledge about concepts which are related to the new information. Thus, to study mathematics at university, learners most likely need appropriate prior knowledge to benefit from academic learning opportunities.

Prediction studies in the field of higher education analyze the role of learning prerequisites for successful learning processes but often remain on a domain-independent level and ignore domain-specific aspects (e.g. Valle et al. 2003). Only a few studies concentrate on mathematics learning at university and relate individual prerequisites of study success to domain-specific features of the learning environment (e.g. Hailikari et al. 2008). We assume that such a connection is necessary for a profound interpretation of the identified prerequisites and, subsequently, to develop ideas about how learning processes at university can be optimized. Especially in the case of mathematics, its specific character as a school subject and as an academic discipline must be taken into account when investigating the role of knowledge for individual study success. Learners have to substantially reconstruct their knowledge when they start to deal with mathematics as an academic discipline. Prior research suggests that appropriate prior knowledge for dealing with mathematics as an academic discipline includes deep conceptual understanding and argumentation skills (Rach and Heinze 2017). However, the design of existing studies on the role of prior knowledge does not allow deeper interpretations than statements along the line “more or deeper knowledge is better” when trying to predict study success. Our main goal in this article is to push research on the role of students’ mathematical knowledge for being successful respectively failing at the transition to tertiary mathematics courses forward on this problem. In particular, it is an open question which type or level of mathematical knowledge is necessary or sufficient for study success. In this contribution, we derive a model that describes different levels of mathematical knowledge which has been found to be important for success in the first semester.

Starting from a theoretical comparison of mathematics as it is learnt at school and university (in Germany), we describe a shift in the character of mathematics at the transition to university. By looking at the case of the transition in Germany, we see a particular case of a broader phenomenon that is known in many other countries (e.g. Canada & New Zealand: Clark and Lovric 2009; South Africa: Engelbrecht 2010; UK: Hoyles et al. 2001).

Then, we present existing models that describe the structure and levels of (mathematical) knowledge. Based on these ideas, we reanalyze existing data from a mathematical prior knowledge test, used in several recent longitudinal studies focusing on the study entrance phase. The test mainly assesses basic technical skills, conceptual understanding and argumentation skills of topics that are the focus of first semester Analysis lectures, but which are also studied in secondary school in a more informal way (such as functions, limits, and derivatives). Based on data from 1553 first-year students of five cohorts at two universities, we establish an exploratory model comprising four levels of prior knowledge and investigate to which extent this model allows differentiating between students who complete a first semester Analysis course successfully and those who fail in the final exam. Through these analyses, we address our main research question: How can we describe the level of prior knowledge that differentiates between students who pass first semester Analysis courses successfully and those who fail?

Background

The Transition from School to University in Mathematics

High drop-out rates in study programs with a major in mathematics (e.g. Dieter 2012) have drawn attention to the transition from school to university mathematics. For many countries, researchers postulate two changes of the learning environment of the involved institutions: a shift in the character of the learning domain, mathematics, and a change of the learning environment, from guided learning to self-regulated learning (cf. Rach and Heinze 2017). In Germany, as in some other countries, the change of the institution – from school to university – coincides with a shift of the learning domain from an applied-oriented form of mathematics to advanced mathematics. As our study is situated in Germany, we describe briefly some characteristics of the German educational system. In many federal states in Germany, there are two or three different secondary school tracks aiming at: entering employment in craft and trade (Hauptschule); further vocational education (Realschule); and further academic education at universities (Gymnasium). After completing the “Hauptschule” or “Realschule”, students start working, enter vocational education, or continue upper secondary education, e.g. at a Gymnasium. In upper secondary school, in the last two years of the Gymnasium, all students take courses in basic Calculus (e.g. one-dimensional differential and integral calculus), Linear Algebra (with a focus on matrices or Algebraic Geometry), and Stochastics. These contents are treated mostly based on an informal understanding of basic concepts, e.g. an intuitive limit concept not based on an ε-N definition. After successfully completing upper secondary school, students obtain a certificate to enter university at the age of 18–19. This certificate is sufficient to study mathematics at university and no other conditions, e.g. good grades in mathematics, are required. In particular, most universities do not apply specific entry-assessments to select their students.

Advanced Mathematics as a Learning Content

Teaching mathematics in school in Germany is primarily focused on the goal of general education: Mathematical concepts and procedures are useful tools for describing the world and solving real world problems (e.g. Alcock and Simpson 2002; Gueudet 2008; Hoyles et al. 2001; Witzke 2015). Thereby technical aspects (e.g. solving equations) as well as dealing with problems, situated in real contexts, are prominent in curricula. A scientific perspective on mathematics, namely as an academic discipline based on concept definitions and proofs, is integrated in some curricula. However, this perspective is underrepresented in classroom instruction in Germany (Jordan et al. 2008; Witzke 2015). In contrast to this, mathematics is treated from a scientific perspective in many tertiary mathematics courses. Mathematics as a scientific discipline is characterized by formally defined, abstract concepts and formal, deductive proofs of mathematical propositions (Engelbrecht 2010; Gueudet 2008). Moreover, this character of mathematics as a scientific discipline strongly shapes teaching at university, e.g., by a strong focus on the DTP (Definition–Theorem–Proof) structure (Engelbrecht 2010; Hoyles et al. 2001). Indeed, Witzke (2015) interviewed students about differences and similarities between school and university mathematics and reports that students highlighted everyday situation as a reference for problems in school mathematics whereas the rigor is central for university mathematics. Jordan et al.’s (2008) analysis of mathematics tasks in secondary school showed a marginal role of mathematics argumentation in school tasks, which strongly contrasts against the central role of proof in university mathematics teaching. Other results in this field are based on theoretical analysis, influenced by personal experiences (Engelbrecht 2010) or by the perspective to see transition as a rite of passage (Clark and Lovric 2009). However, they all put forward similar substantial differences between mathematics in school and in university education.

These differences of the character of mathematics at school and at university become especially visible when mathematical concepts are introduced or statements are supported by argumentations. Although there are many mathematical concepts that are treated in upper secondary classes as well as in university courses, e.g. real numbers, limits, or differentiable functions (Artigue 1999; Kidron 2018), there are substantial differences in how these concepts are treated in the two learning institutions. So, one can assume that students construct different types of knowledge concerning these concepts in the different institutions. At school, concepts are usually introduced starting from an initial concept image. The meaning of a concept is rooted in students’ experiences from real life in the school context (Engelbrecht 2010; Tall and Vinner 1981). For example, limits are usually introduced using an informal characterization of “approaching closer and closer towards a certain number” in secondary school. Depending on the specific school context and related educational standards, a formalized description of the concept (a formal definition) is considered less important (e.g. Ufer and Kramer 2015). In contrast, concepts are often formally defined in a de-ontologized sense by their characteristic properties at university, i.e. the concept definition plays the essential role (Alcock and Simpson 2002; di Martino and Gregorio 2019; Engelbrecht 2010; Gueudet 2008; Sfard 1991). The limit concept, for example is usually introduced (again) in university mathematics lectures in Germany using an ε-N definition. In particular, concept images in the sense of specific mental representations, are rarely introduced by university teachers or academic textbooks (Vollstedt et al. 2014) so that students must apply specific elaboration strategies to reconstruct the meaning of formally defined concepts individually (e.g. Artigue 1999) or connect them to the concept images constructed in secondary school. A second consequence of the different aims of school and university mathematics education is the different role of proofs (Gueudet 2008). From a utilitarian perspective, it is sufficient that mathematical concepts and rules are reliable (Witzke 2015). This means, for example, that a person who wants to apply a mathematical rule in a specific context must be sure that this rule provides a result that can be considered as acceptably accurate for this specific context. So, for many mathematical statements in school mathematics, empirical and authoritative evidence is sufficient to take validity for granted and no (formal) proofs are needed. In contrast, for the scientific discipline of mathematics, proofs are the evidence instruments (Epp 2003; Gueudet 2008). In a previous study, we compared mathematical practices which are predominant in popular school and university textbooks in Germany. In line with the results reported above, German school textbooks contain only few opportunities to prove statements, whereas in university textbooks, proving is a central practice (Vollstedt et al. 2014).

In the following, we exemplify these differences between mathematics at school and at university with a focus on the topic of differentiation, one of the central topics used in our empirical study. The curriculum of the federal state in Germany, where some of the re-analyzed studies were conducted, describes the mathematics course at school in the following way (ISB 2004): first, students describe the curve of a rational function using the concept image of the limit concept; then, the differential quotient is introduced as the slope of a tangent at a point of the function graph or as the local change rate of a function. The differential quotient is introduced based on an informal limit concept. Starting from local differentiation, students become familiar with the derivative function of a function and compute the derivations of power functions using corresponding rules. Applying the features of the differentiation of a function often means, at school, to apply it for solving more or less realistic real-world problems. A typical task in the school context is: “Using metal, you should produce a cylindrical can with a prescribed volume. For which radius is the material consumption minimal?” (Rach et al. 2018).

Analysis courses in university approach the topic of differentiation substantially differently. First, students get to know rational or real number sequences and infinite sums, the limit concept is introduced based on an ε-N definition, and techniques how to prove that a sequence or infinite sum converges or diverges (Alcock and Simpson 2005). In many courses, Cauchy-sequences are used to introduce real numbers. Then, lecturers introduce continuous and differentiable functions based on a formal ε-δ-definition (Ghedamsi and Lecorre 2018) and limits, and use number sequences to prove that a function is continuous respectively differentiable or not. Applying the concept of differentiation in university courses often means that the concept definition is used to prove further statements to build up a mathematical theory. A typical task in the university context is: “Let f be a differentiable function. Show that f is continuous.” (Rach et al. 2018). In a nutshell, mathematics at upper secondary school mainly addresses what is usually summarized under the term Calculus, while mathematics at university what is summarized under the term Analysis.

Learning Advanced Mathematics: Reconstruction of Knowledge

After summarizing some important differences between the character of mathematics at school and at university, we now analyze in which way these differences influence students’ learning processes at university.

Some authors assume that learning processes are not continuous, but contain breaks or epistemological obstacles (Sierpinska 1987) at points where further learning content is not well-aligned with a student’s prior understanding of the respective or more basic concepts. As mentioned before, gaps might occur in individual learning paths at the transition to advanced mathematics, if concept images established at school do not fit to concept definitions. Many concepts are introduced in school mathematics in an informal way, e.g. real numbers, function, tangent, limit etc. but are reintroduced using formal definitions in advanced mathematics course, e.g. real numbers as equivalence classes of Cauchy-sequences over rational numbers. While rich and meaningful concept images are considered crucial for conceptual understanding also at the university level (Tall and Vinner 1981), the strong focus on formal aspects in university courses most likely bears further challenges to beginning mathematics students: even if alternative, more informal and meaningful representations are treated in university lectures, students still need to integrate these representations into their existing concept images. Moreover, they will often also restructure their own understanding of concepts, where this is not in line with a formal concept definition. The latter occurs in particular when concepts are defined differently in different fields of mathematics. Selden (2005) exemplifies this phenomenon by referring to the concept “tangent”, which is a straight line that touches a given circle at only one single point in the field of Geometry, whereas a tangent of a function graph could touch or intersect the graph more than once, e.g. the tangent of the function f :  → , f(x) = sin(x) at the point \( \left(\frac{\uppi}{2},1\right) \). And a third process of reconstruction has to take place at university because it has not turned out to be possible to learn some facets of a concept in school. Artigue (1999) calls these different reconstruction processes as “integrating new facets of a concept”, “reconstruction of relationships to familiar objects” and “changes of level of conceptualization”.

Summarizing, the specific character of academic mathematics is unfamiliar to most of the students when entering a university mathematics program, so that these learners are faced with unfamiliar practices of mathematical thinking and learning. In Germany, the institutional transition to university coincides with this transition to advanced mathematics. While researchers agree that the necessary processes of restructuring existing and integrating new mathematical knowledge is demanding for students, little is known about which prior mathematical knowledge might help students to master this demand. Which prior mathematical knowledge is useful towards the successful learning of advanced mathematics? One might argue that the learning of students with only little experience concerning important concepts like real numbers or limits would not manifest such problems because they do not have to modify their existing concept images, and all (formally relevant) information is introduced in the lectures. It seems obvious that this argument is invalid, and theoretical arguments (e.g. Artigue 1999) as well as empirical results, support this impression (Hailikari et al. 2008; Rach and Heinze 2017). Of course, students bring some and individually different prior knowledge to university mathematics courses. They have to integrate new information into this network of prior knowledge, possibly modifying or extending their existing concept images to match them with a formal concept definition or new facets of the concepts introduced in university courses. In this contribution, we ask how we can characterize, in terms of measurable characteristics, the mathematical prior knowledge which is required to master the unfamiliar challenges in mathematics learning during a first semester university course.

Prior Knowledge as a Learning Prerequisite

Which prerequisites students need to cope successfully with the specific challenges of learning mathematics in this new university context is a critical question. In the literature, study success is often conceptualized as a multidimensional construct (e.g., Nagy 2006). Whereas subjective criteria of study success comprise study satisfaction or (de-)motivation regarding the study program, objective criteria refer to acquired knowledge, grades in university courses as well as drop-out (see Nagy 2006). In this contribution, we focus on successful course completion as one example of objective criteria.

When trying to identify possible reasons for learning problems and drop-out in university mathematics programs, prior empirical research has focused on learners’ prerequisites as one of several causes. In this tradition, a wide range of cognitive and affective-motivational prerequisites has been studied (e.g. Hailikari et al. 2008; Schiefele et al. 1995; Richardson et al. 2012). Frameworks on complex mathematical practices, such as proving and problem solving, describe a set of individual variables that influence students’ ability to engage in these practices, e.g. metacognition, self-regulation, interest, content knowledge (de Corte et al. 2000; Schoenfeld, 1992). Apart from the theoretical considerations in the previous section, also empirical results indicate that, among these variables, prior knowledge plays a key role for successful learning in the mathematics study entrance phase (Chinnappan et al. 2012; Sommerhoff 2017; Ufer et al. 2008).

Affective-motivational learning prerequisites such as interest (Laging and Voßkamp 2017; Rach and Heinze 2017), self-concept (Hailikari et al. 2008; Robbins et al. 2004), study choice motives (Ufer 2015) etc., often only show weak relations to objective criteria of study success in university studies in the past. However, relations to study satisfaction, study motivation, and drop out have been found (Blüthmann 2012; Schiefele et al. 2007). Since in this contribution we are interested in determinants of students’ academic success, mainly on passing university courses, we focus on cognitive learning prerequisites.

The school qualification grade, averaged over all school subjects, is maybe the most studied indicator for students’ cognitive learning prerequisites, even though it can be argued that it is an amalgam of prior school achievement with affective-motivational characteristics such as learning motivation (Richardson et al. 2012; Trapmann et al. 2007). In particular, the school qualification grade has been found to predict objective criteria of study success in general (Robbins et al. 2004), as well as in mathematics courses specifically (Laging and Voßkamp 2017). However, since this grade predicts academic success over a range of subjects and includes prior achievement over the whole range of school subjects, it has little power to analyze problems that are specific to university mathematics studies. Fewer results exist for mathematics school grades, which represent domain-specific prior school achievement (Laging and Voßkamp 2017). Reasons for this impact are that prior achievement may influence other learning prerequisites (such as individual self-concept) which enhance the current learning process or that the underlying variables that have an impact on prior achievement influence the current learning process.

In this contribution, we concentrate on prior mathematical knowledge that students bring from secondary school as one domain-related cognitive learning prerequisite (see Dochy et al. 2002 with respect to cognitive entry behaviors). Its influence may be direct in the sense that, from a cognitive-constructivist perspective, building new knowledge depends on well-connected prior knowledge in any learning process, or mediated by students’ learning behavior (e.g. choice of learning strategies, Trigwell et al. 2013). In the following paragraphs, we firstly present results of theoretical and empirical studies that analyze the impact of learning prerequisites, in particular prior knowledge, on successful learning in the study entrance phase. Secondly, we discuss models describing the quality and structure of mathematical knowledge.

Empirical Results Concerning the Role of Prior Mathematical Knowledge in the Study Entrance Phase

As pointed out in “Advanced Mathematics as a Learning Content” section, mathematics in the study entrance phase is based on formally defined concepts, theorems, and deductive proofs which build the scientific discipline mathematics. According to these differences between the scientific discipline mathematics and school mathematics, it seems plausible that specific prior knowledge, acquired at school, has only a small influence on the acquisition of mathematical knowledge and therefore on study success at university beyond school qualification grade. In addition, theories, which take social and cultural dimensions of learning into account, come to the conclusion that “knowledge remains strongly contextual” (Artigue 1999, pp. 1378).

However, concepts learnt at school and at university in the domain “Calculus/Analysis” are often the same and following cognitive-constructivist learning theories – new knowledge has to be integrated in existing knowledge – we argue that learners’ prior mathematical knowledge is an important presumption for successful learning processes. With a higher level of prior knowledge, this integration turns out well – but only if the prior knowledge fits to the new information. Fit in this sense means that the prior knowledge can be extended by new concepts or ideas and these concepts and ideas do not contradict prior knowledge (see “Learning Advanced Mathematics: Reconstruction of Knowledge” section). Results of empirical studies at school (Köller et al. 2001, 2006) and university (Greefrath et al. 2017; Hailikari et al. 2007; Hailikari et al. 2008; Halverscheid and Pustelnik 2013) confirm the influence of prior mathematical knowledge on learning success. As an example of the conducted studies in the field of predicting study success at university, we gain an insight into the study of Hailikari and collegues (Hailikari et al. 2007, 2008). With a sample of 202 students, mainly with a major in mathematics, of two mathematics courses, the authors confirm that prior knowledge, mainly procedural knowledge, predicts success in a mathematics course (operationalized by the final grade). Also Halverscheid and Pustelnik (2013) were able to explain 40% of the variance of students’ performance in the courses “Analysis” and “Linear Algebra” for Bachelor mathematics, physics, and teacher education students by students’ prior declarative knowledge. These two studies differ in the type of prior knowledge which seems relevant for being successful in the first semester. Other studies (e.g. Greefrath et al. 2017) integrate their work into other models of cognitive entry behaviors, e.g. mathematical competences.

Summarizing, there is ample evidence that mathematical knowledge is an important predictor for successful learning processes, not only at school but also at university. However, besides the simple answer “The more knowledge, the better” which form of knowledge is relevant when reconstructing existing and integrating new knowledge during the transition to university mathematics (cf. “Learning Advanced Mathematics: Reconstruction of Knowledge” section) remains an open question. In the next section, we present works that conceptualize (mathematical) knowledge in different ways.

Conceptualizations of Mathematical Knowledge

The acquisition of (applicable) mathematical knowledge is a central goal of mathematics education and has thus been in the focus of mathematics education and related psychological research. Different approaches have been developed over time to characterize mathematical knowledge and its development, leading to a multitude of terms (e.g. De Jong and Ferguson-Hessler 1996) and perspectives. The main goal of our current work is to generate a local model of different levels of mathematical knowledge that serves as prerequisite for university Analysis lectures. In what follows, we highlight three perspectives that we used as a basis when deriving our model.

Conceptual Vs. Procedural Distinction

Differentiating knowledge of facts and knowledge of procedures has a long tradition in psychology (e.g., Anderson 1983) and resonates in mathematics education, e.g. in Skemp’s (1976) distinction between relational understanding (similar to conceptual knowledge) and instrumental understanding (similar to a superficial form of procedural knowledge). Even though concrete definitions vary, conceptual knowledge usually refers to a network of general facts, concepts, and principles while procedural knowledge covers sequences of mental or concrete actions to achieve a specific goal (cf. Rittle-Johnson et al. 2015). With respect to measurement, it is widely agreed that procedural knowledge will show on tasks in a domain which participants have solved frequently before whereas conceptual knowledge is best assessed with unfamiliar tasks (Rittle-Johnson et al. 2015). While procedural knowledge is usually restricted to solve well-delineated types of problems or sub-problems, conceptual knowledge can be applied more flexibly and broadly across a range of familiar and unfamiliar tasks. Other works use the distinction between “declarative knowledge” and “compiled knowledge” or “encapsulated knowledge” (De Jong and Ferguson-Hessler 1996; Schmidt and Rikers 2007). Both conceptual knowledge and procedural knowledge are assumed to occur in declarative as well as compiled or encapsulated forms. Similar to the works of Jukic and Dahl (2012), we are interested in distinguishing different types of knowledge and therefore use the terms conceptual and procedural knowledge. Approaches from mathematics education have proposed an integrated modelling of conceptual and procedural knowledge, for example as procepts (Gray and Tall 1994), or their mutual relations, for example the process of treating a known procedure as a new mental object (reification, Sfard 1991).

Conceptual Knowledge as a Network of Representations

Being able to transfer between different representations when dealing with mathematical concepts has been put forward as a central aspect of conceptual mathematical knowledge repeatedly (Gagatsis et al. 2006; Kaput 1989; Nistal et al. 2014). This resonates and extends e.g. Vinner’s (1983) notion of concept images. Following this perspective, a major aspect of conceptual knowledge is being able to change flexibly between different structured external, e.g. graphical, or mental representations of a concept. For the function concept, for example, there are several works that consider using different representations and linking different representations (Gagatsis et al. 2006; Ronda 2015). Ronda (2015) works out that only a few students gain the ability to deal flexibly with different representations in school because they see function more as a process than as a new object. This is of particular importance, since Gagatsis and Shiakalli (2004) have found that university students’ ability to change between representations of functions is positively correlated to their problem-solving skills in the area of functions. Specifically, at the transition to university mathematics, the increased role of more formal and symbolic representations is often mentioned as a key challenge for students (Clark and Lovric 2009; Tall 2008). In this vein, making sense of symbolic representations by activating more semantic representations, such as graphs, tables, and prototypical (generic) examples, might play a key role in the transition to university mathematics. For the concept of differentiation, Jones and Watson (2018) propose instructional goals that should be achieved in first-semester undergraduate studies. They assume that students should be able to describe all aspects of derivative (ratio, limit, and function) in different representation formats. Attention has to be paid that these links between different representations are fruitful and not misleading. Juter (2011) pointed out that pre-service teachers in Sweden reported invalid links between concepts such as limit, derivate, integral, and continuity which also might lead to a conceptual misunderstanding. She warns that some of these students, who assume invalid links between such concepts, might wrongly believe that they understand the concepts very well. Thus, the quality of links between different representations of a concept has to be taken into account when describing the conceptual knowledge of a person.

Levels and Dimensions of Conceptual Knowledge

Characterizing different levels of conceptual knowledge has been a focus of mathematics education research for decades. A famous example is the level model by Dina van Hiele-Geldof and Pierre van Hiele that describes the acquisition of geometric concepts from an intuitive, visual understanding of geometric objects, over an apprehension of their properties and relations between properties and between geometric object classes to an increasingly global ordering of geometry (van Hiele 1957). However, such models mostly describe knowledge of a given concept or conceptual field. Other approaches, instead, focus on knowledge necessary for coping with specific situations and demands, for example with respect to the professional work of mathematics teachers (Heinze et al. 2016; COACTIV: Krauss et al. 2013; Michigan group: Hill et al. 2008; TEDS-M: Buchholtz and Kaiser 2013) or, in our case, for mastering an undergraduate mathematics course. Tests assessing mathematical knowledge necessary for a successful start into a science, engineering, or economics program, often focus on computational, routine tasks and basic school mathematics (Greefrath et al. 2017; Laging and Voßkamp 2017), and rarely include more conceptual aspects (Hailikari et al. 2008; Rach and Heinze 2017). However, one central prerequisite of generating such level models is that the knowledge construct can be modelled by one single dimension that describes students’ performance over all items in a test.

Few studies use a multi-dimensional conceptualization of mathematics knowledge in the study entrance phase. Hailikari et al. (2007) distinguish between different types of knowledge, which the researchers measured with different instruments. The approach to separate prior mathematical knowledge into different dimensions, however, often leads to high correlations between the different measures of knowledge (Heinze et al. 2016). So, most of the studies in the field measure the concept knowledge unidimensionally, and these studies have shown that such a conceptualization has good predictive power for students’ success (e.g. Greefrath et al. 2017; Köller et al. 2006).

In the end, the choice of a unidimensional or multidimensional model to measure mathematical knowledge in the study entrance phase is not merely an empirical question, but also a question of the purpose of the measurement (Ufer and Neumann 2018). Since distinguishing different dimensions of mathematical knowledge empirically has been a challenge due to high correlations in the past (Heinze et al. 2016), a unidimensional model with different levels of knowledge might be the more feasible approach to characterize the knowledge that is necessary to succeed in first-semester mathematics courses. Apart from empirical considerations, level models have one more advantage. Existing studies such as the one of Greefrath et al. (2017) report and interpret positive correlations between the performance in a mathematics test at study entrance and study success after the first semester. These kinds of studies have argued that more prior knowledge leads to better study success. However, level models may provide tools for an criterial interpretation of students’ test scores in terms of the demands a student can master. Level models that differentiate levels of knowledge of a broader range of concepts, that are relevant in the transition to university mathematics, may help to describe the type, quality, and content of knowledge that is necessary for success in the study entrance phase.

Summary: The Relation of Prior Mathematical Knowledge to Success in the Study Entrance Phase

Summarizing, it is well established from a theoretical as well as from an empirical perspective, that prior knowledge of mathematical concepts is an important prerequisite for objective criteria of success in the study entrance phase. Indeed, the mathematical concepts students encountered at school and those analyzed in the first semesters of university studies are partly the same, in particular in Analysis lectures, e.g. limit or differentiable functions. However, the practices of dealing with these concepts change, so that students have to reorganize and restructure their mental structure of these mathematical concepts: “Such new ways of thinking often require students to make quite difficult reconstructions of their mathematical knowledge” (Selden 2005, pp. 134). However, beyond a broad “more is better” statement, we can currently only hypothesize, what characterizes a sufficient level of prior knowledge in this context. For example, the fact that many tests of prior knowledge used in research on the transition to university focus on computational, routine tasks and basic school mathematics (Greefrath et al. 2017; Laging and Voßkamp 2017) seems to indicate that a lack of these skills is perceived as relevant in research and practice. On the one hand, this resonates in statements that see “a serious lack of essential technical facility” as a major reason for students’ problems (London Mathematical Society, Institute of Mathematics and its Applications, Royal Statistical Society 1995; cited in Clark and Lovric 2009, pp. 757). On the other hand, also a “marked decline in analytical powers when faced with simple problems requiring more than one step” (London Mathematical Society, Institute of Mathematics and its Applications, Royal Statistical Society 1995; cited in Clark and Lovric 2009, pp. 757) are identified as causes. This may suggest a stronger relation to conceptual knowledge, representational flexibility, and levels of conceptual knowledge that go beyond a mere description of properties. Finally, the central role of proof is highlighted as a cause for problems in the transition to university mathematics (Selden 2005). Students’ being prepared for proofs would relate to the highest level in van Hiele’s (1957) model, and students’ proof validation and construction skills have been found to relate strongly to conceptual knowledge, as well (Chinnappan et al. 2012; Sommerhoff 2017; Ufer et al. 2008; Weber 2001).

The Current Study

To approach the question of what constitutes sufficient prior knowledge in Analysis courses, we reanalyzed data from a set of existing studies that had used items from a common item pool (Knowledge for University Mathematics – Analysis; KUMA). These studies survey students’ prior knowledge for Analysis courses in mathematics programs at two German universities (Rach and Heinze 2017; Ufer 2015; Ufer et al. 2015), including pure and applied mathematics bachelor programs and mathematics teacher education programs. Data was also available on students’ success in first semester Analysis courses.

In particular, we addressed the following questions:

  1. 1.

    Does a unidimensional model of mathematical knowledge fit the empirical test data?

We assume that differentiating different dimensions of mathematical knowledge might be feasible and desirable in some situations. However, we expected that a one-factor structure would sufficiently fit the data (see Köller et al. 2006), so that we could identify hierarchical levels of prior knowledge using statistical models.

  1. 2.

    Is it possible to identify different, coherent levels of mathematical knowledge?

The KUMA test items had been developed to cover a broad range of complexity, ranging from routine tasks over tasks surveying conceptual understanding to argumentation tasks. Therefore, we assume that items would span a range of difficulty levels. It was an exploratory question of the study if items could be grouped so that each group would describe one of several levels of demands on prior knowledge, reflecting similar types of knowledge in terms of the perspectives described in “Conceptualizations of Mathematical Knowledge” section.

  1. 3.

    Is it possible to identify a level of prior knowledge that differentiates between students who pass the first semester Analysis courses successfully and those who fail?

Based on results from prior studies (Rach and Heinze 2017; Ufer 2015; Ufer et al. 2015), in which we were particularly interested in the influence of motivational variables on subjective and objective criteria of study success, we expected strongly that higher prior knowledge would go along with a higher passing rate. Beyond this well-known pattern (cf. “Empirical Results Concerning the Role of Prior Mathematical Knowledge in the Study Entrance Phase” section), our main question was which kind of knowledge would differentiate between students that master the transition (cf. “Advanced Mathematics as a Learning Content” section) and knowledge reconstruction processes (cf. “Learning Advanced Mathematics: Reconstruction of Knowledge” section) in the first semester of undergraduate mathematics studies more or less successfully. We did not expect a discrete level of prior knowledge to differentiate definitely between passing and failing students, but a marked but gradual increase of passing rate within a restricted span of the knowledge scale.

Methodology

The central question of our project is in which way successful students differ from failing students according to their prior knowledge. With such a characterization of the necessary prior knowledge, we might better inform, advise, and support students at the transition to advanced mathematical thinking. There are many inspiring (case) studies which analyze in detail learners’ (re-)construction of knowledge concerning certain concepts and practices under the influence of certain teaching strategies (Alcock and Simpson 2002; Dahl 2017; Ghedamsi and Lecorre 2018). Contrary to this approach, we analyze the prior knowledge of a larger sample of students using quantitative methods. This allows us to use statistical tools such as the Rasch model that establish a connection between item properties and student skills.

Previous, empirical studies indicate that it is adequate to measure prior knowledge in one single topic (here: topic of differentiation) as a unidimensional construct. This is a basic prerequisite to derive meaningful levels of prior knowledge. We tested this assumption for our instrument by answering research question 1.

As we were interested in distinguishing between qualities of knowledge, we asked whether it is possible to identify and characterize levels of knowledge. A prominent approach to develop such levels is the bookmark procedure: The items were sorted by their empirical difficulties and we analyzed contrasting demands of the items against the background of theoretical frameworks (described in “Conceptualizations of Mathematical Knowledge” section). This method leads to a verbal description of the knowledge levels and a list of corresponding items which can be solved using knowledge on the respective level. The Rasch model then allows us to link students’ knowledge scores to the demands of the items they were able to solve respectively unable to solve. Using this link, it is possible to assign each student to a level of prior knowledge that he or she can master.

Our last question dealt with the relation between the individual knowledge level and the success in the Analysis I course. As we wanted to analyze a broad sample of students from multiple Analysis I courses, we could not use scores of grades from the different exams of these courses. So, we used a (dichotomous) pass/failure indicator from each course as an indicator of study success. Logistical regression is the appropriate statistical model to relate the continuous knowledge scores to the dichotomous variable “course success”, especially to indicate which level of knowledge is necessary to complete the course successfully with a specific probability (e.g. 50%). A detailed, descriptive analysis of this relation gives us further information, which level of knowledge is sufficient for students to be rather reliably successful in an Analysis I course.

Design

Our sample consists of 1553 students in the first semester of a mathematics program: 483 in pure mathematics or applied mathematics programs, 743 in teacher education programs for higher secondary schools, 78 in other programs (such as computer science or teacher education for other school tracks) and for 249 students we have no information about the study program. For this re-analysis, we combined data from previous projects (see Kosiol et al. 2019; Rach and Heinze 2017; Ufer 2015). The sample consists of five cohorts who started a university mathematics program in autumn 2010, 2011, 2013, 2014, respectively 2015. The first cohort was drawn from a university of a medium-sized city in Germany, the four last cohorts at a university of a large city in Germany. Students participated voluntarily in the mathematical knowledge test on the first day of a bridging course to university mathematics (only a part of the 2013 sample) or in a first lecture of the Analysis I course.

In these Analysis I courses, the first-semester students were introduced to mathematics as a scientific discipline with formal definitions of abstract concepts, mathematical theorems, and deductive proofs (see “Advanced Mathematics as a Learning Content” section). Accordingly, these courses were an important basis for further studies (see Weber 2008). At the end of semester, students had to participate in an exam to successfully complete the first semester. For 705 students of our sample, we could obtain information from the course instructors, if they were successful in the exam: 242 of these students passed the exam, 463 students failed.

The Knowledge for University Mathematics (Analysis) Test (KUMA)

The item pool for the KUMA test of 21 items has been developed by a group of researchers since 2008 and has been used successfully in several longitudinal studies in the study entrance phase (see Kosiol et al. 2019; Rach and Heinze 2017; Ufer 2015).Footnote 1 In each of these studies, we used eight to ten out of the pool of 21 items. The tasks deal with the concepts real number, infinity, limit, differentiable function etc. and mainly assesses basic technical skills, conceptual understanding and argumentation skills of these concepts. Based on the long-lasting use of the test, the research team has gathered a sound understanding of how students approach these items, e.g. by qualitative analyses of written answers.

For reanalyzing the data, we firstly examined all of the 21 items. Because of low item fit or because they had been replaced by substantially revised versions in the meantime, we then excluded four of these items from the analysis. This restriction did not narrow the content of the test substantially. Of the 17 remaining items, there were 6 single choice items, 4 complex multiple choice items, and 7 open-ended items. Each item was scored dichotomously, i. e. one point for a solution which was accepted as correct and zero points for other solutions or missing based on an extensive coding scheme with examples for accepted and for non-accepted solutions. For every item, at least 172 answers are available and at most 1179 answers. For each pair of items, at least 126 cases contained data for both items.

The distractors of the single or multiple choice items are based on typical misconceptions that are described in many articles (irrational numbers: Sirotic and Zazkis 2007; see also Kidron 2018; limit and continuity: Bezuidenhout 2001; infinity: Kolar and Cadez 2012; derivation: Orton 1983). For illustration, we present the item “positive real numbers bounded” in detail:

  • Does a smallest positive real number exist? Which of the following statements is true?

  • ⎕ Yes, because you can find a number in ℝ, arbitrarily close to 0.

  • ⎕ No, because for every positive number there exists another number between 0 and this number.

  • ⎕ Yes, because the positive real numbers are bounded below.

  • ⎕ No, because the smallest positive number is not real, but rational.

  • The correct answer is the second statement. Students have either to decide “yes” or “no” and to rate which argumentation supports this decision.

Furthermore, we collected the overall school achievement by the overall final school qualification grade. The scale was used with reversed polarity so that its range reaches from 4.0 (very good) to 1.0 (sufficient).

Statistical Analysis

Since not every student worked on all items of our test, we used the Rasch model (Rasch 1960) and structural equation modelling in MPlus (Muthén and Muthén 1998–2015) with categorical data to analyze the data. The Rasch model is a probabilistic test model that assumes that a student’s (v) success on a specific item (i) is determined by two continuous parameters: a student score θv and an item difficulty σi. The Rasch model then assumes that the probability P(Xvi = 1) that student v solves item i correctly is given by\( \mathrm{P}\left({\mathrm{X}}_{\mathrm{v}\mathrm{i}}=1\right)=\frac{\ \exp \left({\uptheta}_{\mathrm{v}}-{\upsigma}_{\mathrm{i}}\right)}{1+\exp \left({\uptheta}_{\mathrm{v}}-{\upsigma}_{\mathrm{i}}\right)} \).

In our case, this approach provides values for students’ knowledge scores θv and for the item difficulty σi on a joint, linear scale. When a difficulty parameter of an item and a students’ knowledge score have the same value, the model implies that this student will solve this item with 50% probability. Easier items with a lower difficulty parameter will be solved correctly by this student with a higher probability and more difficult items with a higher difficulty parameter with a lower probability. Mapping student knowledge scores and difficulty parameters on a joint scale allows to identify levels of knowledge, that can be interpreted in terms of specific content-related demands, and to allocate students to these levels (cf. Ufer and Neumann 2018).

To analyze if and how success in the course Analysis I depends on students’ prior knowledge scores, we used logistic regression analysis. Logistic regression allows to study the relation between one or more continuous independent variable (students’ knowledge scores) and a dichotomous dependent variable (course success/failure). Logistic regression with one independent variable assumes that this relation is determined by two parameters B and δ, and that the probability that a student v with knowledge score θv succeeds in the course is given by \( \mathrm{P}\left(\mathrm{v}\ \mathrm{succeeds}\right)=\frac{\exp \left[\mathrm{B}\left({\uptheta}_{\mathrm{v}}-\updelta \right)\right]}{1+\exp \left[\mathrm{B}\left({\uptheta}_{\mathrm{v}}-\updelta \right)\right]} \). Thus, δ denotes the knowledge score of a student who has 50% probability of succeeding in the course and B describes how strongly the success probability is related to students’ knowledge scores.

To analyze both models within a joint framework, we implemented the Rasch model as a measurement model in a structural equation framework in MPlus and added the logistic regression on success in the Analysis I course to this model. Model fit was evaluated using established indicators and cut-off values such as chi-square-statistics (criterion for acceptable fit: \( \frac{\upchi^2}{\mathrm{df}}\le 3 \); Schermelleh-Engel et al. 2003), comparative fit index CFI (criterion for acceptable fit: CFI around .95 or higher; Hu and Bentler 1999) and root means square error of approximation RMSEA (criterion for acceptable fit: RMSEA < .08; Schermelleh-Engel et al. 2003).

Identifying Levels of Demands

To obtain a deeper insight into the structure of the test, we grouped items together according to their content-related demands based on their difficulty parameters. To achieve this, we followed the bookmark procedure (Mitzel et al. 2001). Items were sorted into an Ordered-Item-Booklet by their empirical difficulty parameters. These Ordered-Item-Booklets were inspected by both authors to identify similarities and differences between the ordered items, independently. Items were grouped together so that items in the same level were of similar complexity with respect to the required mathematical knowledge and the way this knowledge must be used to solve the items, but also so that items on different levels contrasted against the items on the neighboring levels in terms of the required mathematical knowledge and the way it must be used to solve the items. The second author created a first verbal description of four different knowledge levels and listed the items that belonged to each level. The first author compared these descriptions to her own analysis, and made proposals for changes. In two more rounds, each author reconsidered the changed descriptions and item lists, until a consensus was reached. To map students’ person parameters to the knowledge levels, cut scores that separated these levels were calculated as the mean value of the difficulty parameter of the hardest item of one level and the easiest item of the next higher level.

This method allows to cluster the content-related demands of the different items into different levels of knowledge, as it is surveyed in the test. Using the Rasch model, also a students’ test performance can be interpreted in relation to these levels, indicating the border between tasks the student is able to master successfully and (s)he cannot solve reliably. Describing these levels based on the concrete items is, of course, an act of interpretation. This interpretation is based on a) information about the educational context, in particular the mathematical knowledge and strategies which students will most likely have encountered at school (see “The Transition from School to University in Mathematics” section ) and b), based on this, a more or less implicit understanding of how students will approach and solve the different items. Thus, the resulting model will, at first, be restricted to this specific educational context and its validity will depend on how realistic the assumed solution processes are. In particular, the model tries to capture the complexity of the mathematical demands in the items, based on difficulty parameters. Of course, it is possible to construct items that are very difficult to solve, independently of their mathematical complexity (e.g., by including complex language constructs in item formulations, or using very large numbers). So, the interpretation requires that c) the applied items, including the scoring procedures, primarily capture relevant aspects of mathematical complexity but avoids other factors that might make the items difficult without relating to mathematical complexity.

To evaluate the rater reliability of our level model, we provided our items as well as the final level description to members of a group of German experts in mathematics education with specialization in test instruments for the mathematics study entrance phase. We asked each expert to assign each item to one level. There was good consensus. After discussing the anticipated solution processes of the items, all disagreements were resolved. Based on this discussion, the level descriptions were refined slightly. In a last round, the model was used to categorize items from our test and five other tests used in similar projects by two independent raters each, reaching good agreement.

Results

The items were solved correctly by 9.0% to 76.1% of the students who worked on them (Mean solution rate: 40.3%, SD: 21.2%). This indicates that some items in the test were rather hard, but that the test covered an acceptable range of difficulty.

Model-Fit and Unidimensionality

To answer the first research question, we examined the fit of the unidimensional Rasch model to our data. Since the original model shows some low fit indices, we inspected the reasons in detail. Subsequently, we added one residual correlation between two items which deal with very similar content (existence and density of irrational numbers). This model has adequate to good fit indices: χ2(134) = 185.685 (p < .01); CFI = .959, TLI = .958, RMSEA = .016.Footnote 2

The mean of the person parameters was constrained to zero. The item thresholds vary between −0.71 and 1.30. To test to which extent prior knowledge as a construct overlaps with previous school achievement, we analyzed the correlation between students’ prior knowledge and their final school qualification grade. Prior knowledge scores correlate positively with school qualification grade (r = .12, p < .001). The weak correlation indicates that prior mathematical knowledge can be empirically separated from general school achievement. This supports the discriminant validity of the measurement of prior knowledge.

Levels of Demands in the Test

Using the bookmark procedure described above, we analyzed the items based on the difficulty parameters from the statistical model and the mathematical demands posed by the items. Table 1 shows the identified levels and cut scores, each illustrated by an item example. Even though the KUMA test spans a spectrum of mathematical concepts from the domain “Analysis” such as real numbers, differentiable functions, the concepts covered in the items alone were not sufficient to explain the observed item difficulties. To explain these differences, how the students had to deal with these concepts in the different items turned out to be important. Based on the theoretical perspectives concerning “conceptual and procedural knowledge”, “conceptual knowledge as a network of representations” and “levels and dimensions of conceptual knowledge” (see “Conceptualizations of Mathematical Knowledge” section), we derive the following level model:

Table 1 Levels of knowledge in the KUMA test with item examples and difficulty parameters

Items on level 1 require students to apply well-known procedures that can be identified straightly from the problem presentation, such as calculating the derivative of a polynomial (see item example), or to evaluate mathematical statements mainly concerning simple characteristics of real numbers. These statements are formulated mostly verbally, with little use of symbolic notation beyond those frequently used in secondary school (see item example). In sum, procedural knowledge and knowledge about facts are sufficient to solve items on this level.

In items on level 2, students have, for example, to provide examples for mathematical concepts like irrational numbers (see item example) or state the maximum domain of the composition of two functions. Items on this level do not clearly indicate a solution procedure, so conceptual knowledge is necessary to identify which known strategy can be applied, for example to generate a list of five irrational real numbers (see a deeper analysis of students’ solutions below). In Germany school classes, stating five irrational number is not a fact which students have to memorize. Only basic changes of representations from the verbal problem statement to one well-known representation of the problem are necessary, e.g. the number line for real numbers.

Level 3 consists of items which require a deeper understanding beyond well-known procedures and representations. These items particularly require students to activate, integrate, and investigate different (mentally or externally drafted) representations of a concept. However, these items mainly focus on working with the concept image of these concepts, not with a formal concept definition. The item example deals with the values of tangent and sinus. We expected that students solve this item by picturing the graph of the tangent- and sinus-function or by picturing the tangent and sinus on the unit circle and translate the graphical information into the symbolic form. Mnemonic for this problem are not common in German classrooms. Solving items on level 2 or on level 3 requires conceptual knowledge. Items on level 3 differ from those on level 2 in the demands of dealing with representations. Items on level 3 require a deeper understanding of the concepts. This means that students need to construct a mental or graphical representation on their own, that is not provided in the item, or to make meaningful links between different representations of one concept to solve the problem (cf., Ronda 2015). For items on level 2, only dealing with one representation of the concept is necessary, which is either presented in the item or can be considered to be a prototypical representation of the concept (e.g., symbolic representation for a number). So, these two levels differ in the quality of conceptual knowledge which is needed to solve the corresponding items.

Items on level 4 require working with mathematical concepts in a formal way or conducting deductive proofs using formal notation (e.g., variables, function notation, limit notation). For example, students have to prove that the absolute value function is not differentiable at x = 0 (see item example). Solving these items requires well-connected conceptual knowledge including formal representations of concepts.

The students distribute as follows on the levels of knowledge: 251 students on level 1, 811 students on level 2, 480 students on level 3 and 11 students on level 4.

To illustrate our analysis, we present in detail students’ solutions concerning two of the open-ended items. One item allocated to level 2 ask students to state five different irrational numbers. Solving this item requires identifying known examples of real numbers that are not rational, but also applying ways of generating new irrational numbers from given ones (e.g., by multiplying with a non-zero rational number). We note that many students provided a selection of a small set of well-known irrational numbers such as \( \sqrt{2} \) (84% of 172 analyzed solutions), \( \sqrt{3} \) (60%), π (60%), e (53%). To generate five numbers, the students had to identify strategies that generate irrational numbers beyond these typical representatives. Here, a range of errors occurs, that indicate invalid strategies or basic conceptually misconceptions: \( \sqrt{4} \) (5 answers) or \( \sqrt{9} \) (6 answers), natural numbers (3 answers), proper (3 answers) and periodic decimal rational numbers (3 answers), and fractions (6 answers).

Another item, allocated to level 4, asked students to prove that the absolute value function is not differentiable. This specific problem is part of many curricula in Germany, but proving such statements is surely not a well-known procedure; many students show serious problems to solve this item (one typical student answer is shown in Fig. 1). Some students drew the graph of the absolute value function and argued only with the concept image of a differentiable function: A differentiable function must not have a kink. For one point, students had to provide a proof by using the definition of a differentiable function and the limit concept with corresponding notation, so well-connected conceptual knowledge including formal notations is needed.

Fig. 1
figure 1

Example of a student’s answer to an item example of level 4 (see Rach and Heinze 2017)

Prior Knowledge and Study Success in the First Semester

To answer the third research question, we restricted our structural equation analyses to those students, for which we have data on success in the Analysis I module from their first semester (N = 705). We conducted a structural equation model with the dichotomous outcome success in the first-semester Analysis I module (0 = fail, 1 = pass) and prior mathematical knowledge as independent variable. The model still shows good to acceptable fit indices: χ2(150) = 202.822 (p < .01); CFI = .940, TLI = .939, RMSEA = .022. Mathematical knowledge turns out to be a strong predictor of this objective criteria of study success (standardized regression coefficient: β = .752, p < .001).Footnote 3

Using exported factor scores from the structural equation model with all students and the estimated logistic regression equation, we computed a predicted success probability of each student based on their prior knowledge scores. We derived a success prediction for each student by checking if this probability was larger than or equal to 50% (predicting success) or smaller than 50% (predicting failure). Out of 602 students for which the model predicted failure, 72,3% failed the Analysis I course. Out of 103 students for which the model predicted success, 72,8% succeeded in the Analysis I course. Thus, the prediction based on students’ prior knowledge is quite accurate when predicting failure or success in the first semester. However, the result also reflects that low prior knowledge might be compensated by other student characteristics, which are beyond the scope of this study.

The threshold parameter δ of the logistic regression, i. e. the score of prior knowledge at which the model predicts a 50% probability to succeed in the Analysis I course, was δ = 0.41 (SE = 0.048). In our level model, this score is in the lower part of level 3. Plotting the observed probability density of success in the Analysis 1 course by the prior knowledge score (Fig. 2, dashed line; Gaussian kernel with width 0.18) illustrates how well the empirical data fits the logistic regression model (Fig. 2, dotted line). It (cf. Fig. 2) indicates that students who can master knowledge items on level 3 (scores above 0.93) have a high predicted chance of succeeding in the Analysis I course, while students who have substantial problems with these items (scores below 0.19) have a quite low predicted chance of success. Thus, it seems to be particularly relevant for success in the first semester Analysis I courses in our study, whether students have knowledge that is described by level 3 in our model at their disposal.

Fig. 2
figure 2

Plot of observed (dashed line, kernel-density plot with Gaussian kernel with kernel width 0.18) and predicted (dotted line) success probabilities by knowledge score; solid line indicates information density; each dot represents a single case (failure = 0, success = 1)

Discussion

Summary

The paper aims to contribute to understand students’ problems in the transition to university programs with a major in mathematics. The underlying problems in the first semester can be caused by substantial differences in learning processes between school and university. The subject itself changes from an applied-oriented form of mathematics in school to mathematics as a scientific discipline at university (Gueudet 2008) and the learning processes from a guided to a self-regulated form. Thus in Germany, the institutional transition from school to university coincides with the transition to advanced mathematics. Empirical research has shown that students’ prior knowledge is a decisive factor for successful learning at this transition (e.g. Greefrath et al. 2017; Hailikari et al. 2008). One explanation for this influence of prior knowledge is that successful learning at the transition requires substantial restructuring of students’ understanding of mathematical concepts and procedures (e.g. Artigue 1999). Our main goal in this study was to analyze if – and in particular which kind of – prior mathematical knowledge differentiates between students who complete a first-semester Analysis I course successfully from those who fail. To answer this question, we reanalyzed data from a test of mathematical knowledge (KUMA) that takes into account the specific form of mathematics at tertiary courses. As a central prerequisite for our approach to the data, we found that a unidimensional model fitted the data.

There are innovative results from our study that go well beyond prior research: (1) We were able to establish a model of four levels of prior knowledge for the transition to university mathematics, which allow us to obtain a criterial interpretation of students’ test scores in terms of the demands they can master using this prior knowledge. To our knowledge, such a model did not exist before in the literature. (2) We could use this model to characterize a level of knowledge, which characterizes those students who have very good chances to succeed in first semester mathematics courses.

To achieve (1), we established that an assumption of unidimensionality was justified for the knowledge construct measured by the KUMA test and our student population. Based on this assumption, we generated a model of four levels of prior knowledge for university mathematics using the bookmark procedure. These levels range from procedural knowledge and knowledge about facts (level 1) over basic conceptual knowledge (level 2) and connected conceptual knowledge incorporating multiple mental representations of mathematical concepts (level 3) to knowledge that is connected to formal notations and central mathematical practices like proving and defining formally (level 4). Our sample turned out quite heterogeneous with respect to prior mathematical knowledge, but only a small number of students was assigned to the highest level. A substantial number of students on level 1, but by far not the majority of students, can hardly go beyond reproducing basic conceptual knowledge and well-known procedures.

Regarding (2), our study showed again that prior mathematical knowledge predicts success in first semester Analysis I courses strongly. So, students who enter university with a deeper knowledge can probably integrate new information, e.g. formal concept definitions or new aspects of a concept, better than students with a weaker knowledge base. The central innovation of our study is that the level model allows to go beyond the usual interpretations along the line “more is better” in prior studies (e.g. Greefrath et al. 2017; Halverscheid and Pustelnik 2013; Köller et al. 2006; Rach and Heinze 2017). Using logistic regression models as well as descriptive analyses, we found that in particular knowledge described by level 3 of our model differentiates between students who succeed and those who fail in the first semester. This has some central implications: i) Even though proof and formal representations are considered specific characteristics of academic mathematics (Engelbrecht 2010; Gueudet 2008), the corresponding mathematical knowledge including connections to formal representations seems to be a sufficient (with still some variation remaining, due to other factors than knowledge), but no necessary prerequisite to succeed in Analysis I courses. ii) Knowledge of mathematical procedures without a substantial conceptual basis is not sufficient to succeed in such courses and to master the knowledge reconstruction processes assumed in the literature (cf. “Prior Knowledge as a Learning Prerequisite” section). On the one hand, this supports mathematicians’ concerns about “lacking technical facility” (London Mathematical Society, Institute of Mathematics and its Applications, Royal Statistical Society 1995, cited in Clark and Lovric 2009), on the other hand it counters fast calls on schools to focus on building up these technical skills. iii) The main difference between students who pass and students who fail Analysis I courses seems to be the availability of well-connected conceptual knowledge without necessarily including formal symbolic representations. Such knowledge is surely within the scope of most upper secondary school curricula focusing on the flexible use of mathematical knowledge, e.g. in practices such as problem solving, argumentation, or communication.

Finally, a more descriptive analysis showed that a prediction of study success, for example student advice is a double-edged sword. The method detects successful and failing students to a good rate but not completely. This is in line with other studies showing that mathematical knowledge – even though important – is not the only relevant determinant of success in the transition to university mathematics (Hailikari et al. 2008; Laging and Voßkamp 2017). It remains an interesting question if, and to what extent, a wider set of student characteristics allows for more accurate predictions of success and failure.

Limitations

The limitations of this empirical study concern the sample, the exclusion of other control variables, and our general approach. The sample of first semester students majoring in mathematics from two universities is surely not representative for German mathematics programs. As these students came from many different schools with different learning cultures in classroom, we cannot draw any conclusions on the reasons for students’ heterogeneous conceptual and procedural knowledge, for example in their classroom instruction at school. Besides, to link mathematical knowledge to objective criteria of study success in the first semester, we could use only a reduced sample because not all students provided the information, for example because they dropped out of the course before the exam took place. It is an open question to which extent the results of our study generalize to other educational systems. On the one hand, substantial differences in the organization of secondary and tertiary education (e.g., Pechar and Andres 2011) make a direct generalization questionable. On the other hand, mathematical work has been described as very coherent between subfields and countries (Heintz 2000) and prior knowledge has been found to predict success in the first mathematics semester also in other countries than Germany (e.g., Finland: Hailikari et al. 2008). To connect our results with actual teaching, more studies may pay attention to the knowledge-related demands of the teaching-learning processes during the first semester. These demands determine which prior knowledge is necessary for successful further learning at university.

To make an explicit, meaningful link between levels of mathematical knowledge and students’ success, we had to focus our statistical analysis to one single predictor. Of course, many other cognitive as well as non-cognitive student characteristics and features of the university courses have been assumed to influence students’ success at the transition to university mathematics. This includes general school achievement (Trapmann et al. 2007), interest (Kosiol et al. 2019; Schiefele et al. 1995), self-concept (Robbins et al. 2004), study motives (Ufer 2015), learning behaviors (Valle et al. 2003), and others (di Martino and Gregorio 2019), while studies could not always establish the expected relations. Further research is necessary to clarify the interaction between these variables in their relevance for objective and subjective criteria of study success, but this goes beyond the focus of this contribution.

Regarding our main approach, our analysis is based on statistical data accumulation in the first step, compared with an in-depth analysis of the items in the KUMA test. Even though we have illustrated students’ solutions for selected open-ended items, qualitative analyses might back up the assumed solution processes involved in the interpretation of the items’ difficulty parameters. Our study is based on knowledge about such processes from our own research and the literature, e.g. on students’ conceptual understanding and argumentation skills (e.g. Kolar and Cadez 2012; Orton 1983). While such qualitative analyses allow a deeper exploration of concepts, the statistical approach chosen here enable us to integrate data from a large number of students and relate it to students’ success in the Analysis I course in a meaningful way to provide a broader picture of relevant knowledge facets.

Conclusions

Summing up, prior mathematical knowledge predicts exam performance, a so-called objective criteria of study success in university Analysis I courses. More precisely, well-connected knowledge about concepts of school mathematics is an essential learning prerequisite besides procedural skills, but not necessarily knowledge about formal symbolic representations. We can describe these different levels of mathematical knowledge in the four-level model presented in this paper. This four-level-model demonstrates well which knowledge is necessary for a successful start in a mathematics study, beyond just saying that “more knowledge is better”.

We draw three practical implications from this central result. Firstly, by administrating a test of prior mathematical knowledge before study entrance, the individual results can support study advice to a certain extent. For example, when providing students with feedback on their prior knowledge, including a description of the students’ knowledge level may make the feedback more useful. It may stimulate a comparison towards a well-defined reference, thus going beyond social comparisons – comparison between students – and dimensional comparison – comparison within a student concerning knowledge of different fields. Moreover, specific support might be offered to foster transition from one level to higher levels of prior knowledge. We are optimistic that similar tests can be developed based on this model for other content areas, such as Linear Algebra. Secondly, the results of the study make transparent which prior knowledge students should ideally bring along from school. This and further research may support the communication between school and university stakeholders by explicating which prior knowledge is helpful respectively necessary for a successful transition to mathematics university programs. Thirdly, additional university courses to support students in their transition to university might focus in particular on a flexible change between representations and, if necessary the more basic knowledge facets described by level 1 and level 2. Bridging courses are offered at many German universities before the first semester. If such courses emphasize only a repetition of school mathematics on a technical and basic conceptual level, their effectiveness must be questioned (Clark and Lovric 2009).

Regarding future research, using the Rasch model as an example of Item-Response-Theory proved to be helpful to explore the influence of knowledge in more depth than the simple “more is better” statements found in prior studies (cf. Ufer and Neumann 2018). Based on this method and an analysis of the KUMA items, we could propose a possible model of different levels of prior knowledge that is considered a prerequisite of university mathematics studies, specifically in the field of Analysis. Future research should examine how these levels can be transferred to other contexts, but at best it may serve as a help to interpret students’ performance in tests similar to KUMA. Apart from the concrete levels in our model, we see the main approach to be extendable to other areas of mathematics education. Statements about necessary knowledge to succeed in an educational context or to benefit from a specific intervention are more helpful, if they can be connected to concrete demands that can be mastered using this knowledge. The proposed approach may help to arrive at such a deeper understanding beyond a more-is-better interpretation of knowledge scores also in other areas of mathematics education.