1 Introduction

The value of capturing student sentiment has received increasing attention by researchers in higher education (Knight et al., 2020; Linnenbrink-Garcia & Pekrun, 2011). Apart from obvious benefits to helping student well-being, understanding such sentiment is valuable to understanding the changes that could or should be made in curriculum design (Dunbar et al., 2014; Baxter-Magolda, 2003). Students’ sentiment is recognised as an integral part of student learning and a critical element in the learning process (Linnenbrink-Garcia & Pekrun, 2011; Henritius et al., 2019) as it is closely intertwined with students’ motivations and strategies for learning, self-regulation, performance and academic achievements (Mainhard et al., 2018; Mega et al., 2014; Pekrun et al., 2002). There are concerns, however, that research on student sentiment at university is fragmented (Mega et al., 2014). A focus on the perspectives of students is essential to the development of analytics related to their needs, rather than to the needs of institutions (Ferguson, 2012).

LA has emerged as an area with high potential for improving student learning experiences and curriculum design (Henritius et al., 2019; Ferguson, 2012). It belongs to a suite of ‘smart technologies’ (e.g., big data), also referred to as intelligent technologies’ that are increasingly being promoted as the solution to ‘smart education’ (Zhu et al., 2016) as their use focuses on how learning data can be utilised to improve teaching and learning (Mayer-Schönberger & Cukier 2013; Picciano 2012). LA involves the use of “analytic techniques integrated with learning outcomes assessment to better understand student learning and more efficiently and meaningfully target instruction, curricula and support” (Bach, 2010, p. 2). Integrating LA with teacher inquiry has been identified as critically important (Bos & Brand-Gruwel, 2016; Lockyer et al., 2013). Yet, there is a scarcity of research that examines the adoption of LA to support teacher inquiry (Dyckhoff et al., 2013; Mor et al., 2015; Sergis & Sampson, 2017). Moreover, educators may not always have the discretion to adopt work related technologies such as LA, as this decision is usually made at the organisation or departmental levels (Orlikowski, 1993; Fichman & Kemerer, 1997). We use the assimilation stages of innovation framework proposed by Gallivan (2001) as it acknowledges that adoption of technology is not always made by the individual (Cooper & Zmud, 1990).

Despite the increased attention that LA has received from researchers in recent years, “little research attention has been placed on providing recommendations to educators for translating the analysed data to actionable reflecting actions on their educational design and delivery” (Sergis & Sampson, 2017, p. 20). Further, LA has traditionally been applied to understand and optimise the learning process at module level, even though it can also be used to understand and optimise learning at the program level (Ochoa, 2016).

To address this gap in knowledge, the overarching aim of this study is to explore “how adopting learning analytics can be used to understand students’ sentiment about their learning experience, and to use this understanding to inform teacher inquiry”.

The focus of this study is important as there is a noticeable under representation of studies that directly engage with students in the shaping the curriculum design process curriculum and report the value of such engagements (Trowler & Trowler, 2010; Bovill, 2013; Campbell et al., 2007). We also provide recommendations, which is important because the process of obtaining actionable insights for curriculum design is generally considered to be a time-consuming activity for educators (Sergis & Sampson, 2017; Marsh & Farrell, 2014; Mor et al., 2015).

In the context of this study, we adopt the view that the student voice is about actively involving students in evaluating and redesigning curriculum (Bovill et al., 2011; Bovill, 2013; Trowler & Trowler, 2010) as it has a unique perspective on teaching and learning, and therefore, it warrants the attention and response of educators (Rudduck, 2007; Fielding, 2001; Hattie, 2008). Further, although the student voice has increasingly gained prominence in higher education (Campbell et al., 2007), the focus has primarily been on quality assurance with less attention given to active student involvement (Seale, 2009).

The remainder of this paper is structured as follows. First, the theoretical background to LA and teacher inquiry is presented. Next, the research method and background to the case studied is presented. Then, the findings and analysis are presented. Followed by discussion, recommendations, implications, and a research agenda. The paper ends with a conclusion.

2 Theoretical Background

2.1 Overview of Analytics

The term ‘analytics’ is interpreted differently across university stakeholders (Roden et al., 2017), be that across different academics, different academic departments and different business units. Analytics are categorised as descriptive, diagnostic, predictive, and prescriptive (see Fig. 1). In a broader context, analytics falls under the umbrella of ‘business analytics, a holistic approach that uses various technologies, methodologies, and applications to manage, process and analyse data that can lead to actionable insights and enable organisations to predict and respond to change. Business analytics have received increased attention from academics and practitioners to generate and use data for operational and strategic purposes to deliver business value (Chatterjee et al., 2021; Gupta et al., 2020; Wamba et al., 2015).

Fig. 1
figure 1

Types of analytics (Dennehy, 2020)

Understanding different analytics types will also inform the LA initiative, and specifically how the data will be modelled. As the context of this study is teaching and learning, the remainder of this section discusses the role of both academic and LA.

2.2 Academic and lEarning Analytics

Academic analytics refers to the use of analytics within academic settings and may be applied at the level of the institution, the department, or the learner, depending on the goals and objectives of the analysis (Van Barneveld et al., 2012; Dunbar et al., 2014). In the context of this study it is used to transform teaching, learning, assessment, and curriculum design (Siemens & Long, 2011).

Academic analytics consist of two types of applied analytics called ‘institutional analytics’ and ‘learning analytics’ (Dunbar et al., 2014). Institutional analytics is generally used to understand factors that relate to running the business of the higher education institution, such as predicting student success and retention rates (Oblinger, 2012). Learning analytics focuses specifically on students and their learning behaviours (van Barneveld et al., 2012; Siemens & Long, 2011). LA was defined in 2011 by the Society for Learning Analytics Research at the 1st International Conference on Learning Analytics and Knowledge as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs” (Ferguson, 2012). This definition includes techniques such as predictive modeling, building learner profiles, personalised and adaptive learning, optimising learner success, early interventions, social network analysis, concept analysis, and sentiment analysis. While there is no generally accepted definition of LA, it can play a critical role in understanding student learning and curriculum design, by supporting evidence-based practices derived from relevant measures of learning processes, outcomes, and activities (Mangaroska & Giannakos, 2018). Siemens and Long (2011) outline the differences between academic and LA (see Table 1).

Table 1 Differences between academic and learning analytics

LA holds the potential to (i) detect and explain unexpected learning behaviours and misplaced efforts, (ii) identify successful learning behaviours and patterns, and (iii), introduce appropriate design interventions (Siemens & Long, 2011; Mangaroska & Giannakos, 2018). The link between LA and teacher inquiry is based on the premise that comprehensive data capturing and analysis is conducted at various levels (i.e. module, programme) to inform and influence the learning experience, the design process (or its ensuing refinement) and the community of curriculum designers (Hernández-Leo et al., 2019). There are several sources of information that can be used to analyse a program curriculum, with surveys about students’ perceptions and sentiments being the most popular tool in curricula analysis (Ochoa, 2016).

2.3 Synergies Between Teacher Inquiry and Learning Analytics

Teacher inquiring is defined as a cyclical process in which “teachers identify questions for investigation in their practice and then design a process for collecting evidence about student learning that informs their subsequent educational designs” (Avramides et al., 2015, p. 249–250). Teacher inquiry is a process that can guide reflection and enhancement to curriculum design and delivery in a systematic and evidence-based approach (Dana & Yendol-Hoppey, 2014). In essence, teacher inquiry is a form of ‘action research’ as educators are in a position to determine questions to critically evaluate the design and delivery of their curriculum and choose appropriate data collection techniques (i.e., learning analytics) to answer these specific questions (Feldman et al., 2018; Sergis & Sampson, 2017). There are a number of generic steps involved in teacher inquiry which are listed in Table 2 and mapped to LA (Sergis & Sampson, 2017; Timperley et al., 2010; Hansen & Wasson, 2016). To the best of our knowledge, this is the first study to align learning analytics, with teacher inquiry.

Table 2 Mapping learning analytics with the steps of teacher inquiry

LA can also provide support for educators to reflect on and improve curriculum design and delivery through the evidence-based insights generated by LA (Bakharia et al., 2016; Sergis & Sampson, 2017; Greller et al., 2014). Therefore, LA supports the concept of teacher inquiry (Mor et al., 2015) and can be linked to the teacher inquiry cycle (Sergis & Sampson, 2017).

Despite the critical importance of integrating LA with curriculum design (Bakharia et al., 2016, Lockyer et al., 2013), there is limited research that reports on the actual use of LA to support curriculum design (Bakharia et al., 2016; Dyckhoff et al., 2013; Sergis & Sampson, 2017). Most concerning is that LA are increasingly being implemented in different educational settings, often without the guidance of a research base (Siemens, 2012). In addition, there is very little research on how to analyse the learning process at the program level in order to guide the design or redesign of a curricular program. Research does suggest, however, that educators may lack the skills and knowledge to formulate questions and identify solutions (Olah et al., 2010; Means et al., 2011) or they may not always know how to make sense of the data in order to inform curriculum redesign (Olah et al., 2010; Heritage et al., 2009; Young & Kim, 2010).

While many LA studies identify patterns in students’ learning behaviour, which are then related to academic performance, understanding of the pedagogical context that influences student activities is lacking (Lockyer et al., 2013; Gasevic et al., 2016). A related issue is the need to use the ‘actionable insights’ generated from the use of LA to make appropriate design interventions to improve learning (Clow, 2013; Campbell et al., 2007). Most concerning is that LA, an interdisciplinary field that adopts methods and frameworks from other disciplines, lacks a consolidated model to systematise how those disciplines are merged together (Gašević et al., 2017; Mangaroska & Giannakos, 2018).

2.4 Assimilation Theory as a Means to Examine the Adoption of Learning Analytics

This study draws on innovation assimilation theory. Assimilation is defined by Meyer and Goes (1988, p. 897) as “an organisational process that (i) is set in motion when individual organisation members first hear of an innovations development; (ii) can lead to the acquisition of the innovation; and (iii) sometimes comes to fruition in the innovation’s full acceptance, utilisation, and institutionalisation”. We apply the assimilation framework for innovation adoption proposed by Gallivan’s (2001) who in turn was been heavily influenced by earlier work of Cooper and Zmud (1990). The framework has been used to study the diffusion and assimilation of information technology innovation (Fichman, 2000), software process innovations (Fichman & Kemerer, 1997), e-business (Zhu et al., 2006), and enterprise information systems (Liang et al., 2007; Saraf et al., 2013).

While several researchers have proposed various frameworks describing the technology implementation process in organisations, Gallivan’s framework is one of the most cited frameworks (Weible & Hess, 2018). In recent years it has been used to study contemporary technologies such as big data (Bharati & Chaudhury, 2019; Weible & Hess, 2018), cloud (Ooi et al., 2018), social media (Cao et al., 2018), and contemporary processes such as agile (Wang et al., 2012), and Kanban (Ahmad et al., 2018). The six assimilation stages of innovation are adapted for this study and are therefore restated in the context of LA rather than technology generally. These adapted definitions are listed and explained in Table 3.

Table 3 Stages of assimilation (adapted from Gallivan, 2001)

A strength of this framework is that is acknowledges the realities of adoption within organisations, particularly when adoption decisions are made at the organisation, departmental, or workgroup levels, rather than at the individual level (Orlikowski, 1993; Fichman & Kemerer, 1997).

3 Research Method

3.1 Background to the Case Studied

The research described in this paper follows the principles of a case study method (Yin, 2009). The context of the case is a one-year fulltime master’s programme in business analytics at a university in Ireland. The specific case was purposefully chosen because, (i) monitoring student sentiment was critical as the programme underwent significant growth each year, (ii) ensuring the programme was designed for inclusive teaching as the student population was diverse, and (iii) the programme director was keen that students had a positive student experience.

Background to the Case

The Master of Science (Business Analytics) is designed as a specialist programme, which assists students to blend their existing talents with the analytical skills and business knowledge needed to use and manage big data and business analytics in knowledge-based companies. The programme is aligned with Ireland’s National Skills Strategy 2025 by placing a strong focus on providing skills development opportunities that are relevant to the needs of learners, society and the economy.

Programme learning outcomes

The learning outcomes are intended to equip students with the required industry-standard skills and knowledge: (A) understand and be able to use specific IT which is used in developing business analytics. (B) analyse and solve business problems using applied data analytics. (C) understand and apply techniques for managing IT in organisations. (D) identify, analyse and solve applied problems in individual and team-based settings. (E) apply effective data-driven decision-making to global business and social problems.

Programme Outline

The programme consists of 90 ECTS (European Credit Transfer and Accumulation System). Modules (see Table 4) are worth 5 ECTS, with the exception MS5103 Business Analytics Project (30 ECTS) and Business Analytics with third party software (10 ECTS). The programme commences in September and consists of three terms; September to Decembers (Term 1), January to April (Term 2), and April to August (Term 3).

Table 4 Modules offered for 2020-21 academic year

Programme Reputation

The MSc (Business Analytics) programme is the largest of its kind in Ireland and is only one of two such programmes in Ireland that qualified to be ranked by Quacquarelli Symonds (QS) rankings in 2020 and 2021. In 2020, the programme was ranked No.1 in the world for ‘value for money’ and in 2021 it ranked in the top 43 % in Europe for ‘alumni outcomes’ and ‘thought leadership’. The programme was also awarded the Dean’s Award for Inclusive Teaching and Learning (Team Award) in 2019. These endorsements, coupled with an excellent team of academics and administrators, regular engagement with students and alumni, and sharing of student events, awards, and First Destinations reports on social media are possible reasons for the continued growth of the programme (see Table 5).

Table 5 Number of applications and enrolments between 2015 and 2019

Student Profile

Students from Ireland, India, UK, France, Pakistan, Nigeria, Greece, Brazil, China, USA, Ghana, Germany, Mexico, Indonesia, and Malaysia are largely represented on the programme each year. Students present with a range of industry experience (e.g., 1–8 years) and their academic background is varied (e.g., engineering, information systems, statistics, economics, sports, arts, business).

3.2 Data Collection and Analysis

To address the concerns mentioned previously, we propose a LA-based curriculum design framework (see Fig. 2). The proposed framework is an adaptation of Cross Industry Standard Process for Data Mining (CRISP-DM), an industry standard methodology that prescribes a set of guidelines to guide the efficient extraction of information from data. The CRISP-DM methodology consists of six cyclical steps, namely (i) Business Understanding, (ii) Data Understanding, (iii) Data Preparation, (iv) Modeling, (v) Evaluation, and (vi) Deployment. We adapt this process methodology to suit the context of our research but do not exclude any of the six phases of CRISP-DM, instead, it merges them into three inter-related activities, namely, (i) Problem and Data Understanding, (ii) Modeling (i.e., classification, evaluation, and reflection) and (iv) Actionable Insights. Each of these phases are discussed below.

Fig. 2
figure 2

Learning analytics-based curriculum design framework

Problem and Data Understanding

This phase involves firstly understanding the problem in context and align the objective of the LA initiative with this problem. It is secondly about understanding what data sources are to be analysed to achieve this objective. In the context of teacher inquiry, input data consists of learning management system, quantitative data (i.e., surveys), and qualitative data (i.e., interviews, focus groups). Data preparation includes determining what data points to include in the dataset, extracting and cleaning the data.

Modeling

This phase comprises of data classification, type of analysis (i.e., descriptive, diagnostic, predictive, prescriptive), internal and/or evaluation (i.e. instructor, discipline). Evaluation and reflection of the emerging model and findings occur simultaneously, ideally in a collaborative team environment to ensure a shared understanding and shared commitment of the solution.

Actionable Insights

The emerging findings and actionable insights are then applied to the curriculum design problem identified and lessons learned shared with colleagues within the department and wider university setting.

The proposed framework is important, as curriculum design is a “methodology that educators use and communicate with each other to make informed decisions in designing learning activities and interventions with effective use of resources and technologies” (Conole, 2012, p. 121). It must also be conceptualised before it can be utilised as a process that leads to explicit design interventions and outputs.

3.3 Instantiation of the Analytics-based Curriculum Design Framework

This section describes an instantiation of the proposed LA-based curriculum design framework that was previously discussed.

Problem and Data Understanding

In this phase, problem understanding focused on the context, aim and curriculum design problem in order to align with the LA initiative, and data understanding provided understanding of the data sources to be analysed. Primary input data that informed curriculum design comprised of (i) module data (i.e., 15 modules per year), (ii) programme reviews i.e., 1 per year), and (iii) interview data. The questions used for both the module and programme reviews are listed in Appendix Table 9 and were informed by constructive alignment and integrative learning literature (e.g. Biggs, 1996, 1999; Hounsell & Hounsell, 2007). In order to align with international accreditation bodies, student feedback questions for all modules delivered throughout the business school were standardised in 2016. These surveys are administered independently from the module owner (i.e., educator). Secondary data that informed curriculum design includes feedback from external examiners, accreditation bodies, and observations of similar programme offerings that are ranked by QS Rankings.

Qualitative data: To gain a rich understanding of the students’ learning context, interviews and observations were used as sources of evidence, as these techniques are particularly suited for increased immersion within the broader context of the case being studied (Yin, 2009; Stake, 2000). This data was collected throughout the academic years in the form of informal interviews with students and class representatives. Staff responsible for the design and delivery of the modules provided insight to the rational for the current pedagogical design, which enabled the researchers to unearth challenges associated with teaching and learning related to this programme.

Data Preparation

This phase included deciding what needed to be included in the dataset, cleaning the data and all other activities that needed to be done to process data which served as an input to the modeling tool in the next step. Data extraction and integration using Python scripts whereby messages were converted from RAR file format into .CSV file format. Text was then converted into Pandas DataFrame format for compatibility purposes with the sentiment analysis algorithm. Sentiment analysis refers to a sub-field of natural language processing (NLP) in computer science (Liu, 2010). Commonly, word dictionaries with pre-classified sentiments by linguists are used to determine sentiments in an automated manner using word counts (Liu, 2010; Tausczik and Pennebaker, 2010). Sentiment classification models can also be developed using state-of-the-art machine learning methods based on labelled datasets (Zhang et al., 2018). Sentiment analysis is the task of identifying positive and negative opinions, evaluations, gestures, and cultural meanings organised around a relationship to a social object, usually another person or group (Gordon, 2017; Wilson et al., 2005; Jongeling et al., 2015).

In this study, we use a different approach that is not based on textual analysis. While this is a viable method to assess qualitative feedback, we found that the amount of text per answer and the frequency in terms of number of students who replied to open-ended questions was not sufficient to deem this analysis credible by itself. We derive sentiments based on the replies to the Likert type questions. In this study, the five scale options of the Likert questions were rated between − 1 and + 1; disagree or strongly disagree rated as -1 (negative), neither agree nor disagree rated as 0 (neutral) and agree or strongly agree rated as + 1 (positive). For example, if there are 3 respondents for a given Likert question and the responses are, ‘Strongly Disagree’, ‘Agree’, and ‘Strongly Agree’, the aggregate score for that question is + 1.

Modeling

Essentially, this phase performed sentiment analysis across three consecutive academic years, namely, 2016-17, 2017-18, 2018-19, and 2019-20. To ensure high response rates, all responses were anonymised. The response rate for each end of year programme review was 72 % (2016-17), 96 % (2017-18), 70 % (2018-19), and 66 % (2019-20). As the response rate varied across the academic years, a number of analytical techniques were applied to calculate an overall rating scale of 0 to 5. Zero being the lowest overall score the programme could receive and five been the highest rating. To calculate the average of all the scores, we added individual scores for each of the 10 Likert questions (e.g., Q1 + Q2 + Q3 + Q4 + Q7 + Q8 + Q9 + Q11 + Q12 + Q13) and divided it by 10. The results of the data were then represented using Tableau, an industry standard analytical software tool that is used for interactive data visualisation.

Evaluation & Reflection

In this phase, the model, data, and emerging findings were analysed in relation to the problem and data understanding (e.g., disconnect between module and programme learning outcomes). This involved meeting with staff, students, and the research team. This iterative process ensured that the emerging findings led to ‘actionable insights’ that informed the curriculum design of the programme.

4 Findings and Analysis

The findings and analysis presented in this section are intended to provide insight of how student sentiment and involvement influenced curriculum redesign rather than compare staff.

The 2016-17 end of year programme review was the starting point of our empirical analysis as (i) this was the first programme review conducted since the programme commenced in 2015, (ii) the programme review was conducted by the incoming and newly appointed programme director, and (iii) this dataset provided a baseline from which to compare student sentiment in subsequent academic years. First, we were keen to understand if students were aware of the learning outcomes of the programme (Q1), if the programme delivered the expected learning outcomes (Q2), if the assessment and examination requirements were clearly communicated (Q3), and if the modules on the programme were linked effectively (Q4). The sentiment for each of these metrics is presented in Fig. 3. There was concern about a disconnect between the stated (see purple circles in Fig. 3) and realised learning outcomes and assessment (see black circles in Fig. 3).

Fig. 3
figure 3

Sentiment of learning outcomes and assessment

To gain deeper insight of the sentiment ratings identified from the baseline survey, informal interviews with students and monthly meetings with the class representatives were conducted during the following academic years. Engagement with students was necessary in order to distinguish if there was a recurring pattern relating to curriculum design issues or if the issue was unique to the 2016-17 cohort of students. Engagement with students revealed that the majority of students did not distinguish between programme and module learning outcomes (Q1, Q2) and many students acknowledged that they did not know the programme learning outcomes or where to find them.

A number of initiatives were implemented by the programme director and staff at the business school that has since positively increased student sentiment for the 2019-20 academic year (see Fig. 4 below). These included (i) designing a standard template for module descriptions with no more than five learning outcomes linked to a module, (ii) learning outcomes were based on Bloom’s taxonomy, (iii) learning outcomes of the programme and module descriptions with the associated learning outcomes were made available on the college website for current and potential students to review, and (iv) the programme learning outcomes were incorporated into the programme orientation and their relevance discussed with incoming students.

Fig. 4
figure 4

Curriculum design and delivery rating

There was a concern that students (2016-17) did not find the programme intellectually stimulating (Q7). See the yellow squares in Fig. 4 for a comparison of each year. Students also reported that they did not receive helpful and/or timely feedback during the programme (Q8). This was surprising considering sentiment remained the same (25 out of 50 points) for the subsequent academic year (2017-18). However, interviews with students indicated that students were unable to identify when educators were providing ‘formative’ assessment compared to ‘summative’ feedback. This was concerning because incorrect assumptions about assessment do more damage by ‘misaligning’ teaching than any other single factor (Biggs, 2003). Staff now explicitly inform students when they are providing formative assessment and this had a considerable impact (rating of 39 out of 50) on student sentiment in the 2019-20 academic year, even though the class size had increased (see purple squares in Fig. 4).

Interviews (see Table 6) with students revealed that many students struggled to grasp ‘threshold concepts’ (Nicola-Richmond et al., 2018; Cousin, 2006; Meyer & Land, 2005) and/or were unable to apply the transferable skills obtained from a module to other modules (i.e. major project). Threshold concepts are important because business analytics is also a profession (cf. Land et al., 2018) that rely on evidence-based thinking and practices, that involves, key threshold concepts (e.g., digital literacy, appropriate use of business analytics terminology, critical appraisal of business analytics techniques and practices, and problem solving (technical, people, process).

Table 6 Sample of student feedback

The experiences listed above are reflected in Fig. 5, which presents sentiment trend over the four academic years. To help students ‘connect the dots’ between modules and to get a grasp of threshold concepts, the programme director initiated and supported a number of curriculum design changes. These included, inviting industry experts to share their experiences of using analytical tools and techniques in their respective industry, setting up a student-led business analytics society, and appointing an Honorary Professorship of the programme to a Chief Innovation Officer (CIO) of an analytics-centric multinational that has a presence in the country. The CIO visits the university to deliver a number of workshops and lectures in order to demonstrate how data science teams use analytical tools to generate business value at their company.

Fig. 5
figure 5

Programme rating for each academic year

The impact of inviting relevant guest speakers from industry and curriculum design changes (e.g., new core modules and new elective modules) had a significant impact on sentiment for the overall programme rating for the 2018-19 and 2019-20 academic years (see Fig. 5). Using the 2016-17 programme as a baseline rate of 3.5 out of 5, sentiment increased to 4.47 in the 2019-20 academic year. While sentiment for the overall programme rating moved in a positive direction, there was a dip in sentiment in 2017-18, which can be attributed to the timing of interventions and the period of adjustment needed to have an impact on the programme.

These new module changes were informed by student feedback and used to influence staff responsible for designing and approving new modules. For example, ‘Python’, a popular programming language used in the analytics field, was strongly suggested by students from the 2017-18 cohort to be added as a new module. In response, a new module that includes Python, called ‘Advanced Programming for Business Analytics’ has since been designed and incorporated into the programme.

Although curriculum design changes will continue to be implemented, these changes are not simply to improve a superficial level of student sentiment but rather to provide students with the right content and appropriate supports that will enable them to shift from ‘surface learning’ to ‘deep learning’ of the threshold concepts related to business analytics (Ashwin, 2016). The culmination of the changes made to the curriculum design and delivery have had an overall positive impact on student sentiment and overall academic performance.

5 Discussion, Recommendations, Implications and a Research Agenda

Viewing students as active participants and creators of knowledge is important because the focus of their work in the 21st century will be forging relationships, tackling novel challenges and synthesizing ‘big picture’ scenarios (Pink, 2005). This changing role of the student is akin to ‘self-authorship’ as proposed by Magolda (2004), whereby students develop the capacity to define their beliefs, values, and relationships with others. This is important as it will influence how students spend their time and how they come to see themselves as students and graduates (Brown & Knight, 1994). In addition, the wider context of student learning needs to be considered. For example, a recent study by Foltýnek and Glendinning (2015) identified that only 50 % of students in Ireland confirmed that they received training for scholarly academic writing and avoidance of plagiarism.

As there is a lack of research examining the value of LA to support curriculum design (Dyckhoff et al., 2013; Mor et al., 2015; Sergis & Sampson, 2017), there is a risk that LA is not used appropriately and thereby, its real value not realised. Our study showed that integrating LA with teacher inquiry, advantageously informed curriculum redesign. By combining LA with interviews, we actively involved students in evaluating and redesigning curriculum and therefore gained critical insight into how students experienced their learning (Bovill et al., 2011; Bovill, 2013; Trowler & Trowler, 2010).

In doing so, the student voice (cf. Campbell et al., 2007; Seale, 2009) clarified and challenged our approach to curriculum development. While we cannot assume that students will always appreciate changes in curriculum design (Brooman et al., 2015), the following inter-related recommendations are intended to support educators to realise the value of LA in the context of curriculum design, as well as to provide a more positive student learning experience. In Table 7 below, each recommendation is also mapped to the relevant assimilation stage that it relates to if implemented effectively. These recommendations are based on the case studied and synergies between the LA and curriculum design previously outlined.

Table 7 Mapping of recommendations to assimilation stages (X denotes application of LA)

Create a LA Culture

Support educators to adapt, apply, and integrate LA into their teacher inquiry (Mandinach, 2012). This implies that educators will require training in the use of analytical tools and analysis of data. Tailored training is important as Vatrapu (2011) highlights that LA solutions that do not incorporate diverse “alternates for action” might not achieve the desired results for students and educators. Tailored training will address the issue of low ‘data literacy’ competency that has hindered the adoption of LA (Marsh & Farrell, 2014). From the analysis conducted in this study, this is a recommendation that if not followed, can undermine all aspects of assimilation. To even initiate the adoption process, there has to be at least some educators who believe in the value of LA, and to build an awareness of what analytics solutions exist, or what analytics features of currently used technology is not being used. At the advanced, infusion stage of assimilation, there needs to be a culture of analytics experimentation- a trial and error use of analytics, where the educators are willing to continually revise their analytics design and use, and to continually scrutinise the analytics information for any omissions or misinformation that may undermine the analytics initiative, create cynicism around it, and may damage the analytics culture in subsequent learning cycles.

Establish Baseline Learning Analytics

Such a baseline is critically important in learning institutions where a analytics technology investment may need to be justified. Rather than adopting the technology for the sake of it, an analytics initiative can be justified by establishing baseline analytics and then showing the efficacy of the technology through a pilot on one class or module. Then, for subsequent educator and student acceptance, it is important to determine the improvement gleaned from this LA. As student feedback can be emotive, it is critical that educators establish baseline metrics from which to their build analytic capabilities, and over time, identify patterns and trends, rather than prematurely acting on negative and positive feedback. Establishing a baseline has been identified as a useful indicator for progress in other studies.

Use Learning Analytics in Context

Understanding the contextual factors of teaching and learning is critical when determining curriculum changes, rather than purely relying on learning analytics. Of course, the most logical part of assimilation to consider context is at the adaptation stage, when such adaptations can adjust to the context of the classroom. However, traditional assimilation theory would suggest that while former adaptation decisions are certainly important, it is the fluid, minute contextual changes that technology users make that are usually the most impactful and ultimately lead to acceptance and routinisation (or not) (Gallivan et al., 2001). Therefore, we suggest that, while the strategy for implementing analytics should certainly have a formal component that makes university-wide decisions about the tailoring of the technology and its use, we would encourage the creation of an environment where educators are free to further tailor to the minutia of their module, curriculum and student context. LA should also be used to support students to develop their critical thinking and problem solving through the process of reflecting and acting on data, rather than simply a tool to generate evidence for quality assurance (cf. Tsai et al., 2019).

Create Inclusive Learning Analytics

Educators need to design LA that will facilitate the learning of a more diverse group of learners. Apart from very obvious reasons why any initiative should be inclusive, it is also clear that such inclusivity improves both acceptance and routinisation metrics - the more educators and learners included in an initiative the higher the acceptance rate and the more that potentially use the technology in a routinised way as part of their day-to-day education activity. This implies we need to value what individual students bring to the curriculum design process (Sorenson, 2001, Bovill et al., 2011). Specifically, while inclusion in information systems has received significant attention in recent years (Coleman et al., 2017; Trauth, 2017), research on inclusion within IS curriculum design and delivery has not received sufficient attention.

Differentiate Features of Sentiment Data

This study showed that sentiment analysis adds data points and information that adds different value to other types of information from and on students and their learning. Sentiment analysis can sense issues the students themselves may not even be aware of or know how to articulate themselves through the traditional survey. Traditional surveys are limited and subject to bias (Ochoa, 2016) in that they only elicit what the survey designer asks, and so may miss crucial issues or issues that emerge after the survey was designed. Sentiment analysis can track emerging behaviours and use of keywords in an organic and grounded manner. However, we recommend that educators consider these differences, use these instruments accordingly, and ensure they consider these differences when acting on the emerging sentiment feedback. This can be done not just to enable routinisation but to sustain it. Also, we propose that emergent analytics can then be used as a seed to initiate new and infused use of such analytics in ways that may not be obvious or indeed possible at the initial point of the technology’s adoption.

5.1 Implications for Teacher Inquiry

We acknowledge that the recommendations provided are not exhaustive but they do however contribute to the wider discourse on the need for more academic research that provides recommendations to educators (Sergis & Sampson 2017), in order to maximise the use of LA. While this study highlights the value of actively engaging students in curriculum design (Bovill et al., 2011; Bovill, 2013; Trowler & Trowler, 2010), it should not be used to undermine the domain expertise of educators and their role in teacher inquiry.

While LA can be used to support inclusive teaching and learning, it should be used as part of a suite of tools and frameworks rather than be used in isolation. For example, the Application of Good Practice Framework proposed by Chickering and Gamson outlines six powerful forces in teaching: (i) activity, (ii) expectations, (iii) co-operation, (iv) interaction, (v) diversity, and (vi) responsibility. While each of the principles in the framework is in itself beneficial, when all are present they form more than the sum of their individual parts. These forces hold meaning for students from diverse backgrounds who are usually ‘under-represented’ groups, namely international students, mature students, students with (hidden/visible) disabilities, students from minority backgrounds (Ashwin et al., 2020; Larkin & Richardson, 2013). This is important because good teaching needs to provide supportive academic environments that facilitate the learning of a more diverse group of learners (Larkin & Richardson, 2013).

This study reports the positive use of student evaluations to inform teacher inquiry. It is however, important to highlight that other studies (e.g., Hornstein; 2017; Heffernan, 2021; Westoby et al., 2021) have reported the negative impact of such evaluations, whereby educators have been subject to discriminatory evaluations based on their gender, race and age, and the impact of such discrimination on their workload and mental health.

5.2 Future Research Agenda

We acknowledge three limitations of this study, which also offer directions for future research. First, conventional textual sentiment analysis was not conducted due to limited data points, making it difficult for the predictions. Second, the findings are based on a single case which by nature, limits generalisability (cf. Yin, 2009). The findings were however based on four iterations (e.g., within-case analyses) of a one-year master’s programme and in-depth background to the case studied and rich contextual data was provided, which can help readers to tailor and apply the recommendations to their own educational context. Third, while LA has become increasingly popular, it is only one approach to inform curriculum design. It should, therefore, not be used in isolation but rather to complement other data sources (i.e., academic analytics) and the knowledge possessed by educators and curriculum designers. Based on the analysis and limitations of this study, Table 8 provides a future research agenda. It contains sample research questions associated with each recommendation that individually and collectively can improve the efficacy of LA initiatives.

Table 8 Research agenda

6 Conclusions

Learning analytics is a research field that aims to support educators during the process of inquiry. This study reported the value of using sentiment analytics as a form of LA to improve student-learning experiences and inform curriculum design. Sentiment analytics offers a dynamic and evidence-based approach to guide teacher inquiry and inform curriculum design. However, it assumes that educators have the ability to use these types of analytical tools and techniques and align these with their teacher inquiry. This is most likely not the case in many universities, due to a range of factors including, (i) the capacity of the discipline, (ii) availability of funding, (iii) tailored training in the use of LA, and (iv) continuous support in the use of LA and curriculum design.