1 Introduction

Social media sites allow users to share news, information, knowledge, ideas, facts, beliefs, and opinions—in negative, positive, or neutral ways—about products, events, and issues. Zou et al. (2018) suggest that Twitter is an important microblogging platform in people’s daily lives. Social media platforms like Twitter are open spaces for students to share their learning experiences, emotions, and stories, and to discuss their concerns (Chen et al., 2014). The availability of students’ insights on social media provides a new lens through which educational researchers can explore their experiences in an informal environment not controlled by a classroom or institution (Chen et al., 2014). Students’ digital footprints provide a valuable source of information about their perspectives and experiences, which can help educational institutions improve those experiences and the quality of instruction. Various traditional methods have been used to collect data from students and explore their experiences in a controlled, structured way, including surveys, interviews, focus groups, and classroom observations (Chen et al., 2014). However, social media analytics of unstructured data provide a less restricted view of users’ insights, experiences and opinions during national disasters, crisis management, product marketing, health campaigns and learning experiences (Al-Rubaiee et al., 2016; Hiltz et al., 2011; Stieglitz et al., 2018). In this way, useful data can be obtained to understand users, support them, and provide different experiences/services that will more appropriately meet the targeted groups’ needs (Stieglitz et al., 2018). Analysing unstructured Twitter data is one method of attaining such insights.

In times of emergency, social media platforms like Twitter are a hub for sharing information and support. The COVID-19 pandemic has been one such crisis. Emergency online teaching/learning is a significant topic that has gained mass attention amongst policymakers, academics, students, educators, researchers, and parents. This research aims to 1) explore, via their posts on Twitter, the engagement of university students in Saudi Arabia with online learning and assessment during the COVID-19 pandemic and 2) determine the impact of this emergency shift on students’ engagement with learning and assessment.

On 8 March 2020, the Saudi government announced the shift to online teaching, and the decision went into effect the next day in all Saudi K–12 schools, universities, and colleges. The change was rapid in Saudi universities, which already had an online platform (Blackboard) in place. The week of the shift was the second mid-term week in Saudi universities; teaching resumed, but there were many stirring questions about unfinished examinations. Saudi university students protested on Twitter, which is a popular platform for sharing opinions in Saudi Arabia, against these rapid changes in teaching, learning, and assessment. Students not only shared content, but also documented their experiences using screenshotted pieces of evidence (departments’ and lecturers’ announcements, private messages, group messages, and emails with professors and lecturers). In response to this unexpected transition, the Ministry of Higher Education (MOHE) encouraged a move towards alternative assessment methods. This study documents students’ reports of their engagement with learning and assessment between March and May 2020. Social media analytics will enhance the current understanding of how students use social media to share their experiences. The findings will be useful for researchers and policymakers who are interested in determining the factors that affect students and how to improve their experiences.

2 Literature review

2.1 Student engagement dimensions

Student engagement is considered important for exploring students’ motivation to complete their education; supporting students when their studies are incomplete and when they are on track for graduation; solving issues of burnout, boredom, and low achievement; and taking care of students’ well-being. However, the student engagement construct and its dimensions have been conceptualised in various ways (Appleton et al., 2008; Fredricks et al., 2004; Jimerson et al., 2003). The term has also been broadly applied to include school engagement, academic engagement, educational engagement, engagement in the classroom, and engagement in schoolwork. The present study draws from Fredricks et al. (2004), who have established that student engagement is a multifaceted construct that can be explored from behavioural, emotional, and cognitive engagement dimensions. Cognitive engagement is defined as students being self-regulated—invested in using metacognition strategies and putting in the effort necessary to study complex ideas. Emotional engagement reflects students’ effective engagement (positive and negative emotions) with the classroom, teachers, and school. Behavioural engagement is defined as student participation inside and outside the classroom—getting involved in learning and submitting required work. Several measures have been developed to measure these engagement dimensions, such as student self-reports (the most common form), teacher questionnaires and ratings of their students, interviews, and individual and classroom observations (Fredricks et al., 2004).

Face-to-face teaching, learning, and assessment engagement differ from online learning in numerous ways. As O’Shea et al. (2015) assert, ‘When shifting to online contexts, engagement takes on different manifestations, due to the lack of face-to-face contact and the ways in which teaching and learning are mediated through technology’ (p. 43). Online learning comes with various demands; researchers report that online students struggle with having a regular study schedule as they try to balance study, work, and family (Blackmon & Major, 2012; Brown et al., 2015; Buck, 2016). Playing several roles as student, family member, professional, and social participant can lead to a stressful lifestyle (Brown et al., 2015; Stone & O’Shea, 2019).

2.2 The impacts of COVID-19 on university students

The COVID-19 pandemic is an added variable to this load, which can increase the abovementioned impacts on studying. At time of writing, the COVID-19 pandemic is still raging. Early signs indicate that the impact of the pandemic on education, emotions, and mental health are noticeable (Alateeq et al., 2020; Araújo, 2020; Sahu, 2020). Recent studies related to COVID-19 have explored the impacts of the pandemic on university students and staff, revealing that international students are concerned about their education, their well-being, and the impacts of travel suspension on themselves and their families (Zhai and Du, 2020). They are also worried about the impact of the virus on their studies (Cornine, 2020) and future employment prospects (Wang et al., 2020), and such worries have raised stress levels amongst medical students, especially females (Al-Rabiaah et al., 2020). University students, in general, are experiencing psychological issues, such as depression, stress, and anxiety. Cao et al. (2020) have identified the COVID-19-related stressors that have affected Chinese college students during the pandemic: fears of getting sick, worries about family, economic stressors, worries about education, and concerns about assessment performance. Researchers in these studies have also stressed the importance of monitoring student and staff mental health during this crisis to lessen its emotional impact, and they have concluded with suggestions about establishing psychological services to care for vulnerable groups, provide guidance, and suggest coping strategies to help students manage stress.

Research studies have also been conducted in the Middle East (Oman and Saudi Arabia), exploring faculty perspectives towards assessment during the COVID-19 lockdown period. Guangul et al. (2020) surveyed 50 faculty members in a college in Oman to investigate the challenges of online assessment and academic dishonesty, taking this Middle Eastern college as a case study. Their findings suggest that challenges to online learning and assessment included academic dishonesty, a lack of infrastructure, achievement of the learning outcomes, and students’ commitment to submitting assessments. Similarly, Sharadgah and Sa'di (2020) explored the perceptions of faculty members in a Saudi university towards how well Saudi higher education institutions were prepared for virtual assessment during the lockdown period and investigated the challenges of implementing online assessments for formative and summative purposes. They collected perceptions from 96 faculty members via an online survey. Their findings suggest that faculty members were not prepared for online assessments. They believed that the courses’ learning outcomes could not adequately be assessed online, as this was not the intention of the coursework, nor had they been trained for such a shift. They also found the lack of assessment security infrastructure to be a great limitation that defeated the purpose of assessment and raised academic integrity issues. Therefore, they recommended that the university provide the teaching faculty with professional development regarding e-assessments and invest in assessment security software.

As people continue to live this experience and learn more about how the COVID-19 crisis is reshaping what people know, studies concerning its impact on education have emerged in Asian countries (Bao, 2020; Sintema, 2020; Toquero, 2020; Yan, 2020). The studies conducted during the initial phase of the pandemic were based on surveys with predetermined items and views. Some focused on many identified aspects concerning the pandemic’s impact on students. Other research considered only one side of the issue, such as the emotional dimension (effective engagement) or the perspectives of teaching staff. In these cases, data collection was not restricted to students and demanded no effort from them to participate. Rather, these studies collected the unstructured data the students shared via trending hashtags, attempting to make their voices heard. Therefore, more studies are needed to fully document these educational experiences, as well as the alternations in teaching, learning, and assessment practices during COVID-19 and the impact they have had on educational systems in different contexts (Toquero, 2020).

The current research explores the unfiltered experiences and opinions shared by students on their Twitter accounts to document their engagement with online learning and assessment during the COVID-19 pandemic. There is much to be learned about the factors that could influence their experiences from a sociocultural perspective. This study is guided by the following research questions.

  1. (1)

    What have university students in Saudi Arabia been sharing on Twitter about their engagement with learning and assessment during the COVID-19 pandemic?

  2. (2)

    What factors influence students’ engagement with online learning and assessment?

3 Methodology

The rationale for collecting the data for this research via Twitter, as opposed to traditional data collection methods (e.g., surveys and interviews), is that Twitter allows the researcher to observe how the situation in question unfolded in real time. This offers a distinct advantage over time-bound data collection. In addition, the unfiltered, lived experiences students shared added value to the study.

3.1 Data collection

The dataset was collected over nine weeks between 18 March 2020 (the second week of the country-wide lockdown and the shift to online teaching) and 17 May 2020 (the official end of examinations and the academic semester). In the first step, the data analysis software MAXQDA was used to query and retrieve Twitter data. A total of 219,000 tweets were collected using keywords such as assessment, evaluation, university, study, exam, exams, homework, assignment, assignments, education, and continuous assessment. Tweets were also gathered from popular hashtags such as (#howwasyourexamtoday? # ), Saudi universities’ hashtags, and MOHE mentions (@mohe.sa). The MAXQDA software was used to clean the data by excluding retweets, duplicates, quote tweets, and irrelevant tweets (e.g., ads and spam). Only unique Twitter users were selected in the analysis to avoid bots and spam. Finally, 124,810 original tweets written in Arabic were included in the analysis (Fig. 1).

Fig. 1
figure 1

Data Collection Steps

3.2 Data analysis

Quantitative and qualitative approaches were used to analyse the Twitter data (hashtags, list of keywords, number of tweets, and number of users). Collecting and analysing social media data can be challenging, given that the language used is diverse, and analysing data with just automatic algorithms can lead to faulty results. Therefore, qualitative data analysis was necessary to more closely examine this user-generated content (Chen et al., 2014).

The tweets selected via manual content analysis were chosen from unique users, top keywords, and most frequent hashtags. Inductive content analysis was used as an analytical tool to systematically label a sample from the content of the tweets. This was done to identify trending topics, experiences, and concerns shared by the students involved. To make this process manageable, and to have an overview of the data, the researcher created a word cloud for each month (top keywords) and a word list for all months. The top keywords were classified and used as an initial guide to create categories based on concurrent keywords. The researcher then collected a minimum sample of 100 tweets for each category, and, in total, 700 tweets were analysed and coded. The most common keywords associated with these categories were noted to identify similar, representative tweets. Given that the data volume was large, the researcher created a codebook, including a list of concurrent keywords associated with each category, and then ran a search using a lookup dictionary to find tweets with the same identified values (i.e., tweets containing the specified keywords). This search was conducted several times, making adjustments to the words used for each category to obtain more accurate results. The data analysis process was conducted in Arabic, with the information being translated into English during the writing stage, to ensure that the researcher captured the users’ intended meanings during the analysis process.

Of the 124,810 tweets, 108,031 were classified into six categories: 1) examination issues (test content, test length, test format/layout), 2) study and coursework pressures (sources of pressure, heavy coursework loads), 3) technical issues (Blackboard, internet connection, equipment), 4) positive emotions, 5) negative emotions, and 6) tweets that could not be categorised into the other five groups (16,779 tweets). Category 6 included topics related to fairness in access to learning and assessment, mental health issues (stress, anxiety, panic attacks), lack of sleep, communication issues (unsupportive professors and departments), students’ demands, and comments on the situation as a whole. A key finding of this study is that 40% of the tweets (50,531 tweets) were related to examinations. Of the tweets, 31.6% discussed study and coursework pressure, with students pointing to the sources of the pressure they felt regarding their studies (39,448 tweets). A total of 89,979 tweets were related to learning and assessment practices and made up 71.6% of the data, and 28,588 tweets focused on emotions made up 22.8% of the total. In 14.9% of the tweets (18,681), students expressed negative statements, while only 7.9% (9,907 tweets) contained positive emotions, and 23.8% (29,777 tweets) were related to technical issues.

4 Findings

4.1 Research question 1

  1. (1)

    RQ1: What have university students in Saudi Arabia been sharing on Twitter about their engagement with learning and assessment during the COVID-19 pandemic?

The results suggest that students’ engagement (cognitive, behavioural, and effective) with learning has been considerably impacted by online assessment. Online learning has become an online assessment crisis. Phase 1 of Saudi Arabia’s shift to online education occurred in March, halfway through the semester, when the lockdown was in place. This was a confusing phase for students and universities, as the sudden shift required major readjustments. Universities’ hashtags were trending as students started to share their experiences, and ‘pressure’ was the top keyword in March. The three main sources of pressure were coursework, lectures, and the COVID-19 lockdown period of uncertainty, which caused considerable stress and anxiety (see Table 1).

Table 1 Top keywords in the data collected between March and May 2020 during the COVID-19 pandemic

In their tweets, students reported that since the shift to online learning had been announced, they found themselves trying to cope with the situation. For example, they had to navigate accessing stable internet connections, using a laptop to reach their courses, and finding a quiet space at home to participate in lectures. They were also busy with heavy loads of assignments, projects, research papers, presentations, and unfinished mid-term examinations that needed to be rescheduled. In addition to assessments, lectures were rescheduled for some courses as professors shifted their morning classes to afternoons or evenings. Some professors tried to cover too much of the curriculum in a short period, which led to students feeling overwhelmed and overloaded with information they could not digest. Students reported that their learning was impacted by the fast delivery of the course content, information overload, difficulty in following and understanding professors, exhaustion and tiredness from attending back-to-back lectures, and the need to work on several assignments and projects for different courses. These issues were compounded by stresses stemming from family, life, work, and health and financial issues (e.g., job losses and salary cuts), all of which heightened the stress of the COVID-19 pandemic and lockdown.

4.1.1 Emotions

Emotions are an important aspect of students’ engagement and learning experiences, and they were a consistent theme across the data in this research. The emotions students expressed towards learning and assessment during the COVID-19 pandemic were intense and revealed information about the situations that evoked them.

Positive emotions

The top words associated with positive emotions in students’ tweets were ‘surprise’, ‘happy’, ‘love’, ‘appreciation’, ‘satisfied’, ‘success (full marks)’, ‘prayers’, and happy emojis. These tweets were filled with happy faces, happy crying faces, hearts, and funny gifs. Students’ positive tweets celebrated and appreciated the MOHE, their departments’ and universities’ decisions and procedures, the full marks they received on their examinations, and the access to information they had been granted during their assessments. Students reported positive experiences or displayed positive attitudes when reporting that they had had an easy exam, felt supported by their professors, experienced no technical issues, managed to obtain full marks, received peer support, or cheated alone or with peers to answer all the questions. Examples of tweets were filled with expressions such as ‘easy exam’, ‘thanks to Allah’, and ‘the last exam was like a gift’.

Other sources of positivity were achievable assessments and access to support during exams. Students celebrated being able to use Google or refer to their notes and course books to answer exam questions. In some cases, they created support groups to help each other and cheat during quizzes and examinations; regardless of the security measures taken, such as limited time and shuffled questions, students managed to organise their efforts, collaborating with classmates and sending screenshots of the examination questions to their peers. Others turned to private tutors for help while taking exams. The following tweet exhibits this situation: ‘Full marks, may Allah protect private tutors [laughing with tears emoji]’. Others sought help from friends or relatives: ‘I went to my sister-in-law’s house. I gave her my device and she answered the questions for me. She is an English teacher [happy dance emoji]’. Whole group support was also available when the exam was fixed to be taken at a certain time and when it was open for 24 h. Students even asked their peers, who had already taken the exams, for the answers. The following tweet illustrates this method: ‘On a day where there was cheating, don’t ask me how my exam went because of course everyone got full marks’. Not all students approved of cheating or the sharing of exam questions, which led to some unpleasant confrontations. The following example exhibits this situation:

No one took the exam, yet we are depending on each other. Only one took it and she is XXXFootnote 1 and she refused to give us the questions or the answers and she avoided our questions….we started a fight. What is going to happen if she helped us? Anyway we will use the book but we need help given that she took the exam already.

The students valued professors who showed understanding of their situations, were considerate and reasonable with course demands, reached out to provide moral support, and provided accessible means of communication and quick responses. Students praised and thanked their professors for being kind and positive. The following is an example of such tweets: ‘The doctor was the most collaborative one with me and my group. May Allah grant her happiness’. Students’ positive attitudes towards the end of the semester (i.e., the final exam period) indicate that the university professors responded well to the MOHE’s demands to ease the exam process for students.

Negative emotions

Students also shared many negative emotions in their tweets, such as sadness, anger, worry, nervousness, stress, defeat, helplessness, upset, panic, disappointment, and hopelessness. The main sources of negative emotions were experiences that increased students’ anxiety and fear of the unknown, as well as their fear that these issues could affect their studies and performance assessments. Students expressed concern about missing exams due to oversleeping, internet issues, and issues with the Blackboard app. Another negative experience was dealing with and having negative exchanges with uncooperative professors. Students with such experiences demanded more help and support from their universities, and some turned to the MOHE’s account to request help:

Please have mercy on us; we are mentally and physically tired. Please pass us or give us an answer, don’t leave us wondering, tell us what are your decisions? And in the midst of this crisis, we demand that you cancel the examinations. I swear the assignments exhausted us.

These interrelated factors, which were associated with or caused negative emotions, will be discussed further in the next section concerning the factors that affected students’ engagement with online learning and assessment.

Students’ effective engagement with learning was highly associated with assessment and was affected by the rapid changes in teaching and assessment setups due to COVID-19. At the end of the semester, some students reported huge relief; they had not expected to do well in the assessment tasks or to survive the year, but things had turned out well. For others, their worst fears had proved accurate, with uncooperative professors and technical issues.

4.2 Research question 2

  1. (2)

    RQ2: What factors influence students’ engagement with online learning and assessment?

Several factors have influenced students’ engagement with learning and assessment at cognitive, emotional, and behavioural levels. The most common themes found in students’ tweets were communication, fairness, and technical and assessment issues.

4.2.1 Communication issues

COVID-19 expanded teacher–student communication outside of lectures. The boundaries between working hours and rest hours were crossed, which affected students’ relationships with their professors in positive or negative ways, as some grew closer and others more distant. On Twitter, students shared screenshots of tense correspondence with their professors, which led to unpleasant exchanges and professors blocking on WhatsApp. The following tweets exemplify students’ disappointment in the lack of much-needed communication with their professors:

No one will understand our problems like those who were deducted deserved points. It was not little. I lost six points, and I keep contacting the professor by email and I get no answers and I was ignored by all. OK, my grades, my effort!!!! But God is above all.

I swear there is no fairness or consideration for our circumstances. We don’t want understanding but our grades. I have submitted everything my professors asked for, and I got 48 out of 80 with no reason. I demand reassessment but there is no response from the dean nor the head of the department or the professor himself is incapable of explaining where my mistake is.

Some students felt that online communication with their professors was more challenging than face-to-face communication, as they often received no answers from their professors, leading them to feel helpless and without a means to discuss their situation. They were anxious about what would happen to their assessments and grades (GPAs), which are important for their futures.

4.2.2 Fairness issues

According to students, fairness was an issue in terms of assessment and their ability to access lectures, information, and the Blackboard app. Students had different experiences in terms of being able to afford a device and maintain reliable internet connections. Unreliable internet was an issue of great concern, both for students and professors. Those with financial problems felt it was unfair that they did not have access to good services and equipment, which affected their learning and assessments. Some universities responded by providing tablets and laptops to those who could not afford them.

Students living in remote areas, villages, or small cities often commented that their internet connections were unstable. Some students reported having attended their lectures but being recorded as absent in the Blackboard system due to technical issues or bad network service. Students also mentioned that they could not follow their professors, as the sound cut out repeatedly due to either bad service in the professors’ area or in the students’ network. As a result, they could neither understand nor absorb most of the materials taught online.

Another point of major concern was the load of information students received in a short period of time. Cognitive engagement was greatly affected, as students felt lost and occupied with assessment tasks after professors abruptly adapted their teaching and assessment plans to virtual classes. Some students described finishing their course one month ahead of the syllabus’s schedule and, afterward, not recalling what they had been taught. The rush to finish the course book and assigned teaching materials was associated with the panic mode some professors and departments went through due to the uncertainty and sudden interruption of their plans. Other professors opted to teach fewer materials and focused on what mattered most for a smooth ending to the semester.

Students also raised the topic of fairness in assessment. They felt they did not have equal chances to succeed, with the following tweets exhibiting this view:

Today was unfair. Instead of answering the questions, we were looking for the questions because we could not see them. The time passed and it was confusing. A nice zero.

20 questions in 20 minutes, am I a robot. Is this leniency and fairness?

Not all students believed they had received equal treatment and chances to succeed. As the following tweet exhibits, students who worked hard felt it was unfair that lazy students were advantaged.

Where is fairness? A careless student who did not take the quizzes and did not do the assignments and research paper in a month and a half and was asleep is treated just like a diligent [student] in the same period, how do you judge?

4.2.3 Technical issues

Most of the technical issues reported in students’ tweets were related to the Blackboard app, which froze constantly during lectures and examinations. During exams, one student wrote that a huge number of students were taking exams at the same time, which affected the network service.

70% [of tweets] with the hashtag is [due to] negativity, technical issues, Blackboard not responding (pressure on the website). This is the biggest evidence on day one of the finals for general subjects. How are we going to do when 300,000 students take exams at the same time? Some students couldn’t take the exam because Blackboard froze or the internet network was bad.

Such technical difficulties increased stress among students, some of whom were told by their professors to take responsibility and handle their own problems. One tweet read, ‘Blackboard froze four times during the exam and the professor told me: it is your problem’. So, while students tried to concentrate on taking their exams, they had to deal with internet and Blackboard issues, which resulted in their spending less time answering the exam questions and more time worrying about external problems. One student explained:

It was the worst exam. Blackboard froze, and I was able to re-enter to continue when half of the exam time was gone. The third issue was that the wrong answer key was entered by professors who were not familiar with online tests: on top of Blackboard not working, if it works and you answer correctly your answer is recorded as incorrect. The answer key was wrong, the [exam] time was up and Blackboard was frozen. What is the solution guys? I missed two exams. Look at our situation please [teary and broken heart emojis].

Some students reported that the Blackboard system counted their first attempt to access the website before it froze and gave them zero points in the exam, even though they could not log in and read the questions: ‘Wallah, zero. I don’t know how I got it while the website was frozen’. Some professors responded to students’ concerns, while others refused: ‘Then the programme didn’t count my correct answers and my professor says the computer is never wrong’. While universities offered workshops and 24/7 support for staff and students, technical support did not help in some cases, and students felt helpless.

Our grades and dreams are flying away just because of something that is out of our hands. How is it our problem if the programme is incomplete or can’t handle the pressure? I am a student, how is it my fault when I study and work hard and at the end I can’t take the exam or I am only allowed [to take it] in the last 5 minutes and I have 30 questions to answer. What can I do in this situation?

4.3 Assessment issues

Other common themes included professors suddenly changing their assessment tasks or adding extra tasks and examinations. Students reported having to repeat assignments because their papers were lost or left uncorrected in the university office, with professors unable to access their offices.

Our professor in the postgraduate programme asked us to do a research paper every week and asked us to participate in every lecture, and now he is going to do our examination. I will pray against him during Ramadan.

Yes, [my professor] is staying at home. She forgot her students’ assignments in her desk in her office at the university. She is asking us to repeat the assignment. The same with another two professors. I don’t know why I tried from the beginning of the semester and was prompt in submitting my assignments to find out the professor did not correct our papers earlier. I don’t know what she was busy with. If we were lazy and did not submit on time, she would not be flexible.

Students frequently discussed the content of examinations as being easy, manageable, and doable or as being difficult and hard to answer. Many negative emotions were associated with difficult exams and unexpected questions.

Did we bring Corona? Three exams, and every exam is more difficult than the last. I am not used to this. The finals used to be easier than the mid-term exams.

Done with the first exam, I would like to thank my professor for the XXX questions. I was literally tearing up while answering the questions.

In my university, they do not teach, they give us the lectures and we try our best. We are lost and we are losing grades. They use difficult exam questions to prevent us from high marks [GPAs]. They think we can cheat during exams when [we] have one minute for each question. We can barely read the questions.

4.4 Examination time (scheduled date; time and length)

Final examinations in Saudi Arabia this year were scheduled during Ramadan, which is an annual religious occasion during which Muslims practice fasting. Fasting lasts, on average, 16 h each day, from dawn until about 6–7 p.m. Taking exams while fasting was a challenge reported by students, who stressed the inconvenience of the examination schedule. The following tweet illustrates the challenge this posed, as managing the exam timetable and the lack of sleep affected students most:

The questions were clear, but we have a problem with the time. If they changed to the morning at 8 a.m., it would be better today. I have two exams, one at 10 a.m. and the second one at 1 p.m. I’ll stay up.

Students raised two issues concerning the examination schedule. Professors scheduled exams to be taken over two days or more, while other exams were open for 24 h or for a shorter, fixed period, such as 60 min. The specific time was problematic for some students, as some exams were scheduled to open at midnight (12–1 a.m.) or after 8 p.m., or on the weekends. One student tweeted, ‘It [the exam] starts at 11 p.m. [teary face and broken heart emojis]’. Some professors chose to schedule exams during these unusual times and on weekends to avoid having technical issues and Blackboard troubleshooting problems, but students still reported experiencing technical issues and were dissatisfied with such inconvenient arrangements, which overlapped with their resting time, sleeping time, family life, care responsibilities, and commitments to other courses. Students often reported that they were scheduled to take two finals in less than 24 h, with one scheduled in the morning and another in the evening.

The second time-related problem concerned the length of the examinations. Students reported cases of unusual exam lengths of seven, ten, 20, 30, or 40 min, which did not provide enough time to think and answer the detailed questions. One student said, ‘My exam was awful, all questions require time to answer them, the system closed and I couldn’t answer all the questions, I did not sleep or eat well and my heart is broken [broken heart emoji]’. Another student mentioned, ‘My 20% flew by, the XXX professor assigned 40 questions to be answered in 30 min’. A third user tweeted, ‘Time was up while I was on question 14. How could I answer 40 questions in 90 min?’ Other illustrative tweets read as follows:

20 questions and a page for each question and one hour. What is this? My time passed while moving to the second page (question 2), Wallah haram. Who listens to us? Who will look at our problem? Oh, OK, our voices are supressed. Please tell me this [is] only our university [that has taken] such a decision.

You study hard, and you hope to get a high GPA and suddenly the professor gives you 20 questions to be answered in 30 minutes and the internet freezes and you can’t review your answers and and and…

Under normal circumstances, there are set rules to be followed, such as the final exam being worth 40–60% of the students’ final grades, with the time frame for taking the exam set to 60–90 min, but during the online transition, these regulations were changed and shortened. The exam duration may have been decreased due to the professors thinking that less time was sufficient given the new weight of the exams (20%), or perhaps they sought to prevent their students from cheating. The MOHE guidelines suggested a weight of 20% for the final examinations, but they did not make suggestions regarding exam duration, which was left to the discretion of professors and departments. Students praised and appreciated professors who left exams open for 24 h, thereby allowing students to take the exams at their own convenience, avoid rush hours, and make two attempts to access the exams if necessary. As one student shared, ‘We have two attempts and 24 h to complete our exam in 60 min. Swear to God our university did the best [happy teary face and heart emojis]’.

5 Discussion

This study has explored students’ reports of their engagement with online learning and assessment during the COVID-19 pandemic, investigating the factors that affected the students’ experiences. The findings indicate that student engagement was affected on cognitive, affective, and behavioural levels. Overall, it seems that some professors followed the MOHE guidelines and made their exams reasonable in length, ensured that they were accessible (i.e., not difficult to answer), and allowed a reasonable timeframe for finishing them. Other professors did not follow these guidelines, which led to difficult and stressful student experiences. Technical issues, such as problems with internet connections and Blackboard, persisted throughout the online learning and assessment phases, and some professors refused to believe their students’ problems and did not give them second chances. This issue was raised by students who experienced hard feelings towards professors who were unresponsive to the problems they faced.

The findings of this study are in line with Thompson’s (2020) results from a survey conducted among 325 Saudi students, which explored the effects of the COVID-19 pandemic on cognitive and behavioural engagement. According to Thompson (2020), students found online learning to be convenient, but they realised that educational quality had decreased, especially in terms of the amount of knowledge they had gained. The students in Thompson’s survey reported three concerns: 1) Cognitive disengagement, including an inability to focus and a loss of desire to learn, as well as behavioural disengagement due to the inability to see professors or peers. 2) Delivery of the lectures, with a lack of standardisation in online course delivery. They called for more structure and standardisation (i.e., a mandatory system for all professors to follow). 3) Online assessments, which might negatively impact their grades. The students’ most frequent complaints were about online tests. These findings coincide with the concerns tweeted by students in the present study.

Other influential factors uncovered in the present research were related to management and organisation at the department and university levels. Students were emotionally attached to their academic performance, which is understandable. Previous research has reported that online learning and assessment cause considerable challenges for students, particularly with their emotional engagement. An increasing number of reports stress the impact of the COVID-19 pandemic on students’ effective engagement. Stress, anxiety, and depression are among students’ top challenges (Al-Rabiaah et al., 2020; Cao et al., 2020; Wang et al., 2020; Zhai and Du, 2020). This result has been confirmed in studies concerning the impact of COVID-19 on students’ mental health in Spain, China, and Saudi Arabia (Alateeq et al., 2020; Cao et al., 2020; Odriozola-González et al., 2020). In the Saudi context, Alateeq et al. (2020) explored the mental health of students in Saudi general and higher education. They found that university students were more impacted than intermediate and secondary school students. The sources of stress university students reported were associated with curricula, parental issues, feelings of loneliness, and worries about their future and performance in examinations. Similar to the present study, 58% of the participants were angry about issues beyond their control.

Online teaching, learning, and assessment during the pandemic have undergone unexpected changes for universities, departments, and faculty members. Students experienced stress and anxiety in dealing with the unknown during and after the lockdown. The main challenges and issues were similar across the three months sampled in this study. However, it seems that, during the final examination period in May, both students and lecturers became accustomed to online assessment, and their departments responded well. This was reflected in the data through the increasingly positive tones of students’ tweets and the reported experiences of full marks, high grades, and easy exams. The satisfaction associated with high grades and GPAs in the Saudi context was also highlighted by Thompson (2019). Students are highly concerned with their exams and obtaining high GPAs, and their grades have a great impact on their engagement and how they evaluate their learning experiences.

5.1 Research implications

MOHE’s responses to the situation were rapid, yet officials experienced several challenges in implementing the necessary changes within a short period. There are lessons that can be drawn from emergency online learning during the COVID-19 pandemic. First, in sudden shifts from face-to-face to online learning, trying to cover all the course content was not feasible, practical, or useful for student learning. In such situations, expectations must be lowered, and teaching materials must be reviewed and restructured to select those most relevant to student needs. Assessment tasks should also be carefully selected, and alternatives to exams should be adopted to ensure positive experiences with online assessment. Examinations should be limited, but when they are necessary, they should take the form of open-book exams. Considerable time should be allotted and standardised among all schools, and appropriate measures should be taken.

As assessment has an impact on teaching and learning, the efficiency of using traditional assessment methods has been tested during this pandemic with the ground reality of the situation. There were reports of panic among professors, which led some to over-assess their students. Departmental arrangements are necessary to coordinate between courses and avoid clashes between submission deadlines, exams, and quizzes. It is also essential to consider training staff to use alternative assessment methods, which should be given more weight in grading and used as part of the regular teaching plan, not just for emergency learning.

5.2 Limitations

This study has some limitations. First, the data collected from unstructured sources reflects only the opinions of those who had access to social media and shared their experiences with trending hashtags. Second, this study reflected only one side of the story, narrated by students from one country.

5.3 Future directions

Professors and other stakeholder groups have their own narratives to tell, justifications for the decisions they made, and feelings about how regulations were implemented and objectives that were achieved. This underscores the importance of research into different stakeholder groups to shed light on how various changes impacted experiences during emergency online teaching and learning.