Back to Journals » Advances in Medical Education and Practice » Volume 12

Developing, Validating, and Implementing a Tool for Measuring the Readiness of Medical Teachers for Online Teaching Post-COVID-19: A Multicenter Study

Authors Hosny S, Ghaly M , Hmoud AlSheikh M , Shehata MH, Salem AH , Atwa H 

Received 24 April 2021

Accepted for publication 29 June 2021

Published 13 July 2021 Volume 2021:12 Pages 755—768

DOI https://doi.org/10.2147/AMEP.S317029

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Dr Md Anwarul Azim Majumder



Somaya Hosny,1 Mona Ghaly,1 Mona Hmoud AlSheikh,2 Mohamed Hany Shehata,3,4 Abdel Halim Salem,1,3 Hani Atwa1,3

1Faculty of Medicine, Suez Canal University, Ismailia, Egypt; 2College of Medicine, Imam Abdulrahman Bin Faisal University, Dammam, Kingdom of Saudi Arabia; 3College of Medicine and Medical Sciences, Arabian Gulf University, Manama, Kingdom of Bahrain; 4Faculty of Medicine, Helwan University, Cairo, Egypt

Correspondence: Hani Atwa
Faculty of Medicine, Suez Canal University, Ismailia, Egypt
Tel +20 1224576171
Email [email protected]

Background: While online education is by no means a new concept, it was recently thrust into the spotlight after school campuses all over the world were forced to close because of the COVID-19 pandemic. The sudden need to shift revealed emerging challenges to online teaching, both logistic and personal. One important challenge is the ability to assess the readiness of educators for online teaching, so that appropriate and specific feedback/training can be offered to those in need. This study aims at developing, validating, and implementing a tool to measure the teachers’ readiness for online teaching in three medical schools from three different countries.
Methods: This was a multi-center, cross-sectional study that involved developing a survey through review of literature and previous studies, item development and revision, and pilot testing. The survey was then distributed electronically to a convenient sample of 217 teaching faculty members of different academic ranks from three medical schools in Egypt, Saudi Arabia, and Bahrain. Exploratory factor analysis and reliability study were performed. Descriptive statistics were applied, and the statistical significance level was set at 0.05.
Results: Factor analysis produced the following five factors: “Online Teaching and Course Design Skills”, “Digital Communication”, “Basic Computer Skills”, “Advanced Computer Skills” and “Using Learning Management Systems”. The tool showed high reliability (alpha = 0.94). Survey results showed highest mean scores for Basic Computer Skills with lower scores for Online Teaching and Course Design Skills and Using Learning Management Systems. ANOVA revealed statistically significant differences between the three studied schools regarding Digital Communication (F=5.13; p=0.007) and Basic Computer Skills (F=4.47; p=0.012) factors.
Conclusion: The tool proved to be reliable and valid. Results indicated an overall acceptable readiness in the three involved schools, with a need for improvement in “Online Teaching and Course Design” and Using Learning Management Systems.

Keywords: teacher online readiness, online teaching readiness tools, faculty development

Background

No matter how well-prepared, medical schools all over the world were recently thrown off balance when they were forced to shift to online teaching. With no end to the COVD-19 crisis in sight, schools have realized that they are unlikely to return to normal anytime soon, and that online teaching has moved from adjunct learning method for both teachers and learners to a main modality. Teachers around the world have had to move their courses online in the face of several challenges, namely the need to (1) promptly shift to online teaching, with little to no preparation; (2) implement and maintain online teaching under the difficult pandemic conditions; and (3) follow extended online teaching with little to no information regarding the expected duration of the shift.1

Online teaching is an eminent advancement in education in the twenty-first century, where online instructors play an essential role, necessitating being well-trained and adaptable to technology.2 In normal situations, it is recommended that an appropriate amount of time is allocated to academics to first acquire the new pedagogical and technical skills before actual operation within an online classroom. Several considerations are important in online teaching including design of online courses, the skills of online instruction and student engagement, and the appropriateness of online delivery to the respective discipline.3 Other considerations that might influence teachers’ readiness are years of teaching experience, self-efficacy with regards to communication and learning transfer, as well as the teacher’s self-directed learning skills.4

Moreover, it was reported by Eslaminejad et al that faculty should continuously be offered training to upgrade their information technology knowledge and skills over time. In addition, their study results indicated that pedagogical innovations are required to develop and implement an effective e-learning program.5

Great debate between researchers exists concerning the optimal toolkit of competencies required by instructors for online teaching. Some educationists argue that these competencies are not significantly different from those necessary for face-to-face teaching;6 and it is assumed that past teaching experience is sufficient to teach online.7 However, others disagree and support their argument by the fact that the toolkit needed by online teachers differs in 3 main issues; focusing on teaching time and space, learning management systems techniques, and the skill to engage students.8,9 More recently, Martin et al9 described four areas for online teaching instructors’ competencies including course design, course communication, time management, and technical skills.

Some investigators studied the perception of medical educators towards online teaching. Downing and Dyment10 examined educators’ perceptions of preparing teachers for a purely online environment and found that online teaching was felt to be time-consuming. Northcote et al11 revealed that faculty felt low self-efficacy in selecting technological resources and high self-efficacy in online course alignment (effectively aligning objectives to learning activities and assessments). For successful conduct of online learning, teachers need to put themselves in the position of self-learners when it comes to the required competencies. Thus, understanding teachers’ readiness as learners to engage in online learning not only enables instructional designers to deliver better online courses, but also enables educational institutions to better help teachers improve their online learning experiences.4

In the context of the COVID-19 crisis, multiple factors can impact faculty readiness and competencies regarding online teaching. It was claimed that the urgent need for readiness produced positive attitudes among teachers, including their willingness to revise their teaching for online delivery as well as to share control in their classrooms with students whose technological expertise exceeded their own.1 There is a need to move beyond an instrumental approach to online teaching and learning and define the roles and responsibilities of educators in online education and their impact on the process of learning to teach. This includes the ethical and political dimensions and the consideration of issues of power and control over teaching and learning.12

We claim that this study makes an important contribution in unraveling the complexities of online teaching. The educational transformation into the online mode is dependent on the level of readiness in many areas that are explored in this study. Thus, the aim of this study is to develop, validate, and implement a tool for measuring the readiness of medical teachers for online teaching in three different medical schools in three countries in the Middle East and North Africa (MENA) region. This would be helpful for the three involved medical schools to tailor their faculty development programs according to the responses of its faculty members. Additionally, other medical schools worldwide can use our validated tool to check online teaching readiness of their educators.

Subjects and Methods

Type of Study

A comparative, cross-sectional, survey-based study.

Study Setting

This is a multicenter study that included three medical schools, which are the Faculty of Medicine, Suez Canal University, Egypt (FOM-SCU); the College of Medicine, Imam Abdulrahman Bin Faisal University, Kingdom of Saudi Arabia (COM-IAU); and the College of Medicine and Medical Sciences, Arabian Gulf University, Kingdom of Bahrain (CMMS-AGU). Before the COVID-19 pandemic, the three schools were depending only minimally on online teaching/learning, but after the pandemic stroke there was a sudden complete shift to the online mode. The curricula of the three schools are similar where all of them are innovative curricula that employ problem-based learning with both vertical and horizontal integration of basic and clinical sciences. Faculty characteristics and faculty workload are similar across the three schools. In addition, Moodle learning management system is used in two of the schools, while the third school uses Blackboard®.

Sampling

Study Population

The study population included male and female teaching faculty members of different academic ranks (professors, associate professors, assistant professors, and lecturers) at the three study centers. Teaching assistants and lab technicians were not included in the study as they do not have online classes.

Sample Size

The study used a convenient sample of teaching faculty members including all academic ranks (professors, associate professors, assistant professors, and lecturers). The total number who responded to the questionnaire within the timeframe given (one month) was 217 from the three medical schools (around 24% of the teaching faculty members at the three schools collectively).

Instrument

The survey was developed by the researchers following the steps of 1) extensive review of the relevant literature and similar studies that included items that address the readiness of teaching faculty members for online teaching, 2) item development and revision by all authors, 3) revision by five medical education experts from the three schools, and 4) pilot testing on a small number of respondents. The survey was written in English language and included 30 items representing basic technology skills, communication skills, the skills of using learning management systems, online teaching skills, and course planning skills. It started as a 5-point Likert scale (Strongly Agree, Agree, Uncertain, Strongly Disagree, and Disagree); however, for collecting clear responses from the study participants, the 5-point scale was reduced to a 3-point scale, where “Strongly Agree” and “Agree” were considered as “Agree”, while “Strongly Disagree” and “Disagree” were considered as “Disagree”. This 3-point scale was used to collect data from the participants.

To determine the suitability of the survey, validity and reliability studies were conducted. Two types of validity were established for the survey. The first was content validity thorough revision by a group of medical education experts from the structural aspect and the different dimensions and the ability of questions to explore the readiness of faculty members for online teaching. The second was construct validity through Exploratory Factor Analysis (EFA). The survey was tested for reliability (internal consistency) through calculating Cronbach’s alpha value.

Data Collection

The questionnaire was converted into an electronic format using SurveyMonkey® and distributed to the target population in each medical school through different communication platforms (like teaching faculty members forums, WhatsApp® groups, official e-mails…). At the beginning of the online survey, the respondents were briefed about the aim of the study and were given the liberty not to respond to the survey, without any consequences. Each respondent was given the chance to respond only once to the questionnaire. The questionnaire was open for data collection for a period of one month, after which no responses were allowed.

Data Analysis

The statistical analysis has been performed using the Statistical Package for Social Science (SPSS) for Windows, version 25. Data collected through the questionnaires were presented in the form of frequencies.

For comparing the mean scores of the three schools, means and standard deviations were calculated and analysis of variance (ANOVA) was used. A p-value ˂ 0.05 was considered statistically significant.

Internal consistency was analyzed using Cronbach’s alpha. Missing data were treated by replacement with mean of missing variables.

Testing the psychometric properties of the questionnaire was performed through EFA. To identify the different factors, EFA was performed using principal component analysis with Varimax rotation. The number of factors extracted and used was based on the Kaiser criterion, which considers factors with an eigenvalue greater than one as common factors,13 the Scree test criterion (the Cattell criterion) to identify the inflexion point indicated by the Scree plot,14 and the cumulative percentage of variance extracted (in humanities research, the explained variance is usually only 50–60%).15

After applying the psychometric criteria, the retained factor solutions were then analyzed according to the following interpretability criteria:16

  • A given factor contains at least three variables with significant loadings, with a loading of 0.30 suggested as the cutoff.
  • Variables that load on the same factor have the same conceptual meaning.
  • A variable that loads on a different factor measure a different construct.
  • The rotated factor pattern shows a “simple structure”, meaning that:
    • Most variables load relatively high on only one factor and low on the other factors.
    • Most factors have relatively high factor loadings on some variables and low loadings on the remaining variables.

Ethical Approval

The study was approved by the Research and Ethics Committee (REC) of the College of Medicine and Medical Sciences, Arabian Gulf University, Kingdom of Bahrain (No. E004-PI-9/20). All survey participants provided informed consent.

Results

The results are divided into two parts:

Part I: Validity and Reliability of the Newly Developed Survey

Results of the Validity Study

Content Validity

According to the review done by five medical education experts from the three schools, modifications to some items were made and the survey was edited and made ready for administration to the study participants. Examples of the made modifications were adding examples of web browsers, learning management systems, and synchronous online teaching platforms.

Construct validity: Exploratory Factor Analysis (EFA)Checking the Suitability of Data for Factor Analysis

The collected responses were 217, which is adequate for factor analysis. Analysis of data adequacy indicated an adequate amount of data for factor analysis (Kaiser-Meyer-Olkin Measure of Sampling Adequacy (KMO) was 0.91 and Bartlett’s Test of Sphericity χ2 (435, N=217)=3379.63, p=0.00). Therefore, this output indicated the appropriateness of the data for factor analysis.

Extraction of Factors

Results of factor extraction revealed that the 30 items of the survey could be grouped under five factors with an eigenvalue >1.00. The five factors that emerged from factor analysis accounted for 58.1% of the total variance.

Rotation of Factors

Results of factor rotation showed that none of the 30 items of the survey were removed from the analysis. This was based on finding that a) all the factors had three or more items, b) no items had cross-loading between factors, and c) all items had a loading of >0.30 on the relevant factor.

After conducting the previous validity studies, the survey was composed of the original 30 items distributed to 5 factors. The factors were named according to the heaviness of loading of the statements (items) on each factor and based on the idea behind the statement (Table 1) as follows:

  • Factor 1 explained 35.63% of the variance in responses, with an eigenvalue of 10.69. Twelve statements loaded on this factor, with values between 0.45 and 0.72. This factor has been renamed to “Online Teaching and Course Design Skills”. This factor addresses the skills of designing courses and educational materials for online teaching and learning, excelling in online teaching, and seeking development in such skills.
  • Factor 2 explained 8.52% of the variance in responses, with an eigenvalue of 2.56. Six statements loaded on this factor, with values between 0.43 and 0.68. This factor has been renamed to “Digital Communication”. This factor addresses the confidence in communicating verbally and in writing and giving feedback to learners.
  • Factor 3 explained 5.68% of the variance in responses, with an eigenvalue of 1.70. Five statements loaded on this factor, with values between 0.50 and 0.80. This factor has been renamed to “Basic Computer Skills”. This factor addresses the skills of file management and document creation using the applications of Microsoft Office, sending and receiving emails, surfing the internet for educational materials, and being familiar with a learning management system.
  • Factor 4 explained 4.19% of variance in responses, with an eigenvalue of 1.26. Three statements loaded on this factor, with values between 0.59 and 0.76. This factor has been renamed to “Advanced Computer Skills”. This factor addresses the skills of file encryption and recording audio and video clips.
  • Factor 5 explained 4.08% of variance in responses, with an eigenvalue of 1.23. Four statements loaded on this factor, with values between 0.61 and 0.69. This factor has been renamed to “Using Learning Management Systems”. This factor addresses the comfort and confidence in using learning management systems in course development and management.

Table 1 Factor Loadings of the Items Under the Five Factors of the Survey Form (Using Principal Components Analysis)

Results of the Reliability Study

Test of reliability (internal consistency) revealed high reliability of this tool (Alpha = 0.94).

Part II: Survey Results

Two hundred and seventeen faculty members from the three schools responded to the survey. As shown in Table 2, the profile of the respondents revealed that the majority of them are in the age group of 46–55 years at the FOM-SCU and COM-IAU, while at the CMMS-AGU the majority are in the age group of 56–65 years. Most of the responses were from females at FOM-SCU and COM-IAU (63.3% and 69.8% respectively), while this was reversed at CMMS-AGU where most of the respondents were males (61.1%). The academic rank “full professor” was the most prevalent among FOM-SCU respondents (45.8%), while “assistant professor” was the most prevalent among CMMS-AGU and COM-IAU respondents (33.3% and 46.6%, respectively).

Table 2 Profile of the Respondents of the Three Participating Schools

The Results of Teacher Responses Under Each Factor are Presented Hereafter

Factor 1: Online Teaching and Course Design Skills

Regarding Online Teaching and Course Design Skills (Table 3), the great majority of instructors in our sample expect online teaching to take more time and they are ready for that. They feel comfortable conducting interactive learning activities and creating online teaching materials. They agreed that they can create schedules for themselves and stick to them. Some differences were noticed between the three schools with respect to faculty perceptions of their readiness for online teaching.

Table 3 Instructors’ Perception of Their Readiness in Regard to “Online Teaching and Course Design Skills”

On the other hand, most instructors incorporate online learning activities that are connected to real-world applications during teaching and are always keen to attend as learners in online faculty development activities. They are oriented with online course planning, comfortable writing measurable learning outcomes, and understand copyright law and Fair Use guidelines when using copyrighted materials in education. Some differences were noticed between the three schools, which were not consistently toward any of them.

Two-thirds of instructors agreed on three items (enjoying online lecturing for most of the class period, knowing how to check for plagiarism in students’ assignments, and feeling comfortable designing online interactive learning activities). Some differences were noticed between the three schools, which were not consistent toward any of them.

For almost all items, the least percentages were for those who disagree, while no more than one-fifth of participants were uncertain about their perception of any of the items.

Factor 2: Digital Communication

Regarding Digital Communication (Table 4), the great majority of instructors in the three schools agreed that they feel comfortable communicating through writing and speaking and they are willing to provide timely and constructive feedback to their students. Some differences were noticed between the three schools, which were not to the side of any of them.

Table 4 Instructors’ Perception of Their Readiness in Regard to Digital Communication

For the other three items (feeling comfortable using social media tools to communicate with students and colleagues, being ready to timely respond to communication requests from students and colleagues and being available to students on a regular basis for questions and assistance), most of the instructors agreed.

Factor 3: Basic Computer Skills

Regarding Basic Communication Skills (Table 5), almost all the instructors agreed that they can use Microsoft Office tools such as Word and PowerPoint to create documents and presentations. They can perform file management on their computers, such as copying, moving, renaming, and deleting files or folders. They can also send and receive emails, including opening and sending email attachments. They can use internet browsers to locate resources for teaching; and they are familiar with at least one synchronous online teaching platform. Minimal statistical differences were noticed in the perception scores between the three schools’ respondents.

Table 5 Instructors’ Perception of Their Readiness in Regard to Basic Computer Skills

Factor 4: Advanced Computer Skills

Regarding “Advanced computer Skills” (Table 6), the great majority of the instructors agreed that they can encrypt files on their computers, record audio/video, and add audio/video files to presentations. Some differences were noticed between the three schools, which were not consistently in favor of any of them.

Table 6 Instructors’ Perception of Their Readiness in Regard to Advanced Computer Skills

Factor 5: Using Learning Management Systems

Regarding Using Learning Management Systems (Table 7), the majority of the instructors agreed that they are comfortable using learning management system tools to develop online courses, using tools in the learning management system to facilitate student learning, and using the learning management system or other online assessment tools to evaluate student performance. Some differences were noticed between the three schools, which were not consistently in favor of any of them.

Table 7 Instructors’ Perception of Their Readiness in Regard to Using Learning Management Systems

Table 8 shows the factors’ mean scores for the three schools. The mean scores for Basic Computer Skills were the highest for the three schools, while those for Online Teaching and Course Design Skills and Using Learning Management Systems were the lowest for the three schools. Analysis of variance of the responses revealed that there are statistically significant differences regarding two of the factors, which are Digital Communication (F=5.13; p=0.007) and Basic Computer Skills (F=4.47; p=0.012).

Table 8 Analysis of Variance (ANOVA) of the Mean Scores of Responses in the Three Studied Medical Schools

Discussion

The value of measuring the readiness for online teaching emanates from the fact that online instructor readiness plays a key role in the success of e-learning,17–19 partly because perceived self-efficacy has a high impact on perceived ease of use.18

The tool designed in this study is a 30-item Likert scale (Online teaching readiness questionnaire – OTRQ) that measures teachers’ readiness for online teaching. To the best of our knowledge, this is the first comprehensive tool to measure readiness of teachers for online instruction in the Middle East. The reliability of this tool was tested on >200 faculty and proved to be excellent, with a Cronbach's alpha value of 0.94 (acceptable values range from 0.70 to 0.95).20 Previous studies have shown reliabilities ranging from 0.7018,19 to 0.86.21

Validity study through EFA yielded five components (factors) that give a holistic readiness estimate combining the three phases of pre-course, during the course, and after the course, and stresses on the technical and pedagogical factors involved in such state of readiness. We used a 3-point Likert scale, which is nearly similar to other studies that used 3-point,22 4-point,23–25 5-point,2,26–30 and 7-point18,31 Likert scales. The number of items was 30 in OTRQ, while in other studies it varied from 18 to 49.2,25–27,32–39 Our validity study yielded 5 factors as previously described, while the number of factors in other studies varied from 333 to 1432 factors. The number of participants in our study was 217, which was sufficient to validate the instrument. Other validation studies used varied numbers of participants, ranging from 2040 to 369.34

The validation process used in this study included content validation through experts’ opinions and construct validation through EFA. All previous studies had to discard some items based on the criterion of low loading or cross loading with different factors, but in this study, no items were discarded as every item fulfilled the criterion of having a loading of >0.3 on a particular factor. Most studies depended only on content validation through expert revision, correlation, and literature review. Only three studies of online teaching readiness measurement tools employed factor analysis.27,37,41

Age was found to influence faculty perceptions towards e-learning with the younger age being associated with more positive e-learning readiness and less technophobia.1,2 The respondents belonged to three different countries in the MENA region and to long standing nationally accredited higher education institutions. Almost all (91%) had a rank of assistant professor or higher. As faculty rank increases, the perceived ease of use and readiness for online teaching score decrease. However, this relationship was found to be not statistically significant. This agrees with findings from a study by Martin et al,9 who found that full and associate professors were inferior to assistant professors and lecturers in both attitude and ability of online course design, online course communication, and technical competence.

The scores obtained from the three schools showed a high level of online teaching readiness among instructors. The readiness was more on technical and attitude aspects than on pedagogical skills. It showed an appreciative level of overall teacher readiness for online instruction of 85.4%, which is in agreement with previous studies.2,26,42 Other studies have shown much lower levels of teacher readiness for online instruction of 20%,43 28%,2 34%,44 and 54%.45

In our study, the lowest score for all schools regarding Online Teaching and Course Design was for the statement “During teaching, I incorporate online learning activities that are connected to real-world applications. (i.e., using real clinical cases, reflecting on applying knowledge in life uses… etc.)”. This probably means that instructors do not have experience in creating online instructional materials taken from real-life examples or they do not have the sufficient time or facilities to do that. Training and assistance on e-learning is one solution for this problem as suggested in previous studies.46 Added to this, the type of learning management system platform influences instructors’ perception of readiness for online teaching.18 This is on par with previous studies that emphasized that the main obstacle to online teaching was the lack of pedagogical or online course design skills.2,18,33,34

The gap between the positive digital and computer skills perception and the lack of use of electronic learning management systems found in this study is similar to previous studies which showed that the lack of time and the high workload in addition to lack of recognition of e-learning material preparation were the reasons behind this gap.25,26,45

Analyzing previous studies on instructor readiness for online teaching showed that there is a lack of standardization among them, including inadequate methodology on exploring the readiness for online teaching scores, and lack of proper statistical validation methodology. Readiness tools for online instruction have produced results and identified problem areas, but have not provided solutions to address deficiencies. Furthermore, these tools did not actually measure the knowledge, skills, and attitudes required by users in online teaching but instead broadly explored perceptions of instructors about their own technical skills and behavioral categories to predict readiness for online teaching.19,46 In other words, there is a gap between measurement of faculty perceptions and the actual use of e-learning.46 Moreover, the tools that assess readiness for online teaching consist of items and domains that are appropriate for use in developed countries and international institutions, but not necessarily appropriate for developing countries and local institutions.19 The current study helped address this gap by exploring the degree of readiness of medical teachers for online teaching through their knowledge, skills, and attitudes, and by evaluating how such behaviors impacted their level of use of the learning management system during delivery of an online course.

Among the three schools, the highest mean scores for Digital Communication and Basic Computer Skills can be explained by the fact that such skills are used frequently by faculty members, as they communicate all the time through different means of digital messaging and electronic emailing and all of them use computers to create and manage files like presentations and handouts. This in addition to the regular training provided by the three schools’ IT units in basic computer skills. The work of Eom et al47 on the role of information technology in the success of e-learning supports this view. The lowest mean scores for Online Teaching and Course Design Skills and Using Learning Management Systems can be explained by the fact that such skills are relatively new to them and were uncommon among faculty members in the three schools that depended mainly on face-to-face teaching before the COVID-19 pandemic. This can highlight these two areas as priorities for future faculty development programs. Moreover, the statistically significant difference between the three schools with regard to Digital Communication and Basic Computer Skills indicates the individual differences between people in using computers and digital communication applications and platforms.

The study has a couple of possible limitations. First, using two different online learning management systems in the three schools maybe a confounding factor as there expected to be differences in the user-friendliness of the two systems. Second, nature and efficiency of technological support and faculty development provided to teaching faculty in the three schools is expected to be different.

Conclusion

Measuring readiness for online teaching is important, especially in a situation of physical distancing like the one we are facing because of COVID-19.

A highly reliable and valid tool was developed in this study tool, which can be implemented by other medical schools. Analysis of the results indicated an overall acceptable readiness of teachers for online learning in the three involved medical schools. However, we recommend faculty development in Online Teaching and Course Design Skills as well as in Using Learning Management Systems as important areas that affect the readiness of medical teachers for online teaching.

Acknowledgments

We would like to thank all the teaching faculty members who participated through filling in the survey. We also appreciate the help of Dr. Mohamed Elwazir for his assistance in data analysis.

Author Contributions

All authors made a significant contribution to the work reported, whether that is in the conception, study design, execution, acquisition of data, analysis and interpretation, or in all these areas; took part in drafting, revising or critically reviewing the article; gave final approval of the version to be published; have agreed on the journal to which the article has been submitted; and agreed to be accountable for all aspects of the work.

Funding

No funding was obtained for this study.

Disclosure

The authors declare that they have no conflicts of interest for this work.

References

1. Cutri RM, Mena J, Whiting EF. Faculty readiness for online crisis teaching: transitioning to online teaching during the COVID-19 pandemic. Eur J Teach Educ. 2020;43(4):523–541. doi:10.1080/02619768.2020.1815702

2. Gay GH. An assessment of online instructor e-learning readiness before, during, and after course delivery. J Comp High Educ. 2016;28(2):199–220. doi:10.1007/s12528-016-9115-z

3. Muthuprasad T, Aiswarya S, Aditya KS, Jha GK. Students’ perception and preference for online education in India during COVID-19 pandemic. Soc Sci Hum Open. 2021;3(1):100101.

4. Hung ML. Teacher readiness for online learning: scale development and teacher perceptions. Comp Educ. 2016;94:120–133. doi:10.1016/j.compedu.2015.11.012

5. Eslaminejad T, Masood M, Ngah NA. Assessment of instructors’ readiness for implementing e-learning in continuing medical education in Iran. Med Teach. 2010;32(10):e407–412. doi:10.3109/0142159X.2010.496006

6. Bawane J, Spector JM. Prioritization of online instructor roles: implications for competency‐based teacher education programs. Distance Educ. 2009;30(3):383–397. doi:10.1080/01587910903236536

7. Wray M, Lowenthal PR, Bates B, Stevens E. Investigating perceptions of teaching online & f2f. Acad Exchange Quart. 2008;12(4):243–248.

8. Ko S, Rossen S. Teaching Online: A Practical Guide. Taylor & Francis; 2017.

9. Martin F, Budhrani K, Wang C. Examining faculty perception of their readiness to teach online. Online Learn. 2019;23(3):97–119. doi:10.24059/olj.v23i3.1555

10. Downing JJ, Dyment JE. Teacher educators’ readiness, preparation, and perceptions of preparing preservice teachers in a fully online environment: an exploratory study. Teach Educ. 2013;48(2):96–109. doi:10.1080/08878730.2012.760023

11. Northcote M, Gosselin KP, Reynaud D, Kilgour P, Anderson M. Navigating learning journeys of online teachers: threshold concepts and self-efficacy. Issues Educ Res. 2015;25(3):319–344.

12. Carrillo C, Flores MA. COVID-19 and teacher education: a literature review of online teaching and learning practices. Eur J Teach Educ. 2020;43(4):466–487. doi:10.1080/02619768.2020.1821184

13. Kaiser HF. The application of electronic computers to factor analysis. Educ Psychol Measure. 1960;20(1):141–151. doi:10.1177/001316446002000116

14. Cattell RB. The scree test for the number of factors. Multivariate Behav Res. 1966;1(2):245–276. doi:10.1207/s15327906mbr0102_10

15. Pett MA, Lackey NR, Sullivan JJ. Making Sense of Factor Analysis: The Use of Factor Analysis for Instrument Development in Health Care Research. Sage; 2003.

16. Lee N, Saunders J, Saunders J, Lee N. The evolution of “classical mythology” within marketing measure development. Eur J Market. 2005;39(3/4):365–385. doi:10.1108/03090560510581827

17. Hashim H, Tasir Z. E-learning readiness: a literature review. In: 2014 International Conference on Teaching and Learning in Computing and Engineering. IEEE; 2014:267–271.

18. Keramati A, Afshari-Mofrad M, Kamrani A. The role of readiness factors in E-learning outcomes: an empirical study. Comp Educ. 2011;57(3):1919–1929. doi:10.1016/j.compedu.2011.04.005

19. Motaghian H, Hassanzadeh A, Moghadam DK. Factors affecting university instructors’ adoption of web-based learning systems: case study of Iran. Comp Educ. 2013;61:158–167. doi:10.1016/j.compedu.2012.09.016

20. Tavakol M, Dennick R. Making sense of Cronbach’s alpha. Int J Med Educ. 2011;2:53–55. doi:10.5116/ijme.4dfb.8dfd

21. Wasilik O, Bolliger DU. Faculty satisfaction in the online environment: an institutional study. Internet Higher Educ. 2009;12(3–4):173–178. doi:10.1016/j.iheduc.2009.05.001

22. Pillay H, Irving K, Tones M. Validation of the diagnostic tool for assessing Tertiary students’ readiness for online learning. High Educ Res Dev. 2007;26(2):217–234. doi:10.1080/07294360701310821

23. Agboola AK. Assessing the awareness and perceptions of academic staff in using e-learning tools for instructional delivery in a post-secondary institution: a case study. Innov J. 2006;11(3):1–2.

24. Schreurs J, Ehlers UD, Sammour G. E-learning Readiness Analysis (ERA): an e-health case study of e-learning readiness. Int J Know Learn. 2008;4(5):496–508. doi:10.1504/IJKL.2008.022066

25. Saekow A, Samson D. E-learning readiness of Thailand’s universities comparing to the USA’s cases. Int J e-Educ e-Business e-Manage e-Learn. 2011;1(2):126.

26. Nunally J, Bernstein L. Psychometric Theory. New York: McGraw-Hill Higher; 1994.

27. Schifter CC. Faculty participation in asynchronous learning networks: a case study of motivating and inhibiting factors. J Asynchron Learn Netw. 2000;4(1):15–22.

28. Panda S, Mishra S. E‐learning in a mega open university: faculty attitude, barriers and motivators. Educ Media Int. 2007;44(4):323–338. doi:10.1080/09523980701680854

29. Lee JA, Busch PE. Factors related to instructors’ willingness to participate in distance education. J Educ Res. 2005;99(2):109–115. doi:10.3200/JOER.99.2.109-115

30. Naidu S. Trends in faculty use and perceptions of e-learning. Asian J Distance Educ. 2004;2(2).

31. Parlakkiliç A. E-learning readiness in medicine: Turkish Family Medicine (FM) physicians case. Turkish Online J Educ Technol. 2015;14(2):59–62.

32. Mantilla G, Lewis KO. Determining faculty and student readiness for an online medical curriculum. Med Sci Educ. 2012;22(4):228–243. doi:10.1007/BF03341791

33. Darab B, Montazer GA. An eclectic model for assessing e-learning readiness in the Iranian universities. Comp Educ. 2011;56(3):900–910. doi:10.1016/j.compedu.2010.11.002

34. Holsapple CW, Lee-Post A. Defining, assessing, and promoting E-learning success: an information systems perspective. Decis Sci J Innov Educ. 2006;4(1):67–85. doi:10.1111/j.1540-4609.2006.00102.x

35. Jamlan M. Faculty opinions towards introducing e-learning at the University of Bahrain. Int Rev Res Open Distrib Learn. 2004;5(2). doi:10.19173/irrodl.v5i2.185

36. McGill T, Klobas J, Renzi S. LMS use and instructor performance: the role of task-technology fit. Int J E-Learn. 2011;10(1):43–62.

37. Liaw S, Huang H, Chen G. Surveying instructor and learner attitudes toward e-learning. Comp Educ. 2007;49(4):1066–1080. doi:10.1016/j.compedu.2006.01.001

38. Adeyinka T, Mutula S. A proposed model for evaluating the success of WebCT course content management system. Comp Human Behav. 2010;26(6):1795–1805. doi:10.1016/j.chb.2010.07.007

39. Lloyd SA, Byrne MM, McCoy TS. Faculty-perceived barriers of online education. J Online Learn Teach. 2012;8(1).

40. Larbi-Apau JA, Moseley JL. Computer attitude of teaching faculty: implications for technology-based performance in higher education. J Info Technol Educ Res. 2012;11(1):221–233.

41. Watkins R, Leigh D, Triner D. Assessing readiness for E‐learning. Perform Improv Quart. 2004;17(4):66–79. doi:10.1111/j.1937-8327.2004.tb00321.x

42. El Turk S, Cherney ID. Perceived online education barriers of administrators and faculty at a US university in Lebanon. Creighton J Interdisciplin Leadersh. 2016;2(1):15–31. doi:10.17062/cjil.v2i1.30

43. Mandernach BJ, Mason T, Forrest KD, Hackathorn J. Faculty views on the appropriateness of teaching undergraduate psychology courses online. Teach Psychol. 2012;39(3):203–208. doi:10.1177/0098628312450437

44. Emelyanova N, Voronina E. Introducing a learning management system at a Russian university: students’ and teachers’ perceptions. Int Rev Res Open Distance Learn. 2014;15(1):272–289. doi:10.19173/irrodl.v15i1.1701

45. Yengin I, Karahoca A, Karahoca D. E-learning success model for instructors’ satisfactions in perspective of interaction and usability outcomes. Procedia Comp Sci. 2011;3:1396–1403. doi:10.1016/j.procs.2011.01.021

46. Kim K, Kang Y, Kim G. The gap between medical faculty’s perceptions and use of e-learning resources. Med Educ Online. 2017;22(1):1338504. doi:10.1080/10872981.2017.1338504

47. Eom S, Ashill NJ, Arbaugh JB, Stapleton JL. The role of information technology in e-learning systems success. Hum Sys Manag. 2012;31:147–163.

Creative Commons License © 2021 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.