• Open supplemental data
  • Reference Manager
  • Simple TEXT file

People also looked at

Original research article, factors and recommendations to support students’ enjoyment of online learning with fun: a mixed method study during covid-19.

recommendation in research online class

  • Rumpus Research Group, Faculty of Wellbeing, Education and Language Studies, The Open University, Milton Keynes, United Kingdom

Understanding components that influence students’ enjoyment of distance higher education is increasingly important to enhance academic performance and retention. Although there is a growing body of research about students’ engagement with online learning, a research gap exists concerning whether fun affect students’ enjoyment. A contributing factor to this situation is that the meaning of fun in learning is unclear, and its possible role is controversial. This research is original in examining students’ views about fun and online learning, and influential components and connections. This study investigated the beliefs and attitudes of a sample of 551 distance education students including pre-services and in-service teachers, consultants and education professionals using a mixed-method approach. Quantitative and Qualitative data were generated through a self-reflective instrument during the COVID-19 pandemic. The findings revealed that 88.77% of participants valued fun in online learning; linked to well-being, motivation and performance. However, 16.66% mentioned that fun within online learning could take the focus off their studies and result in distraction or loss of time. Principal component analysis revealed three groups of students who found (1) fun relevant in socio-constructivist learning (2) no fun in traditional transmissive learning and (3) disturbing fun in constructivist learning. This study also provides key recommendations extracted from participants’ views supported by consensual review for course teams, teaching staff and students to enhance online learning experiences with enjoyment and fun.

Introduction

Online learning has been considered vital in 21st century to provide flexible education for students as well to address the gap between demand for higher education and supply. Governments have advocated increasing rates of completion of secondary and higher education in the face of rapid population growth. However, they face financial pressure to support these larger numbers directly through additional infrastructure, in addition to scholarships and student loans ( Cooperman, 2014 :1).

In recent years, there has been an increasing interest in distance online learning not only to educate students who work but also who live too remotely or cannot access traditional campus universities for other reasons. However, literature shows that online distant education has dropout rates higher than traditional universities ( Xavier and Meneses, 2020 ). Studies also suggest that the students’ level of satisfaction about their online learning and own academic performance have significant correlation with their level of persistence toward completion ( Gortan and Jereb, 2007 ; Higher Education Academy (HEA), 2015 ).

Understanding components that influence students’ enjoyment in distance higher education is fundamental to promote student retention and success ( Higher Education Academy (HEA), 2015 ) during and post COVID-19 pandemic. There is a growing body of research about students’ engagement in virtual learning environments ( Arnone et al., 2011 ). However, there are key issues that whilst extensively researched in traditional teaching, remain relatively absent from research into distance education. For example, a long established body of research exists that demonstrates a link between students’ epistemological beliefs and their study, engagement, and outcomes ( Rodriguez and Cano, 2007 ; Richardson, 2013 ). The types of epistemological beliefs typically examined fall into two broad categories. The first is derived from Schommer’s research ( Schommer, 1990 ), in which she elicited dimensions that reflected students differing beliefs. This included “simple knowledge” (knowledge as isolated facts vs. knowledge as integrated conceptions) and “innate ability” (ability to learn is genetically determined vs. the ability to learn is enhanced through experience). The second category of research is more directly aligned with pedagogy. This has positioned epistemological beliefs in relation to traditional or constructivist beliefs. Traditional views of learning see learning occurring via the non-problematic transfer of untransformed knowledge from expert to student ( Chan and Elliott, 2004 ). This contrasts with constructivist beliefs in which knowledge arises through reasoning, which is facilitated by teaching ( Lee et al., 2013 ). This type of framing can be seen in large scale international comparative research, such as the Organization for Economic Co-operation and Development’s survey of teachers’ epistemological beliefs across 23 countries ( Organisation for Economic Co-operation and Development (OECD), 2010 , 2013 ). However, in relation to online and distance higher education, epistemological research is relatively absent ( Richardson, 2013 ; Knight et al., 2017 ). Given the impact of epistemological beliefs on students’ study experiences there is a need for greater epistemologically focused research in the context of online education.

Another underrepresented research area concerns fun in online learning; in particular, because the meaning of fun is unclear and controversial. There is no consensus about the value of fun in learning and what a fun learning experience means in higher education ( McManus and Furnham, 2010 ; Lesser et al., 2013 ; Tews et al., 2015 ; Whitton and Langan, 2018 ). Tews et al. (2015) argue that fun is a term used regularly in various contexts including education. Yet there is no clear agreement about its role and relationships with students’ learning experience. Congruently, McManus and Furnham (2010) highlight that fun has different meanings for different people and literature is limited about what generally comprises fun for learners. Similarly, Lesser et al. (2013) indicate that views about fun among educators are ambivalent as fun is perceived as too difficult or time-consuming to be implemented and it may distract students from serious learning. These three studies indicate that evidence about fun and learning are circumstantial and subjective for teaching staff to consider it as a compelling component for making their students’ experience more impactful. So that, further studies would be worthwhile to examine the practical meaning and educational value of fun on Distance Higher Education with a systematic and rigorous methodological approach.

To explore this challenge, this paper investigates students’ reflective views about fun and online learning and whether fun and enjoyment are interconnected components to enhance enthusiasm to learn and excel in online distant education. This investigation considers a critical question framed by the authors from Whitton and Langan (2018:11)’s work. How can we explore the impact of fun in higher education in view of the complexity of factors involved? To explore this question, this work is based on Responsible Research and Innovation (RRI) approach to understanding the what, how and why fun might be a valuable key in education with and for distinctive representatives: learners, educators, researchers, consultants, and policy makers. “For pedagogic innovation to succeed, learners must personally perceive the benefits of learning activities” designed to be fun and also “these gains must be translated into outcomes that are viewed positively within the institution quality monitoring by teaching staff.” Whitton and Langan (2018) also explain that there is a negative influence from the competitive job market that values “serious” performance – as the opposite of fun – so potentially this make course teams less likely to embed playful and fun approaches in the higher education curriculum.

The RRI approach implies that community-members and researchers interact together to better align both its process and outcomes with the values, needs and expectations of society ( European Commission, 2013 ; von Schomberg, 2013 ). The purpose of RRI is to promote greater involvement of societal members with research-authors in the process of research to increase knowledge, understanding and better decision-making about both societal needs and scientific research through eight principles: diversity and inclusion; transparency and openness, anticipation and reflexivity, adaptation and responsiveness ( RRI-Tools, 2016 ; European Commission, 2020 ). These principles were used to adapt, implement and refine a self-reflective instrument about learning and fun. So that, the following section-“Previous Studies about Fun and Learning” present Learning and Fun views from literature. Section-“Methodology” shows the self-reflective instrument, which was used integrated with the methodological approach. Section-“Findings” shows the findings and section-“Discussion and Final Remarks” discussion with final remarks.

Previous Studies about Fun and Learning

Studies that appear to research fun and learning, typically focus on types of activity and the extent to which these are seen as enjoyable and indicated as being fun, rather than drilling down to examine or define fun. While fun is consistently recognized as an important part of the lived experience of children, youth and adults, relatively few seek a deeper understanding of what the construct of fun means ( Kimiecik and Harris, 1996 ; Harmston, 2005 ; Garn and Cothran, 2006 ). This situation is in stark contrast to how fun is generally positioned with regard to the domain of learning and education.

There are different views in the literature about fun and learning, in terms of meanings and its effects. Negative perspectives describe fun as the opposite concept of meaningful “work” and consider it as an unnecessary distraction for learning.

Fun is a term that has changed over time. In the 1900s, it came to indicate an absence of seriousness, work, and labor. “Fun can be seen both as a resistance to the rigid demarcation between work and leisure and also as a means of reproducing that dichotomy” ( Blythe and Hassenzahl, 2018 , p92). As it took on these meanings, fun became a loaded term that challenges the status quo ( Beckman, 2014 ). It can be positioned as a challenge to the traditional split between fun and learning; welcomed by those who embrace social views of the learning process but seen as an unnecessary distraction for those who hold a traditional transmission view of how learning takes place.

The etymological meaning of fun ( fonne and fon from Germanic), which refers to “simple, foolish, silly, unwise” ( Etymonline, 2020 ) have still influence on the meanings attributed by people and researchers nowadays. The argument that fun can have a negative influence on learning was highlighted in newspaper reports of research by the Centre for Education Economics (CEE): “Making lessons fun does not help students to learn, a new report has found. The widely held belief that learners must be happy in order to do well is nothing more than a myth” ( Turner, 2018 ). Likewise, Whitton and Langan note in their analysis of fun in United Kingdom that many educators believe fun to be unsuitable in the “serious” business of higher education ( Whitton and Langan, 2018 , p3). They also highlight a need to research whether students believe that there is any place for fun in their university studies. So, for many, fun is seen as having little or no place within learning. Within the context of education, “fun” is often a derogatory term used to refer to a trivial experience ( Glaveanu, 2011 ).

Some researchers have identified a more positive relationship between fun and learning for children and adults. An analysis of outcomes from the United Kingdom’s “Excellence and Enjoyment” teaching initiative concluded that “Learning which is enjoyable (fun) and self-motivating is more effective than sterile (boring) solely teacher-directed learning” ( Elton-Chalcraft and Mills, 2015 , p482; Tews et al., 2015 ). In the context of informal adult learning, fun has been linked to positive learning outcomes, including job performance and learner engagement ( Francis and Kentel, 2008 ; Fine and Corte, 2017 ; Tews et al., 2017 ). This raises the question of why this conflict and controversy might exist.

The positive effect is not due to fun being an integral part of the learning process, but rather because it has physiological effects such as reducing stress and improving alertness which enhance “performance” ( Bisson and Luckner, 1996 ).

Similarly, Whitton and Langan (2018) describe fun as a “fluid state” ( Prouty, 2002 ) which makes learners feel good ( Koster, 2005 : 40) to engage with learning. This fluid state allows learners to take healthy risks beyond existing personal boundaries ( Ungar, 2007 ). This is because learners are attracted to participate in learning activities that they enjoy and can “fail forward” and feel safe. In addition, Feldberg (2011 :12) indicate that fun has a positive effect on the learning process for creating a state of “relaxed alertness” ( Bisson and Luckner, 1996 ) which enables the suspension of one’s social inhibitions and the reduction of stress. The author highlights fun may contribute to the maintenance of cognitive functioning and emotional growth ( Crosnoe et al., 2004 cited by Feldberg).

Dismore and Bailey’s (2011 , p.499) study indicates positive feelings associated with enjoyment, engagement and optimal experience. The authors described fun and enjoyment underpinned by the concept of “flow” ( Csikszentmihalyi, 2015 ) which refers to “ an optimum state of inner experience incorporating joy, creativity, total involvement and an exhilarating feeling of transcendence .” The optimum state is a key component to lead students to enjoyable accomplishment and optimal learning when their perceived skill and challenge are balanced and suitable. Flow is an important concept for educators to be aware that students’ anxiety caused when their challenge becomes higher compared to their skill, and boredom when challenge becomes too little compared to their skill will reduce their enjoyment and have a negative effect on their learning. Fun learning with flow experiences is relevant for learners to grow with positive opportunities where their skill meets their effort producing intrinsic rewards ( Dismore and Bailey, 2011 ; Chu et al., 2017 ; Whitton and Langan, 2018 ).

Literature about the meaning of fun in online learning is very limited. A set of studies about engaging e-learning games highlight that fun and challenge are essential for promoting students’ enjoyment and making them want to learn ( Fu et al., 2009 ). An engaging e-learning game facilitates the flow of experiences of students by increasing their attention, achieving learning goals and enjoyment with their learning experience ( Virvou et al., 2005 ; De Freitas and Oliver, 2006 ).

This study focuses on fun and learning in the context of Distance Higher Education supported by RRI. To explore what fun is, its meaning and the effects of the phenomenon need to be understood with learners. As a first step, there is a need to identify how the relationship between fun and online learning is conceived by learners based on their own learning experience. A second step is to examine whether this relationship connection has any connection with their epistemic views.

The aim of this study is to address the following questions:

• What are the relationships between fun and online learning practices identified by students?

• What are the connections between students’ epistemic views about online learning and fun?

• What are the recommendations for students, teaching staff and course teams?

Methodology

This work is part of a research program OLAF – Online Learning and Fun led by Rumpus Research Group. The methodology used in this study adopts the established epistemological questionnaire approach ( Feucht et al., 2017 ), and provides an opportunity to facilitate participants epistemic reflectivity ( Feucht et al., 2017 ). In this way the study is underpinned by the concept of reflective practitioners, by which participants “think in action” about principles and practices to share their reflective views ( Schon, 2015 ).

This study is based on a mixed-method approach. Quantitative and qualitative data were generated through a self-reflective instrument ( Feucht et al., 2017 ) constituted by two parts, both developed in Qualtrics. The first part was a Likert-scale survey with 25 statements about learning and fun. The second part was an open question (see “Instruments”).

The approach used for qualitative analysis was a systematic and novel multi methodical procedure that combined: word cloud visualization in Qualtrics ( Figure 2 ); automated thematic analysis map ( Figure 3 ) and sentiment analysis ( Figures 4 – 6 ) in NVivo 12. This integration of visualizations enabled us to identify seven themes to analyze the value of fun; and 26 themes of relationships between fun and learning. The quantitative analysis was supported by PCA – Principal Content Analysis (see “Relationships Between Fun and Learning Supported by Quantitative Analysis”). This approach enabled us to group our – multi-method qualitative analysis categorized by themes – into three groups (see “Relationships Between Fun and Learning Supported by Quantitative Analysis”) as well present our findings (section-“Findings”) with global recommendations underpinned by students’ needs, priorities and expectations, which were revealed in the qualitative data and grouped by quantitative analysis.

This study acknowledges 8 principles ( Box 1 ) of RRI ( von Schomberg, 2013 ; RRI-Tools, 2016 ) in the context of open educational research ( Okada and Sherborne, 2018 ) by which all participants reflect about practices and beliefs for better alignment between learners’ needs and research-based recommendations. The instrument with a special code to allow the withdrawal of participation without the collection of personal data was approved by the Ethics Committee and the Student Research Project Panel of the Open University-United Kingdom.

www.frontiersin.org

Participants

The OU offers flexible undergraduate and postgraduate courses and qualifications supported distance and open learning for 174,898 people from the United Kingdom, Europe and some worldwide. Approximately 76% of directly registered students work full or part-time during their studies; 23% of Open University United Kingdom undergraduates live in the 25% most deprived areas and 34% of new OU undergraduates are under 25, 14% with disabilities and 32% with lower qualification at entry.

This study focused on one of the largest introductory modules offered by the Wellbeing Education and Language Studies – WELS Faculty of The Open University. Currently this module has more than 4,300 students and is part of various qualifications. So that, participants were students from all levels and qualification’ interests with different occupations, include novices, undergraduates who had just completed secondary education, pre-service and in-service teachers; as well professionals interested in Education, Psychology and Social Care.

A balanced and representative sample were constituted by a total of 625 students who participated in this study as volunteers, 551 completed a self-reflective questionnaire to reflect about fun and learning and 206 provided their reflective views by answering an “optional” open question. The response rate (40%) for the open views about fun and learning was higher than expected.

In terms of students’ previous study experience 48.55% students completed pre-A levels or equivalent (secondary school), 26.81% had already finished other OU course modules (level 1, level 2, and level 3) and 24.64% reported other different experiences. In terms of qualification pathway targeted by students: 28.80% are interested in childhood studies; 34.24% in psychology; 27.17% Education primary, 4.53% Open and 1.81% do not know and 3.44 other qualification such as Social Care.

This study focuses on a 9-month-module course with twenty-four weekly units and four assessment activities. The course integrates reading materials, online audio-visual materials, a YouTube channel “The student hub live” and radio-style broadcast audio repository. Students have also access to a set of library resources, news and special “quick guides” to provide extra-support for developing activities successfully. Students’ interaction with peers and communication with tutors typically occur asynchronously in the online discussion forum and synchronously in online tutorials (in Adobe Connect) and face-to-face tutorials organized in a specific period and locations. In addition, the course provides a channel in social media (Twitter and Facebook) for students’ social engagement. This course module presentations are opened 3 weeks prior to the start in order to provide time for students to smoothly engage in their initial activities including a series of fun and friendly online workshops to promote interaction.

Recruitment

Students’ recruitment occurred at the middle of the online module. It was supported by the course chair and the module course tutors through an invitation shared in course news page and via central email sent to all students. Recruitment and data generation occurred during 5 weeks (February–March 2020) and was more effective after an email invitation sent to all students.

Instruments

The use of self-report questionnaires is well established as a methodology within research examining epistemological beliefs ( Feucht et al., 2017 ). The self-reflective instrument was underpinned by previous work led by the second author ( Sheehy et al., 2019b ) and adapted to the context of online learning and fun.

www.frontiersin.org

1. Statements 1–4, 13–17 relate to models of learning (Social Constructivist, and Banking) and are taken from Sheehy and Budiyanto’s (2015) development of the Theoretical Orientation Scale ( Hardman and Worthington, 2000 ).

2. Statements 5–7, 8, 10–12 relate to Constructivist and Traditional views of learning, from the OECD international survey ( Organisation for Economic Co-operation and Development (OECD), 2010 , 2013 ).

3. Statements 9, 18–21 elicit beliefs about fun and happiness and emerged as stable items from Budiyanto et al.’s (2017) epistemological research.

The adapted questionnaire was implemented in Qualtrics with consent forms, study objectives and a novel embedded code to enable students’ withdrawal. This is the first study that provides anonymous withdrawal in Qualtrics. It was then tested in two pre-pilots to check its reliability and the embedded code.

In the first phase of implementation, the self-reflective instrument was used by online students to reflect about the topic “Fun and Learning” through a series of 21 statements using Likert-scale to indicate the level of agreement.

In the second phase, students were invited to complete an optional open-ended question (What is your opinion about fun in online learning?) to provide their reflective views and freely express their feelings on this topic.

Preliminary outcomes of this study ( Figure 1 ) were presented to all participants through an article published in OpenLearn ( Okada, 2020 ) and also in a journal paper ( Okada and Sheehy, 2020 : 608). The framework ‘Butterfly of fun’ including four types of fun in online learning was developed underpinned by Piaget and Inhelder (1969) , Vygotsky et al. (1978) , Csikszentmihalyi (2020) , and Freire (1967 , 1984 , 1996 , 2009) and supported by students’ views. Optimal fun is the joy of being fully involved in learning, moving toward full capability and creativity. Individual fun is the happiness of fulfilling accomplishments, supported by clear goals and strategies. Collaborative fun is the happiness of making connections with others, creating social bonding and developing group identity. Emancipatory fun is the joy of being curious, able to search and discover whilst being critically aware ( Okada and Sheehy, 2020 ).

www.frontiersin.org

Figure 1. Four levels of Online Learning and Fun (Source: Okada, 2020 ).

Relationships Between Fun and Online Learning Supported by Qualitative Analysis

This study started with a content analysis in NVivo 12 after importing from Qualtrics a csv file with 206 responses about students’ views related to fun and learning (qualitative data). The word cloud visualization in Qualtrics ( Figure 2 ) about students’ views indicated the most frequent words: 148 fun, 123 learning, 50 enjoy/enjoyed/enjoyable/enjoyment, 45 students, 40 distance, 31 tutorials, 29 activity, and 26 time.

www.frontiersin.org

Figure 2. The word cloud visualization in Qualtrics about Online Learning and Fun.

The automated thematic analysis map ( Figure 3 ) in NVivo 12; represented in Cmap tools provided 89 codes grouped through seven themes: fun, learning, students, tutorials, material, online and activities, which enabled to identify connections between fun and learning presented as following.

www.frontiersin.org

Figure 3. Thematic analysis map about Online Learning and Fun with codes generated by NVivo 12.

NVivo12 sentiment analysis tool ( Figure 4 ) indicated a significant amount of neutral and positive comments associated to narratives that included learning and fun. A small percentage of negative and mixed views emerged across all categories apart from course module “material.” Three largest clusters emerged focused on fun, learning and activities. Four medium clusters were online, tutorials, fun activities, and students. Two small clusters were material and group.

www.frontiersin.org

Figure 4. RRI sentiment analysis about Online Learning and Fun in NVivo 12.

NVivo 12 sentiment analysis were used to obtain an overview about students’ negative views ( Figure 5 ) and positive opinions ( Figure 6 ) which were highlighted in red and green by the authors to show the students’ responses with a significant narrative.

www.frontiersin.org

Figure 5. Sentiment analysis about students’ negative views related to Online Learning and Fun.

www.frontiersin.org

Figure 6. Sentiment analysis about students’ positive views related to Online Learning and Fun.

These visualizations were useful to identify two sets of themes and sub-themes ( Box 3 ) related to value and relationships between learning and fun as well review the automated sentiment analysis code manually to check nuances and recode it based on the meaning of narratives.

www.frontiersin.org

A total of 206 students’ testimonials were coded with these themes and the frequency of codes were represented by percentages ( Box 3 ). The first set of themes was used to code the value of fun for students; a total of 43% students indicated positive values about fun in learning, 24% indicated neutral, and 23% mixed. Only 10% indicated negative views about fun in learning. The second set of themes were used to explore the value and relationships about fun and learning. Approximately 18% of students indicated that fun is valuable, 12% fun is important, 13% fun is useful, 24% fun is needed, 11% fun is difficult, 12% fun depends, and 10% fun is unnecessary.

Relationships Between Fun and Learning Supported by Quantitative Analysis

Quantitative data analysis ( Graph 1 ) revealed largely positive views about fun and learning. Most students agreed that fun (as enjoyment) had value in supporting learning. The majority of students agreed with the following statements: 98% To learn effectively, students must enjoy learning; 91% To learn effectively, students must be happy to learn. 88.77% Learning should involve fun. However, a small group of students 16.66% beliefs that Fun activities can get in the way of student learning.

www.frontiersin.org

Graph 1. Descriptive analysis about Online Learning and Fun in Qualtrics.

The questionnaire data about 21 statements using Likert scale (1–5) were analyzed through SPSS 24. Cronbach’s alpha 0.717 confirmed that the principal components analysis (PCA) was supported ( Cohen et al., 2007 ). The instrument proved to be reliable for both PCAs ( Tavakol and Dennick, 2011 ). The Kaiser-Meyer-Olkin score of 0.756 indicated sample adequacy and the Bartlett’s sphericity test (Chi-square = 2329.046 with 210 degree of freedom, Sig. 0.000 < 0.5) confirmed consistency.

Table 2 illustrates factor analysis with principal components, with Varimax rotation and Kaiser Normalization indicated six groups emerged: (1) socio-constructivist perspective, (2)traditional perspective (3) fun and learning perspective, (4)constructivist perspective, (5) banking perspective, and (6) Emancipatory Learning. Table 1 using the same method but unrotated solution, indicated three relevant groups: (1) Socio-constructivist learning with traditional teaching and fun; (2) Banking model, transmissive learning and no fun and (4) Constructivist learning and disturbing fun; This approach was selected to examine students’ views and beliefs in order to develop recommendations. Therefore, based on the testimonies of the students grouped with PCA unrotated, twenty-one recommendations were listed and grouped according to three groups: apprentices, teaching professionals and the online course team. Three indexes were generated using the variables from the PCA to get an average among each group related to Fun, No Fun and Bad fun:

www.frontiersin.org

Table 1. FA Varimax without rotation in SPSS.

www.frontiersin.org

Table 2. FA with Varimax rotation in SPSS.

• C1 Fun = (V19 + V09 + V03 + V18 + V02 + V05 + V04 + V01 + V08)/9;

• C2 No fun = (V17 + V07 + V16 + V06 + -V21)/5;

• C3 Fun bad (hampers learning) = (V10 + V20 + V11)/3.

These indexes (above 3.5 – 5) allowed to group participants’ testimonies, select a variety of views and elaborate a representative list of recommendations to enhance students’ enjoyment with online learning. NVivo 12 was used to carry out a thematic qualitative analysis with an interpretative approach to extract 21 recommendations supported by inductive mapping ( Tables 3 – 5 ). A consensual review ( Hill et al., 1997 ) through three systematic checks between the recommendations against qualitative data were developed with two experts and a student: individually, in pairs and in group. Five types of feedback enabled reviewers to suggest improvements: 1. Reduce (too long, use short sentence), 2. Specify (very broad, use specific words), 3. Connect (unrelated, focus more on the data), 4. Simplify (complicated, use familiar vocabulary), 5. Clarify (confusing, revise the meaning). The results of the analysis from mixed methods are presented as follows.

www.frontiersin.org

Table 3. Recommendations about Online Learning and Fun for students supported by mixed methods.

www.frontiersin.org

Table 4. Recommendations about Online Learning and Fun for teaching staff supported by mixed methods.

www.frontiersin.org

Table 5. Recommendations about Online Learning and Fun for course teams supported by mixed methods.

In addition, the graphical comparison between recommendations and full set of qualitative data both auto coded ( Figure 3 ) in NVivo 24 ( Graph 2 ) ensured diversity with a variety of views and consistency with a proportional representation among qualitative themes and quantitative components.

www.frontiersin.org

Graph 2. Evidence-based recommendations about Online Learning and Fun supported by consensual review.

Discussion and Final Remarks

The value of students’ enjoyment with online learning has become fundamental in today’s world. The World Bank (2020) and UNESCO (2020) emphasized that more than 160 countries are facing a crisis in education due to the COVID-19 pandemic with loss of learning and in human capital; and over the long term, the economic difficulties will increase inequalities. Various factors will affect educational systems; in particular, low learning outcomes and high dropout rates in secondary school and higher education.

Students’ confidence and satisfaction with online learning are highly relevant in a world in which distance education has rapidly become a necessary practice in response to the global the pandemic. This mixed-methods research revealed significant online students’ opinions about fun for enjoyable and meaningful learning. Fun is as an important part of the lived experience; however, its meaning is underexplored by literature.

This paper provided a methodology to examine fun in online learning supported by students’ epistemic beliefs, underpinned by RRI – Responsible Research and Innovation. A self-reflective instrument with valid and reliable measurement scales with epistemic constructs of online learning and fun helped participants to think about their views about how learning occurs and its relationship with fun. An open database with a three sets of code scheme was generated and shared with all participants during the covid-19 pandemic.

In this study, light is shed on the elements, meaning and relationships about fun and learning considering the students’ “nuanced views” that integrate fun and learning in different ways. Our results provided evidence that a large majority of higher education students (88.77%) value fun because they believe it has a positive social, cognitive and emotional effects on their distance online education. A small group (16.66%) highlighted that fun impairs learning.

This study confirmed that students should experience enjoyable learning so that learning should involve joy. Freire (1996) highlight that the joy of the “serious act” of learning does not refer to the easy joy of being inactive by doing nothing. “Emancipatory fun” ( Okada and Sheehy, 2020 ) underpinned by Freire’s pedagogy of autonomy is related to the hope and confidence that students can have fun by acting, reflecting and learning with enjoyment and consciousness. They can search, research and solve problems, identify and overcome obstacles as well transform and innovate their lives with knowledge, skills and resilience to shape a desirable future.

A key contribution of this study is that different epistemological beliefs are associated with different conceptualizations of the relationship between fun and learning ( Sheehy et al., 2019a ; Okada and Sheehy, 2020 ). Principal component analysis revealed three groups of students who found (1) fun relevant in socio-constructivist learning (2) no fun in traditional transmissive learning and (3) disturbing fun in constructivist learning. A set of 21 recommendations underpinned by systematic mixed methods and consensual review is provided for Higher Education community including course teams, teaching staff and students to enhance online learning experiences with optimal fun, emancipatory fun, collaborative fun and individual fun. Creating opportunities for students to voice and reflect on their own views and values is fundamental to develop more effective online course designs aligned with their needs.

Congruent with the positive effects of optimal experience in some online environments’ studies (e.g., Esteban-Millat et al., 2014 ; Sánchez-Franco et al., 2014 ), this study confirmed that fun creates an opportunity and expectation for students to experience positive feelings in learning such as good mood, enthusiasm, interest, satisfaction and enjoyment that are all relevant for “optimal” learning.

Researchers who see fun as having a close relationship with learning have proposed different types of fun. Lazzaro (2009) highlighted “easy fun” in activities such as games and role play that stimulate curiosity and exploration. Papert (2002) identified “hard fun” within goal-centered and challenging experiences, where the difficulty of the task is part of the fun. Tews et al. (2015 :17) examined fun in two contexts, fun in learning activities developed by students and fun in teaching delivery by the staff. The former was characterized as “hands-on” exercises and activities that promoted social engagement between students. The latter concerned instructor-focused teaching that included the use of humor, creative examples, and storytelling. Their findings indicated that fun delivery, and not fun activities, was positively associated with students’ motivation, interest and engagement.

Notably, their findings indicated fun delivery, but not fun activities, was positively related to student’ motivation, interest and engagement. Prior examining activities and delivery, our study highlights the importance of investigating students’ epistemic views. There is therefore the opportunity for novel research to examine factors and effects of fun and student learning experience including epistemic-guided learning design.

Our study highlights the importance of investigating students’ epistemic beliefs and its connections with the essence of their views. There is therefore the opportunity for novel research to examine factors and effects of fun and within student learning experience including the influence of epistemic-guided learning and teaching design.

A series of studies with Indonesian teachers ( Sheehy et al., 2019a ) suggested that their beliefs about how learning occurs are influenced by their views about happiness and, by implication, fun in relation to learning. These teachers often commented on the relationship between happiness and learning, and many saw happiness as an essential feature of good classroom teaching. However, they described a relationship between happiness and learning that was different in nature to that found in Western educational research. There is a tendency for Western educators to see happiness as “a tool for facilitating effective education” ( Fox et al., 2013 , p1), and as something that is promoted alongside educational excellence. In contrast, many Indonesian teachers see learning not as separate from happiness but as part of it ( Budiyanto et al., 2017 ; Budiyanto and Sheehy, 2019 ).

Other research has implied that this belief in separation arises when people see teaching as a simple transfer of “untransformed knowledge” from expert to student, in a traditional model of learning ( OECD, 2009 ) also known as the “banking model of education” Freire (2000) . This separation may be reflected in the balancing act between happiness with fun and academic achievement described in the CEE report mentioned above. In contrast, those who believe that learning is a social constructivist process are more likely to see happiness with fun as important to the process of learning. The situation remains that we have an incomplete understanding of fun in the domain of learning ( Tews et al., 2017 ) and it remains to be clarified by empirical research ( Iten and Petko, 2016 ); in particular under the lens of epistemological beliefs ( Sheehy et al., 2019a ) and practical experiences.

Our study also complemented a previous research about fun on traditional university’ campus whose students highlighted that fun in learning must integrate stimulating pedagogy; lecturer engagement; a safe learning space; shared experience; and a low-stress environment ( Whitton and Langan, 2018 ). Some key effects of fun, for example, pleasant communication and creation of a relaxed state to reduce stress ( Bisson and Luckner, 1996 ) are important factors to support learners during the isolation. Fun as an inner joy of wellbeing and engagement is an important component to propitiate learning with the creation of new patterns that are interesting, surprising and meaningful ( Schmidhuber, 2010 ) to involve students with formal education during uncertain time of post-pandemic.

As indicated by the research-authors and collaborators, further studies are important based on the RRI approach to construct new questions and also explore the issues indicated by preliminary studies ( Okada and Sheehy, 2020 ). New issues must be also examined on the effects of fun on online learning, also considering age, gender, socio-cultural aspects, accessibility, digital skills, and geographical differences. Developing further recommendations at broader institutional, national and international levels about effective and engaging online learning is also important to empower individuals and society to face, innovate and reconstruct a sustainable and enjoyable world.

Data Availability Statement

The open database can be accessed, downloaded and reused: Okada and Sheehy (2020) OLAF PROJECT data set. Open Research Data Online. The Open University. https://doi.org/10.21954/ou.rd.12670949 (November 2020). The Open Questionnaire can be accessed from the supplementary material Qualtrics Survey OLAF project.pdf.

Ethics Statement

The studies involving human participants were reviewed and approved by The Open University, HREC – Human Research and Ethics Committee. The patients/participants provided their written informed consent to participate in this study.

Author Contributions

AO wrote the first draft of the abstract and prepared the manuscript. KS provided the instrument and feedback about the final version. AO was responsible for the survey implementation in Qualtrics, data generation, instrument’s tests, data analysis through mixed methods, and validation supported by collaborators with consensual review. Additionally, AO created the figures, graphs, and tables. Both authors contributed to manuscript revision, read, and approved the submitted version.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

This study was funded by the Open University UK and is part of the international project OLAF – Online Learning and Fun. http://www.open.ac.uk/blogs/rumpus/index.php/projects/olaf/ .

Acknowledgments

We are grateful to our collaborators who supported the recruitment of participants, our expert colleagues Prof. Dr. Daniela Melaré Barros; Prof. Dr. Maria Elizabeth de Almeida; Dr. Victoria Cooper, and Miss Ana Beatriz Rocha who provided valuable feedback and our external reviewers for useful suggestions.

Arnone, M. P., Small, R. V., Chauncey, S. A., and McKenna, H. P. (2011). Curiosity, interest and engagement in technology-pervasive learning environments: a new research agenda. Educ. Technol. Res. Dev. 59, 181–198. doi: 10.1007/s11423-011-9190-9

CrossRef Full Text | Google Scholar

Beckman, J. (2014). American Fun: Four Centuries of Joyous Revolt. New York, NY: Knopf Doubleday Publishing Group.

Google Scholar

Bisson, C., and Luckner, J. (1996). Fun in learning: the pedagogical role of fun in adventure education. J. Exp. Educ. 19, 108–112. doi: 10.1177/105382599601900208

Blythe, M., and Hassenzahl, M. (2018). The semantics of fun: differentiating enjoyable experiences. Funology 2, 375–387. doi: 10.1007/978-3-319-68213-6_24

Budiyanto Sheehy, K. (2019). “Developing Signalong Indonesia: issues of politics, pedagogy and perceptions,” in Manual Sign Acquisition by Children with Developmental Disabilities , eds N. Grove and K. Launonen (Hauppauge, NY: Nova Science).

Budiyanto, Sheehy, K., Kaye, H., and Rofiah, K. (2017). Developing Signalong Indonesia: issues of happiness and pedagogy, training and stigmatisation. Int. J. Inclusive Educ. 22, 543–559. doi: 10.1080/13603116.2017.1390000

Chan, K. W., and Elliott, R. G. (2004). Relational analysis of personal epistemology and conceptions about teaching and learning. Teach. Teach. Edu. 20, 817–831. doi: 10.1016/j.tate.2004.09.002

Chu, S. L., Angello, G., Saenz, M., and Quek, F. (2017). Fun in Making: understanding the experience of fun and learning through curriculum-based Making in the elementary school classroom. Entertain. Comput. 18, 31–40. doi: 10.1016/j.entcom.2016.08.007

Cohen, L., Manion, L., and Morrison, K. (2007). Research Methods in Education , Sixth Edn. Abingdon: Routledge.

Cooperman, L. (2014). “Foreword,” in Open Educational Resources and Social Networks: Co-Learning and Professional Development , ed. A. Okada (London: Scholio Educational Research & Publishing).

Crosnoe, R., Johnson, M. K., and Elder, G. H. Jr. (2004). Intergenerational bonding in school: The behavioural and contextual correlates of student-teacher relationships. Sociol. Educ. 77, 60–81. doi: 10.1177/003804070407700103

Csikszentmihalyi, M. (2015). The Systems Model of Creativity: The Collected works of Mihaly Csikszentmihalyi. Springer.

Csikszentmihalyi, M. (2020) Finding Flow: The Psychology of Engagement with Everyday Life. Hachette.

De Freitas, S., and Oliver, M. (2006). How can exploratory learning with games and simulations within the curriculum be most effectively evaluated? Comput. Educ. 46, 249–264. doi: 10.1016/j.compedu.2005.11.007

Dismore, H., and Bailey, R. (2011). Fun and enjoyment in physical education: young people’s attitudes. Res. Papers Educ. 26, 499–516. doi: 10.1080/02671522.2010.484866

Elton-Chalcraft, S., and Mills, K. (2015). Measuring challenge, fun and sterility on a ‘phunometre’scale: evaluating creative teaching and learning with children and their student teachers in the primary school. Education 3-13 43, 482–497. doi: 10.1080/03004279.2013.822904

Esteban-Millat, I., Martínez-López, F. J., Huertas-García, R., Meseguer, A., and Rodríguez-Ardura, I. (2014). Modelling students’ flow experiences in an online learning environment. Comput. Educ. 71, 111–123. doi: 10.1016/j.compedu.2013.09.012

Etymonline. Dicionário etimológico . Available online at: https://www.etymonline.com/word/fun (accessed May 12, 2020).

European Commission (2013). Options for Strengthening Responsible Research and Innovation-Report of the Expert Group on the State of Art in Europe on Responsible Research and Innovation. Luxembourg: European Commission.

European Commission (2020). Responsible Research and Innovation. Available online at: https://ec.europa.eu/programmes/horizon2020/en/h2020-section/responsible-research-innovation (accessed July 10, 2020).

Feldberg, H. R. (2011). S’more then Just Fun and Games: Teachers’ Perceptions on the Educational Value of Camp Programs for School Groups. Master’s thesis, University of Waterloo, Waterloo.

Feucht, F. C., Lunn Brownlee, J., and Schraw, G. (2017). Moving beyond reflection: reflexivity and epistemic cognition in teaching and teacher education. Educ. Psychol. 52, 234–241. doi: 10.1080/00461520.2017.1350180

Fine, G., and Corte, U. (2017). Group pleasures: collaborative commitments, shared narrative, and the sociology of fun. Sociol. Theory 35, 64–86. doi: 10.1177/0735275117692836

Fox, E., Jennifer, M., Proctor, C., and Ashley, M. (2013). “Happiness in the classroom,” in Oxford Handbook of Happiness , eds A. C. Ayers, I. Boniwell, and S. David (Oxford: Oxford University Press).

Francis, N., and Kentel, J. (2008). The fun factor: adolescents’ self−regulated leisure activity and the implications for practitioners and researchers. Leisure/Loisir 32, 65–90. doi: 10.1080/14927713.2008.9651400

Freire, P. (1967). Papel da educação na humanização. Série Artigos.

Freire, P. (1984). Ação cultural para a liberdade , 7 Edn. Rio de Janeiro: Paz e Terra.

Freire, P. (1985). Pedagogia do oprimido , 14 Edn. Rio de Janeiro: Paz e Terra.

Freire, P. (1996). Pedagogia da autonomia: saberes necessários à prática educativa , 9 Edn. São Paulo: Paz e Terra.

Freire, P. (2000). Pedagogy of freedom: Ethics, democracy, and civic courage. Lanham, MD: Rowman & Littlefield Publishers.

Freire, P. (2009). Pedagogia da esperança: um reencontro com a pedagogia do oprimido. 16. ed. Rio de Janeiro: Paz e Terra.

Fu, F. L., Su, R. C., and Yu, S. C. (2009). EGameFlow: a scale to measure learners’ enjoyment of e-learning games. Comput. Educ. 52, 101–112. doi: 10.1016/j.compedu.2008.07.004

Garn, A. C., and Cothran, D. J. (2006). The fun factor in physical education. J. Teach. Phys. Educ. 25, 281–297. doi: 10.1123/jtpe.25.3.281

Glaveanu, V. P. (2011). Children and Creativity: a Most (Un)Likely Pair? Think. Skills Creat. 6, 122–131. doi: 10.1016/j.tsc.2011.03.002

Gortan, A., and Jereb, E. (2007). The dropout rate from e-learning courses and the satisfaction of students with e-learning. Organizacija 40.

Hardman, M., and Worthington, J. (2000). Educational Psychologists’ orientation to inclusion and assumptions about children’s learning. Educ. Psychol. Pract. 16, 349–360. doi: 10.1080/02667360020006417

Harmston, G. (2005). Sources of Enjoyment in Physical Activity Among Children and Adolescents. Los Angeles: California State University.

Higher Education Academy (HEA) (2015). Framework for Student Access, RETENTION, Attainment and Progression in Higher Education. Available online at: https://www.heacademy.ac.uk/system/files/downloads/studentaccess-retentionattainment-progression-in-he.pdf (accessed October 2020).

Hill, C. E., Thompson, B. J., and Williams, E. N. (1997). A guide to conducting consensual qualitative research. Couns. Psychol. 25, 517–572. doi: 10.1177/0011000097254001

Iten, N., and Petko, D. (2016). Learning with serious games: is fun playing the game a predictor of learning success? Br. J. Educ. Technol. 47, 151–163. doi: 10.1111/bjet.12226

Kimiecik, J. C., and Harris, A. T. (1996). What is enjoyment? A conceptual/definitional analysis with implications for sport and exercise psychology. J. Sport Exerc. Psychol. 18, 247–263. doi: 10.1123/jsep.20.3.247

Knight, S., Rienties, B., Littleton, K., Mitsui, M., Tempelaar, D., and Shah, C. (2017). The relationship of (perceived) epistemic cognition to interaction with resources on the internet. Comput. Human Behav. 73, 507–518. doi: 10.1016/j.chb.2017.04.014

Koster, R. (2005). Theory of Fun for Game Design. Scottsdale, AZ: Paraglyph Press.

Lazzaro, N. (2009). “Why we play: affect and the fun of games,” in Human-computer interaction: Designing for diverse users and domains , eds A. Sears and J. A. Jacko (Boca Raton, FLA: CRC Press).

Lee, J., Zhang, Z., Song, H., and Huang, X. (2013). Effects of epistemological and pedagogical beliefs on the instructional practices of teachers: a Chinese Perspective. Aust. J. Teach. Educ. 38, 119–146.

Lesser, L. M., Wall, A., Carver, R., Pearl, D. K., Martin, N., Kuiper, S., et al. (2013). Using fun in the statistics classroom: an exploratory study of college instructors’ hesitations and motivations. J. Stat. Educ. 21, 1–33.

McManus, I. C., and Furnham, A. (2010). “Fun, fun, fun”: types of fun, attitudes to fun, and their relation to personality and biographical factors. Psychology 1:159. doi: 10.4236/psych.2010.13021

Okada, A. (2020). Distance education: do students believe it should be fun? Available online at: https://www.open.edu/openlearn/education-development/learning/distance-education-do-students-believe-it-should-be-fun (accessed April 23, 2020).

Okada, A., and Sheehy, K. (2020). The value of fun in online learning: a study supported by responsible research and innovation and open data. Revista e-Curriculum 18, 319–343.

Okada, A., and Sherborne, T. (2018). Equipping the next generation for responsible research and innovation with open educational resources, open courses, open communities and open schooling: an impact case study in Brazil. J. Interact. Media Educ. 1, 1–15.

Organisation for Economic Co-operation and Development (OECD) (2010). Talis Technical Report. Available online at: http://www.oecd.org/education/school/44978960.pdf (accessed October 2020).

Organisation for Economic Co-operation and Development (OCED) (2013). Teaching and Learning International Survey TALIS 2013 Conceptual Framework. Available online at: http://www.oecd.org/edu/school/talis-2013-results.htm (accessed June 25, 2014).

Papert, S. (2002). Hard Fun. Bangor Daily News. Bangor. Available online at: http://www.papert.org/articles/HardFun.html (accessed July 10, 2020).

Piaget, J., and Inhelder, B. (1969). The Psychology of the Child. Basic Books.

Prouty, D. (2002). Courage, compassion, creativity: project adventure at thirty. Zip Lines Voice Adventure Educ. 44, 6–12.

Richardson, J. T. E. (2013). Epistemological development in higher education. Educ. Res. Rev. 9, 191–206.

Rodriguez, L., and Cano, F. (2007). The learning approaches and epistemological beliefs of university students: a cross-sectional and longitudinal study. Stud. High. Educ. 32, 647–667. doi: 10.1080/03075070701573807

RRI-Tools (2016). Self-Reflection tool. Available online at: https://www.rri-tools.eu/self-reflection-tool (accessed July 10, 2020).

Sánchez-Franco, M. J., Peral-Peral, B., and Villarejo-Ramos, ÁF. (2014). Users’ intrinsic and extrinsic drivers to use a web-based educational environment. Comput. Educ. 74, 81–97. doi: 10.1016/j.compedu.2014.02.001

Schmidhuber, J. (2010). Formal theory of creativity, fun, and intrinsic motivation (1990–2010). IEEE Trans. Auton. Ment. Dev. 2, 230–247. doi: 10.1109/tamd.2010.2056368

Schommer, M. (1990). Effects of beliefs about the nature of knowledge oncomprehension. J. Educ. Psychol. 82, 498–504. doi: 10.1037/0022-0663.82.3.498

Schon, D. (2015). Educating the reflective practitioner: toward a new design for teaching and learning in the professions (San Francisco: JosseyBass, 1987), and Ellen Schall,“learning to love the swamp: reshaping education for public service,” J. Policy Anal. Manage. 14:202. doi: 10.2307/3325150

Sheehy, K., and Budiyanto (2015). The pedagogic beliefs of Indonesian teachers in inclusive schools. Int. J. Disability Dev. Educ. 62, 469–485. doi: 10.1080/1034912X.2015.1061109

Sheehy, K., Budiyanto, Kaye, H., and Rofiah, K. (2019a). Indonesian teachers’ epistemological beliefs and inclusive education. J. Intellect. Disabil. 23, 39–56. doi: 10.1177/1744629517717613

PubMed Abstract | CrossRef Full Text | Google Scholar

Sheehy, K., Kasule, G. W., and Chamberlain, L. (2019b). Ugandan teachers epistemological beliefs and child-led research: implications for developing inclusive educational practice. Int. J. Disabil. Dev. Educ. (Early Access). doi: 10.1080/1034912X.2019.1699647

Tavakol, M., and Dennick, R. (2011). Making sense of Cronbach’s alpha. Int. J. Med. Educ. 2, 53–55. doi: 10.5116/ijme.4dfb.8dfd

Tews, M. J., Jackson, K., Ramsay, C., and Michel, J. W. (2015). Fun in the college classroom: examining its nature and relationship with student engagement. Coll. Teach. 63, 16–26. doi: 10.1080/87567555.2014.972318

Tews, M. J., Michel, J. W., and Noe, R. A. (2017). Does fun promote learning? The relationship between fun in the workplace and informal learning. J. Vocat. Behav. 98, 46–55. doi: 10.1016/j.jvb.2016.09.006

Turner, C. (2018). Making lessons fun does not help children to learn, new report finds. Available online at: https://www.telegraph.co.uk/education/2018/11/14/making-lessons-fun-does-not-help-children-learn-new-report-finds/ (accessed November 14, 2018).

UNESCO (2020). COVID-19 Educational Disruption and Response. Paris: UNESCO.

Ungar, M. (2007). Too Safe for their Own Good: How Risk and Responsibility Helpteens Thrive. Toronto, ON: McClelland & Stewart.

Virvou, M., Katsionis, G., and Manos, K. (2005). Combining software games with education: evaluation of its educational effectiveness. Educ. Technol. Soc. 8, 54–65.

von Schomberg, R. (2013). “A Vision of Responsible Research and Innovation,” in Responsible Innovation . Responsible Innovation: Managing the Responsible Emergence of Science and Innovation in Society , eds R. Owen, J. Bessant, and M. Heintz (Hoboken, NJ: John Wiley & Sons), 51–74. doi: 10.1002/9781118551424.ch3

Vygotsky, L. S. (1978). Mind in Society: The Development of Higher Psychological Processes. Cambridge, MA: Harvard University Press.

Whitton, N., and Langan, M. (2018). Fun and games in higher education: an analysis of UK student perspectives. Teach. High. Educ. 24, 1000–1013. doi: 10.1080/13562517.2018.1541885

World Bank (2020). The COVID-19 Pandemic: Shocks to Education and Policy Responses. Washington, DC: World Bank. Available online at: https://openknowledge.worldbank.org/handle/10986/33696 (accessed October 2020).

Xavier, M., and Meneses, J. (2020). Dropout in Online Higher Education: A Scoping Review from 2014 to 2018. Barcelona: eLearn Center, Universitat Oberta de Catalunya.

Keywords : COVID-19, online learning, fun, higher education, academic performance, epistemic views, responsible research and innovation, recommendations

Citation: Okada A and Sheehy K (2020) Factors and Recommendations to Support Students’ Enjoyment of Online Learning With Fun: A Mixed Method Study During COVID-19. Front. Educ. 5:584351. doi: 10.3389/feduc.2020.584351

Received: 17 July 2020; Accepted: 13 October 2020; Published: 11 December 2020.

Reviewed by:

Copyright © 2020 Okada and Sheehy. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Alexandra Okada, [email protected]

This article is part of the Research Topic

COVID-19 and the Educational Response: New Educational and Social Realities

  • Research article
  • Open access
  • Published: 02 December 2020

Integrating students’ perspectives about online learning: a hierarchy of factors

  • Montgomery Van Wart 1 ,
  • Anna Ni 1 ,
  • Pamela Medina 1 ,
  • Jesus Canelon 1 ,
  • Melika Kordrostami 1 ,
  • Jing Zhang 1 &

International Journal of Educational Technology in Higher Education volume  17 , Article number:  53 ( 2020 ) Cite this article

149k Accesses

50 Citations

24 Altmetric

Metrics details

This article reports on a large-scale ( n  = 987), exploratory factor analysis study incorporating various concepts identified in the literature as critical success factors for online learning from the students’ perspective, and then determines their hierarchical significance. Seven factors--Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Online Social Comfort, Online Interactive Modality, and Social Presence--were identified as significant and reliable. Regression analysis indicates the minimal factors for enrollment in future classes—when students consider convenience and scheduling—were Basic Online Modality, Cognitive Presence, and Online Social Comfort. Students who accepted or embraced online courses on their own merits wanted a minimum of Basic Online Modality, Teaching Presence, Cognitive Presence, Online Social Comfort, and Social Presence. Students, who preferred face-to-face classes and demanded a comparable experience, valued Online Interactive Modality and Instructional Support more highly. Recommendations for online course design, policy, and future research are provided.

Introduction

While there are different perspectives of the learning process such as learning achievement and faculty perspectives, students’ perspectives are especially critical since they are ultimately the raison d’être of the educational endeavor (Chickering & Gamson, 1987 ). More pragmatically, students’ perspectives provide invaluable, first-hand insights into their experiences and expectations (Dawson et al., 2019 ). The student perspective is especially important when new teaching approaches are used and when new technologies are being introduced (Arthur, 2009 ; Crews & Butterfield, 2014 ; Van Wart, Ni, Ready, Shayo, & Court, 2020 ). With the renewed interest in “active” education in general (Arruabarrena, Sánchez, Blanco, et al., 2019 ; Kay, MacDonald, & DiGiuseppe, 2019 ; Nouri, 2016 ; Vlachopoulos & Makri, 2017 ) and the flipped classroom approach in particular (Flores, del-Arco, & Silva, 2016 ; Gong, Yang, & Cai, 2020 ; Lundin, et al., 2018 ; Maycock, 2019 ; McGivney-Burelle, 2013 ; O’Flaherty & Phillips, 2015 ; Tucker , 2012 ) along with extraordinary shifts in the technology, the student perspective on online education is profoundly important. What shapes students’ perceptions of quality integrate are their own sense of learning achievement, satisfaction with the support they receive, technical proficiency of the process, intellectual and emotional stimulation, comfort with the process, and sense of learning community. The factors that students perceive as quality online teaching, however, has not been as clear as it might be for at least two reasons.

First, it is important to note that the overall online learning experience for students is also composed of non-teaching factors which we briefly mention. Three such factors are (1) convenience, (2) learner characteristics and readiness, and (3) antecedent conditions that may foster teaching quality but are not directly responsible for it. (1) Convenience is an enormous non-quality factor for students (Artino, 2010 ) which has driven up online demand around the world (Fidalgo, Thormann, Kulyk, et al., 2020 ; Inside Higher Education and Gallup, 2019 ; Legon & Garrett, 2019 ; Ortagus, 2017 ). This is important since satisfaction with online classes is frequently somewhat lower than face-to-face classes (Macon, 2011 ). However, the literature generally supports the relative equivalence of face-to-face and online modes regarding learning achievement criteria (Bernard et al., 2004 ; Nguyen, 2015 ; Ni, 2013 ; Sitzmann, Kraiger, Stewart, & Wisher, 2006 ; see Xu & Jaggars, 2014 for an alternate perspective). These contrasts are exemplified in a recent study of business students, in which online students using a flipped classroom approach outperformed their face-to-face peers, but ironically rated instructor performance lower (Harjoto, 2017 ). (2) Learner characteristics also affect the experience related to self-regulation in an active learning model, comfort with technology, and age, among others,which affect both receptiveness and readiness of online instruction. (Alqurashi, 2016 ; Cohen & Baruth, 2017 ; Kintu, Zhu, & Kagambe, 2017 ; Kuo, Walker, Schroder, & Belland, 2013 ; Ventura & Moscoloni, 2015 ) (3) Finally, numerous antecedent factors may lead to improved instruction, but are not themselves directly perceived by students such as instructor training (Brinkley-Etzkorn, 2018 ), and the sources of faculty motivation (e.g., incentives, recognition, social influence, and voluntariness) (Wingo, Ivankova, & Moss, 2017 ). Important as these factors are, mixing them with the perceptions of quality tends to obfuscate the quality factors directly perceived by students.

Second, while student perceptions of quality are used in innumerable studies, our overall understanding still needs to integrate them more holistically. Many studies use student perceptions of quality and overall effectiveness of individual tools and strategies in online contexts such as mobile devices (Drew & Mann, 2018 ), small groups (Choi, Land, & Turgeon, 2005 ), journals (Nair, Tay, & Koh, 2013 ), simulations (Vlachopoulos & Makri, 2017 ), video (Lange & Costley, 2020 ), etc. Such studies, however, cannot provide the overall context and comparative importance. Some studies have examined the overall learning experience of students with exploratory lists, but have mixed non-quality factors with quality of teaching factors making it difficult to discern the instructor’s versus contextual roles in quality (e.g., Asoodar, Vaezi, & Izanloo, 2016 ; Bollinger & Martindale, 2004 ; Farrell & Brunton, 2020 ; Hong, 2002 ; Song, Singleton, Hill, & Koh, 2004 ; Sun, Tsai, Finger, Chen, & Yeh, 2008 ). The application of technology adoption studies also fall into this category by essentially aggregating all teaching quality in the single category of performance ( Al-Gahtani, 2016 ; Artino, 2010 ). Some studies have used high-level teaching-oriented models, primarily the Community of Inquiry model (le Roux & Nagel, 2018 ), but empirical support has been mixed (Arbaugh et al., 2008 ); and its elegance (i.e., relying on only three factors) has not provided much insight to practitioners (Anderson, 2016 ; Cleveland-Innes & Campbell, 2012 ).

Research questions

Integration of studies and concepts explored continues to be fragmented and confusing despite the fact that the number of empirical studies related to student perceptions of quality factors has increased. It is important to have an empirical view of what students’ value in a single comprehensive study and, also, to know if there is a hierarchy of factors, ranging from students who are least to most critical of the online learning experience. This research study has two research questions.

The first research question is: What are the significant factors in creating a high-quality online learning experience from students’ perspectives? That is important to know because it should have a significant effect on the instructor’s design of online classes. The goal of this research question is identify a more articulated and empirically-supported set of factors capturing the full range of student expectations.

The second research question is: Is there a priority or hierarchy of factors related to students’ perceptions of online teaching quality that relate to their decisions to enroll in online classes? For example, is it possible to distinguish which factors are critical for enrollment decisions when students are primarily motivated by convenience and scheduling flexibility (minimum threshold)? Do these factors differ from students with a genuine acceptance of the general quality of online courses (a moderate threshold)? What are the factors that are important for the students who are the most critical of online course delivery (highest threshold)?

This article next reviews the literature on online education quality, focusing on the student perspective and reviews eight factors derived from it. The research methods section discusses the study structure and methods. Demographic data related to the sample are next, followed by the results, discussion, and conclusion.

Literature review

Online education is much discussed (Prinsloo, 2016 ; Van Wart et al., 2019 ; Zawacki-Richter & Naidu, 2016 ), but its perception is substantially influenced by where you stand and what you value (Otter et al., 2013 ; Tanner, Noser, & Totaro, 2009 ). Accrediting bodies care about meeting technical standards, proof of effectiveness, and consistency (Grandzol & Grandzol, 2006 ). Institutions care about reputation, rigor, student satisfaction, and institutional efficiency (Jung, 2011 ). Faculty care about subject coverage, student participation, faculty satisfaction, and faculty workload (Horvitz, Beach, Anderson, & Xia, 2015 ; Mansbach & Austin, 2018 ). For their part, students care about learning achievement (Marks, Sibley, & Arbaugh, 2005 ; O’Neill & Sai, 2014 ; Shen, Cho, Tsai, & Marra, 2013 ), but also view online education as a function of their enjoyment of classes, instructor capability and responsiveness, and comfort in the learning environment (e.g., Asoodar et al., 2016 ; Sebastianelli, Swift, & Tamimi, 2015 ). It is this last perspective, of students, upon which we focus.

It is important to note students do not sign up for online classes solely based on perceived quality. Perceptions of quality derive from notions of the capacity of online learning when ideal—relative to both learning achievement and satisfaction/enjoyment, and perceptions about the likelihood and experience of classes living up to expectations. Students also sign up because of convenience and flexibility, and personal notions of suitability about learning. Convenience and flexibility are enormous drivers of online registration (Lee, Stringer, & Du, 2017 ; Mann & Henneberry, 2012 ). Even when students say they prefer face-to-face classes to online, many enroll in online classes and re-enroll in the future if the experience meets minimum expectations. This study examines the threshold expectations of students when they are considering taking online classes.

When discussing students’ perceptions of quality, there is little clarity about the actual range of concepts because no integrated empirical studies exist comparing major factors found throughout the literature. Rather, there are practitioner-generated lists of micro-competencies such as the Quality Matters consortium for higher education (Quality Matters, 2018 ), or broad frameworks encompassing many aspects of quality beyond teaching (Open and Distant Learning Quality Council, 2012 ). While checklists are useful for practitioners and accreditation processes, they do not provide robust, theoretical bases for scholarly development. Overarching frameworks are heuristically useful, but not for pragmatic purposes or theory building arenas. The most prominent theoretical framework used in online literature is the Community of Inquiry (CoI) model (Arbaugh et al., 2008 ; Garrison, Anderson, & Archer, 2003 ), which divides instruction into teaching, cognitive, and social presence. Like deductive theories, however, the supportive evidence is mixed (Rourke & Kanuka, 2009 ), especially regarding the importance of social presence (Annand, 2011 ; Armellini and De Stefani, 2016 ). Conceptually, the problem is not so much with the narrow articulation of cognitive or social presence; cognitive presence is how the instructor provides opportunities for students to interact with material in robust, thought-provoking ways, and social presence refers to building a community of learning that incorporates student-to-student interactions. However, teaching presence includes everything else the instructor does—structuring the course, providing lectures, explaining assignments, creating rehearsal opportunities, supplying tests, grading, answering questions, and so on. These challenges become even more prominent in the online context. While the lecture as a single medium is paramount in face-to-face classes, it fades as the primary vehicle in online classes with increased use of detailed syllabi, electronic announcements, recorded and synchronous lectures, 24/7 communications related to student questions, etc. Amassing the pedagogical and technological elements related to teaching under a single concept provides little insight.

In addition to the CoI model, numerous concepts are suggested in single-factor empirical studies when focusing on quality from a student’s perspective, with overlapping conceptualizations and nonstandardized naming conventions. Seven distinct factors are derived here from the literature of student perceptions of online quality: Instructional Support, Teaching Presence, Basic Online Modality, Social Presence, Online Social Comfort, cognitive Presence, and Interactive Online Modality.

Instructional support

Instructional Support refers to students’ perceptions of techniques by the instructor used for input, rehearsal, feedback, and evaluation. Specifically, this entails providing detailed instructions, designed use of multimedia, and the balance between repetitive class features for ease of use, and techniques to prevent boredom. Instructional Support is often included as an element of Teaching Presence, but is also labeled “structure” (Lee & Rha, 2009 ; So & Brush, 2008 ) and instructor facilitation (Eom, Wen, & Ashill, 2006 ). A prime example of the difference between face-to-face and online education is the extensive use of the “flipped classroom” (Maycock, 2019 ; Wang, Huang, & Schunn, 2019 ) in which students move to rehearsal activities faster and more frequently than traditional classrooms, with less instructor lecture (Jung, 2011 ; Martin, Wang, & Sadaf, 2018 ). It has been consistently supported as an element of student perceptions of quality (Espasa & Meneses, 2010 ).

  • Teaching presence

Teaching Presence refers to students’ perceptions about the quality of communication in lectures, directions, and individual feedback including encouragement (Jaggars & Xu, 2016 ; Marks et al., 2005 ). Specifically, instructor communication is clear, focused, and encouraging, and instructor feedback is customized and timely. If Instructional Support is what an instructor does before the course begins and in carrying out those plans, then Teaching Presence is what the instructor does while the class is conducted and in response to specific circumstances. For example, a course could be well designed but poorly delivered because the instructor is distracted; or a course could be poorly designed but an instructor might make up for the deficit by spending time and energy in elaborate communications and ad hoc teaching techniques. It is especially important in student satisfaction (Sebastianelli et al., 2015 ; Young, 2006 ) and also referred to as instructor presence (Asoodar et al., 2016 ), learner-instructor interaction (Marks et al., 2005 ), and staff support (Jung, 2011 ). As with Instructional Support, it has been consistently supported as an element of student perceptions of quality.

Basic online modality

Basic Online Modality refers to the competent use of basic online class tools—online grading, navigation methods, online grade book, and the announcements function. It is frequently clumped with instructional quality (Artino, 2010 ), service quality (Mohammadi, 2015 ), instructor expertise in e-teaching (Paechter, Maier, & Macher, 2010 ), and similar terms. As a narrowly defined concept, it is sometimes called technology (Asoodar et al., 2016 ; Bollinger & Martindale, 2004 ; Sun et al., 2008 ). The only empirical study that did not find Basic Online Modality significant, as technology, was Sun et al. ( 2008 ). Because Basic Online Modality is addressed with basic instructor training, some studies assert the importance of training (e.g., Asoodar et al., 2016 ).

Social presence

Social Presence refers to students’ perceptions of the quality of student-to-student interaction. Social Presence focuses on the quality of shared learning and collaboration among students, such as in threaded discussion responses (Garrison et al., 2003 ; Kehrwald, 2008 ). Much emphasized but challenged in the CoI literature (Rourke & Kanuka, 2009 ), it has mixed support in the online literature. While some studies found Social Presence or related concepts to be significant (e.g., Asoodar et al., 2016 ; Bollinger & Martindale, 2004 ; Eom et al., 2006 ; Richardson, Maeda, Lv, & Caskurlu, 2017 ), others found Social Presence insignificant (Joo, Lim, & Kim, 2011 ; So & Brush, 2008 ; Sun et al., 2008 ).

Online social comfort

Online Social Comfort refers to the instructor’s ability to provide an environment in which anxiety is low, and students feel comfortable interacting even when expressing opposing viewpoints. While numerous studies have examined anxiety (e.g., Liaw & Huang, 2013 ; Otter et al., 2013 ; Sun et al., 2008 ), only one found anxiety insignificant (Asoodar et al., 2016 ); many others have not examined the concept.

  • Cognitive presence

Cognitive Presence refers to the engagement of students such that they perceive they are stimulated by the material and instructor to reflect deeply and critically, and seek to understand different perspectives (Garrison et al., 2003 ). The instructor provides instructional materials and facilitates an environment that piques interest, is reflective, and enhances inclusiveness of perspectives (Durabi, Arrastia, Nelson, Cornille, & Liang, 2011 ). Cognitive Presence includes enhancing the applicability of material for student’s potential or current careers. Cognitive Presence is supported as significant in many online studies (e.g., Artino, 2010 ; Asoodar et al., 2016 ; Joo et al., 2011 ; Marks et al., 2005 ; Sebastianelli et al., 2015 ; Sun et al., 2008 ). Further, while many instructors perceive that cognitive presence is diminished in online settings, neuroscientific studies indicate this need not be the case (Takamine, 2017 ). While numerous studies failed to examine Cognitive Presence, this review found no studies that lessened its significance for students.

Interactive online modality

Interactive Online Modality refers to the “high-end” usage of online functionality. That is, the instructor uses interactive online class tools—video lectures, videoconferencing, and small group discussions—well. It is often included in concepts such as instructional quality (Artino, 2010 ; Asoodar et al., 2016 ; Mohammadi, 2015 ; Otter et al., 2013 ; Paechter et al., 2010 ) or engagement (Clayton, Blumberg, & Anthony, 2018 ). While individual methods have been investigated (e.g. Durabi et al., 2011 ), high-end engagement methods have not.

Other independent variables affecting perceptions of quality include age, undergraduate versus graduate status, gender, ethnicity/race, discipline, educational motivation of students, and previous online experience. While age has been found to be small or insignificant, more notable effects have been reported at the level-of-study, with graduate students reporting higher “success” (Macon, 2011 ), and community college students having greater difficulty with online classes (Legon & Garrett, 2019 ; Xu & Jaggars, 2014 ). Ethnicity and race have also been small or insignificant. Some situational variations and student preferences can be captured by paying attention to disciplinary differences (Arbaugh, 2005 ; Macon, 2011 ). Motivation levels of students have been reported to be significant in completion and achievement, with better students doing as well across face-to-face and online modes, and weaker students having greater completion and achievement challenges (Clayton et al., 2018 ; Lu & Lemonde, 2013 ).

Research methods

To examine the various quality factors, we apply a critical success factor methodology, initially introduced to schools of business research in the 1970s. In 1981, Rockhart and Bullen codified an approach embodying principles of critical success factors (CSFs) as a way to identify the information needs of executives, detailing steps for the collection and analyzation of data to create a set of organizational CSFs (Rockhart & Bullen, 1981 ). CSFs describe the underlying or guiding principles which must be incorporated to ensure success.

Utilizing this methodology, CSFs in the context of this paper define key areas of instruction and design essential for an online class to be successful from a student’s perspective. Instructors implicitly know and consider these areas when setting up an online class and designing and directing activities and tasks important to achieving learning goals. CSFs make explicit those things good instructors may intuitively know and (should) do to enhance student learning. When made explicit, CSFs not only confirm the knowledge of successful instructors, but tap their intuition to guide and direct the accomplishment of quality instruction for entire programs. In addition, CSFs are linked with goals and objectives, helping generate a small number of truly important matters an instructor should focus attention on to achieve different thresholds of online success.

After a comprehensive literature review, an instrument was created to measure students’ perceptions about the importance of techniques and indicators leading to quality online classes. Items were designed to capture the major factors in the literature. The instrument was pilot studied during academic year 2017–18 with a 397 student sample, facilitating an exploratory factor analysis leading to important preliminary findings (reference withheld for review). Based on the pilot, survey items were added and refined to include seven groups of quality teaching factors and two groups of items related to students’ overall acceptance of online classes as well as a variable on their future online class enrollment. Demographic information was gathered to determine their effects on students’ levels of acceptance of online classes based on age, year in program, major, distance from university, number of online classes taken, high school experience with online classes, and communication preferences.

This paper draws evidence from a sample of students enrolled in educational programs at Jack H. Brown College of Business and Public Administration (JHBC), California State University San Bernardino (CSUSB). The JHBC offers a wide range of online courses for undergraduate and graduate programs. To ensure comparable learning outcomes, online classes and face-to-face classes of a certain subject are similar in size—undergraduate classes are generally capped at 60 and graduate classes at 30, and often taught by the same instructors. Students sometimes have the option to choose between both face-to-face and online modes of learning.

A Qualtrics survey link was sent out by 11 instructors to students who were unlikely to be cross-enrolled in classes during the 2018–19 academic year. 1 Approximately 2500 students were contacted, with some instructors providing class time to complete the anonymous survey. All students, whether they had taken an online class or not, were encouraged to respond. Nine hundred eighty-seven students responded, representing a 40% response rate. Although drawn from a single business school, it is a broad sample representing students from several disciplines—management, accounting and finance, marketing, information decision sciences, and public administration, as well as both graduate and undergraduate programs of study.

The sample age of students is young, with 78% being under 30. The sample has almost no lower division students (i.e., freshman and sophomore), 73% upper division students (i.e., junior and senior) and 24% graduate students (master’s level). Only 17% reported having taken a hybrid or online class in high school. There was a wide range of exposure to university level online courses, with 47% reporting having taken 1 to 4 classes, and 21% reporting no online class experience. As a Hispanic-serving institution, 54% self-identified as Latino, 18% White, and 13% Asian and Pacific Islander. The five largest majors were accounting & finance (25%), management (21%), master of public administration (16%), marketing (12%), and information decision sciences (10%). Seventy-four percent work full- or part-time. See Table  1 for demographic data.

Measures and procedure

To increase the reliability of evaluation scores, composite evaluation variables are formed after an exploratory factor analysis of individual evaluation items. A principle component method with Quartimin (oblique) rotation was applied to explore the factor construct of student perceptions of online teaching CSFs. The item correlations for student perceptions of importance coefficients greater than .30 were included, a commonly acceptable ratio in factor analysis. A simple least-squares regression analysis was applied to test the significance levels of factors on students’ impression of online classes.

Exploratory factor constructs

Using a threshold loading of 0.3 for items, 37 items loaded on seven factors. All factors were logically consistent. The first factor, with eight items, was labeled Teaching Presence. Items included providing clear instructions, staying on task, clear deadlines, and customized feedback on strengths and weaknesses. Teaching Presence items all related to instructor involvement during the course as a director, monitor, and learning facilitator. The second factor, with seven items, aligned with Cognitive Presence. Items included stimulating curiosity, opportunities for reflection, helping students construct explanations posed in online courses, and the applicability of material. The third factor, with six items, aligned with Social Presence defined as providing student-to-student learning opportunities. Items included getting to know course participants for sense of belonging, forming impressions of other students, and interacting with others. The fourth factor, with six new items as well as two (“interaction with other students” and “a sense of community in the class”) shared with the third factor, was Instructional Support which related to the instructor’s roles in providing students a cohesive learning experience. They included providing sufficient rehearsal, structured feedback, techniques for communication, navigation guide, detailed syllabus, and coordinating student interaction and creating a sense of online community. This factor also included enthusiasm which students generally interpreted as a robustly designed course, rather than animation in a traditional lecture. The fifth factor was labeled Basic Online Modality and focused on the basic technological requirements for a functional online course. Three items included allowing students to make online submissions, use of online gradebooks, and online grading. A fourth item is the use of online quizzes, viewed by students as mechanical practice opportunities rather than small tests and a fifth is navigation, a key component of Online Modality. The sixth factor, loaded on four items, was labeled Online Social Comfort. Items here included comfort discussing ideas online, comfort disagreeing, developing a sense of collaboration via discussion, and considering online communication as an excellent medium for social interaction. The final factor was called Interactive Online Modality because it included items for “richer” communications or interactions, no matter whether one- or two-way. Items included videoconferencing, instructor-generated videos, and small group discussions. Taken together, these seven explained 67% of the variance which is considered in the acceptable range in social science research for a robust model (Hair, Black, Babin, & Anderson, 2014 ). See Table  2 for the full list.

To test for factor reliability, the Cronbach alpha of variables were calculated. All produced values greater than 0.7, the standard threshold used for reliability, except for system trust which was therefore dropped. To gauge students’ sense of factor importance, all items were means averaged. Factor means (lower means indicating higher importance to students), ranged from 1.5 to 2.6 on a 5-point scale. Basic Online Modality was most important, followed by Instructional Support and Teaching Presence. Students deemed Cognitive Presence, Social Online Comfort, and Online Interactive Modality less important. The least important for this sample was Social Presence. Table  3 arrays the critical success factor means, standard deviations, and Cronbach alpha.

To determine whether particular subgroups of respondents viewed factors differently, a series of ANOVAs were conducted using factor means as dependent variables. Six demographic variables were used as independent variables: graduate vs. undergraduate, age, work status, ethnicity, discipline, and past online experience. To determine strength of association of the independent variables to each of the seven CSFs, eta squared was calculated for each ANOVA. Eta squared indicates the proportion of variance in the dependent variable explained by the independent variable. Eta squared values greater than .01, .06, and .14 are conventionally interpreted as small, medium, and large effect sizes, respectively (Green & Salkind, 2003 ). Table  4 summarizes the eta squared values for the ANOVA tests with Eta squared values less than .01 omitted.

While no significant differences in factor means among students in different disciplines in the College occur, all five other independent variables have some small effect on some or all CSFs. Graduate students tend to rate Online Interactive Modality, Instructional Support, Teaching Presence, and Cognitive Presence higher than undergraduates. Elder students value more Online Interactive Modality. Full-time working students rate all factors, except Social Online Comfort, slightly higher than part-timers and non-working students. Latino and White rate Basic Online Modality and Instructional Support higher; Asian and Pacific Islanders rate Social Presence higher. Students who have taken more online classes rate all factors higher.

In addition to factor scores, two variables are constructed to identify the resultant impressions labeled online experience. Both were logically consistent with a Cronbach’s α greater than 0.75. The first variable, with six items, labeled “online acceptance,” included items such as “I enjoy online learning,” “My overall impression of hybrid/online learning is very good,” and “the instructors of online/hybrid classes are generally responsive.” The second variable was labeled “face-to-face preference” and combines four items, including enjoying, learning, and communicating more in face-to-face classes, as well as perceiving greater fairness and equity. In addition to these two constructed variables, a one-item variable was also used subsequently in the regression analysis: “online enrollment.” That question asked: if hybrid/online classes are well taught and available, how much would online education make up your entire course selection going forward?

Regression results

As noted above, two constructed variables and one item were used as dependent variables for purposes of regression analysis. They were online acceptance, F2F preference, and the selection of online classes. In addition to seven quality-of-teaching factors identified by factor analysis, control variables included level of education (graduate versus undergraduate), age, ethnicity, work status, distance to university, and number of online/hybrid classes taken in the past. See Table  5 .

When the ETA squared values for ANOVA significance were measured for control factors, only one was close to a medium effect. Graduate versus undergraduate status had a .05 effect (considered medium) related to Online Interactive Modality, meaning graduate students were more sensitive to interactive modality than undergraduates. Multiple regression analysis of critical success factors and online impressions were conducted to compare under what conditions factors were significant. The only consistently significant control factor was number of online classes taken. The more classes students had taken online, the more inclined they were to take future classes. Level of program, age, ethnicity, and working status do not significantly affect students’ choice or overall acceptance of online classes.

The least restrictive condition was online enrollment (Table  6 ). That is, students might not feel online courses were ideal, but because of convenience and scheduling might enroll in them if minimum threshold expectations were met. When considering online enrollment three factors were significant and positive (at the 0.1 level): Basic Online Modality, Cognitive Presence, and Online Social Comfort. These least-demanding students expected classes to have basic technological functionality, provide good opportunities for knowledge acquisition, and provide comfortable interaction in small groups. Students who demand good Instructional Support (e.g., rehearsal opportunities, standardized feedback, clear syllabus) are less likely to enroll.

Online acceptance was more restrictive (see Table  7 ). This variable captured the idea that students not only enrolled in online classes out of necessity, but with an appreciation of the positive attributes of online instruction, which balanced the negative aspects. When this standard was applied, students expected not only Basic Online Modality, Cognitive Presence, and Online Social Comfort, but expected their instructors to be highly engaged virtually as the course progressed (Teaching Presence), and to create strong student-to-student dynamics (Social Presence). Students who rated Instructional Support higher are less accepting of online classes.

Another restrictive condition was catering to the needs of students who preferred face-to-face classes (see Table  8 ). That is, they preferred face-to-face classes even when online classes were well taught. Unlike students more accepting of, or more likely to enroll in, online classes, this group rates Instructional Support as critical to enrolling, rather than a negative factor when absent. Again different from the other two groups, these students demand appropriate interactive mechanisms (Online Interactive Modality) to enable richer communication (e.g., videoconferencing). Student-to-student collaboration (Social Presence) was also significant. This group also rated Cognitive Presence and Online Social Comfort as significant, but only in their absence. That is, these students were most attached to direct interaction with the instructor and other students rather than specific teaching methods. Interestingly, Basic Online Modality and Teaching Presence were not significant. Our interpretation here is this student group, most critical of online classes for its loss of physical interaction, are beyond being concerned with mechanical technical interaction and demand higher levels of interactivity and instructional sophistication.

Discussion and study limitations

Some past studies have used robust empirical methods to identify a single factor or a small number of factors related to quality from a student’s perspective, but have not sought to be relatively comprehensive. Others have used a longer series of itemized factors, but have less used less robust methods, and have not tied those factors back to the literature. This study has used the literature to develop a relatively comprehensive list of items focused on quality teaching in a single rigorous protocol. That is, while a Beta test had identified five coherent factors, substantial changes to the current survey that sharpened the focus on quality factors rather than antecedent factors, as well as better articulating the array of factors often lumped under the mantle of “teaching presence.” In addition, it has also examined them based on threshold expectations: from minimal, such as when flexibility is the driving consideration, to modest, such as when students want a “good” online class, to high, when students demand an interactive virtual experience equivalent to face-to-face.

Exploratory factor analysis identified seven factors that were reliable, coherent, and significant under different conditions. When considering students’ overall sense of importance, they are, in order: Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Social Online Comfort, Interactive Online Modality, and Social Presence. Students are most concerned with the basics of a course first, that is the technological and instructor competence. Next they want engagement and virtual comfort. Social Presence, while valued, is the least critical from this overall perspective.

The factor analysis is quite consistent with the range of factors identified in the literature, pointing to the fact that students can differentiate among different aspects of what have been clumped as larger concepts, such as teaching presence. Essentially, the instructor’s role in quality can be divided into her/his command of basic online functionality, good design, and good presence during the class. The instructor’s command of basic functionality is paramount. Because so much of online classes must be built in advance of the class, quality of the class design is rated more highly than the instructor’s role in facilitating the class. Taken as a whole, the instructor’s role in traditional teaching elements is primary, as we would expect it to be. Cognitive presence, especially as pertinence of the instructional material and its applicability to student interests, has always been found significant when studied, and was highly rated as well in a single factor. Finally, the degree to which students feel comfortable with the online environment and enjoy the learner-learner aspect has been less supported in empirical studies, was found significant here, but rated the lowest among the factors of quality to students.

Regression analysis paints a more nuanced picture, depending on student focus. It also helps explain some of the heterogeneity of previous studies, depending on what the dependent variables were. If convenience and scheduling are critical and students are less demanding, minimum requirements are Basic Online Modality, Cognitive Presence, and Online Social Comfort. That is, students’ expect an instructor who knows how to use an online platform, delivers useful information, and who provides a comfortable learning environment. However, they do not expect to get poor design. They do not expect much in terms of the quality teaching presence, learner-to-learner interaction, or interactive teaching.

When students are signing up for critical classes, or they have both F2F and online options, they have a higher standard. That is, they not only expect the factors for decisions about enrolling in noncritical classes, but they also expect good Teaching and Social Presence. Students who simply need a class may be willing to teach themselves a bit more, but students who want a good class expect a highly present instructor in terms responsiveness and immediacy. “Good” classes must not only create a comfortable atmosphere, but in social science classes at least, must provide strong learner-to-learner interactions as well. At the time of the research, most students believe that you can have a good class without high interactivity via pre-recorded video and videoconference. That may, or may not, change over time as technology thresholds of various video media become easier to use, more reliable, and more commonplace.

The most demanding students are those who prefer F2F classes because of learning style preferences, poor past experiences, or both. Such students (seem to) assume that a worthwhile online class has basic functionality and that the instructor provides a strong presence. They are also critical of the absence of Cognitive Presence and Online Social Comfort. They want strong Instructional Support and Social Presence. But in addition, and uniquely, they expect Online Interactive Modality which provides the greatest verisimilitude to the traditional classroom as possible. More than the other two groups, these students crave human interaction in the learning process, both with the instructor and other students.

These findings shed light on the possible ramifications of the COVID-19 aftermath. Many universities around the world jumped from relatively low levels of online instruction in the beginning of spring 2020 to nearly 100% by mandate by the end of the spring term. The question becomes, what will happen after the mandate is removed? Will demand resume pre-crisis levels, will it increase modestly, or will it skyrocket? Time will be the best judge, but the findings here would suggest that the ability/interest of instructors and institutions to “rise to the occasion” with quality teaching will have as much effect on demand as students becoming more acclimated to online learning. If in the rush to get classes online many students experience shoddy basic functional competence, poor instructional design, sporadic teaching presence, and poorly implemented cognitive and social aspects, they may be quite willing to return to the traditional classroom. If faculty and institutions supporting them are able to increase the quality of classes despite time pressures, then most students may be interested in more hybrid and fully online classes. If instructors are able to introduce high quality interactive teaching, nearly the entire student population will be interested in more online classes. Of course students will have a variety of experiences, but this analysis suggests that those instructors, departments, and institutions that put greater effort into the temporary adjustment (and who resist less), will be substantially more likely to have increases in demand beyond what the modest national trajectory has been for the last decade or so.

There are several study limitations. First, the study does not include a sample of non-respondents. Non-responders may have a somewhat different profile. Second, the study draws from a single college and university. The profile derived here may vary significantly by type of student. Third, some survey statements may have led respondents to rate quality based upon experience rather than assess the general importance of online course elements. “I felt comfortable participating in the course discussions,” could be revised to “comfort in participating in course discussions.” The authors weighed differences among subgroups (e.g., among majors) as small and statistically insignificant. However, it is possible differences between biology and marketing students would be significant, leading factors to be differently ordered. Emphasis and ordering might vary at a community college versus research-oriented university (Gonzalez, 2009 ).

Availability of data and materials

We will make the data available.

Al-Gahtani, S. S. (2016). Empirical investigation of e-learning acceptance and assimilation: A structural equation model. Applied Comput Information , 12 , 27–50.

Google Scholar  

Alqurashi, E. (2016). Self-efficacy in online learning environments: A literature review. Contemporary Issues Educ Res (CIER) , 9 (1), 45–52.

Anderson, T. (2016). A fourth presence for the Community of Inquiry model? Retrieved from https://virtualcanuck.ca/2016/01/04/a-fourth-presence-for-the-community-of-inquiry-model/ .

Annand, D. (2011). Social presence within the community of inquiry framework. The International Review of Research in Open and Distributed Learning , 12 (5), 40.

Arbaugh, J. B. (2005). How much does “subject matter” matter? A study of disciplinary effects in on-line MBA courses. Academy of Management Learning & Education , 4 (1), 57–73.

Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P., Richardson, J. C., & Swan, K. P. (2008). Developing a community of inquiry instrument: Testing a measure of the Community of Inquiry framework using a multi-institutional sample. Internet and Higher Education , 11 , 133–136.

Armellini, A., & De Stefani, M. (2016). Social presence in the 21st century: An adjustment to the Community of Inquiry framework. British Journal of Educational Technology , 47 (6), 1202–1216.

Arruabarrena, R., Sánchez, A., Blanco, J. M., et al. (2019). Integration of good practices of active methodologies with the reuse of student-generated content. International Journal of Educational Technology in Higher Education , 16 , #10.

Arthur, L. (2009). From performativity to professionalism: Lecturers’ responses to student feedback. Teaching in Higher Education , 14 (4), 441–454.

Artino, A. R. (2010). Online or face-to-face learning? Exploring the personal factors that predict students’ choice of instructional format. Internet and Higher Education , 13 , 272–276.

Asoodar, M., Vaezi, S., & Izanloo, B. (2016). Framework to improve e-learner satisfaction and further strengthen e-learning implementation. Computers in Human Behavior , 63 , 704–716.

Bernard, R. M., et al. (2004). How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research , 74 (3), 379–439.

Bollinger, D., & Martindale, T. (2004). Key factors for determining student satisfaction in online courses. Int J E-learning , 3 (1), 61–67.

Brinkley-Etzkorn, K. E. (2018). Learning to teach online: Measuring the influence of faculty development training on teaching effectiveness through a TPACK lens. The Internet and Higher Education , 38 , 28–35.

Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin , 3 , 7.

Choi, I., Land, S. M., & Turgeon, A. J. (2005). Scaffolding peer-questioning strategies to facilitate metacognition during online small group discussion. Instructional Science , 33 , 483–511.

Clayton, K. E., Blumberg, F. C., & Anthony, J. A. (2018). Linkages between course status, perceived course value, and students’ preferences for traditional versus non-traditional learning environments. Computers & Education , 125 , 175–181.

Cleveland-Innes, M., & Campbell, P. (2012). Emotional presence, learning, and the online learning environment. The International Review of Research in Open and Distributed Learning , 13 (4), 269–292.

Cohen, A., & Baruth, O. (2017). Personality, learning, and satisfaction in fully online academic courses. Computers in Human Behavior , 72 , 1–12.

Crews, T., & Butterfield, J. (2014). Data for flipped classroom design: Using student feedback to identify the best components from online and face-to-face classes. Higher Education Studies , 4 (3), 38–47.

Dawson, P., Henderson, M., Mahoney, P., Phillips, M., Ryan, T., Boud, D., & Molloy, E. (2019). What makes for effective feedback: Staff and student perspectives. Assessment & Evaluation in Higher Education , 44 (1), 25–36.

Drew, C., & Mann, A. (2018). Unfitting, uncomfortable, unacademic: A sociological reading of an interactive mobile phone app in university lectures. International Journal of Educational Technology in Higher Education , 15 , #43.

Durabi, A., Arrastia, M., Nelson, D., Cornille, T., & Liang, X. (2011). Cognitive presence in asynchronous online learning: A comparison of four discussion strategies. Journal of Computer Assisted Learning , 27 (3), 216–227.

Eom, S. B., Wen, H. J., & Ashill, N. (2006). The determinants of students’ perceived learning outcomes and satisfaction in university online education: An empirical investigation. Decision Sciences Journal of Innovative Education , 4 (2), 215–235.

Espasa, A., & Meneses, J. (2010). Analysing feedback processes in an online teaching and learning environment: An exploratory study. Higher Education , 59 (3), 277–292.

Farrell, O., & Brunton, J. (2020). A balancing act: A window into online student engagement experiences. International Journal of Educational Technology in High Education , 17 , #25.

Fidalgo, P., Thormann, J., Kulyk, O., et al. (2020). Students’ perceptions on distance education: A multinational study. International Journal of Educational Technology in High Education , 17 , #18.

Flores, Ò., del-Arco, I., & Silva, P. (2016). The flipped classroom model at the university: Analysis based on professors’ and students’ assessment in the educational field. International Journal of Educational Technology in Higher Education , 13 , #21.

Garrison, D. R., Anderson, T., & Archer, W. (2003). A theory of critical inquiry in online distance education. Handbook of Distance Education , 1 , 113–127.

Gong, D., Yang, H. H., & Cai, J. (2020). Exploring the key influencing factors on college students’ computational thinking skills through flipped-classroom instruction. International Journal of Educational Technology in Higher Education , 17 , #19.

Gonzalez, C. (2009). Conceptions of, and approaches to, teaching online: A study of lecturers teaching postgraduate distance courses. Higher Education , 57 (3), 299–314.

Grandzol, J. R., & Grandzol, C. J. (2006). Best practices for online business Education. International Review of Research in Open and Distance Learning , 7 (1), 1–18.

Green, S. B., & Salkind, N. J. (2003). Using SPSS: Analyzing and understanding data , (3rd ed., ). Upper Saddle River: Prentice Hall.

Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2014). Multivariate data analysis: Pearson new international edition . Essex: Pearson Education Limited.

Harjoto, M. A. (2017). Blended versus face-to-face: Evidence from a graduate corporate finance class. Journal of Education for Business , 92 (3), 129–137.

Hong, K.-S. (2002). Relationships between students’ instructional variables with satisfaction and learning from a web-based course. The Internet and Higher Education , 5 , 267–281.

Horvitz, B. S., Beach, A. L., Anderson, M. L., & Xia, J. (2015). Examination of faculty self-efficacy related to online teaching. Innovation Higher Education , 40 , 305–316.

Inside Higher Education and Gallup. (2019). The 2019 survey of faculty attitudes on technology. Author .

Jaggars, S. S., & Xu, D. (2016). How do online course design features influence student performance? Computers and Education , 95 , 270–284.

Joo, Y. J., Lim, K. Y., & Kim, E. K. (2011). Online university students’ satisfaction and persistence: Examining perceived level of presence, usefulness and ease of use as predictor in a structural model. Computers & Education , 57 (2), 1654–1664.

Jung, I. (2011). The dimensions of e-learning quality: From the learner’s perspective. Educational Technology Research and Development , 59 (4), 445–464.

Kay, R., MacDonald, T., & DiGiuseppe, M. (2019). A comparison of lecture-based, active, and flipped classroom teaching approaches in higher education. Journal of Computing in Higher Education , 31 , 449–471.

Kehrwald, B. (2008). Understanding social presence in text-based online learning environments. Distance Education , 29 (1), 89–106.

Kintu, M. J., Zhu, C., & Kagambe, E. (2017). Blended learning effectiveness: The relationship between student characteristics, design features and outcomes. International Journal of Educational Technology in Higher Education , 14 , #7.

Kuo, Y.-C., Walker, A. E., Schroder, K. E., & Belland, B. R. (2013). Interaction, internet self-efficacy, and self-regulated learning as predictors of student satisfaction in online education courses. Internet and Education , 20 , 35–50.

Lange, C., & Costley, J. (2020). Improving online video lectures: Learning challenges created by media. International Journal of Educational Technology in Higher Education , 17 , #16.

le Roux, I., & Nagel, L. (2018). Seeking the best blend for deep learning in a flipped classroom – Viewing student perceptions through the Community of Inquiry lens. International Journal of Educational Technology in High Education , 15 , #16.

Lee, H.-J., & Rha, I. (2009). Influence of structure and interaction on student achievement and satisfaction in web-based distance learning. Educational Technology & Society , 12 (4), 372–382.

Lee, Y., Stringer, D., & Du, J. (2017). What determines students’ preference of online to F2F class? Business Education Innovation Journal , 9 (2), 97–102.

Legon, R., & Garrett, R. (2019). CHLOE 3: Behind the numbers . Published online by Quality Matters and Eduventures. https://www.qualitymatters.org/sites/default/files/research-docs-pdfs/CHLOE-3-Report-2019-Behind-the-Numbers.pdf

Liaw, S.-S., & Huang, H.-M. (2013). Perceived satisfaction, perceived usefulness and interactive learning environments as predictors of self-regulation in e-learning environments. Computers & Education , 60 (1), 14–24.

Lu, F., & Lemonde, M. (2013). A comparison of online versus face-to-face students teaching delivery in statistics instruction for undergraduate health science students. Advances in Health Science Education , 18 , 963–973.

Lundin, M., Bergviken Rensfeldt, A., Hillman, T., Lantz-Andersson, A., & Peterson, L. (2018). Higher education dominance and siloed knowledge: a systematic review of flipped classroom research. International Journal of Educational Technology in Higher Education , 15 (1).

Macon, D. K. (2011). Student satisfaction with online courses versus traditional courses: A meta-analysis . Disssertation: Northcentral University, CA.

Mann, J., & Henneberry, S. (2012). What characteristics of college students influence their decisions to select online courses? Online Journal of Distance Learning Administration , 15 (5), 1–14.

Mansbach, J., & Austin, A. E. (2018). Nuanced perspectives about online teaching: Mid-career senior faculty voices reflecting on academic work in the digital age. Innovative Higher Education , 43 (4), 257–272.

Marks, R. B., Sibley, S. D., & Arbaugh, J. B. (2005). A structural equation model of predictors for effective online learning. Journal of Management Education , 29 (4), 531–563.

Martin, F., Wang, C., & Sadaf, A. (2018). Student perception of facilitation strategies that enhance instructor presence, connectedness, engagement and learning in online courses. Internet and Higher Education , 37 , 52–65.

Maycock, K. W. (2019). Chalk and talk versus flipped learning: A case study. Journal of Computer Assisted Learning , 35 , 121–126.

McGivney-Burelle, J. (2013). Flipping Calculus. PRIMUS Problems, Resources, and Issues in Mathematics Undergraduate . Studies , 23 (5), 477–486.

Mohammadi, H. (2015). Investigating users’ perspectives on e-learning: An integration of TAM and IS success model. Computers in Human Behavior , 45 , 359–374.

Nair, S. S., Tay, L. Y., & Koh, J. H. L. (2013). Students’ motivation and teachers’ teaching practices towards the use of blogs for writing of online journals. Educational Media International , 50 (2), 108–119.

Nguyen, T. (2015). The effectiveness of online learning: Beyond no significant difference and future horizons. MERLOT Journal of Online Learning and Teaching , 11 (2), 309–319.

Ni, A. Y. (2013). Comparing the effectiveness of classroom and online learning: Teaching research methods. Journal of Public Affairs Education , 19 (2), 199–215.

Nouri, J. (2016). The flipped classroom: For active, effective and increased learning – Especially for low achievers. International Journal of Educational Technology in Higher Education , 13 , #33.

O’Neill, D. K., & Sai, T. H. (2014). Why not? Examining college students’ reasons for avoiding an online course. Higher Education , 68 (1), 1–14.

O'Flaherty, J., & Phillips, C. (2015). The use of flipped classrooms in higher education: A scoping review. The Internet and Higher Education , 25 , 85–95.

Open & Distant Learning Quality Council (2012). ODLQC standards . England: Author https://www.odlqc.org.uk/odlqc-standards .

Ortagus, J. C. (2017). From the periphery to prominence: An examination of the changing profile of online students in American higher education. Internet and Higher Education , 32 , 47–57.

Otter, R. R., Seipel, S., Graef, T., Alexander, B., Boraiko, C., Gray, J., … Sadler, K. (2013). Comparing student and faculty perceptions of online and traditional courses. Internet and Higher Education , 19 , 27–35.

Paechter, M., Maier, B., & Macher, D. (2010). Online or face-to-face? Students’ experiences and preferences in e-learning. Internet and Higher Education , 13 , 292–329.

Prinsloo, P. (2016). (re)considering distance education: Exploring its relevance, sustainability and value contribution. Distance Education , 37 (2), 139–145.

Quality Matters (2018). Specific review standards from the QM higher Education rubric , (6th ed., ). MD: MarylandOnline.

Richardson, J. C., Maeda, Y., Lv, J., & Caskurlu, S. (2017). Social presence in relation to students’ satisfaction and learning in the online environment: A meta-analysis. Computers in Human Behavior , 71 , 402–417.

Rockhart, J. F., & Bullen, C. V. (1981). A primer on critical success factors . Cambridge: Center for Information Systems Research, Massachusetts Institute of Technology.

Rourke, L., & Kanuka, H. (2009). Learning in Communities of Inquiry: A Review of the Literature. The Journal of Distance Education / Revue de l'ducation Distance , 23 (1), 19–48 Athabasca University Press. Retrieved August 2, 2020 from https://www.learntechlib.org/p/105542/ .

Sebastianelli, R., Swift, C., & Tamimi, N. (2015). Factors affecting perceived learning, satisfaction, and quality in the online MBA: A structural equation modeling approach. Journal of Education for Business , 90 (6), 296–305.

Shen, D., Cho, M.-H., Tsai, C.-L., & Marra, R. (2013). Unpacking online learning experiences: Online learning self-efficacy and learning satisfaction. Internet and Higher Education , 19 , 10–17.

Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effectiveness of web-based and classroom instruction: A meta-analysis. Personnel Psychology , 59 (3), 623–664.

So, H. J., & Brush, T. A. (2008). Student perceptions of collaborative learning, social presence and satisfaction in a blended learning environment: Relationships and critical factors. Computers & Education , 51 (1), 318–336.

Song, L., Singleton, E. S., Hill, J. R., & Koh, M. H. (2004). Improving online learning: Student perceptions of useful and challenging characteristics. The Internet and Higher Education , 7 (1), 59–70.

Sun, P. C., Tsai, R. J., Finger, G., Chen, Y. Y., & Yeh, D. (2008). What drives a successful e-learning? An empirical investigation of the critical factors influencing learner satisfaction. Computers & Education , 50 (4), 1183–1202.

Takamine, K. (2017). Michelle D. miller: Minds online: Teaching effectively with technology. Higher Education , 73 , 789–791.

Tanner, J. R., Noser, T. C., & Totaro, M. W. (2009). Business faculty and undergraduate students’ perceptions of online learning: A comparative study. Journal of Information Systems Education , 20 (1), 29.

Tucker, B. (2012). The flipped classroom. Education Next , 12 (1), 82–83.

Van Wart, M., Ni, A., Ready, D., Shayo, C., & Court, J. (2020). Factors leading to online learner satisfaction. Business Educational Innovation Journal , 12 (1), 15–24.

Van Wart, M., Ni, A., Rose, L., McWeeney, T., & Worrell, R. A. (2019). Literature review and model of online teaching effectiveness integrating concerns for learning achievement, student satisfaction, faculty satisfaction, and institutional results. Pan-Pacific . Journal of Business Research , 10 (1), 1–22.

Ventura, A. C., & Moscoloni, N. (2015). Learning styles and disciplinary differences: A cross-sectional study of undergraduate students. International Journal of Learning and Teaching , 1 (2), 88–93.

Vlachopoulos, D., & Makri, A. (2017). The effect of games and simulations on higher education: A systematic literature review. International Journal of Educational Technology in Higher Education , 14 , #22.

Wang, Y., Huang, X., & Schunn, C. D. (2019). Redesigning flipped classrooms: A learning model and its effects on student perceptions. Higher Education , 78 , 711–728.

Wingo, N. P., Ivankova, N. V., & Moss, J. A. (2017). Faculty perceptions about teaching online: Exploring the literature using the technology acceptance model as an organizing framework. Online Learning , 21 (1), 15–35.

Xu, D., & Jaggars, S. S. (2014). Performance gaps between online and face-to-face courses: Differences across types of students and academic subject areas. Journal of Higher Education , 85 (5), 633–659.

Young, S. (2006). Student views of effective online teaching in higher education. American Journal of Distance Education , 20 (2), 65–77.

Zawacki-Richter, O., & Naidu, S. (2016). Mapping research trends from 35 years of publications in distance Education. Distance Education , 37 (3), 245–269.

Download references

Acknowledgements

No external funding/ NA.

Author information

Authors and affiliations.

Development for the JHB College of Business and Public Administration, 5500 University Parkway, San Bernardino, California, 92407, USA

Montgomery Van Wart, Anna Ni, Pamela Medina, Jesus Canelon, Melika Kordrostami, Jing Zhang & Yu Liu

You can also search for this author in PubMed   Google Scholar

Contributions

Equal. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Montgomery Van Wart .

Ethics declarations

Competing interests.

We have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Van Wart, M., Ni, A., Medina, P. et al. Integrating students’ perspectives about online learning: a hierarchy of factors. Int J Educ Technol High Educ 17 , 53 (2020). https://doi.org/10.1186/s41239-020-00229-8

Download citation

Received : 29 April 2020

Accepted : 30 July 2020

Published : 02 December 2020

DOI : https://doi.org/10.1186/s41239-020-00229-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Online education
  • Online teaching
  • Student perceptions
  • Online quality
  • Student presence

recommendation in research online class

  • Our Mission

7 High-Impact, Evidence-Based Tips for Online Teaching

What do highly effective teachers do in online classrooms? We combed through dozens of studies to find the best research-backed ideas.

An illustration of elements of online teaching strategies

When online classes exploded in popularity a decade ago, the U.S. Department of Education embarked on an ambitious project : Researchers pored through more than a thousand studies to determine whether students in online classrooms do worse, as well, or better than those receiving face-to-face instruction. They discovered that on average, “students in online learning conditions performed modestly better than those receiving face-to-face instruction.”

But there was a significant caveat: It wasn’t the technology that mattered. In fact, many studies have found that technology actually hinders learning when deployed in a way that doesn't take advantage of the medium. All too often, for example, teachers would take a face-to-face lesson and replicate it online, a costly though understandable approach that rarely led to improvements. The key question for the researchers from the Dept. of Education was whether an online activity served as “a replacement for face-to-face instruction or as an enhancement of the face-to-face learning experience.”

“This finding suggests that the positive effects associated with blended learning should not be attributed to the medium,” the researchers wrote. Online teaching required specialized knowledge, an understanding of the strategies that would allow teachers to adapt technology to suit their pedagogical needs—not the other way around.

Yet the large-scale disruption caused by the pandemic forced millions of teachers to quickly adapt to online teaching, often with little training and preparation. “I feel like a first-year teacher again, only worse,” Justin Lopez-Cardoze, a seventh-grade science teacher told the Washington Post .

So how can teachers enhance the learning experience in online classrooms? We looked over all the research we've read about online learning to find seven high-impact, evidence-based strategies that every teacher should know.

1. Your Virtual Classroom is a Real Learning Space—Keep it Organized

“Students value strong course organization,” explain Swapna Kumar and her colleagues in a 2019 study . They point out that teachers who are new to online instruction are often too focused on content—converting their lectures, presentations, and worksheets into digital format—leaving course design as a secondary consideration.

While “novice instructors have subject-matter expertise, it’s the design that falls short,” Kumar points out, explaining that novice teachers often “don’t know how to organize their materials or set up a design that makes sense” to students.

When students see a well-organized virtual classroom, they’re more engaged, more confident, and more autonomous, says Sarah Schroeder, an associate professor at the University of Cincinnati. And students who encounter messy online learning environments actually project that judgment onto the teacher ; they conclude that the teacher is disorganized more generally.

Here are a few simple tips for organizing your virtual classroom:

  • Have a single, dedicated hub where students can go every day to find their assignments, and other crucial announcements.
  • Create and articulate the simplest communications plan you can. For example, it may be that students can reach you via text during working hours, and via email after school.
  • Consider holding “learn your technology” days with your class to walk through common-use cases, like submitting work or signing on to synchronous lessons.
  • Make an extra effort to be clear and concise in your directions, and consider making a short daily video summarizing the day’s objectives. When writing, avoid the dreaded “wall of text” and use numbered lists and short paragraphs with subheadings.
  • Get rid of visual clutter. This includes hard-to-read fonts and unnecessary decorations or images.

2. Chunk Your Lessons Into Smaller, Digestible Pieces

In a 2010 study , researchers examined how well high school students learned from an online science curriculum and concluded that on average, online materials “require high mental effort” to process. “Working memory capacity is limited, and a learner can only deal with a few concepts simultaneously,” the researchers explain.

What would normally be a 30-minute activity in a face-to-face classroom should be much shorter in the virtual one. Instead of recording an entire lecture, consider creating several smaller ones, each covering a single key idea. The ideal duration for an instructional video, according to a 2014 study , is about 6 minutes, and researchers recorded steep drop-offs in attention after 9 minutes.

In order to give students additional time to process the material, alternate high- and low-intensity activities, and incorporate brain breaks regularly throughout the school day.

3. The Best Online Teachers Solicit Lots of Feedback

When you’re standing face-to-face with your students, you can usually tell when a lesson’s working. If students are riveted, their eyes light up and their brains are in overdrive. But in a virtual classroom, much of that information is lost.

That’s why the authors of a 2019 study which sought to identify the methods of the best online teachers say that you should regularly “gather student feedback on various aspects of...online courses” in order to identify “what was working or not.”

Unlike formative assessment, which focuses on how well students understand the material, it’s crucial that you also gauge how well students can access your virtual materials, according to the researchers. Most teachers and students are newbies in virtual classrooms, and serious communication and process-oriented issues can go undetected—and fester. Consider using student surveys administered via simple tools like Google Forms to ask questions such as: Are you having any technical problems? Are you able to quickly find and submit your work? Is this virtual classroom easy to navigate?

4. Annotate and Interject to Scaffold Learning

If you’re standing in your classroom and you want students to pay attention to something—perhaps a location on a map or information on a slide—you can use gestures to direct students’ attention. But that context can be hard to reproduce online.

To compensate, use simple annotations like arrows and text labels to provide “visual scaffolding and help direct the users' attention to those aspects that are important in learning materials and help guide learners' cognitive processes,” say the authors of a 2020 study . The researchers demonstrated that students who were shown maps with visual and text cues, like arrows and labels identifying key locations, scored 35 percent higher on a recall test than those exposed to maps with no cues.

Also, strategically interject questions into an instructional video at key points to check for understanding. Questions that prompt critical thinking like “Can you think of any exceptions to this rule?” or that probe for comprehension like “How do you determine momentum from measures of mass and velocity?” not only keep the lesson lively but promote deeper engagement with the material and allow you to assess learning, according to a 2018 study .

5. Frequent, Low-Stakes Quizzes are Easy to Do, and Highly Effective

Low- and no-stakes practice tests enhance retention of the material—and students who struggle the most benefit the most from weekly practice quizzes, according to a recent meta-analysis . While online quizzes don’t provide a greater benefit than paper ones, they can be automatically graded, saving hours of work.

You can use popular tools like Kahoot and Quizlet to create online quizzes that are not only fun, but also help students re-process and retain the material better. If you want to boost engagement even further, you can create a Jeopardy! board to gamify your quizzes.

6. Fight The Isolation of Remote Learning by Connecting With Your Students

You’re not just physically separated from your students. As classrooms move online, the psychological and emotional distance also increases, eroding the critical social context that is fertile soil for learning, according to a 2016 study . You’ll need to make special efforts to create a sense of community in your virtual classroom.

“To offset the isolating effects of an online class, teachers can strive to communicate more regularly and more informally with students,” writes Jason Dockter, a professor of English at Lincoln Land Community College in the study. The goal isn’t just to address academic issues, but to demonstrate “that the teacher is personally interested and invested in each student.”

John Thomas, an elementary school teacher, uses daily morning meetings , which can be done both synchronously and asynchronously, to check in with his students. Using Seesaw, he records a greeting that students can respond to and builds in “interactive, engaging activities designed to help our students learn more about themselves and their classmates”—such as sharing a favorite book or the family pet.

Beyond morning meetings, you can adapt many face-to-face activities to work in virtual classrooms:

  • Use unstructured time to chat at the beginning of class.
  • Try Zoom's "waiting room" feature to welcome kids to class one by one.
  • Use breakout rooms to split students into small groups for show-and-tell, two truths and a lie, or other relationship-building exercises .
  • At the end of the day, ask students to reflect on their learning with discussion prompts or a closing activity like appreciation, apology, or aha!
  • Pose fun questions like “What’s your favorite movie?” in your all-class video tool, or on digital whiteboards like Jamboard or Padlet , and have students share out.

7. Take Care of Yourself

You’re not alone: teacher well-being has experienced a “steep decline” in recent months, with 71% of teachers reporting lower morale levels compared to pre-pandemic levels. As the adage goes, “You can’t serve from an empty cup.” If we want our students to succeed, we need to ensure that our teachers are taken care of. Not only is teacher stress contagious , resulting in higher stress levels for students, but it also passes through as poorer academic performance for students as well.

“In order for any of us to provide that safe, stable, and nurturing environment for the children that we serve, we have to practice self-care so that we can be available,” said Dr. Nadine Burke Harris, a pediatrician and California’s first surgeon general, in a recent interview with Edutopia. “Please make sure to put your own oxygen mask on and practice real care for yourself so that you can be there for the next generation.”

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 09 January 2024

Online vs in-person learning in higher education: effects on student achievement and recommendations for leadership

  • Bandar N. Alarifi 1 &
  • Steve Song 2  

Humanities and Social Sciences Communications volume  11 , Article number:  86 ( 2024 ) Cite this article

6927 Accesses

2 Citations

2 Altmetric

Metrics details

  • Science, technology and society

This study is a comparative analysis of online distance learning and traditional in-person education at King Saud University in Saudi Arabia, with a focus on understanding how different educational modalities affect student achievement. The justification for this study lies in the rapid shift towards online learning, especially highlighted by the educational changes during the COVID-19 pandemic. By analyzing the final test scores of freshman students in five core courses over the 2020 (in-person) and 2021 (online) academic years, the research provides empirical insights into the efficacy of online versus traditional education. Initial observations suggested that students in online settings scored lower in most courses. However, after adjusting for variables like gender, class size, and admission scores using multiple linear regression, a more nuanced picture emerged. Three courses showed better performance in the 2021 online cohort, one favored the 2020 in-person group, and one was unaffected by the teaching format. The study emphasizes the crucial need for a nuanced, data-driven strategy in integrating online learning within higher education systems. It brings to light the fact that the success of educational methodologies is highly contingent on specific contextual factors. This finding advocates for educational administrators and policymakers to exercise careful and informed judgment when adopting online learning modalities. It encourages them to thoroughly evaluate how different subjects and instructional approaches might interact with online formats, considering the variable effects these might have on learning outcomes. This approach ensures that decisions about implementing online education are made with a comprehensive understanding of its diverse and context-specific impacts, aiming to optimize educational effectiveness and student success.

Similar content being viewed by others

recommendation in research online class

Elementary school teachers’ perspectives about learning during the COVID-19 pandemic

recommendation in research online class

Quality of a master’s degree in education in Ecuador

recommendation in research online class

Impact of video-based learning in business statistics: a longitudinal study

Introduction.

The year 2020 marked an extraordinary period, characterized by the global disruption caused by the COVID-19 pandemic. Governments and institutions worldwide had to adapt to unforeseen challenges across various domains, including health, economy, and education. In response, many educational institutions quickly transitioned to distance teaching (also known as e-learning, online learning, or virtual classrooms) to ensure continued access to education for their students. However, despite this rapid and widespread shift to online learning, a comprehensive examination of its effects on student achievement in comparison to traditional in-person instruction remains largely unexplored.

In research examining student outcomes in the context of online learning, the prevailing trend is the consistent observation that online learners often achieve less favorable results when compared to their peers in traditional classroom settings (e.g., Fischer et al., 2020 ; Bettinger et al., 2017 ; Edvardsson and Oskarsson, 2008 ). However, it is important to note that a significant portion of research on online learning has primarily focused on its potential impact (Kuhfeld et al., 2020 ; Azevedo et al., 2020 ; Di Pietro et al., 2020 ) or explored various perspectives (Aucejo et al., 2020 ; Radha et al., 2020 ) concerning distance education. These studies have often omitted a comprehensive and nuanced examination of its concrete academic consequences, particularly in terms of test scores and grades.

Given the dearth of research on the academic impact of online learning, especially in light of Covid-19 in the educational arena, the present study aims to address that gap by assessing the effectiveness of distance learning compared to in-person teaching in five required freshmen-level courses at King Saud University, Saudi Arabia. To accomplish this objective, the current study compared the final exam results of 8297 freshman students who were enrolled in the five courses in person in 2020 to their 8425 first-year counterparts who has taken the same courses at the same institution in 2021 but in an online format.

The final test results of the five courses (i.e., University Skills 101, Entrepreneurship 101, Computer Skills 101, Computer Skills 101, and Fitness and Health Culture 101) were examined, accounting for potential confounding factors such as gender, class size and admission scores, which have been cited in past research to be correlated with student achievement (e.g., Meinck and Brese, 2019 ; Jepsen, 2015 ) Additionally, as the preparatory year at King Saud University is divided into five tracks—health, nursing, science, business, and humanity, the study classified students based on their respective disciplines.

Motivation for the study

The rapid expansion of distance learning in higher education, particularly highlighted during the recent COVID-19 pandemic (Volk et al., 2020 ; Bettinger et al., 2017 ), underscores the need for alternative educational approaches during crises. Such disruptions can catalyze innovation and the adoption of distance learning as a contingency plan (Christensen et al., 2015 ). King Saud University, like many institutions worldwide, faced the challenge of transitioning abruptly to online learning in response to the pandemic.

E-learning has gained prominence in higher education due to technological advancements, offering institutions a competitive edge (Valverde-Berrocoso et al., 2020 ). Especially during conditions like the COVID-19 pandemic, electronic communication was utilized across the globe as a feasible means to overcome barriers and enhance interactions (Bozkurt, 2019 ).

Distance learning, characterized by flexibility, became crucial when traditional in-person classes are hindered by unforeseen circumstance such as the ones posed by COVID-19 (Arkorful and Abaidoo, 2015 ). Scholars argue that it allows students to learn at their own pace, often referred to as self-directed learning (Hiemstra, 1994 ) or self-education (Gadamer, 2001 ). Additional advantages include accessibility, cost-effectiveness, and flexibility (Sadeghi, 2019 ).

However, distance learning is not immune to its own set of challenges. Technical impediments, encompassing network issues, device limitations, and communication hiccups, represent formidable hurdles (Sadeghi, 2019 ). Furthermore, concerns about potential distractions in the online learning environment, fueled by the ubiquity of the internet and social media, have surfaced (Hall et al., 2020 ; Ravizza et al., 2017 ). The absence of traditional face-to-face interactions among students and between students and instructors is also viewed as a potential drawback (Sadeghi, 2019 ).

Given the evolving understanding of the pros and cons of distance learning, this study aims to contribute to the existing literature by assessing the effectiveness of distance learning, specifically in terms of student achievement, as compared to in-person classroom learning at King Saud University, one of Saudi Arabia’s largest higher education institutions.

Academic achievement: in-person vs online learning

The primary driving force behind the rapid integration of technology in education has been its emphasis on student performance (Lai and Bower, 2019 ). Over the past decade, numerous studies have undertaken comparisons of student academic achievement in online and in-person settings (e.g., Bettinger et al., 2017 ; Fischer et al., 2020 ; Iglesias-Pradas et al., 2021 ). This section offers a concise review of the disparities in academic achievement between college students engaged in in-person and online learning, as identified in existing research.

A number of studies point to the superiority of traditional in-person education over online learning in terms of academic outcomes. For example, Fischer et al. ( 2020 ) conducted a comprehensive study involving 72,000 university students across 433 subjects, revealing that online students tend to achieve slightly lower academic results than their in-class counterparts. Similarly, Bettinger et al. ( 2017 ) found that students at for-profit online universities generally underperformed when compared to their in-person peers. Supporting this trend, Figlio et al. ( 2013 ) indicated that in-person instruction consistently produced better results, particularly among specific subgroups like males, lower-performing students, and Hispanic learners. Additionally, Kaupp’s ( 2012 ) research in California community colleges demonstrated that online students faced lower completion and success rates compared to their traditional in-person counterparts (Fig. 1 ).

figure 1

The figure compared student achievement in the final tests in the five courses by year, using independent-samples t-tests; the results show a statistically-significant drop in test scores from 2020 (in person) to 2021 (online) for all courses except CT_101.

In contrast, other studies present evidence of online students outperforming their in-person peers. For example, Iglesias-Pradas et al. ( 2021 ) conducted a comparative analysis of 43 bachelor courses at Telecommunication Engineering College in Malaysia, revealing that online students achieved higher academic outcomes than their in-person counterparts. Similarly, during the COVID-19 pandemic, Gonzalez et al. ( 2020 ) found that students engaged in online learning performed better than those who had previously taken the same subjects in traditional in-class settings.

Expanding on this topic, several studies have reported mixed results when comparing the academic performance of online and in-person students, with various student and instructor factors emerging as influential variables. Chesser et al. ( 2020 ) noted that student traits such as conscientiousness, agreeableness, and extraversion play a substantial role in academic achievement, regardless of the learning environment—be it traditional in-person classrooms or online settings. Furthermore, Cacault et al. ( 2021 ) discovered that online students with higher academic proficiency tend to outperform those with lower academic capabilities, suggesting that differences in students’ academic abilities may impact their performance. In contrast, Bergstrand and Savage ( 2013 ) found that online classes received lower overall ratings and exhibited a less respectful learning environment when compared to in-person instruction. Nevertheless, they also observed that the teaching efficiency of both in-class and online courses varied significantly depending on the instructors’ backgrounds and approaches. These findings underscore the multifaceted nature of the online vs. in-person learning debate, highlighting the need for a nuanced understanding of the factors at play.

Theoretical framework

Constructivism is a well-established learning theory that places learners at the forefront of their educational experience, emphasizing their active role in constructing knowledge through interactions with their environment (Duffy and Jonassen, 2009 ). According to constructivist principles, learners build their understanding by assimilating new information into their existing cognitive frameworks (Vygotsky, 1978 ). This theory highlights the importance of context, active engagement, and the social nature of learning (Dewey, 1938 ). Constructivist approaches often involve hands-on activities, problem-solving tasks, and opportunities for collaborative exploration (Brooks and Brooks, 1999 ).

In the realm of education, subject-specific pedagogy emerges as a vital perspective that acknowledges the distinctive nature of different academic disciplines (Shulman, 1986 ). It suggests that teaching methods should be tailored to the specific characteristics of each subject, recognizing that subjects like mathematics, literature, or science require different approaches to facilitate effective learning (Shulman, 1987 ). Subject-specific pedagogy emphasizes that the methods of instruction should mirror the ways experts in a particular field think, reason, and engage with their subject matter (Cochran-Smith and Zeichner, 2005 ).

When applying these principles to the design of instruction for online and in-person learning environments, the significance of adapting methods becomes even more pronounced. Online learning often requires unique approaches due to its reliance on technology, asynchronous interactions, and potential for reduced social presence (Anderson, 2003 ). In-person learning, on the other hand, benefits from face-to-face interactions and immediate feedback (Allen and Seaman, 2016 ). Here, the interplay of constructivism and subject-specific pedagogy becomes evident.

Online learning. In an online environment, constructivist principles can be upheld by creating interactive online activities that promote exploration, reflection, and collaborative learning (Salmon, 2000 ). Discussion forums, virtual labs, and multimedia presentations can provide opportunities for students to actively engage with the subject matter (Harasim, 2017 ). By integrating subject-specific pedagogy, educators can design online content that mirrors the discipline’s methodologies while leveraging technology for authentic experiences (Koehler and Mishra, 2009 ). For instance, an online history course might incorporate virtual museum tours, primary source analysis, and collaborative timeline projects.

In-person learning. In a traditional brick-and-mortar classroom setting, constructivist methods can be implemented through group activities, problem-solving tasks, and in-depth discussions that encourage active participation (Jonassen et al., 2003 ). Subject-specific pedagogy complements this by shaping instructional methods to align with the inherent characteristics of the subject (Hattie, 2009). For instance, in a physics class, hands-on experiments and real-world applications can bring theoretical concepts to life (Hake, 1998 ).

In sum, the fusion of constructivism and subject-specific pedagogy offers a versatile approach to instructional design that adapts to different learning environments (Garrison, 2011 ). By incorporating the principles of both theories, educators can tailor their methods to suit the unique demands of online and in-person learning, ultimately providing students with engaging and effective learning experiences that align with the nature of the subject matter and the mode of instruction.

Course description

The Self-Development Skills Department at King Saud University (KSU) offers five mandatory freshman-level courses. These courses aim to foster advanced thinking skills and cultivate scientific research abilities in students. They do so by imparting essential skills, identifying higher-level thinking patterns, and facilitating hands-on experience in scientific research. The design of these classes is centered around aiding students’ smooth transition into university life. Brief descriptions of these courses are as follows:

University Skills 101 (CI 101) is a three-hour credit course designed to nurture essential academic, communication, and personal skills among all preparatory year students at King Saud University. The primary goal of this course is to equip students with the practical abilities they need to excel in their academic pursuits and navigate their university lives effectively. CI 101 comprises 12 sessions and is an integral part of the curriculum for all incoming freshmen, ensuring a standardized foundation for skill development.

Fitness and Health 101 (FAJB 101) is a one-hour credit course. FAJB 101 focuses on the aspects of self-development skills in terms of health and physical, and the skills related to personal health, nutrition, sports, preventive, psychological, reproductive, and first aid. This course aims to motivate students’ learning process through entertainment, sports activities, and physical exercises to maintain their health. This course is required for all incoming freshmen students at King Saud University.

Entrepreneurship 101 (ENT 101) is a one-hour- credit course. ENT 101 aims to develop students’ skills related to entrepreneurship. The course provides students with knowledge and skills to generate and transform ideas and innovations into practical commercial projects in business settings. The entrepreneurship course consists of 14 sessions and is taught only to students in the business track.

Computer Skills 101 (CT 101) is a three-hour credit course. This provides students with the basic computer skills, e.g., components, operating systems, applications, and communication backup. The course explores data visualization, introductory level of modern programming with algorithms and information security. CT 101 course is taught for all tracks except those in the human track.

Computer Skills 102 (CT 102) is a three-hour credit course. It provides IT skills to the students to utilize computers with high efficiency, develop students’ research and scientific skills, and increase capability to design basic educational software. CT 102 course focuses on operating systems such as Microsoft Office. This course is only taught for students in the human track.

Structure and activities

These courses ranged from one to three hours. A one-hour credit means that students must take an hour of the class each week during the academic semester. The same arrangement would apply to two and three credit-hour courses. The types of activities in each course are shown in Table 1 .

At King Saud University, each semester spans 15 weeks in duration. The total number of semester hours allocated to each course serves as an indicator of its significance within the broader context of the academic program, including the diverse tracks available to students. Throughout the two years under study (i.e., 2020 and 2021), course placements (fall or spring), course content, and the organizational structure remained consistent and uniform.

Participants

The study’s data comes from test scores of a cohort of 16,722 first-year college students enrolled at King Saud University in Saudi Arabia over the span of two academic years: 2020 and 2021. Among these students, 8297 were engaged in traditional, in-person learning in 2020, while 8425 had transitioned to online instruction for the same courses in 2021 due to the Covid-19 pandemic. In 2020, the student population consisted of 51.5% females and 48.5% males. However, in 2021, there was a reversal in these proportions, with female students accounting for 48.5% and male students comprising 51.5% of the total participants.

Regarding student enrollment in the five courses, Table 2 provides a detailed breakdown by average class size, admission scores, and the number of students enrolled in the courses during the two years covered by this study. While the total number of students in each course remained relatively consistent across the two years, there were noticeable fluctuations in average class sizes. Specifically, four out of the five courses experienced substantial increases in class size, with some nearly doubling in size (e.g., ENT_101 and CT_102), while one course (CT_101) showed a reduction in its average class size.

In this study, it must be noted that while some students enrolled in up to three different courses within the same academic year, none repeated the same exam in both years. Specifically, students who failed to pass their courses in 2020 were required to complete them in summer sessions and were consequently not included in this study’s dataset. To ensure clarity and precision in our analysis, the research focused exclusively on student test scores to evaluate and compare the academic effectiveness of online and traditional in-person learning methods. This approach was chosen to provide a clear, direct comparison of the educational impacts associated with each teaching format.

Descriptive analysis of the final exam scores for the two years (2020 and 2021) were conducted. Additionally, comparison of student outcomes in in-person classes in 2020 to their online platform peers in 2021 were conducted using an independent-samples t -test. Subsequently, in order to address potential disparities between the two groups arising from variables such as gender, class size, and admission scores (which serve as an indicator of students’ academic aptitude and pre-enrollment knowledge), multiple regression analyses were conducted. In these multivariate analyses, outcomes of both in-person and online cohorts were assessed within their respective tracks. By carefully considering essential aforementioned variables linked to student performance, the study aimed to ensure a comprehensive and equitable evaluation.

Study instrument

The study obtained students’ final exam scores for the years 2020 (in-person) and 2021 (online) from the school’s records office through their examination management system. In the preparatory year at King Saud University, final exams for all courses are developed by committees composed of faculty members from each department. To ensure valid comparisons, the final exam questions, crafted by departmental committees of professors, remained consistent and uniform for the two years under examination.

Table 3 provides a comprehensive assessment of the reliability of all five tests included in our analysis. These tests exhibit a strong degree of internal consistency, with Cronbach’s alpha coefficients spanning a range from 0.77 to 0.86. This robust and consistent internal consistency measurement underscores the dependable nature of these tests, affirming their reliability and suitability for the study’s objectives.

In terms of assessing test validity, content validity was ensured through a thorough review by university subject matter experts, resulting in test items that align well with the content domain and learning objectives. Additionally, criterion-related validity was established by correlating students’ admissions test scores with their final required freshman test scores in the five subject areas, showing a moderate and acceptable relationship (0.37 to 0.56) between the test scores and the external admissions test. Finally, construct validity was confirmed through reviews by experienced subject instructors, leading to improvements in test content. With guidance from university subject experts, construct validity was established, affirming the effectiveness of the final tests in assessing students’ subject knowledge at the end of their coursework.

Collectively, these validity and reliability measures affirm the soundness and integrity of the final subject tests, establishing their suitability as effective assessment tools for evaluating students’ knowledge in their five mandatory freshman courses at King Saud University.

After obtaining research approval from the Research Committee at King Saud University, the coordinators of the five courses (CI_101, ENT_101, CT_101, CT_102, and FAJB_101) supplied the researchers with the final exam scores of all first-year preparatory year students at King Saud University for the initial semester of the academic years 2020 and 2021. The sample encompassed all students who had completed these five courses during both years, resulting in a total of 16,722 students forming the final group of participants.

Limitations

Several limitations warrant acknowledgment in this study. First, the research was conducted within a well-resourced major public university. As such, the experiences with online classes at other types of institutions (e.g., community colleges, private institutions) may vary significantly. Additionally, the limited data pertaining to in-class teaching practices and the diversity of learning activities across different courses represents a gap that could have provided valuable insights for a more thorough interpretation and explanation of the study’s findings.

To compare student achievement in the final tests in the five courses by year, independent-samples t -tests were conducted. Table 4 shows a statistically-significant drop in test scores from 2020 (in person) to 2021 (online) for all courses except CT_101. The biggest decline was with CT_102 with 3.58 points, and the smallest decline was with CI_101 with 0.18 points.

However, such simple comparison of means between the two years (via t -tests) by subjects does not account for the differences in gender composition, class size, and admission scores between the two academic years, all of which have been associated with student outcomes (e.g., Ho and Kelman, 2014 ; De Paola et al., 2013 ). To account for such potential confounding variables, multiple regressions were conducted to compare the 2 years’ results while controlling for these three factors associated with student achievement.

Table 5 presents the regression results, illustrating the variation in final exam scores between 2020 and 2021, while controlling for gender, class size, and admission scores. Importantly, these results diverge significantly from the outcomes obtained through independent-sample t -test analyses.

Taking into consideration the variables mentioned earlier, students in the 2021 online cohort demonstrated superior performance compared to their 2020 in-person counterparts in CI_101, FAJB_101, and CT_101, with score advantages of 0.89, 0.56, and 5.28 points, respectively. Conversely, in the case of ENT_101, online students in 2021 scored 0.69 points lower than their 2020 in-person counterparts. With CT_102, there were no statistically significant differences in final exam scores between the two cohorts of students.

The study sought to assess the effectiveness of distance learning compared to in-person learning in the higher education setting in Saudi Arabia. We analyzed the final exam scores of 16,722 first-year college students in King Saud University in five required subjects (i.e., CI_101, ENT_101, CT_101, CT_102, and FAJB_101). The study initially performed a simple comparison of mean scores by tracks by year (via t -tests) and then a number of multiple regression analyses which controlled for class size, gender composition, and admission scores.

Overall, the study’s more in-depth findings using multiple regression painted a wholly different picture than the results obtained using t -tests. After controlling for class size, gender composition, and admissions scores, online students in 2021 performed better than their in-person instruction peers in 2020 in University Skills (CI_101), Fitness and Health (FAJB_101), and Computer Skills (CT_101), whereas in-person students outperformed their online peers in Entrepreneurship (ENT_101). There was no meaningful difference in outcomes for students in the Computer Skills (CT_102) course for the two years.

In light of these findings, it raises the question: why do we observe minimal differences (less than a one-point gain or loss) in student outcomes in courses like University Skills, Fitness and Health, Entrepreneurship, and Advanced Computer Skills based on the mode of instruction? Is it possible that when subjects are primarily at a basic or introductory level, as is the case with these courses, the mode of instruction may have a limited impact as long as the concepts are effectively communicated in a manner familiar and accessible to students?

In today’s digital age, one could argue that students in more developed countries, such as Saudi Arabia, generally possess the skills and capabilities to effectively engage with materials presented in both in-person and online formats. However, there is a notable exception in the Basic Computer Skills course, where the online cohort outperformed their in-person counterparts by more than 5 points. Insights from interviews with the instructors of this course suggest that this result may be attributed to the course’s basic and conceptual nature, coupled with the availability of instructional videos that students could revisit at their own pace.

Given that students enter this course with varying levels of computer skills, self-paced learning may have allowed them to cover course materials at their preferred speed, concentrating on less familiar topics while swiftly progressing through concepts they already understood. The advantages of such self-paced learning have been documented by scholars like Tullis and Benjamin ( 2011 ), who found that self-paced learners often outperform those who spend the same amount of time studying identical materials. This approach allows learners to allocate their time more effectively according to their individual learning pace, providing greater ownership and control over their learning experience. As such, in courses like introductory computer skills, it can be argued that becoming familiar with fundamental and conceptual topics may not require extensive in-class collaboration. Instead, it may be more about exposure to and digestion of materials in a format and at a pace tailored to students with diverse backgrounds, knowledge levels, and skill sets.

Further investigation is needed to more fully understand why some classes benefitted from online instruction while others did not, and vice versa. Perhaps, it could be posited that some content areas are more conducive to in-person (or online) format while others are not. Or it could be that the different results of the two modes of learning were driven by students of varying academic abilities and engagement, with low-achieving students being more vulnerable to the limitations of online learning (e.g., Kofoed et al., 2021 ). Whatever the reasons, the results of the current study can be enlightened by a more in-depth analysis of the various factors associated with such different forms of learning. Moreover, although not clear cut, what the current study does provide is additional evidence against any dire consequences to student learning (at least in the higher ed setting) as a result of sudden increase in online learning with possible benefits of its wider use being showcased.

Based on the findings of this study, we recommend that educational leaders adopt a measured approach to online learning—a stance that neither fully embraces nor outright denounces it. The impact on students’ experiences and engagement appears to vary depending on the subjects and methods of instruction, sometimes hindering, other times promoting effective learning, while some classes remain relatively unaffected.

Rather than taking a one-size-fits-all approach, educational leaders should be open to exploring the nuances behind these outcomes. This involves examining why certain courses thrived with online delivery, while others either experienced a decline in student achievement or remained largely unaffected. By exploring these differentiated outcomes associated with diverse instructional formats, leaders in higher education institutions and beyond can make informed decisions about resource allocation. For instance, resources could be channeled towards in-person learning for courses that benefit from it, while simultaneously expanding online access for courses that have demonstrated improved outcomes through its virtual format. This strategic approach not only optimizes resource allocation but could also open up additional revenue streams for the institution.

Considering the enduring presence of online learning, both before the pandemic and its accelerated adoption due to Covid-19, there is an increasing need for institutions of learning and scholars in higher education, as well as other fields, to prioritize the study of its effects and optimal utilization. This study, which compares student outcomes between two cohorts exposed to in-person and online instruction (before and during Covid-19) at the largest university in Saudi Arabia, represents a meaningful step in this direction.

Data availability

The datasets generated during and/or analyzed during the current study are available from the corresponding author upon reasonable request.

Allen IE, Seaman J (2016) Online report card: Tracking online education in the United States . Babson Survey Group

Anderson T (2003) Getting the mix right again: an updated and theoretical rationale for interaction. Int Rev Res Open Distrib Learn , 4 (2). https://doi.org/10.19173/irrodl.v4i2.149

Arkorful V, Abaidoo N (2015) The role of e-learning, advantages and disadvantages of its adoption in higher education. Int J Instruct Technol Distance Learn 12(1):29–42

Google Scholar  

Aucejo EM, French J, Araya MP, Zafar B (2020) The impact of COVID-19 on student experiences and expectations: Evidence from a survey. Journal of Public Economics 191:104271. https://doi.org/10.1016/j.jpubeco.2020.104271

Article   PubMed   PubMed Central   Google Scholar  

Azevedo JP, Hasan A, Goldemberg D, Iqbal SA, and Geven K (2020) Simulating the potential impacts of COVID-19 school closures on schooling and learning outcomes: a set of global estimates. World Bank Policy Research Working Paper

Bergstrand K, Savage SV (2013) The chalkboard versus the avatar: Comparing the effectiveness of online and in-class courses. Teach Sociol 41(3):294–306. https://doi.org/10.1177/0092055X13479949

Article   Google Scholar  

Bettinger EP, Fox L, Loeb S, Taylor ES (2017) Virtual classrooms: How online college courses affect student success. Am Econ Rev 107(9):2855–2875. https://doi.org/10.1257/aer.20151193

Bozkurt A (2019) From distance education to open and distance learning: a holistic evaluation of history, definitions, and theories. Handbook of research on learning in the age of transhumanism , 252–273. https://doi.org/10.4018/978-1-5225-8431-5.ch016

Brooks JG, Brooks MG (1999) In search of understanding: the case for constructivist classrooms . Association for Supervision and Curriculum Development

Cacault MP, Hildebrand C, Laurent-Lucchetti J, Pellizzari M (2021) Distance learning in higher education: evidence from a randomized experiment. J Eur Econ Assoc 19(4):2322–2372. https://doi.org/10.1093/jeea/jvaa060

Chesser S, Murrah W, Forbes SA (2020) Impact of personality on choice of instructional delivery and students’ performance. Am Distance Educ 34(3):211–223. https://doi.org/10.1080/08923647.2019.1705116

Christensen CM, Raynor M, McDonald R (2015) What is disruptive innovation? Harv Bus Rev 93(12):44–53

Cochran-Smith M, Zeichner KM (2005) Studying teacher education: the report of the AERA panel on research and teacher education. Choice Rev Online 43 (4). https://doi.org/10.5860/choice.43-2338

De Paola M, Ponzo M, Scoppa V (2013) Class size effects on student achievement: heterogeneity across abilities and fields. Educ Econ 21(2):135–153. https://doi.org/10.1080/09645292.2010.511811

Dewey, J (1938) Experience and education . Simon & Schuster

Di Pietro G, Biagi F, Costa P, Karpinski Z, Mazza J (2020) The likely impact of COVID-19 on education: reflections based on the existing literature and recent international datasets. Publications Office of the European Union, Luxembourg

Duffy TM, Jonassen DH (2009) Constructivism and the technology of instruction: a conversation . Routledge, Taylor & Francis Group

Edvardsson IR, Oskarsson GK (2008) Distance education and academic achievement in business administration: the case of the University of Akureyri. Int Rev Res Open Distrib Learn, 9 (3). https://doi.org/10.19173/irrodl.v9i3.542

Figlio D, Rush M, Yin L (2013) Is it live or is it internet? Experimental estimates of the effects of online instruction on student learning. J Labor Econ 31(4):763–784. https://doi.org/10.3386/w16089

Fischer C, Xu D, Rodriguez F, Denaro K, Warschauer M (2020) Effects of course modality in summer session: enrollment patterns and student performance in face-to-face and online classes. Internet Higher Educ 45:100710. https://doi.org/10.1016/j.iheduc.2019.100710

Gadamer HG (2001) Education is self‐education. J Philos Educ 35(4):529–538

Garrison DR (2011) E-learning in the 21st century: a framework for research and practice . Routledge. https://doi.org/10.4324/9780203838761

Gonzalez T, de la Rubia MA, Hincz KP, Comas-Lopez M, Subirats L, Fort S, & Sacha GM (2020) Influence of COVID-19 confinement on students’ performance in higher education. PLOS One 15 (10). https://doi.org/10.1371/journal.pone.0239490

Hake RR (1998) Interactive-engagement versus traditional methods: a six-thousand-student survey of mechanics test data for introductory physics courses. Am J Phys 66(1):64–74. https://doi.org/10.1119/1.18809

Article   ADS   Google Scholar  

Hall ACG, Lineweaver TT, Hogan EE, O’Brien SW (2020) On or off task: the negative influence of laptops on neighboring students’ learning depends on how they are used. Comput Educ 153:1–8. https://doi.org/10.1016/j.compedu.2020.103901

Harasim L (2017) Learning theory and online technologies. Routledge. https://doi.org/10.4324/9780203846933

Hiemstra R (1994) Self-directed learning. In WJ Rothwell & KJ Sensenig (Eds), The sourcebook for self-directed learning (pp 9–20). HRD Press

Ho DE, Kelman MG (2014) Does class size affect the gender gap? A natural experiment in law. J Legal Stud 43(2):291–321

Iglesias-Pradas S, Hernández-García Á, Chaparro-Peláez J, Prieto JL (2021) Emergency remote teaching and students’ academic performance in higher education during the COVID-19 pandemic: a case study. Comput Hum Behav 119:106713. https://doi.org/10.1016/j.chb.2021.106713

Jepsen C (2015) Class size: does it matter for student achievement? IZA World of Labor . https://doi.org/10.15185/izawol.190

Jonassen DH, Howland J, Moore J, & Marra RM (2003) Learning to solve problems with technology: a constructivist perspective (2nd ed). Columbus: Prentice Hall

Kaupp R (2012) Online penalty: the impact of online instruction on the Latino-White achievement gap. J Appli Res Community Coll 19(2):3–11. https://doi.org/10.46569/10211.3/99362

Koehler MJ, Mishra P (2009) What is technological pedagogical content knowledge? Contemp Issues Technol Teacher Educ 9(1):60–70

Kofoed M, Gebhart L, Gilmore D, & Moschitto R (2021) Zooming to class?: Experimental evidence on college students’ online learning during COVID-19. SSRN Electron J. https://doi.org/10.2139/ssrn.3846700

Kuhfeld M, Soland J, Tarasawa B, Johnson A, Ruzek E, Liu J (2020) Projecting the potential impact of COVID-19 school closures on academic achievement. Educ Res 49(8):549–565. https://doi.org/10.3102/0013189x20965918

Lai JW, Bower M (2019) How is the use of technology in education evaluated? A systematic review. Comput Educ 133:27–42

Meinck S, Brese F (2019) Trends in gender gaps: using 20 years of evidence from TIMSS. Large-Scale Assess Educ 7 (1). https://doi.org/10.1186/s40536-019-0076-3

Radha R, Mahalakshmi K, Kumar VS, Saravanakumar AR (2020) E-Learning during lockdown of COVID-19 pandemic: a global perspective. Int J Control Autom 13(4):1088–1099

Ravizza SM, Uitvlugt MG, Fenn KM (2017) Logged in and zoned out: How laptop Internet use relates to classroom learning. Psychol Sci 28(2):171–180. https://doi.org/10.1177/095679761667731

Article   PubMed   Google Scholar  

Sadeghi M (2019) A shift from classroom to distance learning: advantages and limitations. Int J Res Engl Educ 4(1):80–88

Salmon G (2000) E-moderating: the key to teaching and learning online . Routledge. https://doi.org/10.4324/9780203816684

Shulman LS (1986) Those who understand: knowledge growth in teaching. Edu Res 15(2):4–14

Shulman LS (1987) Knowledge and teaching: foundations of the new reform. Harv Educ Rev 57(1):1–22

Tullis JG, Benjamin AS (2011) On the effectiveness of self-paced learning. J Mem Lang 64(2):109–118. https://doi.org/10.1016/j.jml.2010.11.002

Valverde-Berrocoso J, Garrido-Arroyo MDC, Burgos-Videla C, Morales-Cevallos MB (2020) Trends in educational research about e-learning: a systematic literature review (2009–2018). Sustainability 12(12):5153

Volk F, Floyd CG, Shaler L, Ferguson L, Gavulic AM (2020) Active duty military learners and distance education: factors of persistence and attrition. Am J Distance Educ 34(3):1–15. https://doi.org/10.1080/08923647.2019.1708842

Vygotsky LS (1978) Mind in society: the development of higher psychological processes. Harvard University Press

Download references

Author information

Authors and affiliations.

Department of Sports and Recreation Management, King Saud University, Riyadh, Saudi Arabia

Bandar N. Alarifi

Division of Research and Doctoral Studies, Concordia University Chicago, 7400 Augusta Street, River Forest, IL, 60305, USA

You can also search for this author in PubMed   Google Scholar

Contributions

Dr. Bandar Alarifi collected and organized data for the five courses and wrote the manuscript. Dr. Steve Song analyzed and interpreted the data regarding student achievement and revised the manuscript. These authors jointly supervised this work and approved the final manuscript.

Corresponding author

Correspondence to Bandar N. Alarifi .

Ethics declarations

Competing interests.

The author declares no competing interests.

Ethical approval

This study was approved by the Research Ethics Committee at King Saud University on 25 March 2021 (No. 4/4/255639). This research does not involve the collection or analysis of data that could be used to identify participants (including email addresses or other contact details). All information is anonymized and the submission does not include images that may identify the person. The procedures used in this study adhere to the tenets of the Declaration of Helsinki.

Informed consent

This article does not contain any studies with human participants performed by any of the authors.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Alarifi, B.N., Song, S. Online vs in-person learning in higher education: effects on student achievement and recommendations for leadership. Humanit Soc Sci Commun 11 , 86 (2024). https://doi.org/10.1057/s41599-023-02590-1

Download citation

Received : 07 June 2023

Accepted : 21 December 2023

Published : 09 January 2024

DOI : https://doi.org/10.1057/s41599-023-02590-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

recommendation in research online class

Recommending Online Course Resources Based on Knowledge Graph

  • Conference paper
  • First Online: 08 December 2022
  • Cite this conference paper

recommendation in research online class

  • Xin Chen   ORCID: orcid.org/0000-0003-4325-8629 11 ,
  • Yuhong Sun   ORCID: orcid.org/0000-0001-6276-8531 11 ,
  • Tong Zhou   ORCID: orcid.org/0000-0002-6374-8310 12 ,
  • Yan Wen   ORCID: orcid.org/0000-0002-9799-5387 11 ,
  • Feng Zhang   ORCID: orcid.org/0000-0003-2646-9854 11 &
  • Qingtian Zeng   ORCID: orcid.org/0000-0002-6421-8223 11  

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13579))

Included in the following conference series:

  • International Conference on Web Information Systems and Applications

1038 Accesses

2 Citations

Nowadays, it is challenging for college students or lifelong education learners to choose the courses they need under the constant growth of massive online course resources. Therefore the recommendation systems are used to meet their personalized interests. In the scenario of course recommendation, traditional collaborative filtering (CF) is not applicable because of the sparsity of user-item interactions and the cold start problem. Learned from MKR, MKCR is proposed to enhance online courses sources recommendation when the interaction between students and courses is extremely sparse. MKCR is an end-to-end framework that utilizes a knowledge graph embedding task to assist recommendation tasks. The experiment data partially come from the MOOC platform of Chinese universities. The results show MKCR is better performance than other methods in the experiments.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Wang, H., Zhang, F., Zhao, M., Li, W., Xie, X., Guo, M.: Multi-task feature learning for knowledge graph enhanced recommendation. In: Proceedings of the World Wide Web Conference, WWW 2019 , 2000–2010 (2019)

Google Scholar  

He, X., Liao, L., Zhang, H., Nie, L., Hu, X., Chua, T.S.: Neural collaborative filtering. In: International World Wide Web Conferences Steering Committee, pp. 173–182 (2017)

Liu, J., Fu, L., Wang, X., Tang, F., Chen, G.: Joint recommendations in multilayer mobile social networks. IEEE Trans. Mob. Comput. 19 (10), 2358–2373 (2020)

Article   Google Scholar  

Wang, H., Zhang, F., Hou, M., Xie, X., Guo, M., Liu, Q.: SHINE: signed heterogeneous information network embedding for sentiment link prediction. In: Proceedings of the 11th ACM International Conference on Web Search and Data Mining, pp. 592–600 (2018)

Sun, Y., Yuan, N.J., Xie, X., McDonald, K., Zhang, R.: Collaborative intent prediction with real-time contextual data. In: ACM Transactions on Information Systems (2017)

Cheng, H.T., Koc, L., Harmsen, J., et al.: Wide & Deep learning for recommender systems. In: Proceedings of the 1st Workshop on Deep Learning for Recommender Systems. ACM (2016)

Wang, H., Zhang, F., Wang, J., Zhao, M., Li, W., Xie, X., Guo, M.: RippleNet: propagating user preferences on the knowledge graph for recommender systems. In: International Conference on Information and Knowledge Management Proceedings, pp. 417–426 (2018)

Wang, H., et al.: RippleNet: propagating user preferences on the knowledge graph for recommender systems, pp. 417–426. ACM (2018)

Wen, Y., Kang, S., Zeng, Q., Duan, H., Chen, X., Li, W.: Session based recommendation with GNN and time-aware memory network, mobile information systems, vol. 2022, Article ID 1879367, 12 pages (2022). doi: https://doi.org/10.1155/2022/1879367

Li, J., Xu, Z., Tang, Y., Zhao, B., Tian, H.: Deep Hybrid Knowledge Graph Embedding for Top-N Recommendation. In: Wang, G., Lin, X., Hendler, J., Song, W., Xu, Z., Liu, G. (eds.) WISA 2020. LNCS, vol. 12432, pp. 59–70. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-60029-7_6

Chapter   Google Scholar  

Download references

Acknowledgement

This work was supported in part by the Distinguished Teachers Training Plan Program of Shandong University of Science and Technology (MS20211105), in part by the Teaching Reform Research Project of the Teaching Steering Committee of Electronic Information Specialty in Higher Education and Universities of the Ministry of Education, in part by the Special Project of China Association of Higher Education, in part by the Education and Teaching Research Project of Shandong Province, in part by the Taishan Scholar Program of Shandong Province, in part by the University-Industry Collaborative Education Program (201902316015, 202102402001), and in part by the Open Fund of the National Virtual Simulation Experimental Teaching Center for Coal Mine Safety Mining (SDUST 2019).

Author information

Authors and affiliations.

College of Computer Science and Engineering, Shandong University of Science and Technology, Qingdao, 266590, People’s Republic of China

Xin Chen, Yuhong Sun, Yan Wen, Feng Zhang & Qingtian Zeng

College of Civil Engineering and Architecture, Shandong University of Science and Technology, Qingdao, 266590, People’s Republic of China

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Xin Chen .

Editor information

Editors and affiliations.

National University of Defense Technology, Changsha, China

Guangzhou University, Guangzhou, China

Tianjin University, Tianjin, China

Deakin University, Melbourne, VIC, Australia

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Cite this paper.

Chen, X., Sun, Y., Zhou, T., Wen, Y., Zhang, F., Zeng, Q. (2022). Recommending Online Course Resources Based on Knowledge Graph. In: Zhao, X., Yang, S., Wang, X., Li, J. (eds) Web Information Systems and Applications. WISA 2022. Lecture Notes in Computer Science, vol 13579. Springer, Cham. https://doi.org/10.1007/978-3-031-20309-1_51

Download citation

DOI : https://doi.org/10.1007/978-3-031-20309-1_51

Published : 08 December 2022

Publisher Name : Springer, Cham

Print ISBN : 978-3-031-20308-4

Online ISBN : 978-3-031-20309-1

eBook Packages : Computer Science Computer Science (R0)

Share this paper

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Education ∪ Math ∪ Technology

  • Instructional Routines
  • Presentations
  • Privacy Policy

March 11, 2020 / 2 Comments

Online Learning Recommendations

Given that many schools (and entire school districts) may be closed down during the coronavirus outbreak, I decided to write this post with recommendations for schools that may attempt to implement online learning during this time. I read through this review of the research on online learning , which contains these high-level recommendations. There are some caveats with this research, especially given that most research on online learning has been done with older students and that the sample sizes with k to 12 students are relatively small. That being said, some evidence for effectiveness is better than no evidence at all. A further caveat: these recommendations are based on effect sizes, which I have not included since they are notoriously unreliable to compare.

  • Instruction combining online and face-to-face elements had a larger advantage relative to purely face-to-face instruction than did purely online instruction. If your school is closed completely, then this may be impossible. That being said, online programs such as Zoom or Big Marker may allow for some “face to face” interaction to occur. These programs will also help with the next recommendation.
  • Effect sizes were larger for studies in which the online instruction was collaborative or instructor-directed than in those studies where online learners worked independently. This basically means that you should design activities that are either led by a teacher or activities that have students work together in small groups. Resources like Google Docs and Skype will be helpful for students working together but given the high possibility that some students will engage in off-task and/or anti-social behaviour (such as teasing or bullying), having some moderation and oversight of these online spaces will be helpful.
  • Elements such as video or online quizzes do not appear to influence the amount that students learn in online classes. Creating a bunch of video lessons of a talking head working through some math problems and then quizzing students on what they have learned afterwards is not supported by the existing evidence on online learning. Given that educator planning time is in short supply, it’s probably best to plan other types of activities.
  • Online learning can be enhanced by giving learners control of their interactions with media and prompting learner reflection. A good example of a program that allows for this is Geogebra . See for example this set of constructions puzzles that require students to think and make decisions while they work through the problems. Another example is the DreamBox Learning math program , which also requires students to actively engage with mathematics. Disclaimer: I work for DreamBox Learning as a mathematician and senior curriculum designer.
  • Providing guidance for learning for groups of students appears less successful than using such mechanisms with individual learners. This recommendation suggests that feedback and support for students should be individualized for online learning, rather than given to the entire group. This does not necessarily mean that one should avoid providing scaffolds (such as guiding questions) to the entire group or that teachers necessarily need to work with individual students, only that whatever guidance and feedback is provided, it should be directed where possible to individual students.

Based on my experience as a parent to my sons, who have both engaged with online learning and are in elementary school and high school right now, I have some further recommendations.

  • Actively engage the learning guardians of students in the process of learning. My sons’ experiences have been far more productive when we have sat down with them while they work through the online course material. This does not mean that we do the work for our children, but rather than we are there to support, encourage, and nurture their development as learners. It will be helpful to offer explicit advice for how learning guardians can support their learners, especially given the range of knowledge and experience those learning guardians will bring to the task. You may even want to include videos of what class looks like and descriptions of instructional routines that learning guardians can use with their learners. Also, offer suggestions of activities to learning guardians that they can do with their children in their care that are not on a computer and do not require the learning guardians to be experts in any particular subject matter.
  • Skip watch-the-video-then-fill-in-the-blank-spaces activities. I can say from experience that these types of activities are ubiquitous in online learning and result in nearly no learning. I watched my son listen to a video in one tab while dutifully recording the answers in his worksheet in another. I quizzed him 5 minutes later and he could remember literally nothing at all from the worksheet or the video. Given that completing these particular courses was a requirement at his school, I taught him a much more productive learning strategy. First, attempt the worksheet and fill in every blank, even if one has to guess. Next, watch the entire video without writing or doing anything else. Now go back to the worksheet and change as many of the answers as one can without going back to the video. Rewatch or listen to the video with the worksheet and change answers as necessary. This is still a terrible experience but it at least has the possibility to result in some learning.
  • Provide devices for students to work if at all possible or at least ensure that any online learning activities can be completed with a smartphone. While access to computers and the Internet keeps increasing , there are still households that do not have access and so providing equitable access to resources to all families is a key responsibility of schools, particularly when expecting students to engage in online learning.
  • Where possible, engage students in synchronous activities rather than asynchronous activities. One of the more successful online classes my son took was with the Art of Problem Solving . Each week my son met with the entire class in an online chat program where the teacher mostly posed questions and occasionally told the students information, while the students responded to the questions in the online platform. He also had a physical textbook, a bank of unlimited practice problems to work on, and challenging problems to complete each week. The chat program was nothing amazing, but it mostly kept my son engaged for the full 90-minute sessions.
  • Use simple assignments that do not require students to navigate complex instructions. Even with assignments with simple instructions, there is a lot of potential for student learning. Given that your students will be working remotely and with limited direct support, you don’t want students spending too much of their time figuring out what they are trying to accomplish.

What other recommendations for teachers and schools who may be suddenly engaged in online learning do you have? What question do you have that I have not yet answered?

  • Author info

Add yours →

' src=

Evan Weinberg says:

We are in week six of closure over in Vietnam, and I can say that I agree with your assertions here. The ones with the most lasting impact are #5 in your first list (individual feedback/learning is better than whole class) and #4 in your second (synchronous is better than asynchronous).

The unifying thread to both is that students seem to quickly miss the social interactions of being in a physical space together. Whether they are introverts or extroverts, all students have expressed how much they value opportunities to connect with each other and with me as their teacher.

It is true that our students interact online much more than we do. A lot has been made out online learning really being successful for this new generation because it is so natural to them. This crisis has made clear that even with the reality of our students’ comfort with online channels, in-person interactions are as important as ever.

March 11, 2020 — 11:47 am

  • Links to Resources for Shifting Instruction Online – Illustrative Mathematics

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Notify me of followup comments via e-mail. You can also subscribe without commenting.

This site uses Akismet to reduce spam. Learn how your comment data is processed .

That’s Mathematics

Previous post

Why Is a Negative Times a Negative Positive?

' src=

Administrator

Popular posts

  • Why Is a Negative Times a Negative Positive? 111k views
  • What is Conceptual Understanding? 53.9k views
  • Teachers are made, not born 47k views
  • Online Learning Recommendations 41.1k views
  • The difference between instrumental and relational understanding 29k views
  • Philosophy of Educational Technology 20.7k views
  • Paper use in schools 20.6k views
  • Some problems with ebooks in schools 18k views
  • 20 things every teacher should do 15.6k views
  • Why teach math? 15.4k views

Archived posts

Recent posts.

The Transformative Power of Games in Learning

April 18, 2024

Tracking Whole Group Conversations

March 20, 2024

Instructional Routines for Math

March 15, 2024

Welcome to my AI-Generated Math Ted Talk

March 11, 2024

AI in Education

March 9, 2024

Email address:

I have read the privacy policy.

Research and Application of Online Course Recommendation System Based on TF-IDF Algorithm

Ieee account.

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

Setting a new bar for online higher education

The education sector was among the hardest hit  by the COVID-19 pandemic. Schools across the globe were forced to shutter their campuses in the spring of 2020 and rapidly shift to online instruction. For many higher education institutions, this meant delivering standard courses and the “traditional” classroom experience through videoconferencing and various connectivity tools.

The approach worked to support students through a period of acute crisis but stands in contrast to the offerings of online education pioneers. These institutions use AI and advanced analytics to provide personalized learning and on-demand student support, and to accommodate student preferences for varying digital formats.

Colleges and universities can take a cue from the early adopters of online education, those companies and institutions that have been refining their online teaching models for more than a decade, as well as the edtechs that have entered the sector more recently. The latter organizations use educational technology to deliver online education services.

To better understand what these institutions are doing well, we surveyed academic research as well as the reported practices of more than 30 institutions, including both regulated degree-granting universities and nonregulated lifelong education providers. We also conducted ethnographic market research, during which we followed the learning journeys of 29 students in the United States and in Brazil, two of the largest online higher education markets in the world, with more than 3.3 million 1 Integrated Postsecondary Education Data System, 2018, nces.ed.gov. and 2.3 million 2 School Census, Censo Escolar-INEP, 2019, ensobasico.inep.gov.br. online higher education students, respectively.

We found that, to engage most effectively with students, the leading online higher education institutions focus on eight dimensions of the learning experience. We have organized these into three overarching principles: create a seamless journey for students, adopt an engaging approach to teaching, and build a caring network (exhibit). In this article, we talk about these principles in the context of programs that are fully online, but they may be just as effective within hybrid programs in which students complete some courses online and some in person.

Create a seamless journey for students

The performance of the early adopters of online education points to the importance of a seamless journey for students, easily navigable learning platforms accessible from any device, and content that is engaging, and whenever possible, personalized. Some early adopters have even integrated their learning platforms with their institution’s other services and resources, such as libraries and financial-aid offices.

1. Build the education road map

In our conversations with students and experts, we learned that students in online programs—precisely because they are physically disconnected from traditional classroom settings—may need more direction, motivation, and discipline than students in in-person programs. The online higher education  programs that we looked at help students build their own education road map using standardized tests, digital alerts, and time-management tools to regularly reinforce students’ progress and remind them of their goals.

Brazil’s Cogna Educação, for instance, encourages students to assess their baseline knowledge at the start of the course. 3 Digital transformation: A new culture to shape our future , Kroton 2018 Sustainability Report, Kroton Educacional, cogna.com.br. Such up-front diagnostics could be helpful in highlighting knowledge gaps and pointing students to relevant tools and resources, and may be especially helpful to students who have had unequal educational opportunities. A web-based knowledge assessment allows Cogna students to confirm their mastery of certain parts of a course, which, according to our research, can potentially boost their confidence and allow them to move faster through the course material.

At the outset of a course, leaders in online higher education can help students clearly understand the format and content, how they will use what they learn, how much time and effort is required, and how prepared they are for its demands.

The University of Michigan’s online Atlas platform, for instance, gives students detailed information about courses and curricula, including profiles of past students, sample reports and evaluations, and grade distributions, so they can make informed decisions about their studies. 4 Atlas, Center for Academic Innovation, University of Michigan, umich.edu. Another provider, Pluralsight, shares movie-trailer-style overviews of its course content and offers trial options so students can get a sense of what to expect before making financial commitments.

Meanwhile, some of the online doctoral students we interviewed have access to an interactive timeline and graduation calculator for each course, which help students understand each of the milestones and requirements for completing their dissertations. Breaking up the education process into manageable tasks this way can potentially ease anxiety, according to our interviews with education experts.

2. Enable seamless connections

Students may struggle to learn if they aren’t able to connect to learning platforms. Online higher education pioneers provide a single sign-on through which students can interact with professors and classmates and gain access to critical support services. Traditional institutions considering a similar model should remember that because high-speed and reliable internet are not always available, courses and program content should be structured so they can be accessed even in low-bandwidth situations or downloaded for offline use.

The technology is just one element of creating seamless connections. Since remote students may face a range of distractions, online-course content could benefit them by being more engaging than in-person courses. Online higher education pioneers allow students to study at their own pace through a range of channels and media, anytime and anywhere—including during otherwise unproductive periods, such as while in the waiting room at the doctor’s office. Coursera, for example, invites students to log into a personalized home page where they can review the status of their coursework, complete unfinished lessons, and access recommended “next content to learn” units. Brazilian online university Ampli Pitagoras offers content optimized for mobile devices that allows students to listen to lessons, contact tutors for help, or do quizzes from wherever they happen to be.

Adopt an engaging approach to teaching

The pioneers in online higher education we researched pair the “right” course content with the “right” formats to capture students’ attention. They incorporate real-world applications into their lesson plans, use adaptive learning tools to personalize their courses, and offer easily accessible platforms for group learning.

3. Offer a range of learning formats

The online higher education programs we reviewed incorporate group activities and collaboration with classmates—important hallmarks of the higher education experience—into their mix of course formats, offering both live classes and self-guided, on-demand lessons.

The Georgia Institute of Technology, for example, augments live lessons from faculty members in its online graduate program in data analytics with a collaboration platform where students can interact outside of class, according to a student we interviewed. Instructors can provide immediate answers to students’ questions via the platform or endorse students’ responses to questions from their peers. Instructors at Zhejiang University in China use live videoconferencing and chat rooms to communicate with more than 300 participants, assign and collect homework assignments, and set goals. 5 Wu Zhaohui, “How a top Chinese university is responding to coronavirus,” World Economic Forum, March 16, 2020, weforum.org.

The element of personalization is another area in which online programs can consider upping their ante, even in large student groups. Institutions could offer customized ways of learning online, whether via digital textbook, podcast, or video, ensuring that these materials are high quality and that the cost of their production is spread among large student populations.

Some institutions have invested in bespoke tools to facilitate various learning modes. The University of Michigan’s Center for Academic Innovation embeds custom-designed software into its courses to enhance the experience for both students and professors. 6 “Our mission & principles,” University of Michigan Center for Academic Innovation, ai.umich.edu. The school’s ECoach platform helps students in large classes navigate content when one-on-one interaction with instructors is difficult because of the sheer number of students. It also sends students reminders, motivational tips, performance reviews, and exam-preparation materials. 7 University of Michigan, umich.edu. Meanwhile, Minerva University focuses on a real-time online-class model that supports higher student participation and feedback and has built a platform with a “talk time” feature that lets instructors balance class participation and engage “back-row students” who may be inclined to participate less. 8 Samad Twemlow-Carter, “Talk Time,” Minerva University, minervaproject.com.

4. Ensure captivating experiences

Delivering education on digital platforms opens the potential to turn curricula into engaging and interactive journeys, and online education leaders are investing in content whose quality is on a par with high-end entertainment. Strayer University, for example, has recruited Emmy Award–winning film producers and established an in-house production unit to create multimedia lessons. The university’s initial findings show that this investment is paying off in increased student engagement, with 85 percent of learners reporting that they watch lessons from beginning to end, and also shows a 10 percent reduction in the student dropout rate. 9 Increased student engagement and success through captivating content , Strayer Studios outcomes report, Strayer University, studios.strategiced.com.

Other educators are attracting students not only with high-production values but influential personalities. Outlier provides courses in the form of high-quality videos that feature charismatic Ivy League professors and are shot in a format that reduces eye strain. 10 Outlier online course registration for Calculus I, outlier.org. The course content follows a storyline, and each course is presented as a crucial piece in an overall learning journey.

5. Utilize adaptive learning tools

Online higher education pioneers deliver adaptive learning using AI and analytics to detect and address individual students’ needs and offer real-time feedback and support. They can also predict students’ requirements, based on individuals’ past searches and questions, and respond with relevant content. This should be conducted according to the applicable personal data privacy regulations of the country where the institution is operating.

Cogna Educação, for example, developed a system that delivers real-time, personalized tutoring to more than 500,000 online students, paired with exercises customized to address specific knowledge gaps. 11 Digital transformation , 2018. Minerva University used analytics to devise a highly personalized feedback model, which allows instructors to comment and provide feedback on students’ online learning assignments and provide access to test scores during one-on-one feedback sessions. 12 “Maybe we need to rethink our assumptions about ‘online’ learning,” Minerva University, minervaproject.com. According to our research, instructors can also access recorded lessons during one-on-one sessions and provide feedback on student participation during class.

6. Include real-world application of skills

The online higher education pioneers use virtual reality (VR) laboratories, simulations, and games for students to practice skills in real-world scenarios within controlled virtual environments. This type of hands-on instruction, our research shows, has traditionally been a challenge for online institutions.

Arizona State University, for example, has partnered with several companies to develop a biology degree that can be obtained completely online. The program leverages VR technology that gives online students in its biological-sciences program access to a state-of-the-art lab. Students can zoom in to molecules and repeat experiments as many times as needed—all from the comfort of wherever they happen to be. 13 “ASU online biology course is first to offer virtual-reality lab in Google partnership,” Arizona State University, August 23, 2018, news.asu.edu. Meanwhile, students at Universidad Peruana de Ciencias Aplicadas are using 3-D games to find innovative solutions to real-world problems—for instance, designing the post-COVID-19 campus experience. 14 Cleofé Vergara, “Learn by playing with Minecraft Education,” Innovación Educativa, July 13, 2021, innovacioneducativa.upc.edu.pe.

Some institutions have expanded the real-world experience by introducing online internships. Columbia University’s Virtual Internship Program, for example, was developed in partnership with employers across the United States and offers skills workshops and resources, as well as one-on-one career counseling. 15 Virtual Internship Program, Columbia University Center for Career Education, columbia.edu.

Create a caring network

Establishing interpersonal connections may be more difficult in online settings. Leading online education programs provide dedicated channels to help students with academic, personal, technological, administrative, and financial challenges and to provide a means for students to connect with each other for peer-to-peer support. Such programs are also using technologies to recognize signs of student distress and to extend just-in-time support.

7. Provide academic and nonacademic support

Online education pioneers combine automation and analytics with one-on-one personal interactions to give students the support they need.

Southern New Hampshire University (SNHU), for example, uses a system of alerts and communication nudges when its digital platform detects low student engagement. Meanwhile, AI-powered chatbots provide quick responses to common student requests and questions. 16 “SNHU turns student data into student success,” Southern New Hampshire University, May 2019, d2l.com. Strayer University has a virtual assistant named Irving that is accessible from every page of the university’s online campus website and offers 24/7 administrative support to students, from recommending courses to making personalized graduation projections. 17 “Meet Irving, the Strayer chatbot that saves students time,” Strayer University, October 31, 2019, strayer.edu.

Many of these pioneer institutions augment that digital assistance with human support. SNHU, for example, matches students in distress with personal coaches and tutors who can follow the students’ progress and provide regular check-ins. In this way, they can help students navigate the program and help cultivate a sense of belonging. 18 Academic advising, Southern New Hampshire University, 2021, snhu.edu. Similarly, Arizona State University pairs students with “success coaches” who give personalized guidance and counseling. 19 “Accessing your success coach,” Arizona State University, asu.edu.

8. Foster a strong community

The majority of students we interviewed have a strong sense of belonging to their academic community. Building a strong network of peers and professors, however, may be challenging in online settings.

To alleviate this challenge, leading online programs often combine virtual social events with optional in-person gatherings. Minerva University, for example, hosts exclusive online events that promote school rituals and traditions for online students, and encourages online students to visit its various locations for in-person gatherings where they can meet members of its diverse, dispersed student population. 20 “Join your extended family,” Minerva University, minerva.edu. SNHU’s Connect social gateway gives online-activity access to more than 15,000 members, and helps them interact within an exclusive university social network. Students can also join student organizations and affinity clubs virtually. 21 SNHU Connect, Southern New Hampshire University, snhuconnect.com.

Getting started: Designing the online journey

Building a distinctive online student experience requires significant time, effort, and investment. Most institutions whose practices we reviewed in this article took several years to understand student needs and refine their approaches to online education.

For those institutions in the early stages of rethinking their online offerings, the following three steps may be useful. Each will typically involve various functions within the institution, including but not necessarily limited to, academic management, IT, and marketing.

The diagnosis could be performed through a combination of focus groups and quantitative surveys, for example. It’s important that participants represent various student segments, which are likely to have different expectations, including young-adult full-time undergraduate students, working-adult part-time undergraduate students, and graduate students. The eight key dimensions outlined above may be helpful for structuring groups and surveys, in addition to self-evaluation of institution performance and potential benchmarks.

  • Set a strategic vision for your online learning experience. The vision should be student-centric and link tightly to the institution’s overarching manifesto. The function leaders could evaluate the costs/benefits of each part of the online experience to ensure that the costs are realistic. The online model may vary depending on each school’s market, target audience, and tuition price point. An institution with high tuition, for example, is more likely to afford and provide one-on-one live coaching and student support, while an institution with lower tuition may need to rely more on automated tools and asynchronous interactions with students.
  • Design the transformation journey. Institutions should expect a multiyear journey. Some may opt to outsource the program design and delivery to dedicated program-management companies. But in our experience, an increasing number of institutions are developing these capabilities internally, especially as online learning moves further into the mainstream and becomes a source of long-term strategic advantage.

We have found that leading organizations often begin with quick wins that significantly raise student experiences, such as stronger student support, integrated technology platforms, and structured course road maps. In parallel, they begin the incremental redesign of courses and delivery models, often focusing on key programs with the largest enrollments and tapping into advanced analytics for insights to refine these experiences.

Finally, institutions tackle key enabling factors, such as instructor onboarding and online-teaching training, robust technology infrastructure, and advanced-analytics programs that enable the institutions to understand which features of online education are performing well and generating exceptional learning experiences for their students.

The question is no longer whether the move to online will outlive the COVID-19 lockdowns but when online learning will become the dominant means for delivering higher education. As digital transformation accelerates across all industries, higher education institutions will need to consider how to develop their own online strategies.

Felipe Child is a partner in McKinsey’s Bogotá office, Marcus Frank is a senior practice expert in the São Paulo office, Mariana Lef is an associate in the Buenos Aires office, and Jimmy Sarakatsannis is a partner in the Washington, DC, office.

References to specific products, companies, or organizations are solely for information purposes and do not constitute any endorsement or recommendation.

This article was edited by Justine Jablonska, an editor in the New York office.

Explore a career with us

Related articles.

How to transform higher-education institutions for the long term

How to transform higher-education institutions for the long term

Scaling online education: Five lessons for colleges

Higher education in the post-COVID world

Reimagining higher education in the United States

Reimagining higher education in the United States

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Springer Nature - PMC COVID-19 Collection

Logo of phenaturepg

Impact of online classes on the satisfaction and performance of students during the pandemic period of COVID 19

1 Chitkara College of Hospitality Management, Chitkara University, Chandigarh, Punjab India

Varsha Singh

Arun aggarwal.

2 Chitkara Business School, Chitkara University, Chandigarh, Punjab India

The aim of the study is to identify the factors affecting students’ satisfaction and performance regarding online classes during the pandemic period of COVID–19 and to establish the relationship between these variables. The study is quantitative in nature, and the data were collected from 544 respondents through online survey who were studying the business management (B.B.A or M.B.A) or hotel management courses in Indian universities. Structural equation modeling was used to analyze the proposed hypotheses. The results show that four independent factors used in the study viz. quality of instructor, course design, prompt feedback, and expectation of students positively impact students’ satisfaction and further student’s satisfaction positively impact students’ performance. For educational management, these four factors are essential to have a high level of satisfaction and performance for online courses. This study is being conducted during the epidemic period of COVID- 19 to check the effect of online teaching on students’ performance.

Introduction

Coronavirus is a group of viruses that is the main root of diseases like cough, cold, sneezing, fever, and some respiratory symptoms (WHO, 2019 ). Coronavirus is a contagious disease, which is spreading very fast amongst the human beings. COVID-19 is a new sprain which was originated in Wuhan, China, in December 2019. Coronavirus circulates in animals, but some of these viruses can transmit between animals and humans (Perlman & Mclntosh, 2020 ). As of March 282,020, according to the MoHFW, a total of 909 confirmed COVID-19 cases (862 Indians and 47 foreign nationals) had been reported in India (Centers for Disease Control and Prevention, 2020 ). Officially, no vaccine or medicine is evaluated to cure the spread of COVID-19 (Yu et al., 2020 ). The influence of the COVID-19 pandemic on the education system leads to schools and colleges’ widespread closures worldwide. On March 24, India declared a country-wide lockdown of schools and colleges (NDTV, 2020 ) for preventing the transmission of the coronavirus amongst the students (Bayham & Fenichel, 2020 ). School closures in response to the COVID-19 pandemic have shed light on several issues affecting access to education. COVID-19 is soaring due to which the huge number of children, adults, and youths cannot attend schools and colleges (UNESCO, 2020 ). Lah and Botelho ( 2012 ) contended that the effect of school closing on students’ performance is hazy.

Similarly, school closing may also affect students because of disruption of teacher and students’ networks, leading to poor performance. Bridge ( 2020 ) reported that schools and colleges are moving towards educational technologies for student learning to avoid a strain during the pandemic season. Hence, the present study’s objective is to develop and test a conceptual model of student’s satisfaction pertaining to online teaching during COVID-19, where both students and teachers have no other option than to use the online platform uninterrupted learning and teaching.

UNESCO recommends distance learning programs and open educational applications during school closure caused by COVID-19 so that schools and teachers use to teach their pupils and bound the interruption of education. Therefore, many institutes go for the online classes (Shehzadi et al., 2020 ).

As a versatile platform for learning and teaching processes, the E-learning framework has been increasingly used (Salloum & Shaalan, 2018 ). E-learning is defined as a new paradigm of online learning based on information technology (Moore et al., 2011 ). In contrast to traditional learning academics, educators, and other practitioners are eager to know how e-learning can produce better outcomes and academic achievements. Only by analyzing student satisfaction and their performance can the answer be sought.

Many comparative studies have been carried out to prove the point to explore whether face-to-face or traditional teaching methods are more productive or whether online or hybrid learning is better (Lockman & Schirmer, 2020 ; Pei & Wu, 2019 ; González-Gómez et al., 2016 ; González-Gómez et al., 2016 ). Results of the studies show that the students perform much better in online learning than in traditional learning. Henriksen et al. ( 2020 ) highlighted the problems faced by educators while shifting from offline to online mode of teaching. In the past, several research studies had been carried out on online learning to explore student satisfaction, acceptance of e-learning, distance learning success factors, and learning efficiency (Sher, 2009 ; Lee, 2014 ; Yen et al., 2018 ). However, scant amount of literature is available on the factors that affect the students’ satisfaction and performance in online classes during the pandemic of Covid-19 (Rajabalee & Santally, 2020 ). In the present study, the authors proposed that course design, quality of the instructor, prompt feedback, and students’ expectations are the four prominent determinants of learning outcome and satisfaction of the students during online classes (Lee, 2014 ).

The Course Design refers to curriculum knowledge, program organization, instructional goals, and course structure (Wright, 2003 ). If well planned, course design increasing the satisfaction of pupils with the system (Almaiah & Alyoussef, 2019 ). Mtebe and Raisamo ( 2014 ) proposed that effective course design will help in improving the performance through learners knowledge and skills (Khan & Yildiz, 2020 ; Mohammed et al., 2020 ). However, if the course is not designed effectively then it might lead to low usage of e-learning platforms by the teachers and students (Almaiah & Almulhem, 2018 ). On the other hand, if the course is designed effectively then it will lead to higher acceptance of e-learning system by the students and their performance also increases (Mtebe & Raisamo, 2014 ). Hence, to prepare these courses for online learning, many instructors who are teaching blended courses for the first time are likely to require a complete overhaul of their courses (Bersin, 2004 ; Ho et al., 2006 ).

The second-factor, Instructor Quality, plays an essential role in affecting the students’ satisfaction in online classes. Instructor quality refers to a professional who understands the students’ educational needs, has unique teaching skills, and understands how to meet the students’ learning needs (Luekens et al., 2004 ). Marsh ( 1987 ) developed five instruments for measuring the instructor’s quality, in which the main method was Students’ Evaluation of Educational Quality (SEEQ), which delineated the instructor’s quality. SEEQ is considered one of the methods most commonly used and embraced unanimously (Grammatikopoulos et al., 2014 ). SEEQ was a very useful method of feedback by students to measure the instructor’s quality (Marsh, 1987 ).

The third factor that improves the student’s satisfaction level is prompt feedback (Kinicki et al., 2004 ). Feedback is defined as information given by lecturers and tutors about the performance of students. Within this context, feedback is a “consequence of performance” (Hattie & Timperley, 2007 , p. 81). In education, “prompt feedback can be described as knowing what you know and what you do not related to learning” (Simsek et al., 2017 , p.334). Christensen ( 2014 ) studied linking feedback to performance and introduced the positivity ratio concept, which is a mechanism that plays an important role in finding out the performance through feedback. It has been found that prompt feedback helps in developing a strong linkage between faculty and students which ultimately leads to better learning outcomes (Simsek et al., 2017 ; Chang, 2011 ).

The fourth factor is students’ expectation . Appleton-Knapp and Krentler ( 2006 ) measured the impact of student’s expectations on their performance. They pin pointed that the student expectation is important. When the expectations of the students are achieved then it lead to the higher satisfaction level of the student (Bates & Kaye, 2014 ). These findings were backed by previous research model “Student Satisfaction Index Model” (Zhang et al., 2008 ). However, when the expectations are students is not fulfilled then it might lead to lower leaning and satisfaction with the course. Student satisfaction is defined as students’ ability to compare the desired benefit with the observed effect of a particular product or service (Budur et al., 2019 ). Students’ whose grade expectation is high will show high satisfaction instead of those facing lower grade expectations.

The scrutiny of the literature show that although different researchers have examined the factors affecting student satisfaction but none of the study has examined the effect of course design, quality of the instructor, prompt feedback, and students’ expectations on students’ satisfaction with online classes during the pandemic period of Covid-19. Therefore, this study tries to explore the factors that affect students’ satisfaction and performance regarding online classes during the pandemic period of COVID–19. As the pandemic compelled educational institutions to move online with which they were not acquainted, including teachers and learners. The students were not mentally prepared for such a shift. Therefore, this research will be examined to understand what factors affect students and how students perceived these changes which are reflected through their satisfaction level.

This paper is structured as follows: The second section provides a description of theoretical framework and the linkage among different research variables and accordingly different research hypotheses were framed. The third section deals with the research methodology of the paper as per APA guideline. The outcomes and corresponding results of the empirical analysis are then discussed. Lastly, the paper concludes with a discussion and proposes implications for future studies.

Theoretical framework

Achievement goal theory (AGT) is commonly used to understand the student’s performance, and it is proposed by four scholars Carole Ames, Carol Dweck, Martin Maehr, and John Nicholls in the late 1970s (Elliot, 2005 ). Elliott & Dweck ( 1988 , p11) define that “an achievement goal involves a program of cognitive processes that have cognitive, affective and behavioral consequence”. This theory suggests that students’ motivation and achievement-related behaviors can be easily understood by the purpose and the reasons they adopted while they are engaged in the learning activities (Dweck & Leggett, 1988 ; Ames, 1992 ; Urdan, 1997 ). Some of the studies believe that there are four approaches to achieve a goal, i.e., mastery-approach, mastery avoidance, performance approach, and performance-avoidance (Pintrich, 1999 ; Elliot & McGregor, 2001 ; Schwinger & Stiensmeier-Pelster, 2011 , Hansen & Ringdal, 2018 ; Mouratidis et al., 2018 ). The environment also affects the performance of students (Ames & Archer, 1988 ). Traditionally, classroom teaching is an effective method to achieve the goal (Ames & Archer, 1988 ; Ames, 1992 ; Clayton et al., 2010 ) however in the modern era, the internet-based teaching is also one of the effective tools to deliver lectures, and web-based applications are becoming modern classrooms (Azlan et al., 2020 ). Hence, following section discuss about the relationship between different independent variables and dependent variables (Fig. ​ (Fig.1 1 ).

An external file that holds a picture, illustration, etc.
Object name is 10639_2021_10523_Fig1_HTML.jpg

Proposed Model

Hypotheses development

Quality of the instructor and satisfaction of the students.

Quality of instructor with high fanaticism on student’s learning has a positive impact on their satisfaction. Quality of instructor is one of the most critical measures for student satisfaction, leading to the education process’s outcome (Munteanu et al., 2010 ; Arambewela & Hall, 2009 ; Ramsden, 1991 ). Suppose the teacher delivers the course effectively and influence the students to do better in their studies. In that case, this process leads to student satisfaction and enhances the learning process (Ladyshewsky, 2013 ). Furthermore, understanding the need of learner by the instructor also ensures student satisfaction (Kauffman, 2015 ). Hence the hypothesis that the quality of instructor significantly affects the satisfaction of the students was included in this study.

  • H1: The quality of the instructor positively affects the satisfaction of the students.

Course design and satisfaction of students

The course’s technological design is highly persuading the students’ learning and satisfaction through their course expectations (Liaw, 2008 ; Lin et al., 2008 ). Active course design indicates the students’ effective outcomes compared to the traditional design (Black & Kassaye, 2014 ). Learning style is essential for effective course design (Wooldridge, 1995 ). While creating an online course design, it is essential to keep in mind that we generate an experience for students with different learning styles. Similarly, (Jenkins, 2015 ) highlighted that the course design attributes could be developed and employed to enhance student success. Hence the hypothesis that the course design significantly affects students’ satisfaction was included in this study.

  • H2: Course design positively affects the satisfaction of students.

Prompt feedback and satisfaction of students

The emphasis in this study is to understand the influence of prompt feedback on satisfaction. Feedback gives the information about the students’ effective performance (Chang, 2011 ; Grebennikov & Shah, 2013 ; Simsek et al., 2017 ). Prompt feedback enhances student learning experience (Brownlee et al., 2009 ) and boosts satisfaction (O'donovan, 2017 ). Prompt feedback is the self-evaluation tool for the students (Rogers, 1992 ) by which they can improve their performance. Eraut ( 2006 ) highlighted the impact of feedback on future practice and student learning development. Good feedback practice is beneficial for student learning and teachers to improve students’ learning experience (Yorke, 2003 ). Hence the hypothesis that prompt feedback significantly affects satisfaction was included in this study.

  • H3: Prompt feedback of the students positively affects the satisfaction.

Expectations and satisfaction of students

Expectation is a crucial factor that directly influences the satisfaction of the student. Expectation Disconfirmation Theory (EDT) (Oliver, 1980 ) was utilized to determine the level of satisfaction based on their expectations (Schwarz & Zhu, 2015 ). Student’s expectation is the best way to improve their satisfaction (Brown et al., 2014 ). It is possible to recognize student expectations to progress satisfaction level (ICSB, 2015 ). Finally, the positive approach used in many online learning classes has been shown to place a high expectation on learners (Gold, 2011 ) and has led to successful outcomes. Hence the hypothesis that expectations of the student significantly affect the satisfaction was included in this study.

  • H4: Expectations of the students positively affects the satisfaction.

Satisfaction and performance of the students

Zeithaml ( 1988 ) describes that satisfaction is the outcome result of the performance of any educational institute. According to Kotler and Clarke ( 1986 ), satisfaction is the desired outcome of any aim that amuses any individual’s admiration. Quality interactions between instructor and students lead to student satisfaction (Malik et al., 2010 ; Martínez-Argüelles et al., 2016 ). Teaching quality and course material enhances the student satisfaction by successful outcomes (Sanderson, 1995 ). Satisfaction relates to the student performance in terms of motivation, learning, assurance, and retention (Biner et al., 1996 ). Mensink and King ( 2020 ) described that performance is the conclusion of student-teacher efforts, and it shows the interest of students in the studies. The critical element in education is students’ academic performance (Rono, 2013 ). Therefore, it is considered as center pole, and the entire education system rotates around the student’s performance. Narad and Abdullah ( 2016 ) concluded that the students’ academic performance determines academic institutions’ success and failure.

Singh et al. ( 2016 ) asserted that the student academic performance directly influences the country’s socio-economic development. Farooq et al. ( 2011 ) highlights the students’ academic performance is the primary concern of all faculties. Additionally, the main foundation of knowledge gaining and improvement of skills is student’s academic performance. According to Narad and Abdullah ( 2016 ), regular evaluation or examinations is essential over a specific period of time in assessing students’ academic performance for better outcomes. Hence the hypothesis that satisfaction significantly affects the performance of the students was included in this study.

  • H5: Students’ satisfaction positively affects the performance of the students.

Satisfaction as mediator

Sibanda et al. ( 2015 ) applied the goal theory to examine the factors persuading students’ academic performance that enlightens students’ significance connected to their satisfaction and academic achievement. According to this theory, students perform well if they know about factors that impact on their performance. Regarding the above variables, institutional factors that influence student satisfaction through performance include course design and quality of the instructor (DeBourgh, 2003 ; Lado et al., 2003 ), prompt feedback, and expectation (Fredericksen et al., 2000 ). Hence the hypothesis that quality of the instructor, course design, prompts feedback, and student expectations significantly affect the students’ performance through satisfaction was included in this study.

  • H6: Quality of the instructor, course design, prompt feedback, and student’ expectations affect the students’ performance through satisfaction.
  • H6a: Students’ satisfaction mediates the relationship between quality of the instructor and student’s performance.
  • H6b: Students’ satisfaction mediates the relationship between course design and student’s performance.
  • H6c: Students’ satisfaction mediates the relationship between prompt feedback and student’s performance.
  • H6d: Students’ satisfaction mediates the relationship between student’ expectations and student’s performance.

Participants

In this cross-sectional study, the data were collected from 544 respondents who were studying the management (B.B.A or M.B.A) and hotel management courses. The purposive sampling technique was used to collect the data. Descriptive statistics shows that 48.35% of the respondents were either MBA or BBA and rests of the respondents were hotel management students. The percentages of male students were (71%) and female students were (29%). The percentage of male students is almost double in comparison to females. The ages of the students varied from 18 to 35. The dominant group was those aged from 18 to 22, and which was the under graduation student group and their ratio was (94%), and another set of students were from the post-graduation course, which was (6%) only.

The research instrument consists of two sections. The first section is related to demographical variables such as discipline, gender, age group, and education level (under-graduate or post-graduate). The second section measures the six factors viz. instructor’s quality, course design, prompt feedback, student expectations, satisfaction, and performance. These attributes were taken from previous studies (Yin & Wang, 2015 ; Bangert, 2004 ; Chickering & Gamson, 1987 ; Wilson et al., 1997 ). The “instructor quality” was measured through the scale developed by Bangert ( 2004 ). The scale consists of seven items. The “course design” and “prompt feedback” items were adapted from the research work of Bangert ( 2004 ). The “course design” scale consists of six items. The “prompt feedback” scale consists of five items. The “students’ expectation” scale consists of five items. Four items were adapted from Bangert, 2004 and one item was taken from Wilson et al. ( 1997 ). Students’ satisfaction was measure with six items taken from Bangert ( 2004 ); Wilson et al. ( 1997 ); Yin and Wang ( 2015 ). The “students’ performance” was measured through the scale developed by Wilson et al. ( 1997 ). The scale consists of six items. These variables were accessed on a five-point likert scale, ranging from 1(strongly disagree) to 5(strongly agree). Only the students from India have taken part in the survey. A total of thirty-four questions were asked in the study to check the effect of the first four variables on students’ satisfaction and performance. For full details of the questionnaire, kindly refer Appendix Tables ​ Tables6 6 .

The study used a descriptive research design. The factors “instructor quality, course design, prompt feedback and students’ expectation” were independent variables. The students’ satisfaction was mediator and students’ performance was the dependent variable in the current study.

In this cross-sectional research the respondents were selected through judgment sampling. They were informed about the objective of the study and information gathering process. They were assured about the confidentiality of the data and no incentive was given to then for participating in this study. The information utilizes for this study was gathered through an online survey. The questionnaire was built through Google forms, and then it was circulated through the mails. Students’ were also asked to write the name of their college, and fifteen colleges across India have taken part to fill the data. The data were collected in the pandemic period of COVID-19 during the total lockdown in India. This was the best time to collect the data related to the current research topic because all the colleges across India were involved in online classes. Therefore, students have enough time to understand the instrument and respondent to the questionnaire in an effective manner. A total of 615 questionnaires were circulated, out of which the students returned 574. Thirty responses were not included due to the unengaged responses. Finally, 544 questionnaires were utilized in the present investigation. Male and female students both have taken part to fill the survey, different age groups, and various courses, i.e., under graduation and post-graduation students of management and hotel management students were the part of the sample.

Exploratory factor analysis (EFA)

To analyze the data, SPSS and AMOS software were used. First, to extract the distinct factors, an exploratory factor analysis (EFA) was performed using VARIMAX rotation on a sample of 544. Results of the exploratory analysis rendered six distinct factors. Factor one was named as the quality of instructor, and some of the items were “The instructor communicated effectively”, “The instructor was enthusiastic about online teaching” and “The instructor was concerned about student learning” etc. Factor two was labeled as course design, and the items were “The course was well organized”, “The course was designed to allow assignments to be completed across different learning environments.” and “The instructor facilitated the course effectively” etc. Factor three was labeled as prompt feedback of students, and some of the items were “The instructor responded promptly to my questions about the use of Webinar”, “The instructor responded promptly to my questions about general course requirements” etc. The fourth factor was Student’s Expectations, and the items were “The instructor provided models that clearly communicated expectations for weekly group assignments”, “The instructor used good examples to explain statistical concepts” etc. The fifth factor was students’ satisfaction, and the items were “The online classes were valuable”, “Overall, I am satisfied with the quality of this course” etc. The sixth factor was performance of the student, and the items were “The online classes has sharpened my analytic skills”, “Online classes really tries to get the best out of all its students” etc. These six factors explained 67.784% of the total variance. To validate the factors extracted through EFA, the researcher performed confirmatory factor analysis (CFA) through AMOS. Finally, structural equation modeling (SEM) was used to test the hypothesized relationships.

Measurement model

The results of Table ​ Table1 1 summarize the findings of EFA and CFA. Results of the table showed that EFA renders six distinct factors, and CFA validated these factors. Table ​ Table2 2 shows that the proposed measurement model achieved good convergent validity (Aggarwal et al., 2018a , b ). Results of the confirmatory factor analysis showed that the values of standardized factor loadings were statistically significant at the 0.05 level. Further, the results of the measurement model also showed acceptable model fit indices such that CMIN = 710.709; df = 480; CMIN/df = 1.481 p  < .000; Incremental Fit Index (IFI) = 0.979; Tucker-Lewis Index (TLI) = 0.976; Goodness of Fit index (GFI) = 0.928; Adjusted Goodness of Fit Index (AGFI) = 0.916; Comparative Fit Index (CFI) = 0.978; Root Mean Square Residual (RMR) = 0.042; Root Mean Squared Error of Approximation (RMSEA) = 0.030 is satisfactory.

Factor Analysis

Author’s Compilation

Validity analysis of measurement model

Author’s compilation

AVE is the Average Variance Extracted, CR is Composite Reliability

The bold diagonal value represents the square root of AVE

The Average Variance Explained (AVE) according to the acceptable index should be higher than the value of squared correlations between the latent variables and all other variables. The discriminant validity is confirmed (Table ​ (Table2) 2 ) as the value of AVE’s square root is greater than the inter-construct correlations coefficient (Hair et al., 2006 ). Additionally, the discriminant validity existed when there was a low correlation between each variable measurement indicator with all other variables except with the one with which it must be theoretically associated (Aggarwal et al., 2018a , b ; Aggarwal et al., 2020 ). The results of Table ​ Table2 2 show that the measurement model achieved good discriminate validity.

Structural model

To test the proposed hypothesis, the researcher used the structural equation modeling technique. This is a multivariate statistical analysis technique, and it includes the amalgamation of factor analysis and multiple regression analysis. It is used to analyze the structural relationship between measured variables and latent constructs.

Table  3 represents the structural model’s model fitness indices where all variables put together when CMIN/DF is 2.479, and all the model fit values are within the particular range. That means the model has attained a good model fit. Furthermore, other fit indices as GFI = .982 and AGFI = 0.956 be all so supportive (Schumacker & Lomax, 1996 ; Marsh & Grayson, 1995 ; Kline, 2005 ).

Criterion for model fit

Hence, the model fitted the data successfully. All co-variances among the variables and regression weights were statistically significant ( p  < 0.001).

Table ​ Table4 4 represents the relationship between exogenous, mediator and endogenous variables viz—quality of instructor, prompt feedback, course design, students’ expectation, students’ satisfaction and students’ performance. The first four factors have a positive relationship with satisfaction, which further leads to students’ performance positively. Results show that the instructor’s quality has a positive relationship with the satisfaction of students for online classes (SE = 0.706, t-value = 24.196; p  < 0.05). Hence, H1 was supported. The second factor is course design, which has a positive relationship with students’ satisfaction of students (SE = 0.064, t-value = 2.395; p < 0.05). Hence, H2 was supported. The third factor is Prompt feedback, and results show that feedback has a positive relationship with the satisfaction of the students (SE = 0.067, t-value = 2.520; p < 0.05). Hence, H3 was supported. The fourth factor is students’ expectations. The results show a positive relationship between students’ expectation and students’ satisfaction with online classes (SE = 0.149, t-value = 5.127; p < 0.05). Hence, H4 was supported. The results of SEM show that out of quality of instructor, prompt feedback, course design, and students’ expectation, the most influencing factor that affect the students’ satisfaction was instructor’s quality (SE = 0.706) followed by students’ expectation (SE =5.127), prompt feedback (SE = 2.520). The factor that least affects the students’ satisfaction was course design (2.395). The results of Table ​ Table4 4 finally depicts that students’ satisfaction has positive effect on students’ performance ((SE = 0.186, t-value = 2.800; p < 0.05). Hence H5 was supported.

Structural analysis

Table ​ Table5 5 shows that students’ satisfaction partially mediates the positive relationship between the instructor’s quality and student performance. Hence, H6(a) was supported. Further, the mediation analysis results showed that satisfaction again partially mediates the positive relationship between course design and student’s performance. Hence, H6(b) was supported However, the mediation analysis results showed that satisfaction fully mediates the positive relationship between prompt feedback and student performance. Hence, H6(c) was supported. Finally, the results of the Table ​ Table5 5 showed that satisfaction partially mediates the positive relationship between expectations of the students and student’s performance. Hence, H6(d) was supported.

Mediation Analysis

In the present study, the authors evaluated the different factors directly linked with students’ satisfaction and performance with online classes during Covid-19. Due to the pandemic situation globally, all the colleges and universities were shifted to online mode by their respective governments. No one has the information that how long this pandemic will remain, and hence the teaching method was shifted to online mode. Even though some of the educators were not tech-savvy, they updated themselves to battle the unexpected circumstance (Pillai et al., 2021 ). The present study results will help the educators increase the student’s satisfaction and performance in online classes. The current research assists educators in understanding the different factors that are required for online teaching.

Comparing the current research with past studies, the past studies have examined the factors affecting the student’s satisfaction in the conventional schooling framework. However, the present study was conducted during India’s lockdown period to identify the prominent factors that derive the student’s satisfaction with online classes. The study also explored the direct linkage between student’s satisfaction and their performance. The present study’s findings indicated that instructor’s quality is the most prominent factor that affects the student’s satisfaction during online classes. This means that the instructor needs to be very efficient during the lectures. He needs to understand students’ psychology to deliver the course content prominently. If the teacher can deliver the course content properly, it affects the student’s satisfaction and performance. The teachers’ perspective is critical because their enthusiasm leads to a better online learning process quality.

The present study highlighted that the second most prominent factor affecting students’ satisfaction during online classes is the student’s expectations. Students might have some expectations during the classes. If the instructor understands that expectation and customizes his/her course design following the student’s expectations, then it is expected that the students will perform better in the examinations. The third factor that affects the student’s satisfaction is feedback. After delivering the course, appropriate feedback should be taken by the instructors to plan future courses. It also helps to make the future strategies (Tawafak et al., 2019 ). There must be a proper feedback system for improvement because feedback is the course content’s real image. The last factor that affects the student’s satisfaction is design. The course content needs to be designed in an effective manner so that students should easily understand it. If the instructor plans the course, so the students understand the content without any problems it effectively leads to satisfaction, and the student can perform better in the exams. In some situations, the course content is difficult to deliver in online teaching like the practical part i.e. recipes of dishes or practical demonstration in the lab. In such a situation, the instructor needs to be more creative in designing and delivering the course content so that it positively impacts the students’ overall satisfaction with online classes.

Overall, the students agreed that online teaching was valuable for them even though the online mode of classes was the first experience during the pandemic period of Covid-19 (Agarwal & Kaushik, 2020 ; Rajabalee & Santally, 2020 ). Some of the previous studies suggest that the technology-supported courses have a positive relationship with students’ performance (Cho & Schelzer, 2000 ; Harasim, 2000 ; Sigala, 2002 ). On the other hand, the demographic characteristic also plays a vital role in understanding the online course performance. According to APA Work Group of the Board of Educational Affairs ( 1997 ), the learner-centered principles suggest that students must be willing to invest the time required to complete individual course assignments. Online instructors must be enthusiastic about developing genuine instructional resources that actively connect learners and encourage them toward proficient performances. For better performance in studies, both teachers and students have equal responsibility. When the learner faces any problem to understand the concepts, he needs to make inquiries for the instructor’s solutions (Bangert, 2004 ). Thus, we can conclude that “instructor quality, student’s expectation, prompt feedback, and effective course design” significantly impact students’ online learning process.

Implications of the study

The results of this study have numerous significant practical implications for educators, students and researchers. It also contributes to the literature by demonstrating that multiple factors are responsible for student satisfaction and performance in the context of online classes during the period of the COVID-19 pandemic. This study was different from the previous studies (Baber, 2020 ; Ikhsan et al., 2019 ; Eom & Ashill, 2016 ). None of the studies had examined the effect of students’ satisfaction on their perceived academic performance. The previous empirical findings have highlighted the importance of examining the factors affecting student satisfaction (Maqableh & Jaradat, 2021 ; Yunusa & Umar, 2021 ). Still, none of the studies has examined the effect of course design, quality of instructor, prompt feedback, and students’ expectations on students’ satisfaction all together with online classes during the pandemic period. The present study tries to fill this research gap.

The first essential contribution of this study was the instructor’s facilitating role, and the competence he/she possesses affects the level of satisfaction of the students (Gray & DiLoreto, 2016 ). There was an extra obligation for instructors who taught online courses during the pandemic. They would have to adapt to a changing climate, polish their technical skills throughout the process, and foster new students’ technical knowledge in this environment. The present study’s findings indicate that instructor quality is a significant determinant of student satisfaction during online classes amid a pandemic. In higher education, the teacher’s standard referred to the instructor’s specific individual characteristics before entering the class (Darling-Hammond, 2010 ). These attributes include factors such as instructor content knowledge, pedagogical knowledge, inclination, and experience. More significantly, at that level, the amount of understanding could be given by those who have a significant amount of technical expertise in the areas they are teaching (Martin, 2021 ). Secondly, the present study results contribute to the profession of education by illustrating a realistic approach that can be used to recognize students’ expectations in their class effectively. The primary expectation of most students before joining a university is employment. Instructors have agreed that they should do more to fulfill students’ employment expectations (Gorgodze et al., 2020 ). The instructor can then use that to balance expectations to improve student satisfaction. Study results can be used to continually improve and build courses, as well as to make policy decisions to improve education programs. Thirdly, from result outcomes, online course design and instructors will delve deeper into how to structure online courses more efficiently, including design features that minimize adversely and maximize optimistic emotion, contributing to greater student satisfaction (Martin et al., 2018 ). The findings suggest that the course design has a substantial positive influence on the online class’s student performance. The findings indicate that the course design of online classes need to provide essential details like course content, educational goals, course structure, and course output in a consistent manner so that students would find the e-learning system beneficial for them; this situation will enable students to use the system and that leads to student performance (Almaiah & Alyoussef, 2019 ). Lastly, the results indicate that instructors respond to questions promptly and provide timely feedback on assignments to facilitate techniques that help students in online courses improve instructor participation, instructor interaction, understanding, and participation (Martin et al., 2018 ). Feedback can be beneficial for students to focus on the performance that enhances their learning.

Limitations and future scope of the study

The data collected in this study was cross-sectional in nature due to which it is difficult to establish the causal relationship between the variables. The future research can use a longitudinal study to handle this limitation. Further, the data was collected from one type of respondents only, that is, the students. Therefore, the results of the study cannot be generalized to other samples. The future research can also include the perspectives of teachers and policy makers to have more generalization of the results. The current research is only limited to theory classes; therefore, it can be implemented to check students’ performance in practical classes. The study is done on the Indian students only; thus, if the data is collected from various countries, it can give better comparative results to understand the student’s perspective. This study is limited to check the performance of students, so in the future, the performance of teachers can be checked with similar kinds of conditions. There may be some issues and problems faced by the students, like the limited access to the internet or disturbance due to low signals. Some of the students may face the home environment issues such as disturbance due to family members, which may lead to negative performance. The above-mentioned points can be inculcated in the future research.

Declarations

Not applicable.

The authors declare no conflict of interest, financial or otherwise.

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Ram Gopal, Email: [email protected] .

Varsha Singh, Email: [email protected] .

Arun Aggarwal, Email: [email protected] .

  • Agarwal S, Kaushik JS. Student’s perception of online learning during COVID pandemic. The Indian Journal of Pediatrics. 2020; 87 :554–554. doi: 10.1007/s12098-020-03327-7. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Aggarwal A, Dhaliwal RS, Nobi K. Impact of structural empowerment on organizational commitment: The mediating role of women's psychological empowerment. Vision. 2018; 22 (3):284–294. doi: 10.1177/0972262918786049. [ CrossRef ] [ Google Scholar ]
  • Aggarwal A, Goyal J, Nobi K. Examining the impact of leader-member exchange on perceptions of organizational justice: The mediating role of perceptions of organizational politics. Theoretical Economics Letters. 2018; 8 (11):2308–2329. doi: 10.4236/tel.2018.811150. [ CrossRef ] [ Google Scholar ]
  • Aggarwal A, Chand PA, Jhamb D, Mittal A. Leader-member exchange, work engagement and psychological withdrawal behaviour: The mediating role of psychological empowerment. Frontiers in Psychology. 2020; 11 :1–17. doi: 10.3389/fpsyg.2020.00423. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Almaiah MA, Almulhem A. A conceptual framework for determining the success factors of e-learning system implementation using Delphi technique. Journal of Theoretical and Applied Information Technology. 2018; 96 (17):5962–5976. [ Google Scholar ]
  • Almaiah MA, Alyoussef IY. Analysis of the effect of course design, course content support, course assessment and instructor characteristics on the actual use of E-learning system. Ieee Access. 2019; 7 :171907–171922. doi: 10.1109/ACCESS.2019.2956349. [ CrossRef ] [ Google Scholar ]
  • Ames C. Classrooms: Goals, structures, and student motivation. Journal of Educational Psychology. 1992; 84 :261–271. doi: 10.1037/0022-0663.84.3.261. [ CrossRef ] [ Google Scholar ]
  • Ames C, Archer J. Achievement goals in the classroom: Student's learning strategies and motivational processes. Journal of Educational Psychology. 1988; 80 :260–267. doi: 10.1037/0022-0663.80.3.260. [ CrossRef ] [ Google Scholar ]
  • APA Work Group of the Board of Educational Affairs . Learner-centered psychological principles: A framework for school reform and redesign. American Psychological Association; 1997. [ Google Scholar ]
  • Appleton-Knapp S, Krentler KA. Measuring student expectations and their effects on satisfaction: The importance of managing student expectations. Journal of Marketing Education. 2006; 28 (3):254–264. doi: 10.1177/0273475306293359. [ CrossRef ] [ Google Scholar ]
  • Arambewela R, Hall J. An empirical model of international student satisfaction. Asia Pacific Journal of Marketing and Logistics. 2009; 21 (4):555–569. doi: 10.1108/13555850910997599. [ CrossRef ] [ Google Scholar ]
  • Azlan AA, Hamzah MR, Sern TJ, Ayub SH, Mohamad E. Public knowledge, attitudes and practices towards COVID-19: A cross-sectional study in Malaysia. PLoS One. 2020; 15 (5):e0233668. doi: 10.1371/journal.pone.0233668. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Baber H. Determinants of Students' perceived outcome and satisfaction in online learning during the pandemic of COVID-19. Journal of Education and e-Learning Research. 2020; 7 (3):285–292. doi: 10.20448/journal.509.2020.73.285.292. [ CrossRef ] [ Google Scholar ]
  • Bangert AW. The seven principles of good practice: A framework for evaluating on- line teaching. The Internet and Higher Education. 2004; 7 (3):217–232. doi: 10.1016/j.iheduc.2004.06.003. [ CrossRef ] [ Google Scholar ]
  • Bates EA, Kaye LK. “I’d be expecting caviar in lectures”: The impact of the new fee regime on undergraduate students’ expectations of higher education. Higher Education. 2014; 67 (5):655–673. doi: 10.1007/s10734-013-9671-3. [ CrossRef ] [ Google Scholar ]
  • Bayham, J., & Fenichel, E.P. (2020). The impact of school closure for COVID-19 on the US healthcare workforce and the net mortality effects. Available at SSRN: 10.2139/ssrn.3555259.
  • Bersin J. The blended learning book: Best practices, proven methodologies and lessons learned. Pfeiffer Publishing; 2004. [ Google Scholar ]
  • Biner PM, Summers M, Dean RS, Bink ML, Anderson JL, Gelder BC. Student satisfaction with interactive telecourses as a function of demographic variables and prior telecourse experience. Distance Education. 1996; 17 (11):33–43. doi: 10.1080/0158791960170104. [ CrossRef ] [ Google Scholar ]
  • Black GS, Kassaye WW. Do students learning styles impact student outcomes in marketing classes? Academy of Educational Leadership Journal. 2014; 18 (4):149–162. [ Google Scholar ]
  • Bridge, S. (2020). Opinion: How edtech will keep our students on track during covid-19. Arabian business. Com Retrieved from https://search.proquest.com/docview/2377556452?accountid=147490 . Accessed 12 Oct 2020.
  • Brown SA, Venkatesh V, Goyal S. Expectation confirmation in information systems research: A test of six competing models. MIS Quarterly. 2014; 38 (3):729–756. doi: 10.25300/MISQ/2014/38.3.05. [ CrossRef ] [ Google Scholar ]
  • Brownlee J, Walker S, Lennox S, Exley B, Pearce S. The first year university experience: Using personal epistemology to understsnd effective learning and teaching in higher education. Higher Education. 2009; 58 (5):599–618. doi: 10.1007/s10734-009-9212-2. [ CrossRef ] [ Google Scholar ]
  • Budur T, Faraj KM, Karim LA. Benchmarking operations strategies via hybrid model: A case study of café-restaurant sector. Amozonia Investiga. 2019; 8 :842–854. [ Google Scholar ]
  • Centers for Disease Control and Prevention (2020). Coronavirus disease 2019 (COVID-19): Reducing stigma. Retrieved November 26, 2020, from: https://www.cdc.gov/coronavirus/2019-ncov/about/related-stigma.html .
  • Chang N. Pre-service Teachers' views: How did E-feedback through assessment facilitate their learning? Journal of the Scholarship of Teaching and Learning. 2011; 11 (2):16–33. [ Google Scholar ]
  • Chickering AW, Gamson ZF. Seven principles for good practice in undergraduate education. AAHE Bulletin. 1987; 39 (7):3–7. [ Google Scholar ]
  • Cho W, Schelzer C. Just in-time education: Tools for hospitality managers of the future? International Journal of Contemporary Hospitality Management. 2000; 12 (1):31–36. doi: 10.1108/09596110010305000. [ CrossRef ] [ Google Scholar ]
  • Christensen AL. Feedback, affect, and creative behavior: A multi-level model linking feedback to performance. Arizona State University; 2014. [ Google Scholar ]
  • Clayton K, Blumberg F, Auld DP. The relationship between motivation, learning strategies and choice of environment whether traditional or including an online component. British Journal of Educational Technology. 2010; 41 (3):349–364. doi: 10.1111/j.1467-8535.2009.00993.x. [ CrossRef ] [ Google Scholar ]
  • Darling-Hammond, L. (2010).  Evaluating teacher effectiveness: How teacher performance assessments can measure and improve teaching . Washington, DC: Center for American Progress
  • DeBourgh GA. Predictors of student satisfaction in distance-delivered graduate nursing courses: What matters most? Journal of Professional Nursing. 2003; 19 :149–163. doi: 10.1016/S8755-7223(03)00072-3. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Dweck C, Leggett E. A social–cognitive approach to motivation and personality. Psychological Review. 1988; 95 :256–273. doi: 10.1037/0033-295X.95.2.256. [ CrossRef ] [ Google Scholar ]
  • Elliot AJ. A conceptual history of the achievement goal construct. In: Elliot A, Dweck C, editors. Handbook of competence and motivation. Guilford Press; 2005. pp. 52–72. [ Google Scholar ]
  • Elliot A, McGregor H. A 2 _ 2 achievement goal framework. Journal of Personality and Social Psychology. 2001; 80 :501–519. doi: 10.1037/0022-3514.80.3.501. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Elliott ES, Dweck CS. Goals: An approach to motivation and achievement. Journal of Personality and Social Psychology. 1988; 54 (1):5. doi: 10.1037/0022-3514.54.1.5. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Eom SB, Ashill N. The determinants of students' perceived learning outcomes and satisfaction in university online education: An update. Decision Sciences Journal of Innovative Education. 2016; 14 (2):185–215. doi: 10.1111/dsji.12097. [ CrossRef ] [ Google Scholar ]
  • Eraut, M., (2006). Feedback. Learning in Health and Social Care. Volume-5, issue-3. Pg 111–118. Retrieved from https://edservices.wiley.com/how-student-feedback-creates-better-online- learning/ . Accessed 23 Oct 2020.
  • Farooq MS, Chaudhry AH, Shafiq M, Berhanu G. Factors affecting students' quality of academic performance: A case of secondary school level. Journal of Quality and Technology Management. 2011; 7 :1–14. [ Google Scholar ]
  • Fredericksen E, Shea P, Pickett A. Factors influencing student and faculty satisfaction in the SUNY learning network. State University of New York; 2000. [ Google Scholar ]
  • Gold, S. (2011). A constructivist approach to online training for online teachers. Journal of Aysnchronous Learning Networks, 5 (1), 35–57.
  • González-Gómez D, Jeong JS, Rodríguez DA. Performance and perception in the flipped learning model: An initial approach to evaluate the effectiveness of a new teaching methodology in a general science classroom. Journal of Science Education and Technology. 2016; 25 (3):450–459. doi: 10.1007/s10956-016-9605-9. [ CrossRef ] [ Google Scholar ]
  • Gorgodze S, Macharashvili L, Kamladze A. Learning for earning: Student expectations and perceptions of university. International Education Studies. 2020; 13 (1):42–53. doi: 10.5539/ies.v13n1p42. [ CrossRef ] [ Google Scholar ]
  • Grammatikopoulos, V., Linardakis, M., Gregoriadis, A., & Oikonomidis, V. (2014). Assessing the Students' evaluations of educational quality (SEEQ) questionnaire in Greek higher education. Higher Education., 70 (3).
  • Gray JA, DiLoreto M. The effects of student engagement, student satisfaction, and perceived learning in online learning environments. International Journal of Educational Leadership Preparation. 2016; 11 (1):n1. [ Google Scholar ]
  • Grebennikov L, Shah S. Monitoring trends in student satisfaction. Tertiary Education and Management. 2013; 19 (4):301–322. doi: 10.1080/13583883.2013.804114. [ CrossRef ] [ Google Scholar ]
  • Hair JF, Black WC, Babin BJ, Anderson RE, Tatham RL. Multivariate data analysis 6th edition. Pearson Prentice Hall. New Jersey. Humans: Critique and reformulation. Journal of Abnormal Psychology. 2006; 87 :49–74. [ PubMed ] [ Google Scholar ]
  • Hansen G, Ringdal R. Formative assessment as a future step in maintaining the mastery-approach and performance-avoidance goal stability. Studies in Educational Evaluation. 2018; 56 :59–70. doi: 10.1016/j.stueduc.2017.11.005. [ CrossRef ] [ Google Scholar ]
  • Harasim L. Shift happens: Online education as a new paradigm in learning. The Internet and Higher Education. 2000; 3 (1):41–61. doi: 10.1016/S1096-7516(00)00032-4. [ CrossRef ] [ Google Scholar ]
  • Hattie J, Timperley H. The power of feedback. Review of Educational Research. 2007; 77 (1):81–112. doi: 10.3102/003465430298487. [ CrossRef ] [ Google Scholar ]
  • Henriksen D, Creely E, Henderson M. Folk pedagogies for teacher transitions: Approaches to synchronous online learning in the wake of COVID-19. Journal of Technology and Teacher Education. 2020; 28 (2):201–209. [ Google Scholar ]
  • Ho A, Lu L, Thurmaier K. Testing the reluctant Professor's hypothesis: Evaluating a blended-learning approach to distance education. Journal of Public Affairs Education. 2006; 12 (1):81–102. doi: 10.1080/15236803.2006.12001414. [ CrossRef ] [ Google Scholar ]
  • ICSB (2015). Addressing undergraduate entrepreneurship student expectations: An exploratory study. International Council for Small Business (ICSB). Retrieved from https://search.proquest.com/docview/1826918813?accountid=147490 . Accessed 20 Oct 2020.
  • Ikhsan, R. B., Saraswati, L. A., Muchardie, B. G., & Susilo, A. (2019). The determinants of students' perceived learning outcomes and satisfaction in BINUS online learning. Paper presented at the 2019 5th International Conference on New Media Studies (CONMEDIA). IEEE.
  • Jenkins, D. M. (2015). Integrated course design: A facelift for college courses. Journal of Management Education, 39 (3), 427–432.
  • Kauffman, H. (2015). A review of predictive factors of student success in and satisfaction with online learning. Research in Learning Technology, 23 .
  • Khan NUS, Yildiz Y. Impact of intangible characteristics of universities on student satisfaction. Amazonia Investiga. 2020; 9 (26):105–116. doi: 10.34069/AI/2020.26.02.12. [ CrossRef ] [ Google Scholar ]
  • Kinicki AJ, Prussia GE, Wu BJ, McKee-Ryan FM. A covariance structure analysis of employees' response to performance feedback. Journal of Applied Psychology. 2004; 89 (6):1057–1069. doi: 10.1037/0021-9010.89.6.1057. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kline RB. Principles and practice of structural equation modeling. 2. The Guilford Press; 2005. [ Google Scholar ]
  • Kotler, P., & Clarke, R. N. (1986). Marketing for health care organizations . Prentice Hall.
  • Lado N, Cardone-Riportella C, Rivera-Torres P. Measurement and effects of teaching quality: An empirical model applied to masters programs. Journal of Business Education. 2003; 4 :28–40. [ Google Scholar ]
  • Ladyshewsky RK. Instructor presence in online courses and student satisfaction. International Journal for the Scholarship of Teaching and Learning. 2013; 7 :1. doi: 10.20429/ijsotl.2013.070113. [ CrossRef ] [ Google Scholar ]
  • Lah, K., & G. Botelho. (2012). Union Opts to Continue Chicago Teachers Strike; Mayor Takes Fight to Court. http://articles.cnn.com/2012-09-16/us/us_illinois-chicago-teachersstrike_1_chicago-teachers-union-union-president-karen-lewis-union-delegates .
  • Lee J. An exploratory study of effective online learning: Assessing satisfaction levels of graduate students of mathematics education associated with human and design factors of an online course. The International Review of Research in Open and Distance Learning. 2014; 15 (1):111–132. doi: 10.19173/irrodl.v15i1.1638. [ CrossRef ] [ Google Scholar ]
  • Liaw S-S. Investigating students' perceived satisfaction, behavioral intention, and effectiveness of e-learning: A case study of the blackboard system. Computers & Education. 2008; 51 (2):864–873. doi: 10.1016/j.compedu.2007.09.005. [ CrossRef ] [ Google Scholar ]
  • Lin Y, Lin G, Laffey JM. Building a social and motivational framework for understanding satisfaction in online learning. Journal of Educational Computing Research. 2008; 38 (1):1–27. doi: 10.2190/EC.38.1.a. [ CrossRef ] [ Google Scholar ]
  • Lockman AS, Schirmer BR. Online instruction in higher education: Promising, research-based, and evidence-based practices. Journal of Education and e-Learning Research. 2020; 7 (2):130–152. doi: 10.20448/journal.509.2020.72.130.152. [ CrossRef ] [ Google Scholar ]
  • Luekens, M.T., Lyter, D.M., and Fox, E.E. (2004). Teacher attrition and mobility: Results from the teacher follow-up survey, 2000–01 (NCES 2004–301). National Center for Education Statistics, U.S. Department of Education . Washington, DC. https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2004301 .
  • Malik, M. E., Danish, R. Q., & Usman, A. (2010). The impact of service quality on students’ satisfaction in higher education institutes of Punjab. Journal of Management Research, 2 (2), 1–11.
  • Maqableh, M., & Jaradat, M. (2021). Exploring the determinants of students’ academic performance at university level: The mediating role of internet usage continuance intention. Education and Information Technologies . 10.1007/s10639-021-10453-y. [ PMC free article ] [ PubMed ]
  • Marsh HW. Students' evaluations of university teaching: Research findings, methodological issues, and directions for future research. International Journal of Educational Research. 1987; 11 :253–388. doi: 10.1016/0883-0355(87)90001-2. [ CrossRef ] [ Google Scholar ]
  • Marsh, H. W., & Grayson, D. (1995). Latent variable models of multitrait-multimethod data.Marsh, H. W., & Grayson, D. (1995). Latent variable models of multitrait-multimethod data. In R. H. Hoyle (Ed.), Structural equation modeling: Concepts, issues, and applications (p. 177–198). Sage Publications, Inc.
  • Martin, A. M. (2021). Instructor qualities and student success in higher education online courses. Journal of Digital Learning in Teacher Education, 37 (1), 65–80.
  • Martínez-Argüelles, M. J., & Batalla-Busquets, J. M. (2016). Perceived service quality and student loyalty in an online university. International Review of Research in Open and Distributed Learning, 17 (4), 264–279.
  • Martin F, Wang C, Sadaf A. Student perception of helpfulness of facilitation strategies that enhance instructor presence, connectedness, engagement, and learning in online courses. The Internet and Higher Education. 2018; 37 :52–65. doi: 10.1016/j.iheduc.2018.01.003. [ CrossRef ] [ Google Scholar ]
  • Mensink PJ, King K. Student access of online feedback is modified by the availability of assessment marks, gender and academic performance. British Journal of Educational Technology. 2020; 51 (1):10–22. doi: 10.1111/bjet.12752. [ CrossRef ] [ Google Scholar ]
  • Mohammed SS, Suleyman C, Taylan B. Burnout determinants and consequences among university lecturers. Amazonia Investiga. 2020; 9 (27):13–24. doi: 10.34069/AI/2020.27.03.2. [ CrossRef ] [ Google Scholar ]
  • Moore JL, Dickson-Deane C, Galyen K. E-learning, online learning, and distance learning environments: Are they the same? Internet Higher Educ. 2011; 14 (2):129–135. doi: 10.1016/j.iheduc.2010.10.001. [ CrossRef ] [ Google Scholar ]
  • Mouratidis, A., Michou, A., Demircioğlu, A. N., & Sayil, M. (2018). Different goals, different pathways to success: Performance-approach goals as direct and mastery-approach goals as indirect predictors of grades in mathematics. Learning and Individual Differences, 61 , 127–135.
  • Mtebe JS, Raisamo R. A model for assessing learning management system success in higher education in sub-Saharan countries. The Electronic Journal of Information Systems in Developing Countries. 2014; 61 (1):1–17. doi: 10.1002/j.1681-4835.2014.tb00436.x. [ CrossRef ] [ Google Scholar ]
  • Munteanu C, Ceobanu C, Bobâlca C, Anton O. An analysis of customer satisfaction in a higher education context. The International Journal of Public Sector Management. 2010; 23 (2):124. doi: 10.1108/09513551011022483. [ CrossRef ] [ Google Scholar ]
  • Narad A, Abdullah B. Academic performance of senior secondary school students: Influence of parental encouragement and school environment. Rupkatha Journal on Interdisciplinary Studies in Humanities. 2016; 8 (2):12–19. doi: 10.21659/rupkatha.v8n2.02. [ CrossRef ] [ Google Scholar ]
  • NDTV (2020). Schools Closed, Travel To Be Avoided, Says Centre On Coronavirus: 10 Points. NDTV.com. Retrieved March 18, 2020.
  • O'donovan B. How student beliefs about knowledge and knowing influence their satisfaction with assessment and feedback. Higher Education. 2017; 74 (4):617–633. doi: 10.1007/s10734-016-0068-y. [ CrossRef ] [ Google Scholar ]
  • Oliver RL. A congitive model of the antecedents and consequences of satisfaction decisions. JMR, Journal of Marketing Research (Pre-1986) 1980; 17 (000004):460. doi: 10.1177/002224378001700405. [ CrossRef ] [ Google Scholar ]
  • Pei L, Wu H. Does online learning work better than offline learning in undergraduate medical education? A systematic review and meta-analysis. Medical Education Online. 2019; 24 (1):1666538. doi: 10.1080/10872981.2019.1666538. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Perlman S, & Mclntosh K. (2020). Coronaviruses, including severe acute respiratory syndrome (SARS) and Middle East respiratory syndrome (MERS). In: J.E Benett, R. Dolin,  M. J. Blaser (Eds.), Mandell, Douglas, and Bennett's principles and practice of infectious diseases. 9th ed . Philadelphia, PA: Elsevier: chap 155.
  • Pillai, K. R., Upadhyaya, P., Prakash, A. V., Ramaprasad, B. S., Mukesh, H. V., & Pai, Y. (2021). End-user satisfaction of technology-enabled assessment in higher education: A coping theory perspective. Education and Information Technologies . 10.1007/s10639-020-10401-2.
  • Pintrich P. The role of motivation in promoting and sustaining self-regulated learning. International Journal of Educational Research. 1999; 31 :459–470. doi: 10.1016/S0883-0355(99)00015-4. [ CrossRef ] [ Google Scholar ]
  • Rajabalee, Y. B., & Santally, M. I. (2020). Learner satisfaction, engagement and performances in an online module: Implications for institutional e-learning policy. Education and Information Technologies . 10.1007/s10639-020-10375-1. [ PMC free article ] [ PubMed ]
  • Ramsden PA. Performance indicator of teaching quality in higher education: The course experience questionnaire. Studies in Higher Education. 1991; 16 (2):129–150. doi: 10.1080/03075079112331382944. [ CrossRef ] [ Google Scholar ]
  • Rogers J. Adults learning. 3. Open University Press; 1992. [ Google Scholar ]
  • Rono, R. (2013). Factors affecting pupils' performance in public primary schools at Kenya certificate of primary education examination (Kcpe) in Emgwen Division, Nandi District, Kenya (Doctoral dissertation, University of Nairobi) .
  • Salloum, S. A. and Shaalan, K. (2018). Investigating students' acceptance of e-learning system in higher educational environments in the UAE: Applying the extended technology acceptance model (TAM), Ph.D. dissertation, Brit. Univ. Dubai, Dubai, United Arab Emirates, 2018.
  • Sanderson G. Objectives and evaluation. In: Truelove S, editor. Handbook of training and development. 2. Blackwell; 1995. pp. 113–144. [ Google Scholar ]
  • Schumacker RE, Lomax RG. A beginner's guide to structural equation modeling. L. L. Erlbaum Associates; 1996. [ Google Scholar ]
  • Schwarz C, Zhu Z. The impact of student expectations in using instructional tools on student engagement: A look through the expectation disconfirmation theory lens. Journal of Information Systems Education. 2015; 26 (1):47–58. [ Google Scholar ]
  • Schwinger M, Stiensmeier-Pelster J. Performance-approach and performance-avoidance classroom goals and the adoption of personal achievement goals. British Journal of Educational Psychology. 2011; 81 (4):680–699. doi: 10.1111/j.2044-8279.2010.02012.x. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Sher, A. (2009). Assessing the relationship of student-instructor and student-student interaction to student learning and satisfaction in web-based online learning environment. Journal of Interactive Online Learning, 8 (2).
  • Sibanda L, Iwu CG, Benedict OH. Factors influencing academic performance оf university students. Демографія та соціальна економіка 2015; 2 :103–115. [ Google Scholar ]
  • Sigala M. The evolution of internet pedagogy: Benefits for tourism and hospitality education. Journal of Hospitality, Leisure Sport and Tourism Education. 2002; 1 (2):29–45. [ Google Scholar ]
  • Simsek, U., Turan, I., & Simsek, U. (2017). Social studies teachers‟ and teacher candidates” perceptions on prompt feedback and communicate high expectations. PEOPLE: International Journal of Social Sciences, 3 (1), 332, 345.
  • Singh, S. P., Malik, S., & Singh, P. (2016). Factors affecting academic performance of students. Paripex-Indian Journal of Research, 5 (4), 176–178.
  • Shehzadi, S., Nisar, Q. A., Hussain, M. S., Basheer, M. F., Hameed, W. U., & Chaudhry, N. I. (2020). The role of digital learning toward students' satisfaction and university brand image at educational institutes of Pakistan: a post-effect of COVID-19. Asian Education and Development Studies, 10 (2), 276–294.
  • Tawafak RM, Romli AB, Alsinani M. E-learning system of UCOM for improving student assessment feedback in Oman higher education. Education and Information Technologies. 2019; 24 (2):1311–1335. doi: 10.1007/s10639-018-9833-0. [ CrossRef ] [ Google Scholar ]
  • UNESCO (2020). United nations educational, scientific and cultural organization. COVID19 educational disruption and response. UNESCO, Paris, France.  https://en.unesco.org/themes/education-emergencies/coronavirus-school-closures . Accessed 17 Nov 2020.
  • Urdan, T. (1997). Achievement goal theory: Past results, future directions. Advances in Motivation and Achievement, 10 , 99–141.
  • Wilson KL, Lizzio A, Ramsden P. The development, validation and application of the course experience questionnaire. Studies in Higher Education. 1997; 22 (1):33–53. doi: 10.1080/03075079712331381121. [ CrossRef ] [ Google Scholar ]
  • Wooldridge, B. (1995). Increasing the effectiveness of university/college instruction: Integrating the results of learning style research into course design and delivery. In R. R. Simms and S. J. Simms (Eds.), the Importance of Learning Styles. Westport, CT: Greenwood Press, 49–67.
  • World Health Organization (2019). https://www.who.int/health-topics/coronavirus#tab=tab_1 , Retrieved 29 March 2020.
  • Wright CR. Criteria for evaluating the quality of online courses. Alberta distance Educ. Training Assoc. 2003; 16 (2):185–200. [ Google Scholar ]
  • Yen SC, Lo Y, Lee A, Enriquez J. Learning online, offline, and in-between: Comparing student academic outcomes and course satisfaction in face-to-face, online, and blended teaching modalities. Education and Information Technologies. 2018; 23 (5):2141–2153. doi: 10.1007/s10639-018-9707-5. [ CrossRef ] [ Google Scholar ]
  • Yin H, Wang W. Assessing and improving the quality of undergraduate teaching in China: The course experience questionnaire. Assessment & Evaluation in Higher Education. 2015; 40 (8):1032–1049. doi: 10.1080/02602938.2014.963837. [ CrossRef ] [ Google Scholar ]
  • Yorke M. Formative assessment in higher education: Moves towards theory and the enhancement of pedagogic practice. Higher Education. 2003; 45 (4):477–501. doi: 10.1023/A:1023967026413. [ CrossRef ] [ Google Scholar ]
  • Yu F, Du L, Ojcius DM, Pan C, Jiang S. Measures for diagnosing and treating infections by a novel coronavirus responsible for a pneumonia outbreak originating in Wuhan, China. Microbes and Infection. 2020; 22 (2):74–79. doi: 10.1016/j.micinf.2020.01.003. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Yunusa AA, Umar IN. A scoping review of critical predictive factors (CPFs) of satisfaction and perceived learning outcomes in E-learning environments. Education and Information Technologies. 2021; 26 (1):1223–1270. doi: 10.1007/s10639-020-10286-1. [ CrossRef ] [ Google Scholar ]
  • Zeithaml VA. Consumer perceptions of price, quality, and value: A means-end model and synthesis of evidence. Journal of Marketing. 1988; 52 (3):2–22. doi: 10.1177/002224298805200302. [ CrossRef ] [ Google Scholar ]
  • Zhang L, Han Z, Gao Q. Empirical study on the student satisfaction index in higher education. International Journal of Business and Management. 2008; 3 (9):46–51. [ Google Scholar ]

CS50's Introduction to Artificial Intelligence with Python

Learn to use machine learning in Python in this introductory course on artificial intelligence.

CS50AI

Associated Schools

Harvard School of Engineering and Applied Sciences

Harvard School of Engineering and Applied Sciences

What you'll learn.

Graph search algorithms

Reinforcement learning

Machine learning

Artificial intelligence principles

How to design intelligent systems

How to use AI in Python programs

Course description

AI is transforming how we live, work, and play. By enabling new technologies like self-driving cars and recommendation systems or improving old ones like medical diagnostics and search engines, the demand for expertise in AI and machine learning is growing rapidly. This course will enable you to take the first step toward solving important real-world problems and future-proofing your career.

CS50’s Introduction to Artificial Intelligence with Python explores the concepts and algorithms at the foundation of modern artificial intelligence, diving into the ideas that give rise to technologies like game-playing engines, handwriting recognition, and machine translation. Through hands-on projects, students gain exposure to the theory behind graph search algorithms, classification, optimization, reinforcement learning, and other topics in artificial intelligence and machine learning as they incorporate them into their own Python programs. By course’s end, students emerge with experience in libraries for machine learning as well as knowledge of artificial intelligence principles that enable them to design intelligent systems of their own.

Enroll now to gain expertise in one of the fastest-growing domains of computer science from the creators of one of the most popular computer science courses ever, CS50. You’ll learn the theoretical frameworks that enable these new technologies while gaining practical experience in how to apply these powerful techniques in your work.

Instructors

David J. Malan

David J. Malan

Brian Yu

You may also like

Random walks generated using Python 3

Using Python for Research

Take your introductory knowledge of Python programming to the next level and learn how to use Python 3 for your research.

CS50x

CS50: Introduction to Computer Science

An introduction to the intellectual enterprises of computer science and the art of programming.

CS50T

CS50's Understanding Technology

This is CS50’s introduction to technology for students who don’t (yet!) consider themselves computer persons.

IMAGES

  1. How to Write the Recommendation in Research Paper

    recommendation in research online class

  2. IMPORTANCE OF RECOMMENDATION IN RESEARCH

    recommendation in research online class

  3. Example of Recommendation Letter

    recommendation in research online class

  4. Research/Study Letter of Recommendation

    recommendation in research online class

  5. 10 Easy Steps: How to Write a Recommendation in Research

    recommendation in research online class

  6. FREE 8+ Sample Generic Recommendation Letter Templates in PDF

    recommendation in research online class

VIDEO

  1. DAY 11 OF 10K TO 10 LAKH TRADING CHALLENGE.. #sharemarket #tradingchallenge

  2. 15 MIN TRADING STRATEGY 📊 #sharemarket#tradingstrategy

  3. HOW TO IDENTIFY TREND 🤔 #sharemarket #trader

  4. 5 APRIL RBI CONFERENCE ..#sharemarketnews #trading

  5. DAY 6 OF 10K TO 10 Lakh Trading Challenge #sharemarket #tradingchallenge

  6. DAY 9 OF 10K TO 10 Lakh Trading Challenge #sharemarket #trading

COMMENTS

  1. Factors and Recommendations to Support Students' Enjoyment of Online

    Understanding components that influence students' enjoyment of distance higher education is increasingly important to enhance academic performance and retention. Although there is a growing body of research about students' engagement with online learning, a research gap exists concerning whether fun affect students' enjoyment. A contributing factor to this situation is that the meaning ...

  2. Online and face‐to‐face learning: Evidence from students' performance

    1.1. Related literature. Online learning is a form of distance education which mainly involves internet‐based education where courses are offered synchronously (i.e. live sessions online) and/or asynchronously (i.e. students access course materials online in their own time, which is associated with the more traditional distance education).

  3. PDF An adaptable and personalized framework for top-N course ...

    framework for top‑N course recommendations in online learning Samina Amin1, M. Irfan Uddin1*, ... Recently, a new research trend in recommendation research is the use of RL to address

  4. Integrating students' perspectives about online learning: a hierarchy

    Recommendations for online course design, policy, and future research are provided. This article reports on a large-scale (n = 987), exploratory factor analysis study incorporating various concepts identified in the literature as critical success factors for online learning from the students' perspective, and then determines their ...

  5. Advantages, Limitations and Recommendations for online learning during

    Online learning systems are web-based software for distributing, tracking, and managing courses over the Internet.2 It involves the implementation of advancements in technology to direct, design and deliver the learning content, and to facilitate two-way communication between students and faculty.3 They contain features such as whiteboards ...

  6. How to Write Recommendations in Research

    Recommendations for future research should be: Concrete and specific. Supported with a clear rationale. Directly connected to your research. Overall, strive to highlight ways other researchers can reproduce or replicate your results to draw further conclusions, and suggest different directions that future research can take, if applicable.

  7. A literature review of implemented recommendation techniques used in

    1. Introduction. Since the advent of the first Massive Open Online Course (MOOC) in 2008 (Downes, 2008), MOOCs have added a new dimension to education systems with the main attraction being access to free open education courses.The number of MOOCs and the number of students registered in MOOCs are growing every year with MOOCs using a number of platforms such as edX, Coursera, Udacity, NetEase ...

  8. A literature review: efficacy of online learning courses for higher

    The Internet has made online learning possible, and many educators and researchers are interested in online learning courses to enhance and improve the student learning outcomes while battling the shortage in resources, facilities and equipment particularly in higher education institution. Online learning has become popular because of its potential for providing more flexible access to content ...

  9. A systematic review of research on online teaching and learning from

    Tallent-Runnels et al. (2006) reviewed research late 1990's to early 2000's, Berge and Mrozowski (2001) reviewed research 1990 to 1999, and Zawacki-Richter et al. (2009) reviewed research in 2000-2008 on distance education and online learning. Table 1 shows the research themes from previous systematic reviews on online learning research.

  10. 7 High-Impact, Evidence-Based Tips for Online Teaching

    We looked over all the research we've read about online learning to find seven high-impact, evidence-based strategies that every teacher should know. 1. Your Virtual Classroom is a Real Learning Space—Keep it Organized. "Students value strong course organization," explain Swapna Kumar and her colleagues in a 2019 study.

  11. Looking back to move forward: comparison of instructors' and ...

    Recommendations for future research include incorporating a more diverse sample, exploring relationships between the nine factors, and focusing on equipping students with skills for optimal online ...

  12. PDF A Survey of Online Course Recommendation Techniques

    3. General Recommendation Method. General recommendation assumes that users have static preferences. Based on historical data of interaction between users and items, the similarity between us- ers ...

  13. Online vs in-person learning in higher education: effects on student

    By analyzing the final test scores of freshman students in five core courses over the 2020 (in-person) and 2021 (online) academic years, the research provides empirical insights into the efficacy ...

  14. Online Education and Its Effective Practice: A Research Review

    gued that effective online instruction is dependent upon 1) w ell-designed course content, motiva t-. ed interaction between the instructor and learners, we ll-prepared and fully-supported ...

  15. Research on Online Learner Modeling and Course Recommendation Based on

    Therefore, in order to improve the accuracy of course recommendation, it is necessary to build an accurate and complete learner model. In order to improve the application effect of recommendation, this paper focuses on the recommendation method of emotional factors to improve the recommendation efficiency of learning resources.

  16. Research on Online Learners' Course Recommendation System Based on

    The results show that the overall performance of online learner course recommendation system based on knowledge is excellent. References 1 Liu Z. B. , Song W. A. , Kong X. Y. , and Li H. Y. , " Research on learner modeling and learning resource recommendation in cloud environment ," E-Education Research , vol. 38 , no. 7 , pp. 58 - 63 ...

  17. Recommending Online Course Resources Based on Knowledge Graph

    The online course recommendation problem is to recommend courses (items) to students (users). In the course recommendation scenario, ... in part by the Education and Teaching Research Project of Shandong Province, in part by the Taishan Scholar Program of Shandong Province, in part by the University-Industry Collaborative Education Program ...

  18. Online Learning Recommendations

    This recommendation suggests that feedback and support for students should be individualized for online learning, rather than given to the entire group. This does not necessarily mean that one should avoid providing scaffolds (such as guiding questions) to the entire group or that teachers necessarily need to work with individual students, only ...

  19. Course Recommendations in Online Education Based on Collaborative

    In this paper, a personalized online education platform based on a collaborative filtering algorithm is designed by applying the recommendation algorithm in the recommendation system to the online education platform using a cross-platform compatible HTML5 and high-performance framework hybrid programming approach. The server-side development adopts a mature B/S architecture and the popular ...

  20. Full article: "I Don't Think the System Will Ever be the Same

    During the last twenty years, online course-taking expanded rapidly in postsecondary institutions, particularly in broad-access institutions: During the 2016-17 academic year, 39% of students at nonselective institutions were enrolled in online courses (Xu & Xu, Citation 2019).However, the COVID-19 pandemic spurred an explosive expansion of remote instruction on an emergency basis as ...

  21. Research on Online Course Recommendation Model Based on Improved

    Aiming at the problem of sparse data and poor recommendation effect in online course recommendation, this paper proposes an improved online course intelligent recommendation model based on user implicit behavior collaborative filtering. Through data analysis, the implicit behavior data such as user login details, learning details and course selection details are mined, and the online course ...

  22. Best Research Courses Online with Certificates [2024]

    When you take research courses on Coursera, you can learn about research methods and techniques or dig deeper into specific topics like clinical research, research for social work, or market research and consumer behavior. You can learn about how to write and publish a scientific paper, understand concepts of research design, explore data ...

  23. PDF STUDENT EXPERIENCES IN ONLINE COURSES A Qualitative Research Synthesis

    tion. Students who take online courses tend to be slightly older than those students taking all courses offline (Doyle, 2009). Several impor-tant studies have documented that these stu-dents have good learning outcomes in online courses. Such research most frequently com-pares online to offline courses in experimental

  24. Introduction to Good Clinical Practice Course by Novartis

    During this course, you will explore the essential elements of Good Clinical Practice and gain insights into its significance in the global clinical research arena. By the end of the course, you will have a solid understanding of the principles of GCP and its role in ensuring the integrity and reliability of clinical trial data.

  25. Hierarchical Reinforcement Learning Recommendation Method based on Deep

    With the rapid development of MOOC platforms, course recommendation has become a focal point in the research field of recommendations. Effectively modeling user preferences is a crucial task within this context. Despite the notable achievements of reinforcement learning methods in simulating user preferences, there are still existing issues with current approaches. This paper introduces a ...

  26. Research and Application of Online Course Recommendation System Based

    In recent years, with the rapid development of Internet technology, a large number of online learning resources have emerged. Especially affected by the COVID-19 epidemic, online learning has become a very effective learning means. However, a large number of learning platforms and massive online teaching resources have the following three problems: 1) The quality of these courses is uneven and ...

  27. Setting a new bar for online higher education

    The course content follows a storyline, and each course is presented as a crucial piece in an overall learning journey. 5. Utilize adaptive learning tools. Online higher education pioneers deliver adaptive learning using AI and analytics to detect and address individual students' needs and offer real-time feedback and support.

  28. Impact of online classes on the satisfaction and performance of

    Online classes has encouraged me to develop my own academic interests as far as possible: 3.17: 0.76: 0.723: ... The present study results will help the educators increase the student's satisfaction and performance in online classes. The current research assists educators in understanding the different factors that are required for online ...

  29. What Is a Data Scientist? Salary, Skills, and How to Become One

    Communicate recommendations to other teams and senior staff ... modeling, and forecasting, and potentially conduct your own research on a topic you care about. Several data science master's degrees are available online. 2. Sharpen relevant skills. If you feel like you can polish some of your hard data skills, think about taking an online ...

  30. CS50's Introduction to Artificial Intelligence with Python

    By enabling new technologies like self-driving cars and recommendation systems or improving old ones like medical diagnostics and search engines, the demand for expertise in AI and machine learning is growing rapidly. This course will enable you to take the first step toward solving important real-world problems and future-proofing your career.