International Review of Research in Open and Distributed Learning

Volume 23, Number 4

November - 2022

 

Development of the Online Course Overload Indicator and the Student Mental Fatigue Survey

 

Gail Alleyne Bayne and Fethi A. Inan
Texas Tech University

 

Abstract

The purpose of this study is to develop and examine the psychometric properties of the Online Course Overload Indicator (OCOI) and the Student Mental Fatigue Survey (SMFS). The OCOI was designed to measure students’ perceptions of cognitive overload in online courses. The SMFS was used to assess students’ perceptions of mental fatigue while taking online courses. An exploratory factor analysis was conducted on a sample of 378 undergraduate students from various institutions offering online courses across the United States. Results of a factor and reliability analyses confirmed that the instruments are valid and reliable measures of students’ perceived mental fatigue and overload from online course elements. The analysis supported the model that students’ perceptions of overload in online courses consist of four constructs—information relevance, information overload, course design, and facilitation—in addition to the one-factor structure of the SMFS, which consists of the student mental fatigue construct.

Keywords: mental fatigue, cognitive overload, online learning, online course design, student support, online course development

Development of the Online Course Overload Indicator and the Student Mental Fatigue Survey

Mental fatigue is not a novel concept and is one that has received attention through empirical studies in various areas, such as military, aviation, driving, health care, and its effects on shift workers (Ackerman, 2011). Numerous research studies have been conducted on the effects of mental fatigue on employees in work-related environments and clinical settings (e.g., Al Ma’mari et al., 2020; Sarkar & Parnin, 2017). Researchers have also reported on mental fatigue in children and adolescents (Mizuno et al., 2011; Palmer, 2013). It has typically been a problem in clinical and workplace settings. However, limited published research exists on mental fatigue in online course design and its effects on students’ cognitive functioning.

With the growth in online courses and programs, students are spending more and more time doing coursework and learning online, and their cognitive functions may be overtaxed to the extent of experiencing mental fatigue. Researchers have found that mental fatigue can play a part in the disruption of cognitive functioning (Boksem & Tops, 2008) and information processing, causing reduced attention and lack of focus on the task (van der Linden, 2011). The effects of mental fatigue may be attributed to factors in the online environment that are beyond students’ control, such as information overload, poor course design, and a lack of instructor facilitation, to name a few. Such factors would need further examination to better understand their effects on students’ level of mental fatigue in the online environment.

Mental Fatigue

Mental fatigue, sometimes referred to as cognitive fatigue or brain fatigue, is one type of fatigue that is often researched from a performance and motivation perspective. Beiske and Svensson (2010) define mental fatigue as measuring “the subjective feeling of being mentally exhausted, encompassing items such as concentration, memory and speech” (p. 78). DeLuca (2005) describes mental fatigue as “a decrement in performance from excess mental effort” (p. 8). Hockey (2013) refers to it as “an unfocused mental state (distraction, frustration, discomfort)” (p. 1). Boksem et al. (2005) describe mental fatigue as “the effects that people may experience after or during prolonged periods of cognitive activity” (p. 107). The common element in these definitions points to the underlying cognitive processes that are affected by the phenomenon of mental fatigue.

Mental Fatigue and Cognitive Load

The theoretical framework that guides and supports this study is cognitive load theory (CLT) (Plass et al., 2010). CLT is based on the premise that working memory has limited capacity and learners can only process small amounts of information at any one time (Miller, 1994). Consequently, some instructional design practices or strategies can impose extra or unnecessary mental effort and may contribute to mental fatigue that can constrain learning and performance (Clark & Mayer, 2016). Mental load is the load imposed by the task or sequence of information in the instruction (Sweller et al., 1998). Mental effort is the amount of capacity allocated to the demands imposed by the instruction (Sweller et al., 1998). Together, mental load and mental effort make up cognitive load (Ayres, 2006). Cognitive overload occurs when the mental load exceeds mental capacity (Clark & Mayer, 2016). Cognitive overload is certainly not mental fatigue, but it is a trigger that can cause mental fatigue. In other words, once mental capacity is exceeded, the brain becomes mentally exhausted. Individuals typically experience an overload of mental capacity during and after taxing cognitive activity for prolonged periods. Once mental capacity is exceeded, it causes cognitive overload, which can trigger mental fatigue. When students are fatigued, they become disengaged, frustrated, and stressed, and their learning ability and/or capacity becomes diminished.

Online Versus Traditional Classroom Environments

There have been many studies on mental fatigue in work-related and clinical settings, however only a few studies have examined mental fatigue in educational settings (Csathó et al., 2012; Mizuno et al., 2011). This phenomenon has been largely overlooked in online environments, and there have been only a limited number of studies on information overload and fatigue in online environments (Lee et al., 2016; Tugtekin, 2022). However, researchers have found that there are challenges in online environments that might not exist in traditional classrooms, such as student perceptions of isolation; learner frustration, anxiety, and confusion; lack of community; lack of instructor engagement and immediate response; information overload; and challenges with technology, including access to a reliable Internet connection (Holmes & Reid, 2017). These challenges may affect the learner and the online learning experience and could be indicators that the online environment is causing undue mental fatigue.

Sources of Mental Fatigue

Cognitive overload caused by greater mental effort, task difficulty, and design of instruction (Plass et al., 2010) is a contributor to increased levels of mental fatigue (Balkin & Wesensten, 2011). Hence, factors that directly impact students, including course design, facilitation, information overload, and information relevance, are considered potential sources of mental fatigue. The following sections discuss each one of these sources as it relates to the online learning context.

Course Design

Course design is operationalized in this study as the organization, format, and structure of the online course including the components that make up its structure (e.g., multimedia elements, visual design elements, organization, etc.). Clark and Mayer (2016) propose a set of multimedia principles that can be used in the design of online courses to avoid overloading learners with extraneous content and to design courses in effective ways to promote student learning (Clark & Mayer, 2016). Additionally, the organization of the online platform and effective design of learning materials for online courses can help students engage in active learning by decreasing cognitive load. The psychological reason for effective course design is to help learners use their cognitive capacity to focus on the relevant instructional goals by reducing irrelevant processing of information, thereby minimizing cognitive load and mental fatigue.

Facilitation

Facilitation is operationalized in this study as the level of instructor presence, instructor immediacy, and feedback provided in an online course. Facilitation in the online environment is a fundamental element for student learning, satisfaction, and cognitive overload (Wanstreet, 2006). Some researchers have argued that instructor facilitation is important to “support and enhance social and cognitive presence for the purpose of realizing educational outcomes” (Garrison et al., 1999, p. 90). Researchers also note that various forms of instructor behaviors, such as frequently interacting with students, using informality and casualness, returning phone/e-mail messages, and being accessible to students, to name a few (see O’Sullivan et al., 2004, for more cues), can be incorporated via the course design and written interactions (Baker, 2010).

Information Overload

The meaning of information overload can be different depending on the research context. This study adopted the definition proposed by Lee et al. (2016): information overload occurs when individuals “are exposed to more information than they can accommodate in their capacity for information processing” (p. 53). Two main determinants of information overload are human processing capacity and complexity (Sweller, 2008). Plass et al. (2010) declare that the source of cognitive load comes from the design of the materials, the difficulty of the material to be learned, and the mental effort required to process the new information. Furthermore, the type of cognitive load (i.e., intrinsic or extraneous) can contribute to increased levels of mental fatigue, with more difficult tasks consuming more mental effort (Balkin & Wesensten, 2011). Similarly, information overload in the online learning environment can originate from various sources, such as complex course content, an excessive number of readings, numerous topics in one lesson, long videos, and too many resources in the course, to list a few (Guo et al., 2014).

Information Relevance

Information relevance is an important aspect of any course. Roberson (2013) defines relevance as “the perception that something is interesting and worth knowing” (p. 18). Information relevance is operationalized in this study as the extent to which course content is helpful and relevant to a student’s learning and success in and outside of the online course (Lee et al., 2016). Information relevance can produce an increase in motivation (Keller, 1983) and a decrease in mental load (Roelle et al., 2015). The irrelevance of information for current or future needs can affect personal motives, goals, and values and lead to greater fatigue due to a lack of motivation (Edwards & Cooper, 2013; Herlambang et al., 2019). Based on research findings, Roelle et al. (2015) conclude that specific relevance instructions could lower the amount of extraneous cognitive load that students have to process, finding that students who received specific relevance instruction had more working memory capacity to execute cognitive processes.

Purpose of the Study

Although the importance of understanding fatigue in learning has been recognized in previous research (Palmer, 2013), no attempt to date has been made to create and validate instruments to measure the underlying concept of student mental fatigue in educational settings (Hafezi et al., 2010). Some previous studies have used self-developed items; for example, Csathó et al. (2012) used a non-standard, one-item statement focusing on student tiredness levels to measure undergraduate and postgraduate students’ levels of subjective fatigue before and after fatigue-inducing mental tasks. Others have attempted to use instruments designed for medical purposes. For example, Mizuno et al. (2011) examined cognitive predictors of fatigue in elementary and junior high school students by using the Chalder Fatigue Scale (Chalder et al., 1993), which is designed to measure the severity of chronic tiredness due to illnesses. Another instrument to measure chronic fatigue, the Checklist Individual Strength questionnaire (Vercoulen et al., 1994), has also been frequently used in studies (Bakker et al., 2009). Unfortunately, these existing instruments, usually designed for medical diagnosis purposes, do not specifically measure how fatigued students feel while doing coursework and do not apply to diverse student populations in online courses.

Furthermore, there is no systematic way, at the time of writing, to help instructors identify specific areas in the online environment that may be causing overload. The repercussions could include poorly designed online courses that lead to student cognitive overload, disengagement, and attrition. The challenge presented to online instructors is recognizing whether online instruction and/or the design of the online environment contributes to mental fatigue. As a result, instruments are needed in the field of education to gather information about sources of student mental fatigue in online courses. Understanding these constructs will enable better decision making that can lead to improved instructional design that minimizes cognitive overload and mental fatigue.

Therefore, the purpose of this study was to develop two instruments—the Online Course Overload Indicator (OCOI) and the Student Mental Fatigue Survey (SMFS). The SMFS examines and confirms the level of mental fatigue students are experiencing while doing coursework, and the OCOI helps to identify where the mental fatigue/overload is coming from within the online environment (i.e., online course design elements). The research question of the study is the following: What are the psychometric properties (i.e., factors to be retained, variance for each factor, reliability of subscales, and interpretation of factors) of the OCOI and SMFS?

Methodology

Participants

This study used a non-probability sample from a Qualtrics panel. The target population was undergraduate students who were at least 18 years of age and currently enrolled in a fully online course, defined by the Online Learning Consortium as a course in which all activities are done online without any required face-to-face components (Mayadas et al., 2015). The instruments were administered to students after the Thanksgiving break to ensure that students had enough time to acclimate to the online course and environment. Participants were instructed to complete the survey with respect to any online course they were taking that semester. The panel sample was acquired from Qualtrics Panels, LLC. The company collected data from participants enrolled in various distance education institutions/programs across the United States. The company was responsible for sending the survey out through its panel partners to participants, inviting them to complete the online survey in return for incentives, which the company provided. Nunnally’s (1978) widely cited recommendation is that the subject-to-item ratio should be at least 10:1 for exploratory factor analysis. Therefore, the instruments were tested with a large sample to establish validity and reliability (DeVellis, 2003).

Data from 378 undergraduate students, who were enrolled in online courses, were used for the analyses in this study. The majority of students were female (82%; 18% male), which could be due to the trend of higher female enrollment in distance education (Guramatunhu, 2015). The students ranged in age from 18 to 50 years (M = 27.15, Mdn = 25.00, SD = 7.57). The ethnic composition of the sample was diverse. Students self-identified with the following ethnicities: white (65.87%), Black/African American (15.87%), Hispanic/Latino (11.11%), Asian (3.17%), multiple (1.85%), American Indian/Alaskan Native (1.32%), and other (0.79%). Most students indicated that this was not their first semester taking an online course (69.3%). In addition, the data revealed that a large portion of students (62.9%) were taking three to six credit hours of online courses. Regarding technical skills, the majority of the students indicated that they were very proficient computer users (70.9%).

Instrument Design and Development

This section addresses the design and development of the OCOI and the SMFS. This study’s inventory design and development process generally followed DeVellis’s (2003) eight-step scale development process. Under DeVellis’s (2003) guidelines, the eight-step process was organized into three distinct stages for this study: stage 1—identification of constructs and development of subscale items (steps 1-3); stage 2—expert review and validation (steps 4-6); and stage 3—factor analysis and scale optimization (steps 7-8).

Stage 1: Identification of Constructs and Development of Subscale Items

In stage 1, the initial identification of the constructs was based on themes that emerged from an extensive literature review and additional findings in a pilot study, conducted at a four-year public university in the southwestern region of the United States to gather data about the levels of subjective fatigue experienced by online students (Alleyne Bayne, 2016). The pilot study included 63 graduate and undergraduate students from different majors in fully online courses. Results revealed that 36.5% of students had severe levels of mental fatigue, 31.7% had elevated levels, and 31.7% had normal levels (Alleyne Bayne, 2016). Anecdotal evidence from students about the effort exerted were testimonials of their levels of frustration, which were similar to student comments from previous surveys (Barnard & Paton, 2007; Lambert et al., 2009). Several themes emerged from the analysis including information relevance, information overload, course design, instructional activities and materials, and student mental fatigue (Alleyne Bayne, 2016). A list of constructs and their definitions are listed in Table 1.

Table 1

Definition of Constructs

Construct Definition
Information relevance The extent to which course content is helpful and relevant to a student’s learning and success in and outside of the online course (Lee et al., 2016)
Information overload Occurs when individuals “are exposed to more information than they can accommodate in their capacity for information processing” (Lee et al., 2016, p. 53)
Course design The organization, format, and structure of the online course, including the components that make up the structure (e.g., multimedia elements, design elements, etc.)
Instructional activities and materials The activities and materials that effectively communicate the content and/or instructor’s intent (e.g., instructions/guidelines, assignments, reading materials, multimedia elements) to learners to promote learning
Facilitation The level of instructor presence, instructor immediacy, and feedback provided in an online course
Mental fatigue A self-reported feeling of tiredness after a long duration of mental activity; encompasses feelings of anxiety, frustration, and stress

The next step was the exploration of relevant literature on existing validated instruments pertinent to the constructs that were investigated, including mental fatigue (Mota & Pimenta, 2006; Vercoulen et al., 1994), information relevance (Lee et al., 2016), information overload (Chen et al., 2011; Lee et al., 2016), student perceptions of connectedness (Bolliger & Inan, 2012), student perceptions and expectations of online learning (Harris et al., 2011), instructional activities and materials (Roach & Lemasters, 2006), facilitation (Bolliger & Inan, 2012; Harris et al., 2011), and course design (Harris et al., 2011). The creation of the initial item pool was guided by the construct definitions provided in Table 1, keywords that represent the constructs, and an existing set of items from other validated instruments.

Stage 2: Expert Review and Validation

In stage 2, the items were written to represent each construct of interest using the guidelines presented by Worthington and Whittaker (2006) for the generation of an item pool—that is, they are “clear, concise, readable, distinct, and reflect the scale’s purpose” (p. 813). Decisions were made on the number of items for each scale and the type of response format. Hinkin et al. (1997) recommends four to six items for each construct. The Likert response format was used because it is reported as the most widely used in instrument development for attributes measuring constructs that are unobservable such as attitudes or beliefs (DeVellis, 2003). A preliminary list of 38 items (30 for the OCOI and 8 for the SMFS) on a Likert scale ranging from 1 (strongly disagree) to 5 (strongly agree) was developed.

The item inventory was sent to a panel of experts in distance education and instructional technology for face and content validity evaluation. The panel experts consisted of three faculty members who had terminal degrees and several years of experience in teaching online, designing online environments, developing instruments, and publishing research, and one instructional designer with more than 10 years of experience designing/developing hybrid and online courses. Panel experts were contacted individually via e-mail and were provided with instructions, operational definitions, and a scale and rating form for review and comments/suggestions for improvement of the instrument. Additionally, the experts were provided with working definitions of the constructs (see Table 1) to rate the items in each construct concerning their relevance to the definition (DeVellis, 2003). Changes to the instruments were made based on recommendations from panel experts. The OCOI included 30 items and five subscales as follows: information relevance, information overload, course design, instructional activities and materials, and facilitation. The SMFS comprised eight items on a single subscale.

Stage 3: Data Collection and Analysis

In stage 3, the items were administered, along with other scales, to the panel sample of 378 undergraduate students enrolled in online courses to validate the instruments. The main purpose of this stage was to establish the psychometric properties of the instrument. Preliminary analysis was conducted after the items were administered to the sample. The assumptions were checked for normality/linearity (Wilcox, 2013), and sample size adequacy was also checked (Bartlett, 1951). The principal component analysis was used as a data reduction technique to reduce observed variables into a smaller number of components (Worthington & Whittaker, 2006). The oblique (Promax) rotation was used because of the assumption that the factors underlying the items are correlated according to previous theoretical support (Field, 2009). The number of factors to be retained was determined using several methods. Factor extraction and rotation were conducted to get the loadings for each factor and to improve the interpretation of the factors (Mertler & Vannatta, 2016).

Results

OCOI: Exploratory Factor Analysis Results

Four criteria were used to determine the appropriate number of components to retain: eigenvalue, scree plot, total variance explained, minimum average partial (MAP) test, and parallel analysis. Upon examination of the eigenvalues, five factors were above the value of 1 and explained 54.73% of the total variance. Kaiser (1960) recommends retaining all factors with eigenvalues greater than 1. There is a consensus in the literature, however, that using eigenvalues is one of the least accurate measures to determine the number of factors to retain (Carpenter, 2018; Costello & Osborne, 2005). Previous research suggests that parallel analysis (PA) is more accurate than other methods to determine the number of factors to retain (Matsunaga, 2010). Velicer’s MAP test is also a validated procedure that was used to decide on the number of factors to retain (O’connor, 2000). The Statistical Package for the Social Sciences (SPSS) with a subprogram was used to compute PA and MAP for the OCOI. The results from the PA indicated a four-factor structure. Similarly, a four-components solution was suggested by the original and revised MAP test. The scree plot was slightly ambiguous but also supported the retention of four factors. Therefore, four factors were retained based on the results of the PA, MAP test, and scree plot (see Figure 1).

Figure 1

Parallel Analysis Scree Plot Results

The four-factor structure yielded a total explained variance of 50.99%. Each retained item loaded distinctively on one of the four factors. Cross-loadings and multiple factor loadings have been identified as evidence of complex items reflecting the influence of more than one factor (Worthington & Whittaker, 2006). Examination of the pattern matrix revealed items from one scale with cross-loading or multiple factor loadings. Therefore, the items from the instructional activities and materials subscale were eliminated due to low factor loadings and/or loading to multiple subscales. In Table 2, the retained items are ordered and grouped by the size of their loadings.

Table 2

Summary of Exploratory Factor Analysis Results for the OCOI (n = 378)

Subscale and items Component
1 2 3 4
Information relevance
Information will be useful for my current or future job .794
Information will prepare me with practical knowledge and skills .767
Information is related to real-life situations .766
Information contributes to my success in this course .757
Information is related to my interest(s) outside of the online course .663
Information presented is what I need to know to be successful in the course .601
Facilitation
The instructor is actively involved in the online environment .869
The instructor communicates clearly in writing throughout the course .848
The feedback provided by the instructor is constructive .828
The instructor is responsive to my questions .784
The instructor provides timely feedback .776
The instructor encourages learner participation in course activities/tasks .732
Course design
The course tasks/assignments are easy to locate .769
The course is easy to navigate .759
The design (e.g., organization, presentation, i.e., look and feel) is consistent throughout the course .756
The course materials are easy to find .756
The design elements (e.g., colors, fonts, buttons, use of space) are visually pleasing .701
The course was logically organized .660
Information overload
The number of readings is overwhelming .763
There is an excessive amount of information to process in the course .728
The course content is complex .695
There are too many resources in the course .674
The videos are too long .621
The number of threads/replies posted in the online discussions is overwhelming .585
Initial eigenvalue (all items) 8.90 2.58 2.07 1.74
Initial % of variance explained (all items) 29.67 8.59 6.91 5.81

Note. Factor loadings under .50 were removed for the retained constructs.

The SMFS: Exploratory Factor Analysis Results

The SMFS was examined with principal components analysis. Typically, factor analysis requires two steps: (a) factor extraction and (b) factor rotation. However, after factor extraction, an examination of the scale revealed a single underlying dimension. Thus, the component loadings of the individual items indicated a single construct. Therefore, no factor rotation was used. Eigenvalues and the scree plot were examined for factor retention. On inspection of the eigenvalues, only one factor was above the value of one and explained 62.21% of the total variance. The scree plot suggested retaining one factor and was in concurrence with the eigenvalues. Therefore, one factor was retained. Only items with a factor loading of at least .50 were interpreted, based on Comrey and Lee’s (1992) factor loading guidelines. All item loadings were above .50, and there were no cross-loading items. In Table 3, the items are ordered and grouped by the size of their loadings.

Table 3

Summary of Exploratory Factor Analysis Results for the Student Mental Fatigue Survey (SMFS) (n = 378)

Item Component
1
Mental fatigue
I feel stressed when doing coursework .863
I feel overwhelmed when doing coursework .856
I feel frustrated when doing coursework .817
It is difficult to focus when doing coursework .813
I feel anxiety when doing coursework .788
I feel confused when doing coursework .752
I feel tired when doing coursework .748
It is difficult to relax immediately after doing coursework .652
Initial eigenvalue 4.98
Initial % of variance explained 62.21

Note. All factor loadings were greater than .50 for the retained construct.

Reliability Analysis

Internal consistency and reliability analyses were conducted to determine the extent to which items correlated with each other and the degree to which items consistently measured the same construct as other items within that scale (Slavin, 2007). To determine the instrument’s reliability, Cronbach’s alpha coefficient was calculated. The final model for the OCOI retained 24 items and four subscales. The overall reliability of the OCOI was 0.89. Information relevance, facilitation, course design, and information overload subscales all had high to moderate reliabilities, with Cronbach’s alpha ranging from 0.77 to 0.90. The final SMFS included eight items. The overall reliability was high (0.91). Further analysis indicated that this alpha would not increase with the deletion of any item(s). Table 4 includes the number of items, Cronbach’s alpha coefficients, means, and standard deviations for the OCOI and SMFS.

Table 4

Online Course Overload Indicator (OCOI) and Student Mental Fatigue Survey (SMFS) Reliability Summary Statistics

Subscale No. of items Cronbach’s α Ma SD
OCOI
Information relevance 6 0.81 3.97 0.63
Course design 6 0.83 3.87 0.62
Facilitation 6 0.90 3.76 0.79
Information overloadb 6 0.77 2.88 0.78
SMFS
Mental fatigueb 8 0.91 2.79 0.92

Notes. a Mean scores were calculated by adding up the scores of all items loaded to the scale and dividing the total by the number of items on the scale; b Negatively phrased items were reverse coded before running the calculations.

Discussion

The evidence from previous studies suggests that understanding the impacts of course design elements on students’ level of mental fatigue is important because it may potentially impact learning and performance (Ackerman et al., 2010; Jensen et al., 2013), cognitive flexibility (Plukaard et al., 2015), and exploration of complex tasks (Sarkar & Parnin, 2017). However, there is a lack of information in the research on the effects of course design elements and mental fatigue in online environments. This may be due to mental fatigue being a difficult construct to measure and because up to the date of writing, no instruments have been identified that measure student mental fatigue in online environments. Therefore, the OCOI and the SMFS were developed to be used to fill this gap in the literature from an online learning perspective.

The feedback from students regarding course design and implementation elements is an important aspect in helping instructors provide quality instruction to learners. Therefore, a valid and reliable evaluation tool (such as the OCOI) would help instructors identify specific areas for improvement within the online course (e.g., content, design, and environment). In addition, the SMFS could help instructors understand the mental constraints of their learners in terms of whether they feel overwhelmed, confused, anxious, or frustrated when doing coursework. Additionally, online courses usually attract students from diverse backgrounds (e.g., working professionals needing flexible schedules, nontraditional students, etc.). Considering these learners’ mental and physical workloads, improvements in the learning experience could make a difference in their learning outcomes (e.g., content retention and course performance). The four constructs identified on the OCOI gather the student perspective regarding overload indicators in the online environment, and they have been frequently mentioned by online students as sources of their frustrations (Alleyne Bayne, 2016; Barnard & Paton, 2007; Lambert et al., 2009).

Future Research and Limitations

This study has explored the design of two new instruments regarding online course overload indicators and their effects on students’ mental fatigue. Considering the novelty of the subject studied, several areas can be further explored. Future studies using alternative statistical procedures (e.g., confirmatory factor analysis) could validate the developed instrument. A follow-up with confirmatory factor analysis on a new sample could be useful, as researchers recommend repeating instrument validation with a new data set for optimizing the scale length (Field, 2009; Worthington & Whittaker, 2006). These instruments can also be used to investigate whether course design elements predict mental fatigue in online courses. Additionally, studies can examine the relationship between academic performance and perceived student mental fatigue in an online environment, as well as whether improvements in course design decrease perceived student mental fatigue.

In this study, one limitation was that the researchers were not directly involved in the online course design, which would have given the study a point of reference for the quality of the online course environments. In future studies, researchers may consider directly reviewing and evaluating various online course design elements. The research should be expanded into the actual online course environment with instructor input and expert assessment of the course components along with the student-level data to correlate the constructs and their impact on learning outcomes. Additionally, future studies could involve multiple data collection points to explore the effects of online course design elements on students’ mental fatigue in an online learning environment. Such longitudinal studies would allow researchers to monitor changes in student mental fatigue over time.

Conclusion

Several existing instruments are used for fatigue research in medicine. However, these instruments are not targeted toward the online environment—specifically, the design of online courses. Therefore, an important outcome of this study was identifying tools that educators can use to assess whether design elements in online instruction contribute to mental fatigue. Two instruments were created that can assist instructors teaching courses online to assess student perceptions of cognitive overload and mental fatigue when doing coursework. Specifically, the Online Course Overload Indicator (OCOI) was designed to measure students’ perceptions of cognitive overload in online courses. The Student Mental Fatigue Survey (SMFS) was designed to measure students’ perceptions of mental fatigue while taking online courses. The findings from this research study may be helpful for instructors seeking to optimize their online course design to promote better learning experiences for online learners. For those instructors who are new to online teaching and thus unprepared to teach in this environment, and/or for seasoned instructors who have never taught online, the OCOI could be used to better gauge where in the online course/environment students are being overloaded, and adjustments to the course can be provided as needed. Hence, the knowledge gained from this research study could enable practitioners in education to use these tools to facilitate online learning, thereby improving online course content in meaningful and relevant ways to promote student learning and satisfaction.

References

Ackerman, P. L. (2011). Cognitive fatigue: Multidisciplinary perspectives on current research and future applications. American Psychological Association. https://doi.org/10.1037/12343-000

Ackerman, P. L., Kanfer, R., Shapiro, S. W., Newton, S., & Beier, M. E. (2010). Cognitive fatigue during testing: An examination of trait, time-on-task, and strategy influences. Human Performance, 23(5), 381-402. https://doi.org/10.1080/08959285.2010.517720

Al Ma’mari, Q., Sharour, L. A., Al Omari, O. (2020). Fatigue, burnout, work environment, workload and perceived patient safety culture among critical care nurses. British Journal of Nursing, 29(1), 28-34. https://doi.org/10.12968/BJON.2020.29.1.28

Alleyne Bayne, G. (2016, March 25). An exploratory study of university students’ mental fatigue in online courses [Poster presentation]. Texas Tech Univeristy 15th Annual Graduate Student Research Poster Competition, Lubbock TX.

Ayres, P. (2006). Using subjective measures to detect variations of intrinsic cognitive load within problems. Learning and Instruction, 16(5), 389-400. https://doi.org/10.1016/j.learninstruc.2006.09.001

Baker, C. (2010). The impact of instructor immediacy and presence for online student affective learning, cognition, and motivation. Journal of Educators Online, 7(1), 1-30. https://doi.org/doi:10.9743/JEO.2010.1.2

Bakker, R. J., van de Putte, E. M., Kuis, W., & Sinnema, G. (2009). Risk factors for persistent fatigue with significant school absence in children and adolescents. Pediatrics, 124(1), e89-e95. https://doi.org/10.1542/peds.2008-1260

Balkin, T. J., & Wesensten, N. J. (2011). Differentiation of sleepiness and mental fatigue effects. In P. L. Ackerman (Ed.), Cognitive fatigue: Multidisciplinary perspectives on current research and future applications (pp. 47-66). American Psychological Association. https://doi.org/10.1037/12343-002

Barnard, L., & Paton, V. O. (2007, November). Distance learning survey of Texas Tech University’s fall 2006 distance and off-campus students. https://www.depts.ttu.edu/opa/docs/Barnard_Paton_2007_Distance_Learning_Survey_TTU_F2006.pdf

Bartlett, M. S. (1951). The effect of standardization on a χ2 approximation in factor analysis. Biometrika, 38(3/4), 337-344. https://doi.org/10.2307/2332580

Beiske, A. G., & Svensson, E. (2010). Fatigue in Parkinson’s disease: A short update. Acta Neurologica Scandinavica, 122(s190), 78-81. https://doi.org/10.1111/j.1600-0404.2010.01381.x

Boksem, M. A. S., Meijman, T. F., & Lorist, M. M. (2005). Effects of mental fatigue on attention: An ERP study. Cognitive Brain Research, 25(1), 107-116. https://doi.org/10.1016/j.cogbrainres.2005.04.011

Boksem, M. A. S., & Tops, M. (2008). Mental fatigue: Costs and benefits. Brain Research Reviews, 59(1), 125-139. https://doi.org/10.1016/j.brainresrev.2008.07.001

Bolliger, D. U., & Inan, F. A. (2012). Development and validation of the Online Student Connectedness Survey (OSCI). The International Review of Research in Open and Distributed Learning, 13(3), 41-65. https://doi.org/10.19173/irrodl.v13i3.1171

Carpenter, S. (2018). Ten steps in scale development and reporting: A guide for researchers. Communication Methods and Measures, 12(1), 25-44. https://doi.org/10.1080/19312458.2017.1396583

Chalder, T., Berelowitz, G., Pawlikowska, T., Watts, L., Wessely, S., Wright, D., & Wallace, E. P. (1993). Development of a fatigue scale. Journal of Psychosomatic Research, 37(2), 147-153. https://doi.org/10.1016/0022-3999(93)90081-P

Chen, C. Y., Pedersen, S., & Murphy, K. L. (2011). Learners’ perceived information overload in online learning via computer-mediated communication. Research in Learning Technology, 19(2), 101-116. https://doi.org/10.1080/21567069.2011.586678

Clark, R., & Mayer, R. E. (2016). E-learning and the science of instruction: Proven guidelines for consumers and designers of multimedia learning (4th ed.). John Wiley & Sons. https://doi.org/10.1002/9781119239086

Clark, R., Nguyen, F., & Sweller, J. (2006). Efficiency in learning: Evidence-based guidelines to manage cognitive load. John Wiley & Sons.

Comrey, A. L., & Lee, H. (1992). A first course in factor analysis (2nd ed.). Lawrence Erlbaum Associates. https://doi.org/10.4324/9781315827506

Costello, A. B., & Osborne, J. W. (2005). Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Practical Assessment Research & Evaluation, 10(7), 1-9. https://doi.org/10.7275/jyj1-4868

Csathó, Á., van der Linden, D., Hernádi, I., Buzás, P., & Kalmár, G. (2012). Effects of mental fatigue on the capacity limits of visual attention. Journal of Cognitive Psychology, 24(5), 511-524. https://doi.org/10.1080/20445911.2012.658039

DeLuca, J. (Ed.). (2005). Fatigue as a window to the brain. MIT press. https://doi.org/10.7551/mitpress/2967.001.0001

DeVellis, R. F. (2003). Scale development: Theory and applications (2nd ed.). Sage Publications.

Edwards, J. R., & Cooper, C. L. (2013). The person-environment fit approach to stress: Recurring problems and some suggested solutions. In C. L. Cooper (Ed.), From stress to wellbeing: Vol. 1. Theory and research on occupational stress and wellbeing (pp. 91-108). Palgrave Macmillan. https://doi.org/10.1057/9781137310651_5

Field, A. (2009). Discovering statistics using SPSS (3rd ed.). Sage Publications. https://uk.sagepub.com/en-gb/eur/discovering-statistics-using-ibm-spss-statistics/book257672

Garrison, D. R., Anderson, T., & Archer, W. (1999). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2-3), 87-105. https://doi.org/10.1016/S1096-7516(00)00016-6

Guo, P. J., Kim, J., & Rubin, R. (2014, March). How video production affects student engagement: An empirical study of MOOC videos. L@S ’14: Proceedings of the First ACM Conference on Learning @ Scale Conference (pp. 41-50). Association for Computing Machinery. https://doi.org/10.1145/2556325.2566239

Guramatunhu, P. (2015). The gender shift in enrollment patterns in higher education: A case study of a school administration program. Advancing Women in Leadership, 35, 120-133. https://doi.org/10.21423/awlj-v35.a137

Hafezi, S., Zare, H., Najafi Mehri, S., & Mahmoodi, H. (2010). The Multidimensional Fatigue Inventory validation and fatigue assessment in Iranian distance education students. Proceedings of the 4th International Conference on Distance Learning and Education (ICDLE) (pp. 195-198) Institute of Electrical and Electronics Engineers. https://doi.org/10.1109/ICDLE.2010.5606006

Harris, S. M., Larrier, Y. I., & Castano-Bishop, M. (2011). Development of the Student Expectations of Online Learning Survey (SEOLS): A pilot study. Online Journal of Distance Learning Administration, 14(4). https://eric.ed.gov/?id=EJ960578

Herlambang, M. B., Taatgen, N. A., & Cnossen, F. (2019). The role of motivation as a factor in mental fatigue. Human Factors, 61(7), 1171-1185. https://doi.org/10.1177/0018720819828569

Hinkin, T. R., Tracey, J. B., & Enz, C. A. (1997). Scale construction: Developing reliable and valid measurement instruments. Journal of Hospitality and Tourism Research, 21(1), 100-120. https://doi.org/10.1177/109634809702100108

Hockey, R. (2013). The psychology of fatigue: Work, effort and control. Cambridge University Press. https://doi.org/10.1017/CBO9781139015394

Holmes, C. M., & Reid, C. (2017). A comparison study of on-campus and online learning outcomes for a research methods course. The Journal of Counselor Preparation and Supervision, 9(2), 1-24. https://doi.org/10.7729/92.1182

Jensen, J. L., Berry, D. A., & Kummer, T. A. (2013). Investigating the effects of exam length on performance and cognitive fatigue. PLoS One, 8(8), e70270. https://doi.org/10.1371/journal.pone.0070270

Kaiser, H. F. (1960). The application of electronic computers to factor analysis. Educational and Psychological Measurement, 20(1), 141-151. https://doi.org/10.1177/001316446002000116

Keller, J. (1983). Motivational design of instruction. In C. M. Reigeluth (Ed.), Instructional design theories and models: An overview of their current status (pp. 383-434). Lawrence Erlbaum. https://doi.org/10.4324/9780203824283

Lambert, M., Sattler, S., & Paton, V. O. (2009, June). Distance learning survey of Texas Tech University’s fall 2008 distance and off-campus students. https://www.depts.ttu.edu/opa/docs/Lambert_Sattler_Paton_2009_Distance_Learning_Survey_TTU_F2008.pdf

Lee, A., Son, S. M., & Kim, K. K. (2016). Information and communication technology overload and social networking service fatigue: A stress perspective. Computers in Human Behavior, 55, 51-61. https://doi.org/10.1016/j.chb.2015.08.011

Matsunaga, M. (2010). How to factor-analyze your data right: Do’s, don’ts, and how-to’s. International Journal of Psychological Research, 3(1), 97-110. https://doi.org/10.21500/20112084.854

Mayadas, F., Miller, G. E., & Sener, J. (2015, July). Updated e-learning definitions. Online Learning Consortium. http://onlinelearningconsortium.org/updated-e-learning-definitions/

Mertler, C. A., & Vannatta, R. A. (2016). Advanced and multivariate statistical methods: Practical application and interpretation (6th ed.). Routledge. https://doi.org/10.4324/9781315266978

Miller, G. (1994). The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological Review, 101(2), 343-352. https://doi.org/10.1037/0033-295x.101.2.343

Mizuno, K., Tanaka, M., Fukuda, S., Yamano, E., Shigihara, Y., Imai-Matsumura, K., & Watanabe, Y. (2011). Low visual information-processing speed and attention are predictors of fatigue in elementary and junior high school students. Behavioral and Brain Functions, 7(1), 1-7. https://doi.org/10.1186/1744-9081-7-20

Mota, D. D. C. F., & Pimenta, C. A. M. (2006). Self-report instruments for fatigue assessment: A systematic review. Research and Theory for Nursing Practice: An International Journal, 20(1), 49-78. https://doi.org/10.1891/rtnp.20.1.49

Nunnally, J. C. (1978). Psychometric theory (2nd ed.). McGraw-Hill.

O’connor, B. P. (2000). SPSS and SAS programs for determining the number of components using parallel analysis and Velicer’s MAP test. Behavior Research Methods, Instruments, & Computers, 32(3), 396-402. https:/doi.org/10.3758/BF03200807

O’Sullivan, P. B., Hunt, S. K., & Lippert, L. R. (2004). Mediated immediacy: A language of affiliation in a technological age. Journal of Language and Social Psychology, 23(4), 464-490. https://doi.org/10.1177/0261927X04269588

Palmer, L. K. (2013). The relationship between stress, fatigue, and cognitive functioning. College Student Journal, 47(2), 312-325. https://www.ingentaconnect.com/content/prin/csj/2013/00000047/00000002/art00007

Plass, J. L., Moreno, R., & Brunken, R. (Eds.). (2010). Cognitive load theory. Cambridge University Press. https://doi.org/10.1017/CBO9780511844744

Plukaard, S., Huizinga, M., Krabbendam, L., & Jolles, J. (2015). Cognitive flexibility in healthy students is affected by fatigue: An experimental study. Learning and Individual Differences, 38, 18-25. https://doi.org/10.1016/j.lindif.2015.01.003

Roach, V., & Lemasters, L. (2006). Satisfaction with online learning: A comparative descriptive study. Journal of Interactive Online Learning, 5(3), 317-332. https://files.eric.ed.gov/fulltext/ED447310.pdf

Roberson, R. (2013, September). Helping students find relevance. Psychology Teacher Network, 23(2), 18-20.

Roelle, J., Lehmkuhl, N., Beyer, M., & Berthold, K. (2015). The role of specificity, targeted learning activities, and prior knowledge for the effects of relevance instructions. Journal of Educational Psychology, 107(3), 705-723. https://doi.org/10.1037/edu0000010

Sarkar, S., & Parnin, C. (2017). Characterizing and predicting mental fatigue during programming tasks. Proceedings: 2017 IEEE/ACM 2nd International Workshop on Emotion Awareness in Software Engineering (pp. 32-37). Institute of Electrical and Electronics Engineers. https://doi.org/10.1109/SEmotion.2017.2

Slavin, R. E. (2007). Educational research in an age of accountability. Allyn & Bacon.

Sweller, J. (2008). Human cognitive architecture. In J. M. Spector, M. D. Merrill, J. Van Merrienboer, & M. P. Driscoll (Eds.), Handbook of research on educational communications and technology (3rd ed., pp. 369-381). Lawrence Erlbaum Associates. https://doi.org/10.4324/9780203880869

Sweller, J., Van Merrienboer, J. J., & Paas, F. G. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10(3), 251-296. https://doi.org/10.1023/A:1022193728205

Tugtekin, U. (2022). Development and validation of an instrument for online learning fatigue in higher education. In G. Durak & S. Çankaya (Eds.), Handbook of research on managing and designing online courses in synchronous and asychronous environments (pp. 566-586). IGI Global. https://doi.org/10.4018/978-1-7998-8701-0.ch028

van der Linden, D. (2011). The urge to stop: The cognitive and biological nature of acute mental fatigue. In P. L. Ackerman (Ed.), Cognitive fatigue: Multidisciplinary perspectives on current research and future applications. (pp. 149-164). American Psychological Association. https://doi.org/10.1037/12343-007

Vercoulen, J. H. M. M., Swanink, C. M., Fennis, J. F., Galama, J. M., van der Meer, J. W., & Bleijenberg, G. (1994). Dimensional assessment of chronic fatigue syndrome. Journal of Psychosomatic Research, 38(5), 383-392. https://doi.org/10.1016/0022-3999(94)90099-x

Wanstreet, C. E. (2006). Interaction in online learning environments: A review of the literature. Quarterly Review of Distance Education, 7(4), 399-411. https://www.learntechlib.org/p/106711/

Wilcox, R. R. (2013). Introduction to robust estimation and hypothesis testing (3rd ed.). Elsevier. https://doi.org/10.1016/C2010-0-67044-1

Worthington, R., & Whittaker, T. (2006). Scale development research: A content analysis and recommendations for best practices. The Counseling Psychologist, 34(6), 806-838. https://doi.org/10.1177/0011000006288127

 

Athabasca University

Creative Commons License

Development of the Online Course Overload Indicator and the Student Mental Fatigue Survey by Gail Alleyne Bayne and Fethi A. Inan is licensed under a Creative Commons Attribution 4.0 International License.