International Review of Research in Open and Distributed Learning Development of the Online Course Overload Indicator and the Student Mental Fatigue Survey

The purpose of this study is to develop and examine the psychometric properties of the Online Course Overload Indicator (OCOI) and the Student Mental Fatigue Survey (SMFS). The OCOI was designed to measure students’ perceptions of cognitive overload in online courses. The SMFS was used to assess students’ perceptions of mental fatigue while taking online courses. An exploratory factor analysis was conducted on a sample of 378 undergraduate students from various institutions offering online courses across the United States. Results of a factor and reliability analyses confirmed that the instruments are valid and


Article abstract
The purpose of this study is to develop and examine the psychometric properties of the Online Course Overload Indicator (OCOI) and the Student Mental Fatigue Survey (SMFS). The OCOI was designed to measure students' perceptions of cognitive overload in online courses. The SMFS was used to assess students' perceptions of mental fatigue while taking online courses. An exploratory factor analysis was conducted on a sample of 378 undergraduate students from various institutions offering online courses across the United States. Results of a factor and reliability analyses confirmed that the instruments are valid and reliable measures of students' perceived mental fatigue and overload from online course elements. The analysis supported the model that students' perceptions of overload in online courses consist of four constructs-information relevance, information overload, course design, and facilitation-in addition to the one-factor structure of the SMFS, which consists of the student mental fatigue construct.
Mental fatigue is not a novel concept and is one that has received attention through empirical studies in various areas, such as military, aviation, driving, health care, and its effects on shift workers (Ackerman, 2011). Numerous research studies have been conducted on the effects of mental fatigue on employees in work-related environments and clinical settings (e.g., Al Ma'mari et al., 2020;Sarkar & Parnin, 2017). Researchers have also reported on mental fatigue in children and adolescents (Mizuno et al., 2011;Palmer, 2013). It has typically been a problem in clinical and workplace settings. However, limited published research exists on mental fatigue in online course design and its effects on students' cognitive functioning.
With the growth in online courses and programs, students are spending more and more time doing coursework and learning online, and their cognitive functions may be overtaxed to the extent of experiencing mental fatigue. Researchers have found that mental fatigue can play a part in the disruption of cognitive functioning (Boksem & Tops, 2008) and information processing, causing reduced attention and lack of focus on the task (van der Linden, 2011). The effects of mental fatigue may be attributed to factors in the online environment that are beyond students' control, such as information overload, poor course design, and a lack of instructor facilitation, to name a few. Such factors would need further examination to better understand their effects on students' level of mental fatigue in the online environment.

Mental Fatigue
Mental fatigue, sometimes referred to as cognitive fatigue or brain fatigue, is one type of fatigue that is often researched from a performance and motivation perspective. Beiske and Svensson (2010) define mental fatigue as measuring "the subjective feeling of being mentally exhausted, encompassing items such as concentration, memory and speech" (p. 78). DeLuca (2005) describes mental fatigue as "a decrement in performance from excess mental effort" (p. 8). Hockey (2013) refers to it as "an unfocused mental state (distraction, frustration, discomfort)" (p. 1). Boksem et al. (2005) describe mental fatigue as "the effects that people may experience after or during prolonged periods of cognitive activity" (p. 107). The common element in these definitions points to the underlying cognitive processes that are affected by the phenomenon of mental fatigue.

Mental Fatigue and Cognitive Load
The theoretical framework that guides and supports this study is cognitive load theory (CLT) (Plass et al., 2010). CLT is based on the premise that working memory has limited capacity and learners can only process small amounts of information at any one time (Miller, 1994). Consequently, some instructional design practices or strategies can impose extra or unnecessary mental effort and may contribute to mental fatigue that can constrain learning and performance (Clark & Mayer, 2016). Mental load is the load imposed by the task or sequence of information in the instruction (Sweller et al., 1998). Mental effort is the amount of capacity allocated to the demands imposed by the instruction (Sweller et al., 1998). Together, mental load and mental effort make up cognitive load (Ayres, 2006). Cognitive overload occurs when the mental load exceeds mental capacity (Clark & Mayer, 2016). Cognitive overload is certainly not mental fatigue, but it is a trigger that can cause mental fatigue. In other words, once mental capacity is exceeded, the brain becomes 76 mentally exhausted. Individuals typically experience an overload of mental capacity during and after taxing cognitive activity for prolonged periods. Once mental capacity is exceeded, it causes cognitive overload, which can trigger mental fatigue. When students are fatigued, they become disengaged, frustrated, and stressed, and their learning ability and/or capacity becomes diminished.

Online Versus Traditional Classroom Environments
There have been many studies on mental fatigue in work-related and clinical settings, however only a few studies have examined mental fatigue in educational settings (Csathó et al., 2012;Mizuno et al., 2011). This phenomenon has been largely overlooked in online environments, and there have been only a limited number of studies on information overload and fatigue in online environments (Lee et al., 2016;Tugtekin, 2022). However, researchers have found that there are challenges in online environments that might not exist in traditional classrooms, such as student perceptions of isolation; learner frustration, anxiety, and confusion; lack of community; lack of instructor engagement and immediate response; information overload; and challenges with technology, including access to a reliable Internet connection (Holmes & Reid, 2017). These challenges may affect the learner and the online learning experience and could be indicators that the online environment is causing undue mental fatigue.

Sources of Mental Fatigue
Cognitive overload caused by greater mental effort, task difficulty, and design of instruction (Plass et al., 2010) is a contributor to increased levels of mental fatigue (Balkin & Wesensten, 2011). Hence, factors that directly impact students, including course design, facilitation, information overload, and information relevance, are considered potential sources of mental fatigue. The following sections discuss each one of these sources as it relates to the online learning context.

Course Design
Course design is operationalized in this study as the organization, format, and structure of the online course including the components that make up its structure (e.g., multimedia elements, visual design elements, organization, etc.). Clark and Mayer (2016) propose a set of multimedia principles that can be used in the design of online courses to avoid overloading learners with extraneous content and to design courses in effective ways to promote student learning (Clark & Mayer, 2016). Additionally, the organization of the online platform and effective design of learning materials for online courses can help students engage in active learning by decreasing cognitive load. The psychological reason for effective course design is to help learners use their cognitive capacity to focus on the relevant instructional goals by reducing irrelevant processing of information, thereby minimizing cognitive load and mental fatigue.

Facilitation
Facilitation is operationalized in this study as the level of instructor presence, instructor immediacy, and feedback provided in an online course. Facilitation in the online environment is a fundamental element for student learning, satisfaction, and cognitive overload (Wanstreet, 2006). Some researchers have argued that instructor facilitation is important to "support and enhance social and cognitive presence for the purpose of realizing educational outcomes" (Garrison et al., 1999, p. 90). Researchers also note that various forms of instructor behaviors, such as frequently interacting with students, using informality and 77 casualness, returning phone/e-mail messages, and being accessible to students, to name a few (see O'Sullivan et al., 2004, for more cues), can be incorporated via the course design and written interactions (Baker, 2010).

Information Overload
The meaning of information overload can be different depending on the research context. This study adopted the definition proposed by Lee et al. (2016): information overload occurs when individuals "are exposed to more information than they can accommodate in their capacity for information processing" (p. 53). Two main determinants of information overload are human processing capacity and complexity (Sweller, 2008). Plass et al. (2010) declare that the source of cognitive load comes from the design of the materials, the difficulty of the material to be learned, and the mental effort required to process the new information. Furthermore, the type of cognitive load (i.e., intrinsic or extraneous) can contribute to increased levels of mental fatigue, with more difficult tasks consuming more mental effort (Balkin & Wesensten, 2011). Similarly, information overload in the online learning environment can originate from various sources, such as complex course content, an excessive number of readings, numerous topics in one lesson, long videos, and too many resources in the course, to list a few (Guo et al., 2014).

Information Relevance
Information relevance is an important aspect of any course. Roberson (2013) defines relevance as "the perception that something is interesting and worth knowing" (p. 18). Information relevance is operationalized in this study as the extent to which course content is helpful and relevant to a student's learning and success in and outside of the online course (Lee et al., 2016). Information relevance can produce an increase in motivation (Keller, 1983) and a decrease in mental load (Roelle et al., 2015). The irrelevance of information for current or future needs can affect personal motives, goals, and values and lead to greater fatigue due to a lack of motivation (Edwards & Cooper, 2013;Herlambang et al., 2019). Based on research findings, Roelle et al. (2015) conclude that specific relevance instructions could lower the amount of extraneous cognitive load that students have to process, finding that students who received specific relevance instruction had more working memory capacity to execute cognitive processes.

Purpose of the Study
Although the importance of understanding fatigue in learning has been recognized in previous research (Palmer, 2013), no attempt to date has been made to create and validate instruments to measure the underlying concept of student mental fatigue in educational settings (Hafezi et al., 2010). Some previous studies have used self-developed items; for example, Csathó et al. (2012) used a non-standard, one-item statement focusing on student tiredness levels to measure undergraduate and postgraduate students' levels of subjective fatigue before and after fatigue-inducing mental tasks. Others have attempted to use instruments designed for medical purposes. For example, Mizuno et al. (2011) examined cognitive predictors of fatigue in elementary and junior high school students by using the Chalder Fatigue Scale (Chalder et al., 1993), which is designed to measure the severity of chronic tiredness due to illnesses. Another instrument to measure chronic fatigue, the Checklist Individual Strength questionnaire (Vercoulen 78 et al., 1994), has also been frequently used in studies (Bakker et al., 2009). Unfortunately, these existing instruments, usually designed for medical diagnosis purposes, do not specifically measure how fatigued students feel while doing coursework and do not apply to diverse student populations in online courses.
Furthermore, there is no systematic way, at the time of writing, to help instructors identify specific areas in the online environment that may be causing overload. The repercussions could include poorly designed online courses that lead to student cognitive overload, disengagement, and attrition. The challenge presented to online instructors is recognizing whether online instruction and/or the design of the online environment contributes to mental fatigue. As a result, instruments are needed in the field of education to gather information about sources of student mental fatigue in online courses. Understanding these constructs will enable better decision making that can lead to improved instructional design that minimizes cognitive overload and mental fatigue.
Therefore, the purpose of this study was to develop two instruments-the Online Course Overload Indicator (OCOI) and the Student Mental Fatigue Survey (SMFS). The SMFS examines and confirms the level of mental fatigue students are experiencing while doing coursework, and the OCOI helps to identify where the mental fatigue/overload is coming from within the online environment (i.e., online course design elements). The research question of the study is the following: What are the psychometric properties (i.e., factors to be retained, variance for each factor, reliability of subscales, and interpretation of factors) of the OCOI and SMFS?

Methodology Participants
This study used a non-probability sample from a Qualtrics panel. The target population was undergraduate students who were at least 18 years of age and currently enrolled in a fully online course, defined by the Online Learning Consortium as a course in which all activities are done online without any required faceto-face components (Mayadas et al., 2015). The instruments were administered to students after the Thanksgiving break to ensure that students had enough time to acclimate to the online course and environment. Participants were instructed to complete the survey with respect to any online course they were taking that semester. The panel sample was acquired from Qualtrics Panels, LLC. The company collected data from participants enrolled in various distance education institutions/programs across the United States. The company was responsible for sending the survey out through its panel partners to participants, inviting them to complete the online survey in return for incentives, which the company provided. Nunnally's (1978) widely cited recommendation is that the subject-to-item ratio should be at least 10:1 for exploratory factor analysis. Therefore, the instruments were tested with a large sample to establish validity and reliability (DeVellis, 2003).
Data from 378 undergraduate students, who were enrolled in online courses, were used for the analyses in this study. The majority of students were female (82%; 18% male), which could be due to the trend of higher female enrollment in distance education (Guramatunhu, 2015). The students ranged in age from 18 to 50 79 years (M = 27.15, Mdn = 25.00, SD = 7.57). The ethnic composition of the sample was diverse. Students self-identified with the following ethnicities: white (65.87%), Black/African American (15.87%), Hispanic/Latino (11.11%), Asian (3.17%), multiple (1.85%), American Indian/Alaskan Native (1.32%), and other (0.79%). Most students indicated that this was not their first semester taking an online course (69.3%). In addition, the data revealed that a large portion of students (62.9%) were taking three to six credit hours of online courses. Regarding technical skills, the majority of the students indicated that they were very proficient computer users (70.9%).

Instrument Design and Development
This section addresses the design and development of the OCOI and the SMFS. This study's inventory design and development process generally followed DeVellis's (2003) eight-step scale development process. Under DeVellis's (2003) guidelines, the eight-step process was organized into three distinct stages for this study: stage 1-identification of constructs and development of subscale items (steps 1-3); stage 2-expert review and validation (steps 4-6); and stage 3-factor analysis and scale optimization (steps 7-8).

Stage 1: Identification of Constructs and Development of Subscale Items
In stage 1, the initial identification of the constructs was based on themes that emerged from an extensive literature review and additional findings in a pilot study, conducted at a four-year public university in the southwestern region of the United States to gather data about the levels of subjective fatigue experienced by online students (Alleyne Bayne, 2016). The pilot study included 63 graduate and undergraduate students from different majors in fully online courses. Results revealed that 36.5% of students had severe levels of mental fatigue, 31.7% had elevated levels, and 31.7% had normal levels (Alleyne Bayne, 2016). Anecdotal evidence from students about the effort exerted were testimonials of their levels of frustration, which were similar to student comments from previous surveys (Barnard & Paton, 2007;Lambert et al., 2009). Several themes emerged from the analysis including information relevance, information overload, course design, instructional activities and materials, and student mental fatigue (Alleyne Bayne, 2016). A list of constructs and their definitions are listed in Table 1.

Construct
Definition Information relevance The extent to which course content is helpful and relevant to a student's learning and success in and outside of the online course (Lee et al., 2016) Information overload Occurs when individuals "are exposed to more information than they can accommodate in their capacity for information processing" (Lee et al., 2016, p. 53) Course design The organization, format, and structure of the online course, including the components that make up the structure (e.g., multimedia elements, design elements, etc.) Instructional activities and materials The activities and materials that effectively communicate the content and/or instructor's intent (e.g., instructions/guidelines, assignments, reading materials, multimedia elements) to learners to promote learning Facilitation The level of instructor presence, instructor immediacy, and feedback provided in an online course Mental fatigue A self-reported feeling of tiredness after a long duration of mental activity; encompasses feelings of anxiety, frustration, and stress The next step was the exploration of relevant literature on existing validated instruments pertinent to the constructs that were investigated, including mental fatigue (Mota & Pimenta, 2006;Vercoulen et al., 1994), information relevance (Lee et al., 2016), information overload (Chen et al., 2011;Lee et al., 2016), student perceptions of connectedness (Bolliger & Inan, 2012), student perceptions and expectations of online learning (Harris et al., 2011), instructional activities and materials (Roach & Lemasters, 2006), facilitation (Bolliger & Inan, 2012;Harris et al., 2011), and course design (Harris et al., 2011). The creation of the initial item pool was guided by the construct definitions provided in Table 1, keywords that represent the constructs, and an existing set of items from other validated instruments.

Stage 2: Expert Review and Validation
In stage 2, the items were written to represent each construct of interest using the guidelines presented by Worthington and Whittaker (2006) for the generation of an item pool-that is, they are "clear, concise, readable, distinct, and reflect the scale's purpose" (p. 813). Decisions were made on the number of items for each scale and the type of response format. Hinkin et al. (1997) recommends four to six items for each construct. The Likert response format was used because it is reported as the most widely used in instrument development for attributes measuring constructs that are unobservable such as attitudes or beliefs (DeVellis, 2003). A preliminary list of 38 items (30 for the OCOI and 8 for the SMFS) on a Likert scale ranging from 1 (strongly disagree) to 5 (strongly agree) was developed.
The item inventory was sent to a panel of experts in distance education and instructional technology for face and content validity evaluation. The panel experts consisted of three faculty members who had terminal 81 degrees and several years of experience in teaching online, designing online environments, developing instruments, and publishing research, and one instructional designer with more than 10 years of experience designing/developing hybrid and online courses. Panel experts were contacted individually via e-mail and were provided with instructions, operational definitions, and a scale and rating form for review and comments/suggestions for improvement of the instrument. Additionally, the experts were provided with working definitions of the constructs (see Table 1) to rate the items in each construct concerning their relevance to the definition (DeVellis, 2003). Changes to the instruments were made based on recommendations from panel experts. The OCOI included 30 items and five subscales as follows: information relevance, information overload, course design, instructional activities and materials, and facilitation. The SMFS comprised eight items on a single subscale.

Stage 3: Data Collection and Analysis
In stage 3, the items were administered, along with other scales, to the panel sample of 378 undergraduate students enrolled in online courses to validate the instruments. The main purpose of this stage was to establish the psychometric properties of the instrument. Preliminary analysis was conducted after the items were administered to the sample. The assumptions were checked for normality/linearity (Wilcox, 2013), and sample size adequacy was also checked (Bartlett, 1951). The principal component analysis was used as a data reduction technique to reduce observed variables into a smaller number of components (Worthington & Whittaker, 2006). The oblique (Promax) rotation was used because of the assumption that the factors underlying the items are correlated according to previous theoretical support (Field, 2009). The number of factors to be retained was determined using several methods. Factor extraction and rotation were conducted to get the loadings for each factor and to improve the interpretation of the factors (Mertler & Vannatta, 2016).

OCOI: Exploratory Factor Analysis Results
Four criteria were used to determine the appropriate number of components to retain: eigenvalue, scree plot, total variance explained, minimum average partial (MAP) test, and parallel analysis. Upon examination of the eigenvalues, five factors were above the value of 1 and explained 54.73% of the total variance. Kaiser (1960) recommends retaining all factors with eigenvalues greater than 1. There is a consensus in the literature, however, that using eigenvalues is one of the least accurate measures to determine the number of factors to retain (Carpenter, 2018;Costello & Osborne, 2005). Previous research suggests that parallel analysis (PA) is more accurate than other methods to determine the number of factors to retain (Matsunaga, 2010). Velicer's MAP test is also a validated procedure that was used to decide on the number of factors to retain (O'connor, 2000). The Statistical Package for the Social Sciences (SPSS) with a subprogram was used to compute PA and MAP for the OCOI. The results from the PA indicated a fourfactor structure. Similarly, a four-components solution was suggested by the original and revised MAP test. The scree plot was slightly ambiguous but also supported the retention of four factors. Therefore, four factors were retained based on the results of the PA, MAP test, and scree plot (see Figure 1).

Figure 1 Parallel Analysis Scree Plot Results
The four-factor structure yielded a total explained variance of 50.99%. Each retained item loaded distinctively on one of the four factors. Cross-loadings and multiple factor loadings have been identified as evidence of complex items reflecting the influence of more than one factor (Worthington & Whittaker, 2006). Examination of the pattern matrix revealed items from one scale with cross-loading or multiple factor loadings. Therefore, the items from the instructional activities and materials subscale were eliminated due to low factor loadings and/or loading to multiple subscales. In Table 2, the retained items are ordered and grouped by the size of their loadings.  Information presented is what I need to know to be successful in the course .601

Facilitation
The instructor is actively involved in the online environment .869 The instructor communicates clearly in writing throughout the course .848 The feedback provided by the instructor is constructive .828 The instructor is responsive to my questions .784 The instructor provides timely feedback .776 The instructor encourages learner participation in course activities/tasks .732

Course design
The course tasks/assignments are easy to locate .769 The course is easy to navigate .759 The design (e.g., organization, presentation, i.e., look and feel) is consistent throughout the course .756 The course materials are easy to find .756 The design elements (e.g., colors, fonts, buttons, use of space) are visually pleasing .701 The course was logically organized .660 Information overload The number of readings is overwhelming .763 There is an excessive amount of information to process in the course .728 The course content is complex .695 There are too many resources in the course .674 The videos are too long .621 The number of threads/replies posted in the online discussions is overwhelming .585 Initial eigenvalue (all items) 8.90 2.58 2.07 1.74 Initial % of variance explained (all items) 29.67 8.59 6.91 5.81 Note. Factor loadings under .50 were removed for the retained constructs.

The SMFS: Exploratory Factor Analysis Results
The SMFS was examined with principal components analysis. Typically, factor analysis requires two steps: (a) factor extraction and (b) factor rotation. However, after factor extraction, an examination of the scale revealed a single underlying dimension. Thus, the component loadings of the individual items indicated a single construct. Therefore, no factor rotation was used. Eigenvalues and the scree plot were examined for factor retention. On inspection of the eigenvalues, only one factor was above the value of one and explained 62.21% of the total variance. The scree plot suggested retaining one factor and was in concurrence with the eigenvalues. Therefore, one factor was retained. Only items with a factor loading of at least .50 were interpreted, based on Comrey and Lee's (1992) factor loading guidelines. All item loadings were above .50, and there were no cross-loading items. In Table 3, the items are ordered and grouped by the size of their loadings. Note. All factor loadings were greater than .50 for the retained construct.

Reliability Analysis
Internal consistency and reliability analyses were conducted to determine the extent to which items correlated with each other and the degree to which items consistently measured the same construct as other items within that scale (Slavin, 2007). To determine the instrument's reliability, Cronbach's alpha coefficient was calculated. The final model for the OCOI retained 24 items and four subscales. The overall reliability of the OCOI was 0.89. Information relevance, facilitation, course design, and information overload subscales all had high to moderate reliabilities, with Cronbach's alpha ranging from 0.77 to 0.90. The final SMFS included eight items. The overall reliability was high (0.91). Further analysis indicated that this alpha would not increase with the deletion of any item(s). Table 4 includes the number of items, Cronbach's alpha coefficients, means, and standard deviations for the OCOI and SMFS.

Discussion
The evidence from previous studies suggests that understanding the impacts of course design elements on students' level of mental fatigue is important because it may potentially impact learning and performance (Ackerman et al., 2010;Jensen et al., 2013), cognitive flexibility (Plukaard et al., 2015), and exploration of complex tasks (Sarkar & Parnin, 2017). However, there is a lack of information in the research on the effects of course design elements and mental fatigue in online environments. This may be due to mental fatigue being a difficult construct to measure and because up to the date of writing, no instruments have been identified that measure student mental fatigue in online environments. Therefore, the OCOI and the SMFS were developed to be used to fill this gap in the literature from an online learning perspective.
The feedback from students regarding course design and implementation elements is an important aspect in helping instructors provide quality instruction to learners. Therefore, a valid and reliable evaluation tool (such as the OCOI) would help instructors identify specific areas for improvement within the online course (e.g., content, design, and environment). In addition, the SMFS could help instructors understand the mental constraints of their learners in terms of whether they feel overwhelmed, confused, anxious, or frustrated when doing coursework. Additionally, online courses usually attract students from diverse backgrounds (e.g., working professionals needing flexible schedules, nontraditional students, etc.). Considering these learners' mental and physical workloads, improvements in the learning experience could make a difference in their learning outcomes (e.g., content retention and course performance). The four constructs identified on the OCOI gather the student perspective regarding overload indicators in the online environment, and they have been frequently mentioned by online students as sources of their frustrations (Alleyne Bayne, 2016; Barnard & Paton, 2007;Lambert et al., 2009).

Future Research and Limitations
This study has explored the design of two new instruments regarding online course overload indicators and their effects on students' mental fatigue. Considering the novelty of the subject studied, several areas can be further explored. Future studies using alternative statistical procedures (e.g., confirmatory factor analysis) could validate the developed instrument. A follow-up with confirmatory factor analysis on a new sample could be useful, as researchers recommend repeating instrument validation with a new data set for optimizing the scale length (Field, 2009;Worthington & Whittaker, 2006). These instruments can also be used to investigate whether course design elements predict mental fatigue in online courses. Additionally, studies can examine the relationship between academic performance and perceived student mental fatigue in an online environment, as well as whether improvements in course design decrease perceived student mental fatigue.
In this study, one limitation was that the researchers were not directly involved in the online course design, which would have given the study a point of reference for the quality of the online course environments. In future studies, researchers may consider directly reviewing and evaluating various online course design elements. The research should be expanded into the actual online course environment with instructor input and expert assessment of the course components along with the student-level data to correlate the constructs and their impact on learning outcomes. Additionally, future studies could involve multiple data collection points to explore the effects of online course design elements on students' mental fatigue in an 86 online learning environment. Such longitudinal studies would allow researchers to monitor changes in student mental fatigue over time.

Conclusion
Several existing instruments are used for fatigue research in medicine. However, these instruments are not targeted toward the online environment-specifically, the design of online courses. Therefore, an important outcome of this study was identifying tools that educators can use to assess whether design elements in online instruction contribute to mental fatigue. Two instruments were created that can assist instructors teaching courses online to assess student perceptions of cognitive overload and mental fatigue when doing coursework. Specifically, the Online Course Overload Indicator (OCOI) was designed to measure students' perceptions of cognitive overload in online courses. The Student Mental Fatigue Survey (SMFS) was designed to measure students' perceptions of mental fatigue while taking online courses. The findings from this research study may be helpful for instructors seeking to optimize their online course design to promote better learning experiences for online learners. For those instructors who are new to online teaching and thus unprepared to teach in this environment, and/or for seasoned instructors who have never taught online, the OCOI could be used to better gauge where in the online course/environment students are being overloaded, and adjustments to the course can be provided as needed. Hence, the knowledge gained from this research study could enable practitioners in education to use these tools to facilitate online learning, thereby improving online course content in meaningful and relevant ways to promote student learning and satisfaction.