November – 2014

A Comparison of Learner Intent and Behaviour in Live and Archived MOOCs

Campbell photo Gibbs photo Najafi photo Severinski photo

Jennifer Campbell, Alison Gibbs, Hedieh Najafi, and Cody Severinski
University of Toronto, Canada

Abstract

The advent of massive open online courses (MOOCs) has created opportunities for learning that are clearly in high demand, but the direction in which MOOCs should evolve to best meet the interests and needs of learners is less apparent. Motivated by our interest in whether there are potential and purpose for archived MOOCs to be used as learning resources beyond and between instructor-led live-sessions, we examined the use of a statistics MOOC and a computer science MOOC, both of which were made available as archived-courses after a live-session and for which enrolment continued to grow while archived. Using data collected from surveys of learner demographics and intent, the course database of major learner activity, and the detailed clickstream of all learner actions, we compared the demographics, intent, and behaviour of live- and archived-learners. We found that archived-learners were interested in the live-MOOC and that their patterns of use of course materials, such as the number and sequence of videos they watched, the number of assessments they completed, their demonstration of self-regulatory behaviour, and their rate of participation in the discussion forums, were similar to the live-learners. In addition, we found evidence of learners drawing on an archived-MOOC for use as reference material. Anticipated areas of impact of this work include implications for the future development of MOOCs as resources for self-study and professional development, and in support of learner success in other courses.

Keywords: MOOCs; massive open online courses; archived MOOCs; online education; self-directed learning; self-regulation; remediation

Introduction

Massive open online courses (MOOCs) can provide flexible enrolment options for learners beyond their registered cohort as these courses are often left accessible as self-study after the end of a given session. In this archived-mode, learners continue to be able to access the course materials, although deadlines and the opportunity to earn a credential have passed. In Coursera, for example, about 78% of courses that have finished at least one session on the platform have a session in archived mode (C. Gao, personal communication, November 26, 2013). In this study, we compare the use of two MOOCs as in-session, instructor-led (“live”) courses with their use as post-session, self-directed (“archived”) learning resources.

Figure 1

The two MOOCs studied here are “Statistics: Making Sense of Data” (STATS) and “Learn to Program: The Fundamentals” (LTP1), offered on Coursera with live sessions in April-May 2013 and September-November 2012, respectively, and available afterwards as archived-MOOCs. When each of these MOOCs ended, many learners persisted and new registrants continued to join the archived courses. Figure 1 shows the number of learners who registered before, during, and after the live sessions of the two courses. We also look at a follow-up course to LTP1, “Learn to Program: Crafting Quality Code” (LTP2), to understand how students transition among related live- and archived-MOOCs.

Purpose of the Study

The purpose of this study is to compare the characteristics of learners and their interactions with two MOOCs during live-sessions and afterwards when the MOOCs are archived. Live- and archived-MOOCs are distinguished by the presence or absence of instructional support, cohort presence, deadlines, and the potential for formal acknowledgement of completion. To this end, we investigated live- and archived-learners’ demographics, intent, and the relationship between intended and actual behaviour, including the amount and nature of interaction with the course materials. Our goal in this observational study was to examine learner characteristics and behaviour given the MOOC formats that were available to the learners, allowing us to better understand both who chooses to use the course materials in each format and how they engage with the materials.

Research on MOOC learners has recognized the effect of learner intent on the amount and pattern of MOOC component use (Kizilcec, Piece, & Schneider, 2013). Studying patterns of behaviour of learners who have no chance of earning a Statement of Accomplishment (SoA) or some other formal acknowledgement of completion extends the current state of MOOC research. It also distinguishes this study from the dominant discourse of MOOC research, which is focused on low completion rates and the behaviour and characteristics of learners who do and do not earn an SoA (e.g., Seaton, Bergner, Chuang, Mitros, & Pritchard, 2014).

The purpose of this work is not to critique MOOCs, nor address their known shortcomings such as high attrition rates (Adamopoulos, 2013; Catropa, 2013). We also acknowledge the challenges involved in assessing learning achievement of informal learners (Levenberg & Caspi, 2010), let alone archived-learners. Rather, we aim to understand the similarities and differences between the learning paths of live-learners and archived-learners.

Definition of Terms

Live-MOOC: In-session, instructor-led course with the possibility of obtaining an SoA. The instructional team regularly sends reminders and encouragement through email and announcements to learners. Materials are released at regular intervals and learners are guided through the course at the pace at which the materials are released. An instructional team and cohort provide learning support through the discussion forums. Coursework has deadlines and some coursework includes peer assessment.

Archived-MOOC: Post live-session, self-directed course with minimal or no instructional support or cohort presence, no deadlines, no peer-assessment, and no opportunity to earn an SoA. All materials are available on registration, giving learners more flexibility relative to live-MOOCs in the pace and order in which they access the course materials.

Live-learners: Learners who register in a live-MOOC in time to earn an SoA.

Archived-learners: Learners who are active in an archived-MOOC.

For the purposes of this study, learners are characterized by whether they are accessing the particular live- and archived-MOOCs under consideration, in the form in which these particular MOOCs were available at the time, without consideration to the status of the course when the learners enrolled and how they might be using other MOOCs in which they may be enrolled. In addition, in this study we do not consider other MOOC formats with flexible start dates and pacing such as MOOCs designed for self-study.

Theoretical Framework

Increasing chances for on-demand learning, in the light of intensive interest in life-long learning, is a significant promise and potential of MOOCs (Garrison, 2011). Learners are able to actively choose among available MOOCs to address their professional and learning needs or to pursue personal interests, notwithstanding temporal, geographical, or institutional barriers (Adamopoulos, 2013). Such on-demand learning (Dobrovolny, 2006; Rhode, 2009) is often informal, self-paced, and self-directed online learning. We first examine characteristics of MOOC learning in the light of the aforementioned learning modalities. Then, we examine factors known to impact the quality of learning in this context.

Alongside institution-affiliated online learning, there exist other modalities of online learning that provide to learners greater freedom of choice in duration, content, and modes of assessment (Levenberg & Caspi, 2010). For example, in a self-paced learning mode, students can choose the start date of their course and complete it according to their own time schedule (Anderson, Poellhuber, & McKerlich, 2010; Horton, 2006). Self-paced courses, however, are usually bound by a deadline to finish all course activities and involve some level of real-time or asynchronous instructional support (Gerlich, Mills, & Sollosy, 2009). Learner’s choice also applies to the amount of course material and activities that a learner covers.

Similar to other informal and self-paced learning, motivation to pursue learning through MOOCs ranges from career advancement to personal interest (Iiyoshi & Kumar, 2008; Sheu, Lee, Bonk, & Kou, 2013). Live-MOOCs provide an opportunity for informal online learning that can be instructor-led and to some degree self-paced, as learners may often have to stick to deadlines if they strive to earn an SoA and cannot explore course content that is not released yet. Live-learners can also engage in peer interaction, with a defined cohort of students working through the material at the same pace. Archived-MOOCs, on the other hand, fall into the extreme end of the self-paced learning continuum (Lowenthal, Wilson, & Parrish, 2009), as archived-learners have no deadlines to meet and little chance of interacting with course instructors. While interaction with peers is still possible, there is not the same large cohort studying the same material at the same time. Learners can select the material to cover and take as long as they need.

Self-regulation.

Informal learning, such as learning in MOOCs, demands a high level of self-directedness from the learner as they are in charge of their own progress. Self-regulation (Pintrich, 2000; Zimmerman, 2008) explains how learners manage their learning by actively setting goals, planning to achieve their goals, identifying and using resources, monitoring their progress, and using self-corrective measures. We use self-regulation to explain qualities that learners need to develop in order to engage with and persist in informal, non-credit, yet structured, learning environments of MOOCs. Dobrovolny (2006) studied self-paced corporate learners’ use of self-regulatory processes using verbal and visual think-aloud strategies. The participants would refer to course material to resolve misunderstandings and confusions, demonstrating self-assessment and reflective strategies. In MOOCs, following each of thousands of learners’ self-regulatory processes is not feasible, and it is impossible to provide scaffolding to meet each learner’s unique needs. However, clickstream data might provide evidence of learners’ self-corrective behaviours, such as accessing relevant resources or posting in discussion forums between repeated attempts at formative assessments.

Learning goals.

Time and effort invested in self-paced courses can be affected by learners’ goals and desired achievement levels (del Valle & Duffy, 2009; Ely, Sitzmann, & Falkiewicz, 2009). Within informal learning environments, learners’ goals are typically to satisfy their personal interest or further develop their competencies (Sheu, Lee, Bonk, & Kouu, 2013). Time and effort invested in a MOOC may vary from completing all assignments and following the cohort, to selecting relevant topics and studying them at the learner’s desired pace (Kizilcec, Piece, & Schneider, 2013). Existing research in self-paced and MOOC learning has mostly focused on learners who complete all course requirements (DeBoer, Stump, Seaton, & Breslow, 2013), excluding the majority of MOOC learners.

Peer collaboration and interaction.

While social presence and its necessity for knowledge co-construction is favoured in online learning community frameworks, such as the community of inquiry (Garrison, Anderson, & Archer, 2000), such interaction among learners in self-paced contexts may be of less importance to their learning outcome. In an exploratory study guided by Anderson’s (2003) interaction equivalency theorem, Rhode (2009) interviewed ten learners who completed an online self-paced professional development course. Results showed that participants perceived learner-learner interactions to be challenging and less important than other forms of interactions. Similar findings were reported from a voluntary hybrid professional development course for new faculty, in which learners rarely posted to course discussion forums or replied to their peers’ postings (Schwier, Morrison, Daniel, & Koroluk, 2009). More than 70% of self-paced distance learners in the Anderson, Poellhuber, and McKerlich (2010) study preferred working independently to working in groups. Conversely, learners who used fewer resources and invested less time than their peers in a self-paced online professional development course preferred cohort-based learning (del Valle & Duffy, 2009). Although learners value peer-assessment components of MOOCs (Adamopoulos, 2013), high levels of sustained peer interaction may not be viable due to the sheer number of registrants and varied start times. However, based on the existing body of research, motivated learners would persist in the learning environment even with minimal peer interaction. The results of our investigation of the learning strategies of learners in archived-MOOCs, who have little chance for peer interaction, may inform future understanding of learner-content and learner-instructor interaction.

Course content.

Learners’ depth of learning is necessarily affected by their persistence with the course. Using text- and opinion-mining methods, Adamopoulos (2013) analyzed 1,163 reviews submitted by 842 learners who had taken at least one MOOC in various disciplines to investigate factors associated with learner retention. While learners’ satisfaction with the course material was positively associated with their completion of the course, courses with higher workload and longer duration had greater risk for learner attrition. The importance of the quality of course material on perceived learning, specifically over peer interaction, has also been observed (Rhode, 2009; Schwier, Morrison, Daniel, & Koroluk, 2009). And in a journalism MOOC, 50% and 40% of learners rated course readings and videos, respectively, as being the most helpful learning resources, with only 6% of learners identifying discussion forums as a useful learning resource (Liu, Kang, Cao, Lim, Ko, & Weiss, 2013).

Role of instructor.

In addition to content, the presence of instructional support may influence self-paced, self-directed learners’ experiences, but instructional support may be deemed less essential than high quality content (Rhode, 2009; Schwier, Morrison, Daniel, & Koroluk, 2009). Contradictory evidence was also reported in MOOC settings where instructors were found to be the most important factor in learner retention (Adamopoulos, 2013), but this study does not provide detail on the aspects of the instructors’ role that foster retention. The amount of reliance on and interest in instructor interaction may also depend on learners’ goals and motivation (del Valle & Duffy, 2009). Considering the large enrollments in MOOCs, high volumes of instructor interaction and feedback may not be feasible or as essential as in formal online courses (Hosler & Arend, 2012; Skramstad, Schlosser, & Orellana, 2012; Sheridan & Kelly, 2010). Archived-MOOCs offering high quality content can be of value to self-paced and self-directed learners, since these learners are less reliant on instructors.

Method

Context of the Study

The learners studied in this research were enrolled in courses that teach practical, skills-based subjects, and for both courses the prerequisite was only high school level mathematics. Moreover, the concepts and skills covered are useful for a variety of professions and fields of study.

We briefly describe the two courses, offered on the Coursera platform, that were used as cases for this study.

A sequel to LTP1, “Learn to Program: Crafting Quality Code” (LTP2), was offered in March-April 2013. Approximately 54,000 enrolled in the course by the end of its 5-week live-session.

We study learners in the live- and archived-sessions of STATS. Because of the timing of the conception of this work, we study learners in two sessions of LTP1: archived-learners in the first-session and live-learners in the second-session. We also investigate the activity of LTP2 live-learners who accessed the archive of LTP1, to further understand how learners make use of archived-MOOCs.

Data Sources

Data corpus was collected through the Coursera platform and included the following.

Live-survey: A pre-course survey of live-learners included close-ended questions on demographics, reasons for enrolment, intended time investment, amount of videos and assessments they intended to complete, previous knowledge, and MOOC experience.

Archived-survey: Archived-learners were asked to complete a survey similar to the live-survey with additional questions including why they took the course in its archived-mode and whether they would retake the course in live-mode.

Coursera database: The database contains records of videos accessed, assessments submitted, and posts to the discussion forums.

Clickstream: The clickstream includes a log of all user activity.

All data were anonymized at the institutional level before distribution to the researchers.

Analysis

We used descriptive statistics to better understand the characteristics of live- and archived-learners using the survey data. Data from the clickstream were used to identify learners’ patterns of use of the course components. The database of each course provided additional evidence to corroborate and complement this analysis. Anonymized user identifiers provided a map across all data sources, allowing us to connect user behaviour characterized in the clickstream and database with measures of intent captured in the surveys.

Results

We report on learner demographics, intent, and behaviour. Unless otherwise stated, the data presented include only those live- and archived-learners who completed the live- and archived-surveys, respectively. For STATS, 17,541 learners completed the live-survey and 1,923 completed the archived-survey; for LTP1, 28,585 learners completed the live-survey and 2,137 completed the archived-survey. We acknowledge that this population may be different from the population of all learners. However, it is known that many MOOC registrants do not actively participate (e.g., Balakrishnan & Coetzee, 2013) and by restricting ourselves to survey respondents, we are considering a population that is more engaged. Also, we are able to use demographics and learner intent to contextualize behaviour.

Learner Demographics

First, we highlight similarities and differences between demographics of live- and archived-learners, namely age, language proficiency, highest level of education, and reasons for enrolling in the STATS and LTP1 MOOCs. Such descriptive findings further contextualize the results of our clickstream data analysis, as we explain in the following sections.

In both STATS and LTP1, live-learners were younger than archived-learners. In STATS, 85.7% of live-learners and 78.9% of archived-learners were 45 years old or younger; in LTP1, 89.8% of live-learners and 80.0% of archived-learners were 45 years old or younger.

A greater proportion of live-learners identified English as their first language. For STATS, 36.2% of live-learners and 27.0% of archived-learners identified English as their first language. These percentages were 42.8% and 31.8% for LTP1.

Archived-learners in both courses had higher education levels than live-learners with many educated at an undergraduate or postgraduate level. For STATS, 87.9% of live-learners and 92.5% of archived-learners indicated they had completed at least an undergraduate degree, whereas for LTP1, 65.4% of live-learners and 74.0% of archived-learners had at least an undergraduate degree.

Since the experience of archived-learners is of central focus to this research, we examined the reasons why learners chose to enroll in the archived-courses. Figure 2 shows the percentage of survey respondents who selected each possible response as a reason for enrolling in the archived-MOOC. Learners could choose as many responses as applied to them. The top responses for learners in both MOOCs were that they enrolled in the live offering but were not able to complete the course (43.4% for STATS and 41.1% for LTP1), and that they arrived too late for the live offering (30.8% for STATS and 40.9% for LTP1). As these responses indicate, archived-learners were interested in the live-course and most (69.5% of LTP1 and 53.9% of STATS archived-learners) indicated that they would retake it live if it were re-offered.

Figure 2

Learner Intent

Time learners planned to spend.

For STATS, live-learners planned to devote more hours per week to the course than archived-learners (median of 5 hours for the live-learners and 2 hours for the archived-learners). Live- and archived-learners in LTP1 intended to devote a similar number of hours per week to the course (median of 5 hours for both the live- and archived-learners).

Work learners planned to complete.

Learners were asked how much work they planned to do for the course. We have classified their response choices into the following categories.

All required: The learner indicated that he or she planned to complete all requirements, including watching all videos, and completing all assessments.

Most: The learner indicated that he or she planned to watch most videos and complete some assessments.

Not sure: The learner indicated that he or she was unsure.

Some: The learner indicated that he or she planned to watch some videos, perhaps on targeted topics, but was unlikely to complete assessments.

Table 1 shows the percentage of learners in each category. Perhaps motivated by the opportunity to earn an SoA, more live-learners intended to complete all requirements. For STATS, the relatively low percentage of archived-learners who planned to complete all requirements may be a reflection of the fact that the peer-assessments could not be completed in archived-mode.

Table 1

Learner Behaviour

Overview.

Table 2 provides summary statistics about the behaviour of learners in the live- and archived-MOOCs, as characterized by how many videos they accessed, assessments they attempted, and the time between first and last video access.

The average number of required videos accessed is similar for live- and archived-learners in STATS, but LTP1 live-learners watched, on average, about two more videos than archived-learners. While live-learners complete slightly more assessments than archived-learners, the difference is small. The optional videos in STATS are tutorials in the R statistical software and programming language. Archived-learners watch, on average, approximately one more of these videos than live-learners.

We defined access time as the number of days between the first and last access of any video. In this analysis we are only considering archived-learners who had been enrolled for at least the length of the live-MOOC. For both STATS and LTP1, over 50% of learners accessed videos for at most 10 days in both the live- and archived-courses. However, for the learners in the top quartile of video access times, more archived-learners than live-learners had long access times. Archived-learners in the 90th percentile accessed videos over a period of 150 days or more, while for live-learners the 90th percentile was approximately 50 days. Thus even the live-learners who had the longest access times tended to not access the MOOCs after their formal conclusion, while some archived-learners access the learning resources over long periods of time. Note that, because the last day of access was necessarily constrained by the date on which the clickstream was extracted, some larger access times are censored, possibly underestimating the 90th percentile for archived-learners.

Table 2

Sequencing of videos watched.

Archived-learners who enroll after the live-MOOC has ended have immediate access to all lecture videos. As a result, they have much greater opportunity than live-learners to explore the course in a non-sequential fashion, perhaps picking and choosing topics that are of interest to them.

To investigate this behaviour, we model a learner’s transition from video to video using a first-order Markov Chain. The statistics of interest are captured in a video-by-video matrix; rows are the last video watched, columns are the next video watched, and the entry is the estimated probability that a learner will make this transition. Since we are interested in transitions between videos rather than rewatches of the same video, we exclude self-transitions.

Figures 3, 4, 5, and 6 give visual representations of these transition matrices. Hotter colours indicate larger probabilities, corresponding to more common video-to-video transitions. A hot spot immediately to the right of the diagonal indicates the transition of watching a video in sequential order.

In STATS, it was very common for both live- and archived-learners to follow the intended sequence, as indicated in the strong pattern from upper-left to lower-right, one to the right of the diagonal. Figures 3 and 4 illustrate the transition matrices for learners whose intent was categorised as all required and most. The strong sequential pattern in video transitions exists regardless of intent.

For LTP1 learners who indicated they intended to do all of the required work for the course, both live- and archived-learners also tend to watch the videos in sequence (Figure 5). However, for learners who intended to do less than all of the work, video transitions were more sequential in the live-course than in the archived-course. In Figure 6 we illustrate the matrix for the learners who responded in the category most for intent; a similar pattern was observed for learners who intended to do some or who indicated that they were not sure.

Figure 3

Figure 4

Figure 5

Figure 6

Quantity of assessments completed.

Archived-learners cannot earn a Statement of Accomplishment, so we cannot use earning an SoA as a metric of course completion. Quiz completion is one possible alternative measure of course completion. For STATS, Figure 7 shows the proportion of learners for each possible number of quizzes completed, broken down by intent. For both live- and archived-learners, learners more commonly completed zero or all seven quizzes. The largest proportion of those who completed all seven quizzes was observed for learners who intended to complete all required in the archived-MOOC (35.7%).

Figure 7

Overall, the pattern was similar for both quizzes and assignments in LTP1, as shown in Figures 8 and 9. However, the proportion of learners who completed all seven quizzes is not as prominent in the live-course, perhaps indicative of the fact that an SoA could be earned without completing all quizzes.

Figure 8

Figure 9

Discussion forum interactions.

We also investigated discussion forum activity for live- and archived-learners. More of the live-learners (44.6% for STATS; 41.8% for LTP1) view threads than archived-learners (31.3% for STATS; 37.3% for LTP1). Of those who do view threads, the mean number of views for STATS live-learners was 10.9 and archived-learners was 14.5, whereas the mean for both live- and archived-learners in LTP1 was 19.6.

As is typical of MOOCs, the number of learners who post or comment on the discussion forums is low for both live- and archived-learners (DeBoer, Ho, Stump, & Breslow, 2014). In both STATS and LTP1, more live-learners (12.1% for STATS and 14.5% for LTP1) posted than archived-learners (9.3% for STATS and 13.2% for LTP1). It is interesting to note that archived-learners did post to the forums, even though the courses were not active. The mean number of posts by those archived-learners who do post was 5.1 for STATS and 3.9 for LTP1.

Activity patterns: activity between reattempts of an assessment.

In both STATS and LTP1, learners were allowed multiple attempts at the quizzes and the maximum grade achieved counted towards their course assessment. As an investigation of the relative evidence for self-regulatory activity in the live- versus archived-courses, we examined the frequency of use of MOOC materials, in particular lecture videos and forums, between repeated attempts at machine-graded quizzes.

In Figure 10, we see that a greater percentage of archived-learners in both courses accessed lecture videos between quiz reattempts, although this behaviour is less evident for the later quizzes. As can be seen in Figure 11, archived-learners use the forums as a resource for self-regulated learning at least as much as learners in the live-course. Although not shown here, no distinguishing patterns were observed among learners’ varying levels of intent.

Figure 10

Figure 11

Activity patterns: LTP2 learners active in LTP1.

LTP2 was a sequel to LTP1, with the content from LTP1 a presumed prerequisite. During the period when LTP2 was live, LTP1 was available as an archived course, and thus was accessible reference material for learners enrolled in the live offering of LTP2. Of the 16,875 active live-learners in LTP2, 2,192 (13.0%) were active in the archived LTP1.

Here we report on the activity in the archived-LTP1 of those 2,192 LTP2 live-learners. We are only reporting on activity of these learners while LTP2 was live, so the access time available is less than that for the general LTP1 archived-learner population. It appears that watching videos was the primary reason for using the LTP1 archived course, with less interest among these learners in accessing assignments and discussion forums.

Almost all of these learners visited LTP1 to view videos, with 93.1% accessing at least one LTP1 video, and, on average, 9.4 videos accessed. Only 3.7% accessed all of the LTP1 videos. Relative to the general population of LTP1 archived-learners (see Table 2), the LTP2 live-learners access an average of 4.1 fewer videos.

Fewer LTP2 live-learners who concurrently accessed LTP1 completed LTP1 assessments. 73.7% did not submit any quizzes and 92.5% did not submit any assignments. Only 0.8% completed both all seven quizzes and all three exercises.

The LTP1 forums were not a popular resource for the LTP2 live-learners, with over 75% of them viewing either no threads or a single thread once.

Discussion

MOOCs commonly have defined start and end dates for a cohort but remain open after the end date with learners continuing to enroll. Can these archived courses meet learners’ needs? Our goal in this research was to examine learner characteristics and behaviour in live- and archived-MOOCs. We found more similarities than differences, with indications that archived-learners interact with the course in much the same way as live-learners. These similarities are consistent with the top reasons why learners used the archived course materials, which were because they arrived too late in the live-course or they were unable to complete the course during the live-session. Since this is an observational study, we cannot attribute differences between learners and their behaviours to the differences between live- and archived-MOOCs.

Previous research on self-directed learning in MOOCs (Kizilcec, Piece, & Schneider, 2013) and other settings (de Valle, & Duffy, 2009) has stressed the connection of learners’ intent to their level of engagement with learning and assessment resources. Here we took a step towards understanding this relationship by including in our analysis learners who would not be acknowledged externally for their learning effort such as by earning an SoA. Although not as many as for the live-courses, significant proportions of the archived-learners indicated that they planned to complete all required work. Even though fewer archived-learners indicated that they intended to complete all required work, the mean numbers of videos accessed and assessments attempted are similar for live- and archived-learners. In both the live- and archived-groups, learners who intended to complete all of the required work tended to complete either none or all of the assessments. This may indicate that learners who find the course meets their needs make that decision very early in the course and, having made that decision, act accordingly.

A common topic in the MOOC literature is retention (Breslow et al., 2009; Chen et al., 2012). The existence of instructional support and a peer cohort created a social structure that we thought might positively impact retention in the live-MOOCs. However, archived-learners achieved similar progress, watching a comparable number of videos and displaying a similar pattern in the particular assessments that were attempted. In the archived-MOOC, the flexible pace may have been a contributing factor to this retention, as also seen by Gooding et al. (2013). Yet there is tension between providing the flexibility of archived-MOOCs and the strong social support structure of live-MOOCs, illustrating the potential for continued improvement of MOOC formats.

An important component of live-MOOCs is the online discussion forum. In the archived-MOOCs, despite the lack of instructional and reduced cohort presence on the forums, there was still extensive discussion forum use. Bruff et al. (2013) found that an on-campus cohort of learners using an archived-MOOC viewed forums posts, but few reported posting to the forums themselves. Instead, they chose to ask questions locally. Our archived-learners, in the absence of a recognizable cohort, both viewed discussion forums posts and posted to the forums.

Although the archived questions and answers were not recently posted, they remained a valuable resource for archived-learners. These forums became another medium for content delivery, rather than an opportunity for social interaction. Existing research (Anderson, Poellhuber, & McKerlich, 2010; Rhode, 2009; Schwier, et al., 2009) has shown that learners value course content over peer and instructor interaction. For our archived-learners, less peer interaction and a minimal chance of instructor interaction did not generally deter them from covering their intended content. Their reasons for and the extent of their desired interaction with instructor and peers, however, remain open questions that are beyond the scope of this study. An investigation into these questions would inform the potential development of a new modality for self-directed, on-demand learning that combines the self-paced structure of an archived-MOOC with the desired instructional support structure.

Live-learners had the opportunity to earn an SoA and all learners had the opportunity to re-take quizzes to demonstrate mastery. In both groups of learners, indications of self-regulatory behaviour were observed in remedial action taken between repeated attempts at quizzes. Archived-learners used the lecture videos and the discussion forums as resources for self-regulated learning at least as much as learners in the live-course, even though there was no external reward for improved results.

With all course materials immediately available on registration, archived-learners have the opportunity to view the videos in the order of their choosing, rather than being limited by the release of materials at regular intervals. We had hypothesized that archived-learners may be more likely to explore the course in a non-sequential fashion, picking and choosing topics that were of most interest to them. However, for archived-learners who intended to complete all required work, the sequence of videos accessed closely matched the sequence that the instructor intended. Thus learners are treating the archived-MOOC as a traditional course, rather than as a learning resource they might access as needed. However, the use of archived-LTP1 by LTP2 live-learners illustrates that there is potential for archived-MOOCs to be used as reference material as well.

Archived-learners have more flexibility, not only in terms of access to content but in the pace at which they complete the course. Additional exploration of the pace at which archived-learners access videos and complete assessments may have valuable implications for course design.

In this study, we investigated two MOOCs, both of which had live-sessions followed by a period of time during which the MOOCs remained available as archives of the live-sessions. Archived-learners are interacting with the courses in much the same way as live-learners. They succeed at the same rate as live-learners, with minimal guidance and no obvious cohort. There is potential for MOOCs to be beneficial as self-study courses, and for the development of new modalities that combine the most valued aspects of live- and archived-MOOCs to best meet learner needs and interests.

Acknowledgements

The authors gratefully acknowledge the assistance of Open UToronto, in particular Stian Håklev, Laurie Harrison, and William Heikoop. We would also like to thank the anonymous reviewers whose feedback greatly improved the paper. This project is part of the MOOC Research Initiative, funded by the Bill & Melinda Gates Foundation.

References

Adamopoulos, P. (2013). What makes a great MOOC? An interdisciplinary analysis of student retention in online courses. In Proceedings of the 34th International Conference on Information Systems, ICIS (Vol. 2013).

Anderson, T. (2003). Getting the mix right again: An updated and theoretical rationale for interaction. The International Review Of Research In Open And Distance Learning, 4(2). Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/149/230

Anderson, T., Poellhuber, B., & McKerlich, R. (2010). Self paced learners meet social software: An exploration of learners’ attitudes, expectations and experience. Online Journal of Distance Learning Administration, 13(3).

Balakrishnan, G., & Coetzee, D. (2013). Predicting student retention in massive open online courses using hidden Markov models. Technical Report No. UCB/EECS-2013-109. Retrieved from http://www.eecs.berkeley.edu/Pubs/TechRpts/2013/EECS-2013-109.html

Breslow, L., Pritchard, D. E., DeBoer, J., Stump, G. S., Ho, A. D., & Seaton, D. T. (2013). Studying learning in the worldwide classroom: Research into edX’s first MOOC. Research & Practice in Assessment, 8, 13-25. Retrieved from http://www.rpajournal.com/dev/wp-content/uploads/2013/05/SF2.pdf

Bruff, D. O., Fisher, D. H., McEwen, K. E., & Smith, B. E. (2013). Wrapping a MOOC: Student perceptions of an experiment in blended learning. MERLOT Journal of Online Learning and Teaching, 9(2), 187-199. Retrieved from http://jolt.merlot.org/vol9no2/bruff_0613.htm

Catropa, D. (2013, February 24). Big (MOOC) data. Inside Higher Education. Retrieved from http://www.insidehighered.com/blogs/stratedgy/big-mooc-data

Chen, Z., Cheng, J., Chia, D., & Koh, P. W. (2012). Engagement, interaction, and retention in online classes. CS224W Final Report. Retrieved from http://www.stanford.edu/class/cs224w/upload/cs224w-003-final.pdf

DeBoer, J., Ho, A. D., Stump, G. S., & Breslow, L. (2014). Changing “course”: Reconceptualizing educational variables for massive open online courses. Educational Researcher, 43(2), 74-84.

DeBoer, J., Stump, G. S., Seaton, D., & Breslow, L. (2013). Diversity in MOOC students’ backgrounds and behaviors in relationship to performance in 6.002 x. In Proceedings of the Sixth Learning International Networks Consortium Conference. Retrieved from https://tll.mit.edu/sites/default/files/library/LINC ‘13.pdf

del Valle, R., & Duffy, T. (2009). Online learning: Learner characteristics and their approaches to managing learning. Instructional Science, 37(2), 129-149.

Dobrovolny, J. (2006). How adults learn from self‐paced, technology‐based corporate training: New focus for learners, new focus for designers. Distance Education, 27(2), 155-170.

Ely, K., Sitzmann, T., & Falkiewicz, C. (2009). The influence of goal orientation dimensions on time to train in a self-paced training environment. Learning and Individual Differences, 19(1), 146-150.

Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical thinking in a text-based environment. Computer conferencing in higher education. Internet in Higher Education, 2(2), 87-105.

Garrison, D. R. (2011). E-learning in the 21st century: A framework for research and practice. Taylor & Francis.

Gerlich, R. N., Mills, L. H., & Sollosy, M. (2009). An evaluation of predictors of achievement on selected outcomes in a self-paced online course. Research in Higher Education Journal, 4(1), 1-14. Retrieved from http://aabri.com/manuscripts/09199.pdf

Gooding, I., Klaas, B., Yager, J. D., & Kanchanaraksa, S. (2013). Massive open online courses in public health. Frontiers in Public Health, 1, 59. doi:10.3389/fpubh.2013.00059

Horton, W. (2006). e-Learning by design. San Francisco: Pfeiffer.

Hosler, K., & Arend, B. (2012). The importance of course design, feedback, and facilitation: Student perceptions of the relationship between teaching presence and cognitive presence. Educational Media International, 49(3), 217-229.

Iiyoshi, T., & Kumar, M. (Eds.). (2008). Opening up education: The collective advancement of education through open technology, open content, and open knowledge. Cambridge, MA: MIT Press.

Kizilcec, R., Piece, C., & Schneider, E. (2013). Deconstructing disengagement: Analyzing learner subpopulations in massive open online courses. The 3rd Proceedings of the Learning Analytics & Knowledge Conference. Leuven, Belgium. Retrieved from http://www.stanford.edu/~cpiech/bio/papers/deconstructingDisengagement.pdf

Levenberg, A., & Caspi, A. (2010). Comparing perceived formal and informal learning in face-to-face versus online environments. Interdisciplinary Journal of E-Learning and Learning Objects, 6(1), 323-333.

Liu, M., Kang, J., Cao, M., Lim, M., Ko, Y., & Weiss, A. S. (2013). Understanding MOOCs as an emerging online learning tool: Perspectives from the students. In T. Bastiaens & G. Marks (Eds.), Proceedings of World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education 2013 (pp. 2008-2015). Chesapeake, VA: AACE.

Lowenthal, P. R., Wilson, B., & Parrish, P. (2009). Context matters: A description and typology of the online learning landscape. In 32nd Annual proceedings: Selected research and development papers presented at the annual convention of the Association for Educational Communications and Technology. Washington D.C.

Pintrich, P. (2000). The role of goal orientation in self–regulated learning. In M. Boekaerts, P. Pintrich, & M. Zeidner (Eds.), Handbook of self–regulation (pp. 451–502). San Diego: Academic Press.

Rhode, J. F. (2009). Interaction equivalency in self-paced online learning environments: An exploration of learner preferences. International Review of Research in Open & Distance Learning, 10(1). Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/172/684

Schwier, R., Morrison, D., Daniel, B., & Koroluk, J. (2009, October). Participant engagement in a non-formal, self-directed and blended learning environment. In World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education, 2009(1), 1948-1956. Chesapeake, VA: AACE.

Seaton, D. T., Bergner, Y., Chuang, I., Mitros, P., & Pritchard, D. E. (2014). Who does what in a massive open online course? Communications of the ACM, 57(4), 58-65.

Sheu, F. R., Lee, M. M., Bonk, C. J., & Kou, X. (2013, June). A mixed methods look at self-directed online learning: MOOCs, open education, and beyond. Paper presented at the 25th Annual Ethnographic & Qualitative Research Conference. Cedarville, OH.

Skramstad, E., Schlosser, C. & Orellana, A. (2012). Teaching presence and communication timeliness in asynchronous online courses. Quarterly Review of Distance Education, 13(3), 183-188.

Sheridan, K., & Kelly, M. A. (2010). The indicators of instructor presence that are important to students in online courses. MERLOT Journal of Online Learning and Teaching, 6(4), 767-779.

Zimmerman, B. J. (2008). Investigating self-regulation and motivation: Historical background, methodological developments, and future prospects. American Educational Research Journal, 45(1), 166-183.

© Campbell, Gibbs, Najafi, Severinski