September - 2015

Effectiveness of Integrating MOOCs in Traditional Classrooms for Undergraduate Students

Maria Joseph Israel
University of San Francisco, USA

Volume 16, Number 5

Abstract

The idea of a Massive Open Online Course (MOOC) has attracted a lot of media attention in the last couple of years. MOOCs have been used mostly as stand-alone, online courses without credits. However, some researchers, teachers, colleges, and universities have attempted to utilize MOOCs in blended format in traditional classroom settings. This paper reviews some recent experiments in the context of current trends in MOOCs by examining methodologies utilized in blended MOOCs in a face-to-face environment. This paper further discusses the preliminary findings related to its effectiveness of learning outcomes and its impact on students and instructors in blended MOOCs format. The review of blended MOOCs in classrooms assists to form the emerging consensus on integrating MOOCs in conventional classroom settings, while highlighting potential opportunities and challenges one might face when implementing MOOCs in similar or entirely different contexts.

Keywords: Online learning, MOOC, Higher education, Hybrid learning.

Introduction

Some consider Massive Open Online Course (MOOC) as a disruptive innovation to bring reform in higher education (Billington & Fromueller, 2013; Dyer, 2014). Koller (2012) observed that many of the MOOCs are used as stand-alone, online courses without formal credit but open to anyone or used for students registered in academic institutions that produce MOOCs. She also noted that MOOCs present new opportunities for supporting face-to-face classes (Koller, 2012). However, there is very little empirical data on the implementation of MOOCs in contexts other than in which they are designed and developed (Griffiths, 2014). Of late, a growing number of researchers, teachers, colleges, and universities began to report integrating MOOCs in traditional classroom settings to support face-to-face learning experiences in a blended format (Bruff, Fisher, McEwen, & Smith, 2013; Caulfield, Collier, & Halawa, 2013; Firmin, Schiorring, Whitmer, Willet, Collins, & Sujitparapitaya, 2014; Griffiths, Chingos, Mulhern, & Spies, 2014; Holotescu, Grosseck, Cretu, & Naaji, 2014).

This paper reviews the results of a handful of recent research papers completed on incorporating MOOCs in conventional classroom settings for undergraduate students and examines the methodologies utilized in implementing MOOCs in a blended model. It attempts to address the following questions:

The purpose of this review is to provide clarity on the emerging consensus on integrating MOOCs in a blended format and assess its opportunities and challenges. Understanding its intricacies can promote further research as well as assist improving the design of future MOOCs, and inform useful strategies for similar implementation by others.

This paper first introduces the emerging MOOC trends, second presents a brief review of recent experiments on blended MOOCs in traditional classrooms, third critiques the reviewed articles, fourth highlights potential opportunities and challenges, and finally offers recommendation for future implementations.

Emerging MOOC Trends

MOOCs are online courses open to all who have access to an internet connection and are self-motivated in learning anywhere and anytime in the world (Jordan, 2014; Liyanagunawardena, Adams, & Williams, 2013). Liyanagunawardena et al. (2013) further noted that the word ‘open’ in MOOCs imply that people do not require any specific academic qualification, fees, and completion of courses. Though large numbers of people across the globe enroll for MOOCs, the current trend shows that the typical completion rate is less than 10% of total enrollment (Jordan, 2014; Pappano, 2012; Yuan & Powell, 2013). However, Stephen Downes opined that “different people have different objectives for MOOCs, and what we find in informal learning generally is that people are successful through informal learning, insofar as it enables them to do what it is that they wanted to do” (As cited in Buck, 2013, para. 6).

Based on the analysis of 11,000 participants in the Duke University’s first MOOC, Researchers observed multiple motivations to participate in MOOCs, including: : To gain an understanding of the subject matter, to explore online education, to experience online social interactions, fun, and enjoyment with no particular expectations for completion or achievement (Belanger & Thornton, 2013). Koller, Ng, Do, and Chen (2013) expressed a similar view on the motivation and intention of registrants in Coursera MOOCs. The research analysis of the first 17 courses jointly launched by HarvardX and MITx on edX platform from Fall 2012 to Summer 2013, revealed that the average percentage of registrants who ceased activity in the first week was the highest with 50% and was 16% percent in the second week. The research report further highlighted that 4% of registrants explored half of the online content, 55.8% explored about a quarter of the online content and 34.7% never engaged with the online content (Ho et al., 2014).

Some of the factors which influence learners’ participation in MOOCs can be understood from the perspective of Bouchard’s (2009) four dimensions affecting effective self-directed learning behavior. They are: (i) conative - referring to all possible reasons a person can have for learning such as drive, motivation, initiative, and confidence, (ii) algorithmic - referring to pedagogical issues such as sequencing, pacing goal setting in learning and evaluation of progress, (iii) semiotics of learning - referring to various modes of content delivery in e-learning such as hypertext, audio, video, 2D, and 3D images, and (iv) economy - referring to availability of courses for credit, non-credit, informally in chat groups, in any languages, from any country, and from in numerous sources affecting learning in terms of future employment and cost factors. Due to limited space, analysis of MOOCs integration in traditional classrooms based on Bouchard’s four dimensions learner’s autonomy in self-directed online learning is beyond the scope of this paper.

Integration of MOOCs in Traditional Classrooms

MOOCs offer opportunities to wrap on-campus courses around existing MOOCs (Koller, 2012). When MOOCs are offered using hybrid formats, it can improve student outcomes and reduce costs (Griffiths, 2014). Bruff et al. (2013), Caulfield et al. (2013), Firmin et al. (2014), Griffiths et al. (2014), and Holotescu et al. (2014) took steps to integrate MOOCs in the traditional classroom settings to enhance learning experiences. This approach has been termed as “ distributed flip ” (Caulfield et al., 2013) or blended / hybrid model (Bruff et al., 2013; Griffiths et al., 2014; Holotescu et al., 2014) in which teachers can integrate online content and activities with face-to-face to enhance optimal learning process. Garrison and Vaughan (2008) described the basic principle of blended learning as, “face-to-face oral communication and online written communication are optimally integrated such that the strengths of each are blended into a unique learning experience congruent with the context and intended educational purpose” (p.5). The proportion of face-to-face and online learning activities may vary considerably. In fact, the Babson Survey Research Group which conducted a survey of chief academic officers in 2800 college and universities in the U.S., found that a typical blended learning has 30% to 79% of its content delivered online using online discussions, video lectures, quizzes, and assignments (Allen & Seaman, 2013). But the key assumptions in designing a blended learning are: Thoughtfully integrating face-to-face and online learning, fundamentally restructure and replace the course design, and class hours for effective student engagement (Garrison and Vaughan, 2008). In this paper, the terms ‘distributed flip’, ‘blended MOOCs’, and ‘hybrid MOOCs’ are used interchangeably.

Though MOOCs are generally used as stand-alone online courses, a handful of MOOCs were used in other formats to support face-to-face learning environment. This paper reviews MOOCs that were incorporated into traditional classroom settings by Bruff et al. (2013), Caulfield et al. (2013), Firmin et al. (2014), Griffiths et al. (2014), and Holotescu et al. (2014). These models vary in student population size from mere 10 to several hundreds of students, number of courses adopted from a single course to maximum of 17 courses, duration of experiment from a single semester to multiple semesters spread over two academic years, and adoption methods from a supplementary text to a fully integrated courses in traditional classrooms. The models can be roughly divided into two categories: Single MOOC Adoption (models 1, 2, and 3) and Multiple MOOCs Adoption (models 4 and 5). Each of these categories can be further divided into models adopting ‘live’ or archived MOOCs as replacement for traditional in-campus courses (models 1, 2, and a MOOC course under model 5) and models adopting MOOCs as supplementary texts (models 3, 4, and 5). Despite these differences, all the models presented in this paper attempt to transform the structure and approach of teaching and learning by blending MOOCs in the existing traditional classrooms. These models can inform instructors on how to optimize student engagement and better learning outcomes. Each of these models is discussed below.

Integration Model 1

Caulfield et al. (2013) reported the work of Patti Ordonez-Rozo, who integrated Stanford’s introduction to databases MOOC in her conventional classroom in Spring 2012 for a group of 26 students at the University of Puerto Rico Rio Perdras, Puerto Rico. She asked students to enroll for Stanford’s introduction to databases MOOC and follow the online materials and complete all assignments. The MOOC was synchronized with the on-campus computer science class at the university. In the class-time freed up by the MOOC, the instructor focused on in-class activities, projects, and assessments using sequenced content in the MOOC.

The researchers analyzed backend data on students’ use of videos, participation in discussion forums, completion of assignments and quizzes. They reported that the instructor appreciated the affordances of the public course materials in the MOOCs, because several appropriate video lectures, quizzes and assignments were readily available in MOOCs. The instructor could focus more on the design of class time for discussion, feedback, and class projects. However, it was observed that students didn’t engage actively in the discussion forums. The little participation they had was motivated by getting a wrong answer on a question. The researchers also observed that 62% of students used the discussion forums for one session or less, while quarter of the students did not visit the forums at all.

On the other hand, students extensively used interactive elements like the video lectures and quizzes presented in the Stanford’ s Introduction to Databases MOOC. Students viewed a total of 3,445 times, with a median of 120 views per student for 54 total videos. A third of the students viewed all the videos in their entirety and three-quarters of the class viewed more than half of the length of the videos. The research report further reported that student time with study materials increased in blended MOOCs compared to the traditional classroom setting in which typical compliance with textbook may be 20-30% of students on any day in class (Hobson, 2004). The researchers examined this blended MOOC more from the perspective of students’ engagement with study materials provided in the MOOC rather than students’ outcomes.

Integration Model 2

In the next study, Bruff et al. (2013) integrated the Stanford University’s machine learning MOOC at Vanderbilt University during the Fall 2012 semester. The MOOC was integrated in its entirety, as it coincided with the on-campus schedule. A group of 10 students participated in this ‘ wrapped ’ course. Students were asked to enroll in the MOOC and were required to participate in all activities in the machine learning MOOC. Students participated in watching video lectures and discussion forums, and completing quizzes and programming assignments. They provided screenshots of their works and submitted them to the on-campus instructor, allowing to contribute to their grades in the Vanderbilt course.

The MOOC was supplemented with additional reading assignments on topics which were not covered in the MOOC. After the 10-week machine learning MOOC ended, students used the final four weeks for their individual project works. Though students were satisfied ( overall rating 4.17 on a 5-point scale, 1 = very poor and 5 = excellent), they had concerns regarding elements of the MOOC integration in class. Students identified the machine learning MOOC was suitable for self-paced learning with features of flexibility, customizable, and accessible. Similar to the previous study by Caulfield et al. (2013), students did not engage actively in online discussion forums provided by the MOOC, rather preferred to interact with the local learning community provided in the on-campus component of the course. They preferred to interact face-to-face in the classroom rather than online, though students admitted that the discussion forum was useful to get help from other online students. The other reason cited was lack of time. Students also mentioned the misalignment of MOOC content with face-to-face class and identified the role of on-campus instructor as facilitator. The authors recommended more complex forms of blended learning in which course materials can be drawn from multiple MOOCs and other forms of online sources. It is important to note that this blended MOOC makes it difficult to draw an emerging trend in blended MOOCs because it had just 10 students and provided no data on learning outcomes. However, the author highlighted the difficulty in aligning Stanford’s Machine Learning MOOC with the on-campus semester schedule and the need to supplement with additional reading materials to meet the university’s course requirements.

Integration Model 3

Similar to the previous blended MOOC, Holotescu et al. (2014) conducted a blended MOOC for a group of 70 students in web programming using educational microblogging platform named Cirip which functioned as social mobile learning management system (LMS) for the course, at the University of Politehnica Timisoara, Romania. Students could sign up for a MOOC of their choice, matching the content of an on-campus course and participate in at least 10% of activities in the web programming MOOC. The study stated that the overall satisfaction of the blended course was positive, though exact data were not provided. The instructor could expose students to different types of learning materials not provided by her on the university campus. The researchers identified that 24% of students completed the entire MOOC, and 66% completed half of MOOCs materials and assignments. It is assumed the remaining students met the basic requirement of 10% participation in the MOOC. However students expressed disappointment as they did not receive direct feedback from online instructors of the MOOCs they enrolled in. They also recommended that videos should contain a summary of the content for easy search and navigation in online MOOCs. This implementation is different from previous two experiments of Caulfield et al. (2013) and Bruff et al. (2013). The Holotescu et al. (2014) permitted students to utilize at least ten percent of either archived MOOCs or ‘live’ MOOCs which will contribute towards their participation grades. The authors of this blended MOOC experiment aimed at encouraging students to use MOOCs as additional learning resources and thus introduced students to different learning materials related to web programming.

Integration Model 4

In another blended learning project in Spring 2013 semester, San Jose State University (SJSU) offered three college preparatory MOOCs on Udacity platform (Firmin et al., 2014). The courses included a remedial-algebra survey course (MATH 6L), an introduction to college-level algebra (MATH 8) and an introduction to college-level statistics (STAT 95). There were over 15,000 students registered for these courses. Retention and pass, and online support were tested using augmented online learning environment (ALOE) for a group of 213 students including 98 matriculated students with age range of 18 to 24 and 115 non-matriculated students with age range of 15 to 86. AOLE enrollment was limited to 100 students per course with a breakdown of 50 SJSU students and 50 non-SJSU students per class. One-half matriculated. The strongest indicator was student effort in terms of submitting problem sets and video time. The success rates were measured in terms of students’ efforts in solving problem sets and viewing video lecture adequately.

Researchers found that matriculated students performed better than non-matriculated students: MATH 6 L matriculated 29.8% and non-matriculated 17.6%; MATH 8 - matriculated 50.0% and non-matriculated 11.9%; STAT 95 - matriculated 54.3% and non-matriculated 48.7%. Another measurement of effort was the amount of time students viewed video lectures which had a strong relationship with passing rates, especially for STAT 95 course. Stat 95 students completed half of their problem sets on average, Math 6L and Math 8 students completed less than 25% and 23% respectively. For Stat 95 students, the probability of passing crosses 50% at 223.45 hours of video time watched.

There were no statistically significant relationships with students’ demographic characteristics between use of online support and positive outcomes. Some of the demographic characteristics include ethnicity, gender, and income. However the researchers cautioned that it should not be interpreted to mean online support cannot increase student engagement and success. They noted factors such as students’ limited online experience, lack of awareness of availability of online support and difficulties with interacting with some aspects of online platform, hindering students’ academic performance. The research concluded that low pass rates in all courses were due to targeting academically at-risk students. Though this experiment was similar to other previous studies, it mostly targeted college students requiring remedial classes in select courses.

Integration Model 5

Similar to previously mentioned experiments, Griffiths et al. (2014) conducted an extensive two year experiment from 2012 to 2013 in seven university campuses under University Systems of Maryland (USM). The experiment integrated MOOCs and other online technologies in a traditional campus environment. While four campuses used MOOCs created by Coursera, three used courses from the Open Learning Initiative (OLI) at Carnegie Mellon University. Three courses were from OLI, 14 were MOOCs of which seven courses were developed by USM faculty participants. A total of 17 courses were embedded in variety of hybrid formats in the campuses. However, one of them was entirely online in which students had to enroll and to complete all the online assignments. There were seven side-by-side comparison tests to evaluate outcomes of students in hybrid systems with those in traditionally taught courses. The subjects were on computer science, biology, communications, statistics, and pre-calculus. It also collected feedback from both faculty and students involved in this research study.

There were 1,598 students (820 in control group and 778 in experiment group) with diverse ethnic backgrounds and income groups, with average age of 20. The average section size was 76 for control group and 77 for experiment group. The weekly face-to-face minutes were 126 for control group and 72 for experiment group.

The report further stated that student outcomes were roughly the same or slightly better in hybrid sections (cumulative GPA 2.85 for statistics course) than in traditional face-to-face sections (cumulative GPA 2.82 for statistics course) in terms of pass rates, scores on common tests and grades. In the case of pass rates, it was 83% for control group and 87% for experiment group, and final score was 70% for control group and 73% for experiment group. Results were similar for other subjects with statistically indistinguishable to zero. It was also true for students from low-income families, under-represented minorities, first-generation college students, and those with weaker academic preparations. The study also found no consistent evidence of negative effects of hybrid format for any of the subgroups. This is consistent with the results found in the above mentioned blended MOOC experiment done at San Jose University. Despite positive results, students expressed lower satisfaction with their experience due to less time with face-to-face interaction with instructors. The researchers affirmed that online courses alone may not address higher education challenges as they place high value on personal interaction with faculty. This is consistent with the current objections/worries expressed by many in higher education (Dolan, 2013; Dyer, 2014). Griffiths et al. (2014) study provides rich data on blending MOOCs in multiple campuses.

Analysis: Effectiveness of Integrating MOOCs in Traditional Classrooms

Some of the major findings of the reviewed studies on the effectiveness of integrating MOOCs in traditional classrooms are: modest positive impacts on learning outcomes, no significant evidence of negative effects for any subgroups of students, and lower levels of student satisfaction in blended MOOCs in classrooms.

Modest Positive Impacts on Student Learning Outcomes

Most of the research studies reviewed in this paper claimed that the impact of incorporating MOOCs in traditional classroom settings was almost equal or slightly better than face-to-face teaching environments (Bruff et al., 2013; Caulfield et al., 2013; Holotescu et al, 2014; Griffiths et al., 2014), consistent with other large scale studies combining online and face-to-face courses (Means, Toyama, Murphy, Bakia, & Jones, 2009). The only exception to the reviewed studies in this paper is the study related to San Jose State University, which had poor student outcomes (Firmin et al., 2014). Griffiths et al. (2014) and Firmin at al. (2014) measured student learning outcomes in terms of students’ pass rates, scores on common tests, and grades.

In the Griffiths et al. (2014) study, the reported cumulative GPA for the statistics course was 2.82 for control group and 2.85 for experiment group. In the case of pass rates, it was 87% for experiment group and 83% for control group, and final score was 73% for experiment group and 70% for control group. The researchers affirmed that the results were similar for other subjects as well. In the study conducted by Firmin et al. (2014), the overall course pass rate was 33.3%, though there were slight variations across the courses. The researchers observed that students lacked adequate preparation for courses on mathematics and statistics, and minimum use of available support services which affected their academic performance (Firmin et al., 2014). The other three research studies don’t indicate the exact results on learning outcomes, except for a brief acknowledgement of positive student outcomes.

There are other benefits resulting from the use of MOOCs, as mentioned by Griffiths et al. (2014), which include that students gained strong critical thinking in terms of the ability to distinguish between opinions and augmentations, and improved their skills in critiquing with analytical comments. Students’ motivation and perseverance play important role in completing both in-class and online activities. Surprisingly, none of the reviewed studies dealt with this aspect in the analysis of students’ learning outcomes.

No Significant Evidence of Negative Effects for Any Subgroups of Students

The other key finding for the reviewed studies was that there was no significant evidence of negative effects for any subgroups (Firmin et al., 2014; Griffiths et al., 2014) as against general assumption that academically at-risk students fare worse in entirely online learning (Xu & Jaggars, 2013). This is evident from the pass rates indicated in Griffiths et al. (2014) study for different subgroups based on race and ethnicity, gender, and parents’ income given in Table 1 below:

Table I: Pass Rates for Different Subgroups of Students (Griffiths et al., 2014)

Categories Control Group Experiment Group
White 50% 51%
Black 31% 34%
Hispanic 4% 4%
Asian 5% 4%
Others 9% 7%
Female 61% 60%
Less than $50,000 15% 17%
$50,000 - $100,000 20% 21%
More than $100,000 28% 29%

The researchers also stated:

Students from low-income families, under-represented minorities, first generation college students or those with weaker academic preparation fared well or slightly better in hybrid sections. Perhaps most significantly, we do not find any evidence that poorly prepared students, as identified by below-average SAT scores, are harmed by the hybrid format. At a minimum, the nearly complete absence of negative effects is a robust finding. Thus, worrying that disadvantaged students are most likely to be harmed by technology-enhanced education is almost not borne out by our data. (Griffiths et al., 2014, p. 35)

Similarly, Firmin et al. (2014) noted, “the lack of significant relationships between student demographic characteristics and success suggests that early interventions designed to help students engage and stay on track can increase persistence and success for all students” (p. 195). This particular point was not dealt with in the other three reviewed empirical studies of Bruff et al. (2013), Caulfield et al. (2013), and Holotescu et al. (2014).

On the issue of reliability of these results, it is important to note that the Griffiths et al. (2014) study didn’t randomly assign both instructors and students for different sections, leading to selection bias. It also further noted that students’ decision to opt for hybrid or face-to-face format may have been on the scheduling rather than the format of the course. It holds true for other reviewed experiments of Bruff et al. (2013), Caulfield et al. (2013), and Holotescu et al. (2014), except for the experiment done at San Jose University, where students had the choice to select either the hybrid or face-to-face course format (Firmin et al. 2014).

Lower Levels of Student Satisfaction in Blended MOOCs in Classrooms

All the reviewed research studies uniformly affirmed the key finding of lower levels of student satisfaction in hybrid learning, and other related issues of limited student participation in the MOOCs’ global community discussion forums and of MOOCs being used as open education resources rather than massive open online courses (Bruff et al., 2013; Caulfield et al., 2013; Firmin et al., 2014; Griffiths et al., 2014; Holotescu et al., 2014).

In all the reviewed studies, students expressed lower satisfaction with their experience in the online part of learning due to less time with face-to-face personal interaction with instructors. For example, based on the survey and focus group discussions done in Firmin et al. (2014) study, 80% of matriculated and non-matriculated students expressed the desire for “more help with course content including more face-to-face opportunities with faculty and other students” (p. 193). It is also relevant on the aspect of lower participation in online discussion forums in which students felt very isolated because their questions were not answered by the lead instructors in the MOOCs. Students found it easier to interact with classmates and instructors face-to-face in classrooms where they could clarify their doubts and get their questions answered, rather than on MOOCs’ online discussion forums.

However, students extensively utilized interactive materials like video lectures and quizzes provided in MOOCs. For example, in Caulfield et al. (2014) study, students viewed the MOOC videos a total of 3,445 times, with a median of 120 views per student, for 54 total video which include a third of students viewing majority of videos in entirety and three-quarters of students viewed more than half of length of the videos. Similarly, in Firmin et al. (2014) the amount of time students viewed videos was considered one of the key predictors of students’ success, though there was no mention of exact amount of time students viewed online MOOCs videos.

Researchers also reported in most of the reviewed studies in this paper that MOOCs were being used as open educational resources rather than as massive open online courses. It is well captured in the statement made in Caulfield et al. (2014), “For all intents and purposes, the blended classroom students were using the MOOC as something more akin to Open Educational Resources or courseware” (As cited in Collier & Caulfield, 2013, para. 7). Deviating from this approach, Caulfield et al. (2013), Bruff et al. (2014) and one of courses in the Griffiths et al. (2014) study had students registered in MOOCs and participated in all activities in entirety. However, none of them used the assessments of MOOCs for grading purposes, except by the Holotescu et al. (2014) study, which assigned certain weight towards participation grades for completing a minimum of 10% of MOOC’s activities. The reason for this may be the belief that MOOCs did not test adequately on particular skills and knowledge required for the local programs on college and universities campuses. Griffiths et al. (2014) measured students’ outcomes using on-campus instructors’ assessment instruments rather than nationally recognized exams. However, the research teams at Griffiths et al. (2014) study ensured consistency and objectivity in assessments.

Opportunities, Challenges, and Recommendations

Opportunities

The reviewed research reports in this paper indicate substantial promise for using MOOCs and interactive online technologies in traditional college settings in two ways: one, MOOCs can be used as learning resources, coupling online and in-class components, and the other, this new teaching environment provides two facilitators - one in-class instructor and the other online instructor of MOOC - showing two different points of view on the course content. It exposes students to different ways of teaching content and enriching class discussions and projects. It can also enable instructors to redesign classes without creating online content from scratch, or even replacing textbooks with more engaging content from MOOCs.

Challenges

Fitting existing MOOCs which are not designed for embedding them into traditional classrooms can be a huge challenge to ensure student engagement, satisfaction and effective learning, can be a huge challenge. There are varying perquisites and emphasis both in local face-to-face classes and MOOCs. Integrating MOOCs into traditional classroom settings is largely influenced by two elements of coupling and cohesion, as explained by Bruff et al. (2014): “Coupling refers to the kinds and extent of dependency between online and in-class components of a hybrid course whereas cohesion refers to the relatedness of the course content overall” (p. 195). It is difficult to find MOOCs which sit well with the aspects of coupling and cohesion (Bruff et al., 2013). Moreover, it will require huge amount of motivation and time commitment from in-class instructors to re-design MOOCs for effective use in hybrid format, taking median and mean hours of 148 and 175 hour respectively, as indicated by Griffiths et al. (2014). Instructors may have to be incentivized in terms of freeing up other parts of their work load.

While using MOOCs in conventional classrooms, one needs to understand intellectual property rights of MOOC content. Though MOOCs are open and free for students who enroll in public offering, MOOC providers restrict their use in other environments. For example, the use of terms of Coursera states, as cited in Griffiths et al., 2014 study:

You may not take any Online Course offered by Coursera or use any Statement of Accomplishment as part of any tuition-based or for-credit certification or program for any college, university, or other academic institution without the express written permission from Coursera. … You may download material from the sites only for your own personal, non-commercial use. You may not otherwise copy, reproduce, retransmit, distribute, publish, commercially exploit or otherwise transfer any material, nor may you modify or create derivatives works of the material. (Griffiths et al. 2014, p. 19)

Though Coursera consented readily to the use of their MOOCs for the Griffiths et al. (2014) project, one must realize that there aren’t standardized policies to enable large scale usage of MOOCs in hybrid formats. Another challenge one must consider is on how to assess students’ distributed activities in different MOOCs and integrate it within on-campus assessment and evaluation policies.

On the front of technology integration, it can be a herculean task. Most of the reviewed research studies affirmed that technology integration was a major concern as the MOOCs could not be embedded into local learning management systems (LMS). It was difficult to transfer students’ grades into local LMS and to monitor progress of individual students (Bruff et al., 2014; Firmin et al., 2014; Griffiths et al., 2014).

Recommendations

Based on the reviewed research studies in this paper, especially from the extensive experiments and large data provided by Griffiths et al. (2014) and Firmin et al. (2014), the following two recommendations are important for MOOC providers and for institutions adopting MOOCs for in-class courses.

To use MOOCs in traditional classrooms effectively as suggested by Caulfield et al. (2013) and Griffiths et al. (2014), MOOC providers should make their courseware more modular and must consider intellectual property and licensing implications of making their contents available for different contexts. They must also make tools and content easier to implement and repurpose, and provide assurance of online content availability for use in the future.

They must also ensure that Discussion forums become meaningful both for the blended classes and the larger global community in MOOCs by experimenting three design models suggested by Caulfield et al. (2014): (i) Making MOOCs as connectivist MOOC which focuses more on the community participants’ lives and works together with course content rather than strictly course content (Milligan, Littlejohn, & Margaryan, 2013); (ii) loosely-coupled cross-institutional courses in which related courses run simultaneously at multiple institutions and are connected by an online community of students and faculty; (iii) form a network with communities or organizations which will provide students opportunities to engage in real, authentic collaborative works, and projects.

Institutions adopting MOOCs should have overarching strategic frameworks for course redesigns and implementation to have significant impacts on enhancing students’ outcomes and reducing costs (Griffiths et al., 2014; Caulfield et al., 2013). They must provide leadership, infrastructure, support and incentives to help faculty to engage with MOOC and other online learning technologies. They must explore opportunities for blended MOOCs research on how factors like early support, high degree of structured content and assignments, and use of learning analytics help to guide early interventions to improve engagement, persistence, and outcomes of students (Firmin et al., 2014).

Conclusion

All the reviewed research studies in this paper highlight some of the emerging models such as: synchronizing an entire MOOC with in-class courses as done by Patti Ordonez-Rozo at University of Puerto Rico Rio Perdras using Stanford’s introduction to databases (Caulfield et al., 2014), using select modules of MOOCs while supplementing with additional reading materials as implemented by Bruff et al. (2014) research team, adopting MOOCS without the assessments provided by MOOCs as conducted in University System of Maryland by Griffiths et al. (2014) research team; integrating augmented online learning environment for courses offered at San Jose State University by Firmin et al. (2014) research team, and allowing students to take any archived or ‘live’ MOOCs related to subject taught in traditional classroom by Holotescu et al.(2014) research team or combing multiple MOOCs run in different universities or in MOOC providers’ platforms as suggested by Bruff et al. (2014) and Caulfield et al. (2014).

The preliminary findings in the reviewed blended MOOCs include: Students in blended MOOCs in traditional classrooms performed almost equal or slightly better than students in only face-to-face class environment, no significant evidence of negative effects for any subgroups in the hybrid model, lower levels of student satisfaction, and limited participation in discussion forums provided by MOOCs. MOOCs, in general, have the potential to offer excellent resource materials in the form of video lectures, quizzes, and assignments, though there are challenges in synchronizing them with in-class traditional courses and repurposing MOOCs with on-campus LMS and policies.

Though the reviewed research studies are limited to a few to due lack of experiments in integrating MOOCs in traditional classrooms, in the words of Griffiths, they “can add further weight to an emerging consensus that online technology can be used to deliver hybrid courses with reduced class time without compromising student outcomes” (Griffiths et al., 2014, p. 15). They provide models to replicate similar adaptation of MOOCs in different contexts to match the results. However, further research of this nature must be conducted in large scale to augment more data to form consensus on the success of embedding MOOCs in undergraduate classrooms as well as apply it in contexts where students have no access to quality higher education.

References

Allen, I. E., & Seaman, J. (2013). Changing c ourse: Ten years of tracking online education in the United States (Babson Survey Research Group Report). Retrieved from http://files.eric.ed.gov/fulltext/ED541571.pdf

Belanger, V., & Thornton, J. (2013). Bioelectricity: A quantitative approach - Duke University’s first MOOC (Report). Retrieved from http://dukespace.lib.duke.edu/dspace/bitstream/handle/10161/6216/Duke_Bioelectricity_MOOC_Fall2012.pdf

Billington, P. J., & Fronmueller, M. P. (2013). MOOCs and the future of higher education. Journal of Higher Education Theory and Practice, 13 (3), 36-42. Retrieved from http://www.na-businesspress.com/JHETP/BillingtonPJ_Web13_3__4_.pdf

Bouchard, P. (2009). Pedagogy without a teacher: What are the limits? International Journal of Self-Directed Learning, 6 (2), 13-22. Retrieved from http://sdlglobal.com/IJSDL/IJSDL6.2-2009.pdf#page=18

Bruff, D. O., Fisher, D. H., McEwen, K. E., & Smith, B. E. (2013). Wrapping a MOOC: Student perceptions of an experiment in blended learning. Journal of Online Learning and Teaching, 9 (2), 187-199. Retrieved from https://my.vanderbilt.edu/douglasfisher/files/2013/06/JOLTPaperFinal6-9-2013.pdf

Buck, T. E. (2013). The massive effects of MOOCs in higher education. Retrieved from http://www.edtechmagazine.com/higher/article/2013/01/massive-affect-moocs-higher-education

Caulfield, M., Collier, A., & Halawa, S. (2013, October 7). Rethinking online community in MOOCs used for blended learning [Web log post]. Retrieved from http://www.educause.edu/ero/article/rethinking-online-community-moocs-used-blended-learning

Collier, A., & Caulfield, M. (2013, May 2). What happens in distributed flips? Investigating students’ interactions with MOOC videos and forums [Web log post]. Retrieved from http://redpincushion.me/2013/05/02/what-happens-in-distributed-flips/

Dolan, V. L. B. (2013). Massive online obsessive compulsion: What are they saying out there about the latest phenomenon in higher education? The International Review of Research in Open and Distance Education, 15 (2), 268-281. Retrieved form http://www.irrodl.org/index.php/irrodl/article/view/1553/2893

Dyer, R. A. D. (2014). Exploring the relevancy of massive open online courses (MOOCs): A Caribbean university approach. Information Resources Management Journal, 27 (2), 61-77. doi: 10.4018/irmj.201404015

Firmin, R., Schiorring, E., Whitmer, J., Willett, T., Collins, E. D., & Sujitparapitaya, S. (2014). Case study: Using MOOCs for conventional college coursework. Distance Education, 35 (2), 178-201. doi: 10.1080/01587919.2014.917707

Garrison, D. R., & Vaughan, N. D. (2008). Blended learning in higher education: Framework, principles, and guidelines. New York: Jossey-Bass.

Griffiths, R., Chingos, M., Mulhern, C., & Spies, R. (2014). Interactive online learning on campus: Testing MOOCs and other platforms in hybrid formats in the University System of Maryland (ITHAKA S+R Report). Retrieved from http://www.sr.ithaka.org/sites/default/files/reports/S-R_Interactive_Online_Learning_Campus_20140716.pdf

Ho, A. D., Reich, J., Nesterko, S., Seaton, D. T., Mullaney, T., Waldo, J., & Chuang, I. (2014). HarvardX and MITx: The first year of open online courses (HarvardX and MITx Working Paper No. 1). Retrieved from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2381263

Hobson, E. H. (2004). Getting students to read: Fourteen tips (IDEA Paper No.40). Retrieved from http://ideaedu.org/sites/default/files/Idea_Paper_40.pdf

Holotescu, C., Grosseck, G., Cretu, V., & Naaji, A. (2014). Integrating MOOCs in blended courses. Proceedings of the International Scientific Conference of eLearning and Software for Education, Bucharest, 243-250. doi: 10.12753/2066-026X-14-034

Jordan, K. (2014). Initial trends in enrolment and completion of massive open online courses. The International Review of Research in Open and Distance Learning, 15 (1), 133–160. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/1651

Koller, D. (2012, November 7). How online courses can form a basis for on-campus teaching. [Web log post]. Retrieved from http://www.forbes.com/sites/coursera/2012/11/07/how-online-courses-can-form-a-basis-for-on-campus-teaching/

Koller, D., Ng, A., Do, C., & Chen, Z. (2013, June 3). Retention and intention in Massive Open Online Courses: In depth [Web log post]. Retrieved from http://www.educause.edu/ero/article/retention-and-intention-massive-open-online-courses-depth-0

Liyanagunawardena, T. R., Adams, A. A., & Williams, S. A. (2013). MOOCs : A systematic study of the published literature 2008-2012. The International Review of Research in Open and Distance Learning, 14 (3), 202–227. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/1455

Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2010). Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. Washington, DC: U.S. Department of Education, Office of Planning, Evaluation, and Policy Development. Retrieved from https://www2.ed.gov/rschstat/eval/tech/evidence-based-practices/finalreport.pdf

Milligan, C., Littlejohn, A., & Margaryan, A. (2013). Patterns of engagement in connectivist MOOCs. MERLOT Journal of Online Learning and Teaching, 9 (2), 149-159. Retrieved from http://jolt.merlot.org/vol9no2/milligan_0613.pdf

Pappano, L. (2012, November 2). The year of the MOOC. The New York Times. Retrieved from http://www.nytimes.com/2012/11/04/education/edlife/massive-open-online-courses-are-multiplying-at-a-rapid-pace.html?pagewanted=all&_r=0

Xu, D., & Jaggars, S. S. (2013). Adaptability to online learning: Differences across types of students and academic subject areas (CCRC Working Paper No. 54). Retrieved from http://ccrc.tc.columbia.edu/media/k2/attachments/adaptability-to-online-learning.pdf

Yuan, L., & Powell, S. (2013). MOOCs and open education : Implications for higher education. CETIS White Paper. Retrieved from http://www.smarthighered.com/wp-content/uploads/2013/03/MOOCs-and-Open-Education.pdf



© Israel