Fiona M. Hollands1 and Devayani Tirthali2
1Teachers College, Columbia University, USA, 2Brown University, USA
Given the ongoing alarm regarding uncontrollable costs of higher education, it would be reasonable to expect not only concern about the impact of MOOCs on educational outcomes, but also systematic efforts to document the resources expended on their development and delivery. However, there is little publicly available information on MOOC costs that is based on rigorous analysis. In this article, we first address what institutional resources are required for the development and delivery of MOOCs, based on interviews conducted with 83 administrators, faculty members, researchers, and other actors in the MOOCspace. Subsequently, we use the ingredients method to present cost analyses of MOOC production and delivery at four institutions. We find costs ranging from $38,980 to $325,330 per MOOC, and costs per completer of $74-$272, substantially lower than costs per completer of regular online courses, by merit of scalability. Based on this metric, MOOCs appear more cost-effective than online courses, but we recommend judging MOOCs by impact on learning and caution that they may only be cost-effective for the most self-motivated learners. By demonstrating the methods of cost analysis as applied to MOOCs, we hope that future assessments of the value of MOOCs will combine both cost information and effectiveness data to yield cost-effectiveness ratios that can be compared with the cost-effectiveness of alternative modes of education delivery. Such information will help decision-makers in higher education make rational decisions regarding the most productive use of limited educational resources, to the benefit of both learners and taxpayers.
Keywords: Online learning; higher education
At least since the 1990s concerns have arisen over the increasing costs and decreasing productivity of higher education, with technology-based reforms being promoted as a solution for institutions of higher education (IHEs) struggling to educate larger numbers of students with a wider range of incoming preparation and learning styles (e.g., Twigg, 1992; Rumble, 1997; Bowen, 2012, 2013; Barber, Donnelly, & Rizvi, 2013). Established IHEs have generally been slow to take advantage of technology to improve productivity in the delivery of education (Miller, 2010), for reasons that are more often “psychological, political, and cultural” rather than “conceptual, technical, or economic” (Dede, Ed., 2013, p. 52). However, a few pioneering institutions and numerous newcomers have gained traction swiftly by offering online or blended learning opportunities to both typical college-aged students and to older, non-traditional learners. As early as the 1970s, the Open University in the United Kingdom was able to offer distance education courses at a large enough scale to render institutional costs per student below the costs of similar courses at traditional campuses (Laidlaw & Layard, 1974).
There is, however, limited evidence regarding the costs of technology-mediated distance instruction and mixed evidence as to whether it lowers the overall costs of education or increases them. Lack (2013) observes that inattention to costs is pervasive in postsecondary education, and highlights one of the few exceptions in the field of postsecondary online learning, the National Center for Academic Transformation (NCAT), which, according to its website, helps institutions use “information technology to re-design learning environments to produce better learning outcomes for students at a reduced cost to the institution.” Miller (2010) reports cost savings of 13%-77% across fifty instances of NCAT-supported course re-designs. Costs per student averaged $196 across the fifty original, traditional versions of the courses while the versions that were re-designed with technology components averaged 39% less, at $119 per student. In one example, costs per student for a fine arts course offered by Florida Gulf Coast University dropped from $132 to $70 after it was transformed from an on-campus course into a fully online course. It is not clear, however, what method was used to establish costs or which personnel and other resources were included in the cost calculations. Twigg (2003) acknowledges that the NCAT estimates do not include costs of course development and transition from traditional to re-designed version, but she also argues that they do not reflect savings that can be achieved by increasing retention, reducing space utilization, or eliminating similar courses.
Cota, Jayaram, and Laboissière (2011) assert that the most productive colleges in the United States (U.S.), as defined by cost per degree (institution’s total annual costs divided by the number of degrees awarded) achieve their efficiencies through five strategies, one of which is keeping costs under control by re-designing instruction, often using technology to deliver some or all content and instruction at distance. On the other hand, Means, Bakia, and Murphy (2014) assert that online learning incurs greater investment costs than conventional instruction for program design, curriculum development, and development or selection of digital resources. Given the high fixed costs of development of online instruction, and of technology-mediated distance education more generally, many experts argue that scale is essential to reducing costs per student (e.g., Boeke, Ed., 2001; Jones, 2004). Massive open online courses (MOOCs) would appear to offer the ideal opportunity to take advantage of scale given their potentially enormous enrollments.
Online enrollment in the U.S. has grown at a rate between 6.1% and 36.5% in each year since 2002 (Allen & Seaman, 2013, 2014), and over the past two years MOOCs have begun to play a noticeable role in this growth. In 2013, 5% of 2,831 IHEs responding to Allen and Seaman’s (2014) annual survey about online learning were offering a MOOC, 9% were planning to do so, and 53% were undecided as to whether to engage in this innovation. While it is clear that MOOCs have “… nudged almost every university toward developing an Internet strategy” (Lewin, 2013), there is little evidence that MOOCs have, as yet, contributed to lowering the costs of higher education.
Given the continuing alarm regarding uncontrollable costs of higher education (e.g., Bowen, 2013; Kelly & Carey, Eds., 2013), it would be reasonable to expect not only concern about the impact of MOOCs on educational outcomes, but also systematic efforts to document the resources expended on their development and delivery. However, beyond the approximate estimates offered by Boddy et al. (2013), there is little publicly available information on MOOC costs that is based on rigorous analysis. Ithaka S+R (2014) documents hours spent by personnel in developing and delivering hybrid courses at the University System of Maryland, some of which integrated MOOCs or MOOC components, but does not translate these into costs.
Moreover, it appears that lowering costs is not the highest priority for MOOC initiatives: among the 140 or so IHEs offering MOOCs in Allen and Seaman’s (2014) sample, less than ten indicated that exploring cost reductions was an objective for their MOOC initiatives. Hollands and Tirthali (2014) found that, of 29 institutions offering MOOCs, improving economics was a goal for only 38%. A recent poll by the Alliance for Higher Education and Democracy (AHEAD) at the University of Pennsylvania found that, among the approximately 44 respondents at institutions offering a MOOC, only 19% strongly agreed that MOOCs may be an effective mechanism for reducing costs of higher education (AHEAD, 2014). Goals that were as or more important than reducing costs to the IHEs in these studies included: increasing access to education, raising institutional visibility or building brand, increasing student recruitment, and improving or innovating pedagogy.
Ruth (2013) explores the question of whether MOOCs can be used to help reduce college tuition and concludes that MOOCs may only contribute to lowering costs of higher education if combined with a reduction in labor costs, as experienced in successful implementations of NCAT’s course re-design model. Hoxby (2014) assesses the economic value of MOOCs and questions the assumption that cost reductions, via economies of scale, will be realized through MOOCs because she expects that the most popular MOOC instructors will eventually need to be paid high salaries. It is perplexing that MOOCs have taken hold without much evidence as to whether they are effective in improving participant skills and knowledge, and without a firmer idea of their economic value, resource requirements, and costs. As Means et al. (2014) observe, “Both irrational exuberance and deep-seated fear concerning online learning are running high” (p. 42). If decision-makers are to make rational decisions about engaging in MOOC production, it is critical to know whether MOOCs are both effective and cost-effective in delivering quality education or related outcomes.
In this article, our objectives are to address what institutional resources are required for the development and delivery of MOOCs, what are the associated costs per MOOC and, where the data are available, what is the cost per MOOC completer. We compare these findings with costs of other online and distance learning to assess whether MOOCs can deliver education more inexpensively at scale than alternative options. We hope that by demonstrating the methods of cost analysis as applied to MOOCs, future assessments of the value of MOOCs and other distance learning courses will combine both cost information and effectiveness data to yield cost-effectiveness ratios that can be compared with the cost-effectiveness of alternative modes of education delivery. Such information will help decision-makers in higher education make rational decisions regarding the most productive use of limited educational resources, to the benefit of both learners and taxpayers.
To elicit information regarding the resources required to develop and deliver MOOCs, we conducted a qualitative study (see Merriam, 2009) similar to that employed by Bacow, Bowen, Guthrie, Lack, and Long (2012) in their investigation of barriers to online learning in higher education. We interviewed 83 individuals across 62 public and private organizations including IHEs, research organizations, online learning platform providers, other for-profit education companies, and several additional stakeholders in the online learning space. Table 1 indicates the distribution of interviewees across institutional type. Thirty of our interviewees were administrators at IHEs, 22 were faculty members, 16 were executives at other institutions, 13 were researchers, one was an educational technologist, and one was a program officer at a foundation.
Interviewees were identified by reviewing the academic and journalistic literature on MOOCs, the names of presenters and panelists at conferences on MOOCs or online learning in higher education, and the MOOC activities of institutions on the Internet. Many of our interviewees suggested other people for us to interview either at their own institutions or elsewhere. We contacted by e-mail individuals who appeared to be knowledgeable about MOOCs or online learning based on their position in deciding whether and how to engage with MOOCs, experience teaching or planning MOOCs, or relevant research and publications.
We contacted 100 individuals on a rolling basis at 66 different institutions, 39 of which were IHEs. Most interviewees were based in the U.S., two were in China, two in the United Kingdom, and several were in Canada. Interviews were conducted between June 2013 and February 2014 and follow-ups by e-mail with interviewees to obtain updates and to verify information continued until May 2014. Almost half of the interviews were conducted face-to-face with the remainder conducted by telephone or Skype. Interviews averaged 75 minutes in length and followed a semi-structured interview protocol (see Merriam, 2009). Most interviewees agreed to be recorded, and the digital audio-files were subsequently transcribed. All interview notes and transcriptions were coded (LeCompte & Schensul, 1999) in NVivo software using themes initially derived from the interview protocol and iteratively refined as more granular topics were identified.
Cost analyses were conducted using the ingredients method (Levin & McEwan, 2001) to estimate the costs of MOOC production and delivery at four of the institutions where we were able to obtain adequate data on resource use. We estimated costs for one connectivist MOOC (cMOOC) and seven xMOOCs. We focused on estimating personnel costs and assumed these would represent 75% of total costs, based on Levin and McEwan’s assertion that personnel costs typically account for 70-80% of total costs of educational interventions (see p. 53). We do not estimate costs individually for facilities, other equipment, and overhead but assume they amount to 25% of total costs. To estimate personnel costs we asked our interviewees detailed questions regarding role, qualifications, and hours spent by each person involved in MOOC development and production. In two cases, detailed records of time spent were collected by the institutions as part of their regular project management process. In one case, the MOOC instructor logged time spent on the MOOC on a daily basis and we obtained other personnel hours by interviewing the relevant individuals shortly after the conclusion of the MOOC. In the case of the cMOOC, we obtained retrospective estimates of hours spent from the two instructors involved. We expect greatest accuracy when time spent is logged on a regular basis.
In order to assign costs to personnel time, we used national average U.S. salaries for individuals in each relevant job category, as opposed to using actual salary levels of personnel at each specific institution, except in one case where some of the personnel costs were given to us directly. This approach not only respects the privacy of the individuals involved, but, more practically, allows for a comparison of the costs across a number of institutions without introducing local pricing influences. National average prices and benefits rates were obtained from the CBCSE Database of Educational Resource Prices which relies on multiple national surveys such as the National Compensation Survey, U.S. Department of Labor. Cost calculations were executed using the CBCSE Cost Tool Kit, an Excel-based application designed for the purpose of estimating costs of educational programs.
We first review the resources required to produce and deliver MOOCs based on information provided by our interviewees. Subsequently, we present our estimates of the costs of MOOCs from the perspective of the producer (i.e., the college, university, or museum, as opposed to the platform provider or participant). We note that for MOOCs that are delivered via third-party platforms, there are often significant, additional costs to the platform provider which may be passed on to the MOOC producers through a direct charge for the platform services or a revenue-sharing agreement (see Young, 2012; Kolowich, 2013).
The major cost drivers we identified in MOOC production and delivery were: the number of faculty members, administrators, and instructional support personnel participating in the process; the quality of videography; the nature of the delivery platform; programming for special features such as computer code auto-graders, virtual labs, simulations, or gamification; analysis of platform data; and technical support for participants. MOOC production teams that were described to us seldom included fewer than five professionals and, in at least one instance, over 30 people were involved.
All interviewees who had been involved in the development of a MOOC reported the effort being two to three times greater than creating a traditional course. These reports comport with written accounts such as Cima’s (2013). Instructors typically spent several hundred hours over several months preparing and re-purposing course materials, and practicing lecture delivery prior to video-taping; several days on actual shoots; and one to two days reviewing the finished video. To create one hour’s worth of MOOC video-lecture required three to ten hours of preparation according to several faculty members, the lower end of the range being in instances where the materials were being re-purposed from existing lectures. To create ten minutes of voice-over-PowerPoint video required six to eight hours according to an interviewee at a private university.
Development of MOOCs was deemed to be more time-consuming compared to traditional online courses due to MOOC-specific components such as high quality video, quizzes to substitute instructor-graded assignments, and peer-to-peer learning technologies. Several interviewees noted that the level of “polish” required for content and delivery was far greater than for traditional on-campus or online courses because of the more public nature of the MOOC. A number of interviewees likened the effort involved in creating a MOOC with writing a textbook in a team. At some institutions faculty members were granted a course release and/or paid stipends ranging from $3,000-$15,000 for developing and delivering a MOOC, but the opportunity costs of the instructor’s time are likely to be higher in many instances. We frequently heard estimates in the order of 400 hours of faculty member time per MOOC developed, the equivalent of 26% of an academic year.
In addition to the direct costs of producing and delivering MOOCs, many of our interviewees provided insights into a plethora of additional considerations for institutions engaging with MOOCs. For example, MOOCs can only attract massive audiences if they are sufficiently marketed. While the platform providers such as Coursera, edX, and Academic Partnerships fulfill these marketing and communications functions for their partner institutions, those institutions using more “do-it-yourself” platforms must find suitable advertising channels. Computing and Internet services for on-campus students participating in MOOCs may need to be increased or upgraded, for example, help desk support and retrofitting buildings to provide enough bandwidth capacity for many students to simultaneously stream or download video. Institutional websites and learning management systems need to provide an access point to relevant MOOCs. Cheal (2012) documents many of these issues as encountered by San José State University’s MOOC initiatives.
A variety of administrative offices are likely to be involved in activities such as obtaining copyright permissions and establishing contracts between the institution and online platform provider, and between the institution and its faculty members to address intellectual property rights, revenue sharing, faculty compensation and workload issues. Compliance with disability regulations in MOOCs must be regularly audited and enforced, and accommodations made, for example, extra time on quizzes and exams for students with learning disabilities. For institutions providing credit for MOOCs, the student admissions, registration, billing, authentication, and crediting systems need to be aligned with platform enrollment procedures. If prerequisites are required for credit-earning participation in a course, a system must be developed to handle large numbers of students.
Based on the cost analyses we conducted of MOOC production and delivery, we estimated personnel costs ranging between $29,000 and $244,000 per MOOC, depending on the number of people involved in the process, the amount of time dedicated, and the quality of video production. The costs of the platform, captioning, content hosting, and analysis of user data to populate the data dashboard were assumed by Coursera for all xMOOCs we analyzed. We estimate total costs per MOOC, including facilities, equipment, and overhead, of $38,980 to $325,330 (see Table 2). In two cases where course completion data were available, we present a cost per completer. Details of each institution’s MOOC(s) and our related cost analysis are presented below.
Connectivism and Connected Knowledge (CCK08), the first course to be dubbed a “MOOC,” was developed and delivered in 2008 by George Siemens and Stephen Downes. The 12-week course was offered at the University of Manitoba to 25 enrolled students for fee and for credit and also as a free, non-credit-bearing course to 2,300 other participants (Downes, 2008). The course has been re-run three times since.
Siemens estimated the time burden for CCK08 development and delivery as follows: 100-150 hours on course design and development over a two month period; 70 hours per week on course delivery for the first two to three weeks (interacting with students and posting on discussion forums or writing blog posts to summarize discussion and activities), tapering down to 30 hours per week in the twelfth week. At the lower end of Siemens’ estimates, the total number of hours amount to 715. At the high end, they amount to 770. We estimate costs at each end of the range.
Downes estimated his total time commitment for CCK08 at 88-108 hours: 20-40 hours in programming time to make adjustments to the gRSShopper course aggregation software that he had developed over many years; 20 hours setting up the course website; and four hours per week during course delivery to maintain the site and prepare audio archives. No technology support personnel, learning designers, or teaching assistants (TAs) were utilized in the development and delivery of CCK08.
Using U.S. national average salary and benefits rates for public postsecondary faculty members and public sector research scientists, the costs of personnel time to replicate CCK08 ranges from $49,400 to $53,800 and we estimate the total costs of between $65,800 and $71,790, as shown in Table 3.
Re-runs of CCK08 required less design and development time. Additionally, with better course management software, weekly delivery time for the 2012 delivery fell to 30-40 hours per week for the first two to three weeks. Some repeat students self-selected as TAs and reduced the instructors’ time burden by helping manage the forums, responding to inquiries, and providing guidance to new students. Set-up time for the course website dropped from 20 hours to four hours. For Siemens, we estimate the total time commitment for a CCK08 re-run at 284 hours: 20 hours to “refresh” the course design and resources before a new launch; 28 hours per week in delivery for the first three weeks; and 20 hours per week in delivery for the remaining nine weeks. For Downes, we estimate the total time commitment for a CCK08 re-run at 72 hours: four hours for website set-up; 20 hours to adjust gRSShopper to accommodate new tools; and four hours per week to maintain the course site. The possible range of time committed by the self-selected TAs could be very wide. We use an estimate of 350 hours total, under the assumption that the TAs collectively replace the reduced hours in Siemens’ delivery time. Total estimated costs for the re-run are $40,740, 38% lower than the low estimate for the first run.
Before its recent entry to the MOOCspace, this university, which requested partial anonymity, had already established an infrastructure for the development of online courses. In 2013, a small number of faculty members were invited to develop and deliver five- to eight-week MOOCs, primarily to showcase the university and engage new audiences. Each faculty member was assigned a design and support team of five to six people to help in the design and production of the MOOC, including a project manager, instructional designers, instructional technologists, and a liaison to the video production team. Additional personnel supervised the design and support teams, and provided programming capacity, overall project management, evaluation, and administrative services.
As a routine part of the project management function at this university, detailed time logs are kept by each design team member so that costs for these personnel can be tracked accurately. We used the cost estimates provided by the university for these personnel in our analysis because we did not obtain enough detail regarding these personnel ingredients (e.g., specific role, level of experience, highest degree of education) to allow us to assign prices ourselves. Faculty member and TA time were not logged but we obtained estimates either during or after MOOC production and assigned relevant costs ourselves, using national average salary and benefits rates for postsecondary public institutions. For the first three MOOCs created and delivered, the hours spent per MOOC by various personnel were as follows: 200-500 hours for the MOOC design team, 700-900 hours for the video production team, 150-155 hours for technical support, 90-220 hours for the faculty member, and 650 hours for a TA in one MOOC. Total personnel hours were 1,140 for the least time-intensive MOOC and 2,245 for the most demanding MOOC. The resulting cost estimates are shown in Table 4.
The faculty time burden was relatively low because the dedicated design and support team took on much of the task of course design and development. Design team time varied depending on the complexity of the learning activities. We estimate the total costs per MOOC at $203,770 - $325,330. Salary levels at this geographical location may be lower than national averages so that costs for the non-teaching personnel could be higher on a national average basis, in the order of a few thousand dollars.
Between September and December 2013, the American Museum of Natural History (AMNH) delivered three four-week long MOOCs targeted at science educators. Planning efforts began in Spring 2013 and involved a team of museum professionals who had significant previous experience in developing and delivering online education. The core MOOC production team comprised a project director, a project manager, an in-house video producer, an educational technologist, and a senior administrator who also served as one of the MOOC instructors.
While the museum had already previously developed many digital resources including science-content videos and educational essays on science topics, MOOCs presented a new challenge to develop lecture-based videos with “talking heads” or voice-over PowerPoint presentations, multiple-choice quizzes, peer-graded assessments, and pre- and post-course surveys. The personnel effort associated with the production and delivery of the three MOOCs are summarized in Table 5, based on time use as logged by the AMNH project manager. The project manager and project director spent the equivalent of 25 and 11 entire workweeks respectively on the project, while the instructors spent, on average, about six workweeks each, shooting videos and developing, adapting, or reviewing course content. The core team met once or twice per week for one to two hours to plan, design, execute, and review the MOOC production and delivery. A TA managed the discussion forums, processed survey responses, and reviewed the platform data.
Using national average salaries and benefits rates for personnel, wherever possible at similar positions in postsecondary institutions to allow comparability with the other MOOC costs we present, we estimate the personnel costs to develop the three MOOCs created by AMNH at $78,470 per MOOC and total costs at $104,620 per MOOC. Of the total 39,685 participants who initially enrolled in the three MOOCs, 1,155 completed and passed all course requirements. Costs per completer for the MOOCs amount to $272.
Big Data in Education was an eight-week MOOC delivered on the Coursera platform in late 2013. Ryan Baker, a faculty member at Teachers College, Columbia University, developed the course by adapting a 16-week on-campus version usually taught to classes ranging in size from eight to fifteen students. Planning and preparation for the course began in mid-March 2013. Big Data in Education was free, open to any participant, and non-credit-bearing. There were 48,058 registrants and 526 of them completed the last assignment. Baker kept track of time and tasks related to the MOOC in an Excel spreadsheet from June (when our study began) to the end of December 2013. Hours spent on activities prior to June were estimated. Total time logged plus time estimated was 176 hours, with the heaviest burden falling during the first three months of planning and preparation of materials, the month prior to launch, and the first few weeks of course delivery. Time spent on various tasks included: creating course materials such as slides, assignments, and quizzes (58 hours); set-up and video-recording using ScreenFlow software (46 hours yielding 6 1/2 hours of finished video used in the MOOC); planning, bureaucracy, and coordination with Coursera, the TA, and the course production team (37 hours); participating in the forums and responding to participant e-mails (26 hours); “debugging” slides, assignments, and quiz questions during the course (7 hours); and open office hours (3 hours).
In addition to Baker, several other personnel worked on the MOOC. A TA spent approximately 15 hours per week over 16 weeks for a total of 240 hours. Tasks included coordinating among faculty member, video team, and Coursera’s course coordinator; checking that uploaded videos were working; posting assignments and “inline” quiz questions (which are embedded in the videos); and participating in the discussion forum. Seven individuals from the Educational Data Mining Laboratory at Teachers College read and participated in the discussion forums. We estimate two hours per person per week over the eight-week period for a total of 112 hours. A senior administrator coordinated the production activities one hour per week for eight weeks. Two in-house video-specialists edited the video, linked files, requested captioning, and uploaded video for 32 hours. A senior educational technologist served as the day-to-day project manager for MOOC production and delivery for a total of 75 hours. This included monitoring the online discussion forum for technical questions.
We estimated personnel costs of $29,240 (see Table 6) to replicate the development and delivery of Big Data in Education using national average salaries and benefit rates for postsecondary personnel at private universities, and total costs of $38,980. With 526 students completing Big Data in Education, estimated costs per completer are $74.
Overall, we found that costs of developing and delivering MOOCs at the four institutions varied widely, ranging from $38,980 to $325,330 per MOOC. Based on our limited sample of eight MOOCs, the key variables in determining costs do not appear to include course length or whether the course is designed as a cMOOC or as an xMOOC. Costs depend heavily on the number of people involved in the MOOC production process and to what extent it is executed “in-house” as opposed to by external professionals. Additionally, platform programming costs to facilitate the extensive auto-grading or peer-grading functionalities necessary to accommodate the huge enrollments, or to provide simulated lab experiences can be high. Course design and delivery has shifted from a solo endeavor to a team effort, often including administrators in offices of digital technology, instructional designers, instructional technologists, videographers, and project managers. While involvement of multiple professionals is typical of what Bates (2005) describes as the “project management” model for web-based course development, the higher visibility of MOOCs, and the objective of building or enhancing brand appears to have led institutions to dedicate more resources for the planning and production of MOOCs compared with regular online courses, often including senior level administrators and external video producers who provide very high production values. Faculty members are generally undercompensated for the opportunity costs of their time to develop MOOC content.
We did not find pre-existing estimates of MOOC production and delivery costs derived from records of personnel effort with which to compare our findings. The E-Learning Working Group at the University of Ottawa estimated costs of developing a Coursera MOOC at C$110,000 and costs of delivery at C$29,000, for a total of C$139,000 (Boddy et al., 2013). The U.S. dollar equivalent of $127,500 falls within the range of our own estimates. To provide another point of comparison for our results, we replicated the projected costs for Georgia Institute of Technology’s Online M.S. in Computer Science program (see GTRC/Udacity, 2013), added a conservative estimate of costs for the head TAs/course developers which appear to have been omitted, and calculated an average cost per course of $226,000-$284,000, including both new courses and re-runs. While at the high end of our range of cost estimates, these courses provide significantly more student support and ongoing instructor involvement.
Limited publicly available information exists on the institutional costs of contemporary postsecondary online courses against which we can compare the costs of MOOCs. Bates provides a useful benchmark estimating costs of $35,000-$50,000 to develop a regular three-credit online course delivered on a learning management system. He notes that, within the context of a program, these costs constitute less than 20% of the total, once costs of delivery, including student support and assessment, are included (A. Bates, personal communications, April 29, 2014, May 15, 2014; Bates & Sangra, 2011). Conversely, we estimated that for Big Data in Education the delivery costs constituted only 20%-30% of the total cost, with production costs accounting for the majority. Using Bates’ guideline, total costs per regular online course for both development and delivery would amount to $175,000-$250,000, at the higher end of the range we found for total MOOC costs.
Ithaka S+R (2014) attempted to estimate costs of hybrid courses developed and delivered at the University System of Maryland. The report indicates that 12 faculty members spent between 40 and 506 hours to plan their hybrid courses, some of which incorporated MOOCs or MOOC components, plus another four hours per week on delivery. If we assume 16-week courses and national average salary and benefits rates for average faculty at public universities, the faculty costs amount to between $6,500 and $36,000. These numbers fall within the range of our estimates of faculty costs for MOOC production and delivery.
One metric for assessing cost-effectiveness of MOOCs relative to regular online courses is institutional cost per student completing the course. In our study we were able to estimate this metric in the cases where completion data were available. Cost per completer for Big Data in Education was $74 and the average cost per completer across the three AMNH MOOCs was $272. By comparison, if we use Bates’ cost estimates for regular online courses and spread the total course costs over a typical online class size of 30 students, cost per completer would be much higher: assuming a completion rate of 82% for online courses (based on Xu & Jaggars, 2011) cost per completer would be $7,000-$10,000. In practice, cost per completer would be lower if the course is offered multiple times, but this is true for both the regular courses and for the MOOC. At a cost of $175,000, the number of students completing a regular online course would need to reach over 2,300 to be as cost-effective for completion as Big Data in Education.
It therefore appears that while MOOC production is often more costly than the development of regular online courses, the ability to scale MOOCs and the absence of associated student supports results in a dramatically lower cost per completer. Considering that MOOCs can help achieve other objectives not generally addressed by regular online courses, including branding, global reach, and large scale research, MOOCs would appear to be a wise use of resources, if only the costs could be recovered through tuition or other fees.
However, it is arguable that course completion per se is not a satisfactory measure of effectiveness and that MOOCs should be judged on the quality and quantity of learning that takes place. To date, almost no peer-reviewed studies have been published comparing pedagogical effectiveness of MOOCs with alternative delivery modes. One exception is Colvin et al. (2014) who rigorously document absolute and relative learning in a physics MOOC using pre- and post-testing and item response theory, and compare the results with on-campus instruction. Colvin et al. find that participants in the MOOC showed learning gains slightly higher than for students in a traditional on-campus course, but lower than for students in courses that rely on interactive engagement pedagogy. As no cost estimates are available in this study, it is not possible to assess cost-effectiveness of the MOOC except to note that, given apparently similar learning gains, even if the MOOC is more expensive to produce than the on-campus course, its ability to serve many more students will likely render it more cost-effective. One important caveat is that, with few instructor-student interactions and student supports, MOOCs are likely completed only by self-sufficient, motivated students. It is possible that MOOCs are cost-effective for this subset of learners, but not for less motivated learners.
We found that the costs of re-running Connectivism and Connected Knowledge were around 38% lower than the costs of the initial offering. Given the intense level of instructor involvement in cMOOCs, this is unlikely to be a useful predictor for xMOOC re-runs where instructor involvement may be minimal or absent. One interviewee at a community college expected that the re-run costs for the college’s xMOOC would be small, perhaps less than $1,000, compared with her estimate of $75,000 for the initial offering. Such assumptions should be rigorously tested through careful cost analyses and we recommend that, going forward, MOOC producers attempt to document these re-run costs to help assess the sustainability of MOOC production.
Given the highly labor-intensive nature of the process, we do not expect the costs of new MOOC production to fall significantly over time. While it appears that revenue streams for MOOCs are slowly building, we expect that unless MOOC producers can offer credentials of economic value in order to attract fee-paying participants, or can use MOOCs to replace traditional offerings more efficiently, most likely by reducing expensive personnel, they will not be able to afford ongoing participation in the current MOOC experimentation. Free, non-credit bearing MOOCs are likely to remain available only from the wealthiest institutions that can subsidize the costs from other sources of funds.
Several questions remain to be explored with respect to MOOC costs and cost-effectiveness and whether they can eventually contribute to reducing the costs of higher education. Cost analyses of MOOC re-runs would help ascertain whether costs of re-offering a MOOC diminish substantially as compared with the initial offering. We recommend that future analyses of MOOC costs aim to estimate actual costs of materials, equipment, facilities, and overhead as opposed to simply assuming, as we did, that these items account for 25% of total costs. Jones (2004), Bates (2005), and Rumble (1997), while acknowledging the difficulty of estimating overhead costs for technology-mediated distance instruction, offer valuable guidelines for this endeavor. The feasibility of sharing courses across multiple campuses must be explored, as should the question of whether, over the longer term, variable costs of MOOCs can be contained by automating functions and substituting instructional support provided by expensive faculty members with less costly TAs, part-time instructors, or peer-to-peer learning and assessment.
Studies of MOOC effectiveness with respect to educational outcomes should be combined with cost analyses to help determine whether spending more on MOOC production and delivery leads to better learning outcomes. For example, does higher quality video production lead to higher rates of course completion or greater acquisition and retention of knowledge? Does substituting tenured faculty members with non-tenured instructors or TA’s affect student performance and learning in MOOCs? While it is difficult to set up true experiments in higher education (Bowen, Chingos, Lack, & Nygren, 2012), it may be possible to address some of these questions by conducting side-by-side comparisons similar to those Ithaka S+R (2014) executed at the University System of Maryland.
To answer the question of whether MOOCs are a cost-effective means to deliver education, we must be able to compare the costs of MOOCs to the costs of alternative delivery mechanisms, as well as the effectiveness of each alternative with respect to a common outcome of interest, such as increasing participants’ level of knowledge or skill in a specific subject area. Generating cost-effectiveness ratios for a number of educational alternatives including MOOCs would allow decision-makers to choose which programs represent the best investments of resources. Longitudinal studies tracking post-MOOC outcomes such as sequences of courses taken, professional certifications obtained, or job opportunities received would help assess the longer term economic value of participating in these courses and allow for cost-benefit analyses to estimate the overall returns to society of investing in MOOC creation.
AHEAD. (2014, April). What’s AHEAD key trends in education poll #1: Massive open online courses (MOOCs). Alliance for Higher Education and Democracy, University of Pennsylvania. Retrieved from http://www.gse.upenn.edu/pdf/ahead/whats_ahead/01_moocs.pdf
Allen, E., & Seaman, J. (2013). Changing course: Ten years of tracking online education in the United States. Babson Survey Research Group Report. Retrieved from http://sloanconsortium.org/publications/survey/changing_course_2012
Allen, E., & Seaman, J. (2014). Grade change: Tracking online education in the United States. Babson Survey Research Group Report. Retrieved from http://sloanconsortium.org/publications/survey/grade-change-2013
Bacow, L. S., Bowen, W. G., Guthrie, K. M., Lack, K. A., & Long, M. P. (2012). Barriers to adoption of online learning systems in U.S. higher education. Ithaka S+R. Retrieved from http://www.sr.ithaka.org/research-publications/barriers-adoption-online-learning-systems-us-higher-education
Barber, M., Donnelly, K., & Rizvi, S. (2013). An avalanche is coming. Institute for Public Policy Research, London, UK. Retrieved from http://www.ippr.org/publication/55/10432/an-avalanche-is-coming-higher-education-and-the-revolution-ahead
Bates, A. W. (2005). Technology, e-learning and distance education (2nd ed.). London and New York: Routledge.
Bates, A., & Sangra, A. (2011). Managing technology in higher education: Strategies for transforming teaching and learning. San Francisco, CA: Jossey-Bass.
Boddy, C., Detellier, C., Duarte, S., Dulpaa, E., Erdmer, A., Levasseur, D., McKay, M., & Ufholz, L. (2013). Report of the e-learning working group. University of Ottawa, Canada. Retrieved from http://www.uottawa.ca/vr-etudes-academic/en/documents/e-learning-working-group-report.pdf
Boeke, M. F. (Ed.). (2001). Technology costing methodology casebook 2001. Western Cooperative for Educational Telecommunications. Retrieved from http://www.cs.trinity.edu/rjensen/EdTech/Miscellaneous/TCM_Casebook_Final.pdf
Bowen, W. G. (2012). The ‘Cost Disease’ in higher education: Is technology the answer? The Tanner Lectures at Stanford University. Retrieved from http://www.ithaka.org/sites/default/files/files/ITHAKA-TheCostDiseaseinHigherEducation.pdf
Bowen, W. G. (2013). Higher education in the digital age. Princeton, NJ: Princeton University Press.
Bowen, W. G., Chingos, M. M., Lack, K. A., & Nygren, T. I. (2012). Interactive learning online at public universities: Evidence from randomized trials. Ithaka S+R. Retrieved from http://webcache.googleusercontent.com/search?q=cache:http://www.sr.ithaka.org/sites/default/files/reports/sr-ithaka-interactive-learning-online-at-public-universities.pdf
Cheal, C. (2012, August 14). Creating MOOCs for college credit (Research Bulletin). Louisville, CO: EDUCAUSE Center for Applied Research. Retrieved from http://www.educause.edu/ecar
Cima, M. J. (2013). My experience teaching 3.091x. MIT Faculty Newsletter, 26(1), 15-17. Retrieved from http://web.mit.edu/fnl/volume/261/cima.html
Colvin, K. F., Champaign, J., Liu, A., Fredericks, C., Zhou, Q., & Pritchard, D. E. (2014). Learning in an introductory physics MOOC: All cohorts learn equally, including an on-campus class. The International Review of Research in Open and Distance Learning, 15(4).
Cota, A., Jayaram, K., & Laboissière, M. C. A. (2011). Boosting productivity in U.S. higher education. McKinsey & Company. Retrieved from http://www.mckinsey.com/insights/social_sector/boosting_productivity_in_us_higher_education
Dede, C. (Ed.). (2013). Connecting the dots: new technology-based models for postsecondary learning. EDUCAUSE Review, September/October 2013.
Downes, S. (2008). Places to go: Connectivism & connective knowledge. Innovate, 5(1). Retrieved from http://www.innovateonline.info/index.php?view=article&id=668
GTRC/Udacity (2013). Amendment to the online courses hosting agreement between Georgia Tech Research Corporation and Udacity, Inc. Retrieved from http://s3.documentcloud.org/documents/703593/udacity-gtrc-amendment-5-13-2013.pdf
Hollands, F. M., & Tirthali, D. (2014). MOOCs: Expectations and reality. Full report. Center for Benefit-Cost Studies of Education, Teachers College, Columbia University, NY. Retrieved from http://cbcse.org/wordpress/wp-content/uploads/2014/05/MOOCs_Expectations_and_Reality.pdf
Hoxby, C. M. (2014). The economics of online postsecondary education: MOOCs, nonselective education, and highly selective education. NBER Working Paper 19816. Retrieved from http://www.nber.org/papers/w19816
Ithaka S+R. (2013). Interim report: A collaborative effort to test MOOCs and other online learning platforms on campuses of the University System of Maryland. Retrieved from http://www.sr.ithaka.org/sites/default/files/reports/S-R_Moocs_InterimReport_20131024.pdf
Ithaka S+R. (2014). Interactive online learning on campus: Testing MOOCs and other platforms in hybrid formats in the University System of Maryland. Retrieved from http://www.sr.ithaka.org/research-publications/interactive-online-learning-on-campus
Jones, D. (2004). Technology costing methodology handbook – version 2.0. Western Cooperative for Educational Telecommunications. Retrieved from http://wcet.wiche.edu/wcet/docs/tcm/TCM_Handbook.pdf
Kelly, A. P., & Carey, K. (Eds.). (2013). Stretching the higher education dollar: How innovation can improve access, equity, and affordability. Cambridge, MA: Harvard Education Press.
Kolowich, S. (2013, February 21). How edX plans to earn, and share, revenue from its free online courses. The Chronicle of Higher Education. Retrieved from http://chronicle.com/article/How-EdX-Plans-to-Earn-and/137433/
Lack, K. (2013). Current status of research on online learning in postsecondary education. Ithaka S+R. Retrieved from http://www.sr.ithaka.org/sites/default/files/reports/ithaka-sr-online-learning-postsecondary-education-may2012.pdf
Laidlaw, B., & Layard, R. (1974). Traditional versus open university teaching methods: A cost comparison. Higher Education, 3(4), 439-468.
LeCompte, M. D., & Schensul, J. J. (1999). Analyzing and interpreting qualitative data. Walnut Creek, CA: Altamira Press.
Levin, H. M., & McEwan, P. J. (2001). Cost-effectiveness analysis: Methods and applications (2nd ed.). Thousand Oaks, CA: Sage Publications.
Lewin, T. (2013, December 10). After setbacks, online courses are rethought. New York Times. Retrieved from http://www.nytimes.com/2013/12/11/us/after-setbacks-online-courses-are-rethought.html?_r=0
Means, B., Bakia, M., & Murphy, R. (2014). Learning online: What research tells us about whether, when and how. New York, NY: Routledge.
Merriam, S. B. (2009). Qualitative research: A guide to design and implementation. San Francisco, CA: Jossey-Bass.
Miller, B. (2010). The course of innovation: Using technology to transform higher education. Education Sector Report. Retrieved from http://www.educationsector.org/usr_doc/NCAT-Report_RELEASE.pdf
Rumble, G. 1997. The costs and economies of open and distance learning. Open and distance learning series. London: Routledge.
Ruth, S. (2013). Can MOOCs help reduce college tuition? George Mason University School of Public Policy. Research Paper No. 2014-06. Retrieved from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2367425
Twigg, C. (1992). Improving productivity in higher education - the need for a paradigm shift. CAUSE/EFFECT 15(2). Retrieved from http://net.educause.edu/ir/library/text/cem9227.txt
Twigg, C. (2003, October). The KISS approach to costing. The Learning MarketSpace. Retrieved from http://www.thencat.org/Newsletters/Oct03.htm.
Xu, D. & Jaggars, S. S. (2011). Online and hybrid course enrollment and performance in Washington State community and technical colleges. CCRC Working Paper No. 31. Retrieved from http://ccrc.tc.columbia.edu/media/k2/attachments/online-hybrid-performance-washington.pdf
Young, J. (2012, July 19). Inside the Coursera contract: How an upstart company might profit from free courses. Chronicle of Higher Education. Retrieved from https://chronicle.com/article/How-an-Upstart-Company-Might/133065/
© Hollands and Tirthali