November – 2009

The Technological Dimension of a Massive Open Online Course:  The Case of the CCK08 Course Tools

Antonio Fini
University of Florence, Italy


In 2008, a new term emerged in the already crowded e-learning landscape: MOOC, or massive open online course. Lifelong learners can now use various tools to build and manage their own learning networks, and MOOCs may provide opportunities to test such networks. This paper focuses on the technological aspects of one MOOC, the Connectivism and Connective Knowledge (CCK08) course, in order to investigate lifelong learners’ attitudes towards learning network technologies. The research framework is represented by three perspectives: (a) lifelong learning in relation to open education, with a focus on the effective use of learning tools; (b) the more recent personal knowledge management (PKM) skills approach; and (c) the usability of web-based learning tools.

Findings from a survey of CCK08 participants show that the course attracted adult, informal learners, who were not concerned about course completion. Time constraints, language barriers, and ICT skills affected the participants’ choice of tools; for example, learners favoured the passive, time-saving mailing list over interactive, time-consuming discussions forums and blogs. Some recommendations for future MOOCs include highlighting the purpose of the tools (e.g., skill-building) and stating clearly that the learners can choose their preferred tools. Further research on sustainability and facilitator workload should be conducted to determine the cost and effectiveness of MOOCs. Investigation is also necessary to understand MOOC participant profiles as they relate to course outcomes and retention and whether terms such as course and attrition are appropriate in this context..

Connectivism and Connective Knowledge Course

Multi-tool learning environments are gaining momentum. From the criticism of VLEs and other institution-managed platforms (Wilson, 2005) through the debate about personal learning environments (PLEs) (Attwell, 2007) to the concept of “loosely coupled teaching” (Leslie, 2007), there has been a shift from centralized, specialized, institutionally owned systems (VLE, LMS) towards distributed, general-purpose, user-centered, and user-owned systems, such as social software tools. In the context of informal education, the integration of multiple and heterogeneous environments and tools may represent the starting point of a learner’s knowledge construction quest. Many people are using blogs, wikis, social networks, messaging systems, etc. The underlying idea is that people are comfortable with tools they consider to be their own, and they may wish to continue to use them when engaged in learning activities. Therefore it becomes important to understand the extent to which multi-tool environments are effective in supporting education and learning and to derive some guidelines on their integration in order to optimize their effectiveness for learning.

This paper focuses on the course Connectivism and Connective Knowledge (CCK08), facilitated by George Siemens and Stephen Downes in the fall of 2008 (Siemens & Downes, 2008). The CCK08 was an online course offered both formally through the University of Manitoba and also informally with enrollment open to anybody in the world (at no cost). As the terms course and informal might seem to be in conflict, we should clarify the meaning of formal and informal in this context. Formal refers to participants earning credit from the University of Manitoba and for that reason having to complete the course and obtain positive grading of assignments. Informal refers to participants attending the course and undertaking the activities at their own pace without receiving any type of academic certification or grading from the facilitators. Hybrid ways of attending were also possible in the course: According to Siemens (2009), one student enrolled in the course but was evaluated by her own institution.

The CCK08 course was also characterized by a variety of technological tools available to the students. Some were selected and proposed by the facilitators, and others were suggested by the participants. Even though the course assignments required only the use of a personal blog and a tool to build concept maps, during the course more than 12 different tools and technological environments were used, from LMSs (Moodle) to 3D environments (Second Life).

This paper reports the results of a survey conducted among the CCK08 attendants. The aim was to analyze learners’ views about the multi-tool environments adopted in the course and to give some suggestions for setting up multi-tool course environments.


The CCK08 course can be situated in the framework of open and distance learning (ODL) initiatives that are offered by institutions and single teachers around the world. The background is the open education movement, whose different aspects have been well illustrated by Iiyoshi and Kumar (2008). In their book, three main themes are covered: open technology, mainly related to open source software for education; open content, with reflections on some open educational resources (OER) initiatives, and open knowledge, with practical suggestions for sharing educational practices and considerations about institutional points of view on openness in education.

Furthermore, after years of online learning practiced mostly by replicating traditional online activities and methods (but with more flexibility due to fewer time and space constraints), recent research trends are shaping a specific online learning theory. For example, Anderson (2008) proposes a model in which the “affordances of the net” play a key role and where it is possible to include the many different forms of teaching and learning supported by present web technologies, with an outlook for future semantic web capabilities.

The CCK08 is not the first open online course (OOC). For example, in 2007/08 the Social Media Open Education course by Alec Couros and the Introduction to Open Education course by David Wiley had international relevance (Fini et al., 2008). Open online courses may be considered to be a special type of OER, which solves the problem of the lack of interaction that is typical of most OER initiatives. While OERs are merely content, OOCs are live courses, which include direct participation of teachers and rich and valuable interaction among participants. As Siemens (2009) says, they are examples of shifting from a content-centered model towards “socialization as information objects.” An OOC can be attended potentially by a large number of students, all over the world, provided that there are sustainability plans for the instructors. For example, it is unreasonable to think that a teacher is able to evaluate hundreds or thousands of blog posts weekly. Thus the real potential of an OOC is to be found in the emergence of learning networks among participants in a many-to-many relationship, rather than the traditional one-to-many model of interactions between a teacher and his or her students. Even this type of relationship can assume new forms, with significant changes in the role of instructors. The CCK08 is a particular case of this phenomenon. According to Downes (2009), 2200 people signed up, and hundreds of people from around the world participated in the CCK08 course, each with different behaviors, outcomes, and levels of involvement.  These figures inspired the massive open online course (MOOC) definition (Siemens, 2008).

In the literature, there are studies (for example, Liu et al., 2009) and guidelines for the effective use of VLEs (Weller, 2007) as well as comprehensive reviews of technology uses for education (for example, Mason & Rennie, 2007). The UK JISC published a study on the effective use of social software to support student learning and engagement (JISC, 2009) in which many benefits and challenges related to institutions, educators, and students were noted. In particular, Clarebout and Elen (2006) carried out a meta-analysis of research about the use of different tools in computer-based learning. Some studies in their review showed that students had difficulty making choices about which tools to use. Furthermore, they found several authors stressing the importance of metacognitive skills in making adequate decisions.

More recently, the idea of a “networked lifelong learner” is offered by studies on personal knowledge management  (PKM). Recent literature connects learning-to-learn competencies with technologies under the umbrella of this term (Cigognini et al., 2009) and identifies the related required skills as a multi-faceted set of abilities (Martin, 2006), including digital literacy, information literacy, and the ability to effectively use social software to build one’s own learning environment (Pettenati et al., 2009). Social and relational aspects of the knowing knowledge attitude in the connectivistic framework (Siemens 2004, 2006) inevitably highlight that mastering technology is but one among many complex skills.

When dealing with technologies, it is important to refer also to usability and, more generally, to human-computer interaction. Usability is a well-known term that is related to quality. According to Shackel (1991), a usable system can “be used by humans easily and effectively.” Nielsen (1999) provides a popular standard for the usability of websites. When applying usability concepts to web learning tools, it is crucial that the tools are particularly easy to use in order to avoid construct irrelevant cognitive load, which can distract the learner by requiring his/her attention for the mere use of the tool. This is the remarkable concept (although not limited to learning tools) of “zero learning time” (Nielsen, 2000). General usability rules are still valid for websites, even if not always followed, as Nielsen (2007) reports. For example, some sites use excessive graphics. Specific studies related to the usability of virtual learning environments (Ardito et al., 2006) and usability issues in learning-oriented applications (Rigutti et al., 2008) are also available.


The survey consisted of three sections (the full version is available on the Internet at

1) Personal information: gender, age, nationality, mother tongue, level of proficiency in English, level of technological skills, profession;
2) General information about the CCK08 course: motivation, attendance type, completion, opinions about certification;
2a) General opinions about the toolset used in the course (most and least useful tools, level of comfort with the global toolset);
3) Detailed questions on each tool used in the course (frequency of use during the course, relevance, and comments). In some cases, questions were added.

The survey included a number of open-ended questions in order to gather detailed comments and opinions.

In this paper, a tag cloud visualization is used to represent the variety of answers received. Tag clouds have been realized with the online tool (  Unless specified otherwise, tag clouds include all the words used except common words (articles, prepositions, numbers etc.). The majority of the open-ended answers have also been categorized for better statistical processing.

The survey was managed online through a website based on LimeSurvey, an open source survey application. An invitation was emailed to 415 people. As a public and complete list of participants was not available, it was not possible to randomly select the sample. Thus, email addresses were collected from the CCK08 user profiles, which were accessible to people enrolled in the CCK08 Moodle environment and also from users’ blogs.  Furthermore, an announcement of the survey was published on Facebook, Linkedin, and as a post in the author’s own blog. The online survey was active from December 1, 2008 to January 5, 2009.

Results and Discussion

Section 1: Personal Information

Eighty-three people completed the survey (49 males, 34 females). The overall age range of respondents was 28 to 69 years old (M = 48 yrs, SD = 9.75, N = 83). This is important evidence indicating that the course generated a special interest among adult learners.

Tables 1 and 2 illustrate the profile of the respondents, according to nationality and mother tongue.

Table 1

Table 2

As the course was taught in English, it was helpful to know the English proficiency level of the participants. As shown in Figure 1, more than 80% had a good level of English proficiency (mother tongue and advanced).

Figure 1

The situation regarding technological skills was very similar (N = 83): 24 were experts/ICT professionals, 41 were power users, 18 were normal users, and no one self-identified as a beginner.

OOCs claim to be open; nonetheless, there are at least two barriers to access. Participants are required to have some basic competencies, specifically ICT skills and a good level of English proficiency. While it is obvious that learners willing to attend an online course should have adequate ICT skills, language is also an issue because OOCs have been provided in English generally. It is worth noting that CCK08 had multiple language translations (made by local groups), but these translations were limited mainly to course content. Nonetheless, interactions took place in English, so it is reasonable to consider English proficiency as a necessary skill for effective participation. Moreover, it is likely that if an institution offers a MOOC, it is probably in English, which is presently the international lingua franca.

Figure 2 shows the tag cloud of the jobs of the respondents (N = 82). The educational professions comprised the vast majority; for example, the word teacher occurs 24 times (Table 3 shows the participants’ occupations).

Figure 2

Table 3

Section 2: The CCK08 Course

Items in section 2 concern the respondents’ attitudes towards the CCK08 course, from motivation to opinions about the course outcomes and the tools used.

Figure 3

Seventy out of 83 respondents declared that they attended the course informally; 12 attended for credit at the University of Manitoba; and one attended formally for credit at another institution. The vast majority of respondents were informal learners, attending the course for professional and/or personal development. This fact fits with the age profile of the respondents (mostly adults).

As regards completion (N = 80), only 15 have completed the entire course, while 44 indicated a completion percentage lower than 50; specifically, 20 wrote all of the required blog posts, 17 made the concept maps and wrote the required papers, and 16 completed the final project (N = 83).

Table 4 reveals that formal students committed to completing assignments more than informal students did. This is better highlighted by Figure 4, which shows the incidence of completion. Only four out of 70 informal students completed the course, compared to 11 out of 13 formal students.

Table 4

Figure 4

Although formal attendance seemed to be the main driver for completing assignments and the course, the main reason for not completing the course was a lack of time, as shown in Figure 5. Since this reason was reported mostly by informal students (it was selected by only one formal student who did not complete the course; the others reported that they obtained an extension), it seems that informal learning experiences such as the CCK08 course compete with other activities for personal time allotment. Learners, in the absence of a stronger motivation, attend only partially. A significant comment (under Other), was “Recession, took extra job:” S/he had little or no time for studying. Literature on engagement in distance higher education is mainly related to formal courses (for example, Angelino & Natvig, 2009). Nash (2005) quotes several studies in which authors emphasize  the impact of factors external to the academic environment to explain distance learners’ drop-out rates. Ostman and Wagner (as cited in Nash, 2005) reported as early as 1987 a “lack of time” as the most commonly quoted reason by distance learners for course noncompletion. In the CCK08 case, it seems that the impact of technology was low since time still appears as the most important factor. Other authors, such as Miller, Rainer, and Corely (as cited in Nash, 2005), have analyzed the relationship between time and computer literacy in student attrition. However, in the case of voluntary students, e.g., the CCK08 informal participants, terms such as  attrition and drop-out may be inappropriate because students attend the course without expecting a certificate, and they do it based on personal motivation only (see Figure 3). So it is reasonable to assume that one might be interested not in the course as a whole but, for instance, only in the first part (for example, to gain an overview of connectivism) or in a single topic. Informal learning does not imply structures like courses, thus participants might feel free to learn only what they really need to at that moment. Nor can we consider participants who did not complete the course to be drop-outs. Because they have different aims and motivations, perhaps they do not consider course incompletion to be a defeat. Further investigation would be necessary to better understand if the very notion of a course really fits open course initiatives.

Figure 5

Furthermore, a “hand-made” or “hacked” certificate issued by the instructor (not by the institution) (Young, 2008) only partially affects the motivation to finish the course: Thirty-nine out of 83 respondents said that this would not be a sufficient reason for completing. This fact could be interpreted as a persistent dichotomy in the way people view education:  strictly formal on the one side or completely informal on the other side. Maybe the time is not yet ripe for “contaminations” like “edupunk” initiatives or, as noted above, maybe structured courses do not always need to be completed, as in the case of informal learners.

Finally,  a question about participation in communities not promoted by the instructors was asked in order to investigate the level of the participants’ self-organization skills. Perhaps the question was not clearly formulated since many people answered “I read and comment on blogs,” “I posted in the Moodle forum,” and so on. However, some answers pointed out the emergence of national communities (mainly Spanish and Italian), which were a disadvantage at times (“yes I browsed through the Spanish connectivitas and the Italian group. Got an overload of information”).

Section 2A: Opinions about the Course Toolset

In section 2a, opinions and views about the course toolset as a whole were investigated. Figure 6 shows the results from the item asking for the overall opinion on the toolset (N = 83, more than one answer possible).

Figure 6

A commonly used adjective was complete (34), but confusing and too rich (31 in total) were also used (only two people chose both indicators). This dichotomy is reinforced by the responses to the item about the overall informational architecture of the course, shown in Figure 7 (N = 83, more than one response possible).

Figure 7

Fifty-five respondents indicated that the information architecture of the course was clear, intuitive, and friendly (only one respondent chose all three indicators), but the number of respondents who indicated confusing and overwhelming was 37 (only two people chose both indicators).

Table 5 shows the results concerning the most useful tool (N = 73). The preferences converge on the most paradigmatic tools: the Daily (the “good old” mailing list) is at  the top, followed by Moodle (a “traditional” LMS), and the wiki and Elluminate (a web conferencing system). It is quite surprising that a Web 1.0 tool, a mailing list, is the most preferred instrument. Some excerpts from the comments on the Daily can help us to better understand this preference: “gives a broad vision of the course,” “was a useful filter,” “gave me guidance,” “a good starting point to locate relevant information and viewpoints,” “concise and practical,” “useful for rapid update.” This tool seems to have high relevance for adult learners who are worried about time and information-filtering. In a MOOC, it is nearly impossible to read all of the posts and the contributions, so many participants choose to delegate effective filtering to the instructor. The Daily provided the “highlights” of the course by email, without any further intervention. In this case, the instructor performed the role of an information broker, not producing content but collecting, selecting, and proposing relevant (in his/her opinion) resources.

Table 5

Figure 8 shows the tag cloud of comments on the Daily. There is also some criticism of this tool; for example, some people commented,  “it’s the instructor’s list of the course,” “the prejudices of the editor(s) were sometimes evident,” “steering only to certain sources,” and “it led to a kind of ‘gold star’ mentality.”

Figure 8

Table 6 illustrates the opinions of the least useful tool (N = 70). In this case, there is a wider range of responses, although the majority of respondents converge on the first three tools. It is odd to find Moodle (the second most useful tool) at the top and, also, that 12 respondents do not have a clear opinion. In the case of Moodle, most of the respondents added that the discussion forum was not useful for them.

It is interesting to note that the Daily, the other popular tool, is placed at the bottom. Additionally, there is clearly pro-Moodle and anti-Moodle opinion among the respondents. Pageflakes, which is equally at the top, is highly criticized because of its low usability and unclear interface. Some comments about it include the following: “an unnecessary mess,” “too much information, not relevant,” “didn’t really understand, I looked at the page but it felt like a disorganised mess,” “very confusing, there were a lot of duplicated and irrelevant flakes.”

These comments confirm the need for increased attention to usability since users do not want to deal with confusing interfaces. Interface problems can not be solved by selecting only well-designed tools and services because the concept of usability is only partially related to the user interface. The case of Pageflakes is instructive. Despite being a  service based on a rich and engaging graphical interface, its specific use in the course (attempting to aggregate a high number of RSS feeds in one page) resulted in a “disorganised mess.” Thus it is necessary to better evaluate the tools from the overall usability point of view, particularly in situations, such as MOOCs, where the “number of participants” variable can  impact usability.

Ustream might be viewed as a redundant copy of Elluminate, which is a possible explanation for its presence among the least useful tools.

Table 6

Overall, respondents said that they were comfortable with the toolset. Figure 9 shows the distribution of the responses (N = 81), with a rating scale of 1-5 (from low comfort to high).

Figure 9

One item explicitly asked if the toolset may have influenced the outcomes of the course. The answers (N = 68) have been grouped in two main categories (Yes and No). The Yes group has been further divided into negative and positive influence.

Table 7 shows the results. Twenty-one out of 68 respondents answered that there was some sort of influence, and 11 said that it was negative. Some excerpts from the comments may help us to understand the reason for this: “the huge variety of tools has made it very difficult to actually follow the discussion, as it was just all over the place,” “tools were too many and some were completely unknown to me,” “I was very interested in this initiative. VERY early I felt lost about the overall technical environment,” “felt like a techno-idiot in this course.”

Among the No or positive influence comments were the following:  “I’ve become familiar with many tools I was not before. I have profited using them;”  “I think the toolset was great because I had choices. I could use the tools I was comfortable with and disregard the others;” “No, lack of time was the main issue not the toolset.”

Table 7

Finally, 56 out of 72 answered that they will use one or more of the tools in their future work, or they are already using them. Blogs, wikis and Moodle are indicated as the preferred tools to be used in the future.

Section 3: Specific Tools

This section contains a number of items, grouped by each tool used during the course. Tools include Moodle, blogs, Facebook, Linkedin, Twine, Twitter, Ning, Elluminate, Ustream, Pageflakes, the Daily, SecondLife, RSS,  conceptual maps, social bookmarking, and Flickr (see the glossary of tools at the end of the paper).

Each group includes similar questions and in particular a rating-scale question about the importance of the tool within the course (1:low – 5:high). The results have been grouped by adding levels 1 and 2 (low group), 3 (neutral group) , 4 and 5 (high group). Missing answers have been marked as “N/A.”

Figure 10 shows the frequency of low scores, while Figure 11 shows the frequency of high scores and Figure 12 shows neutral ranking plus N/A.

Figure 10

Figure 11

Figure 12

According to these results, it is possibile to place the tools in three categories:

  1. Definitely useful, relevant, significant: the Daily is the only tool that seems to obtain large agreement on its usefulness (66 high, 8 low,  9 neutral or N/A).
  2. Definitely not useful, not relevant, not significant: This category includes Facebook, Linkedin, Twine, Ning, Pageflakes, SecondLife and Flickr. It seems that the majority of social networks and  tools were considered less useful for this course. This is somehow strange since the course subject was very close to the basic ideas underpinning web 2.0 and social networks (Figure 13).
  3. Controversial tools: this category includes Moodle, blogs, Twitter, Elluminate, UStream, RSS, maps, and social bookmarking sites. In this case some people said that these tools were useful, but roughly the same number of people said that they were not useful, or that they were indifferent to them. As shown in Figure 14, it is also possible to distinguish between controversial tools with a majority of high scores (Moodle, blogs, Elluminate), tools with a majority of low scores (Twitter, maps, and social bookmarking), and a few disputed tools (Ustream and RSS), which have very low differences among scores.

These highly controversial results show that participants have very different opinions about the tools, probably due to their various learning styles, personal objectives, time availability, etc. This is an important finding that seems to conform to the facilitators’ goal of distributing learning across multiple platforms through the use of “a wide array of technologies – under the control of course participants” (Siemens, 2009). This also aligns, for example, with the fundamental idea of a PLE, in which each student is free to organize his/her own set of tools, rather than having to adapt to the institution’s VLE. However, some respondents reported that they felt overwhelmed by the tools. This could be caused by their lack of skills related to the organization of an effective multi-tool personal knowledge environment (PKM skills, according to Pettenati et al., 2009).

Moreover, since the majority of infomal participants do not care about course completion, it is reasonable that each participant might consider using only a subset of tools. For example, those interested in a single topic of the course could join in the related live session, or if they have more time they could take part in a discussion in a blog post or in Moodle’s web forum. In this respect, the richness of tools seems to have a positive role, enabling a personal choice of the “best tools for me.” 

Finally it can be observed that, particularly at the beginning of the course, it was not clear to the participants that they did not have to use every service/environment. This perceived lack of choice might have induced the sense of overload and confusion that clearly emerged from the answers of some respondents.

A first suggestion for future MOOCs is to highlight the purpose of the wide range of tools and clearly state that it is not mandatory to use all of them. Alternatively, a clearer description of the pedagogical aim of each tool could be provided in order to avoid confusing and overwhelming students.

Since time seems to have been the most important variable, these informal adult students were mainly interested in learning as much as possible with the least possible effort, regardless of the variety of tools and the chance to build networks. They seemed to have a low interest in a “I want to organize my personal view on content and relationships using the tools” approach (which is more related to PKM skills) because it was perceived as time-consuming. For this kind of learner, the importance of “mediated/filtered views” of course content (the Daily) was very high, even though there was some criticism of the “only one voice” results of this method. For future editions of the course perhaps a multi-filtered approach based on a wider, collaborative board of “Daily-makers” could help to better respond to the expectation of plurality. It may be a good idea to make the underlying instructional design principles related to the plurality of technical environments more explicit.

Figure 13

Figure 14

Personal blogs played an important part in the course as participants were requested to post their messages in the blog as an assignment. Some interesting elements emerged from the specific questions (N = 83): Thirty-four learners used a generic personal blog, while 39 used a blog specifically dedicated to the course, and 36 started a new blog for the course.

Commenting and cross-blogging was mostly reported as medium-low intensity (61 out of 83 said level 1-3 on a scale of 5), and only 15 reported high activity (level 4-5). Indeed these are time-consuming activities, and time issues seem to have been overwhelming for most participants.

In relation to the real use of the tools, Moodle was the most used (only 2 out 83 people said “never used it”), followed by the Daily (only 3 people said “not subscribed”) and blogs (13 people said “never used it”). Less used tools were Flickr (58 “never used”), Pageflakes (54), and Twitter (45).

The Daily was the only tool constantly used throughout the course (77 out of 83), while Moodle (32), Pageflakes (20), and blogs (18) were used only in the first phase. For example, many users registered on the Moodle site in the beginning then stopped using it. Pageflakes could have been abandonded because of its low usability. Some participants started to blog but eventually discontinued. The Daily, perhaps due to its passive, time-efficient nature, continued to be used.

Results from the questions about RSS are surprising (N = 83): Twenty-five people did not use RSS at all, and only 21 people used the provided OPML file. This fits with existing evidence that the use of RSS technology is not popular. According to Rubel (2008), who reports research by Forrester, RSS adoption among general Internet users in the USA is 11%.

Social networks were not used by the majority of participants, as shown in Figure 15. Among them, only Facebook seems to have been popular.

Figure 15

Some specific evidence emerged from the items related to web conferencing tools. In the case of Elluminate 44 out of 79 state that they attended fewer than three sessions, while 10 out of 79 attended more than eight sessions. Most people affirmed that they had no time to attend (51), but also time zone differences (34) and language (10) were reported as issues to be considered (N = 79, more than one answer possible). Ustream had roughly the same responses, with some differences; for example time zone and language were more troublesome in Elluminate, while technical issues affected Ustream sessions. It is worth noting that 10 out of 15 people who declared a lower level in English proficiency (14 intermediate and 1 basic) also said they had language difficulties in Elluminate. Web conferencing seems to be a further barrier for people who are not confident in their language abilities.

Different time zones is a relevant issue, so the use of synchronous tools should be carefully planned in a MOOC.

Figure 16

Figures 10, 11, and 12 (reinforced by Figure 15) show that pre-structured social networks were considered largely unuseful. The social network in CCK08 seems to have been based on Moodle, the blogosphere, and the web conferencing environments.


To calculate some correlations, a few variables have been identified:

Demographics: GENDER; AGE, coded 1-4 according to the quartile distribution; ADVENG, Advanced English, a dichotomous value grouping intermediate and basic (0), advanced and mother tongue (1); ADVICT, Advanced ICT skills, a dichotomous value grouping users (0), power users and professionals (1); FORMAL (0 for informal, 1 for formal); MOODLEY (1 for people who said Moodle was the most useful tool, 0 for the others); MOODLEN (1 for people who said Moodle was the least useful tool, 0 for the others); TOOLNEG (1 for those who said the toolset had negative influence on their outcomes); ELLUMENG (1 for people who reported language difficulties in web conferencing sessions, 0 for the others).

Since data are not parametric, the Kruskall-Wallis ANOVA has been used. The chi-square analysis shows only one high significant difference: ELLUMENG vs ADVENG has a chi-square of 29.4473 (p < 0.001) confirming that there are strong language issues that influence the participation in web conferences. Age, gender, formal/informal attendance, and ICT level do not affect the preference for any of the tools.


Altough this study has some limitations, specifically the relatively low number of respondents involved in the survey, some significant findings have emerged as well as some controversial issues, which deserve to be better analysed in further research.

First, participants in CCK08 have varying opinions about the tools. As noted above, there may be several reasons. People participated in the course in a number of ways, according to learning styles, personal objectives, and time availability. The choice of the right tool is probably related to the specific user’s needs, purposes, and self-organization skills. Regarding participants’ awareness of the effectiveness of different learning tools, the findings show that users are able to make selective choices, abandoning, for example, tools with low usability (e.g., Pageflakes) or not using the most popular social networks if they are not considered relevant to the course.

The use of generic social networks that were external to the course (even if it is possible to create ad hoc communities, for example in Facebook or Ning) was perceived as unnecessary because the actual social network was built around a small group of “major tools” (Moodle, Elluminate, blogs). This point is worth further reflection and research because it seems to disagree partly with Anderson (2005), who describes the role of learning communities using educational social software as a key mitigating factor against the isolation of self-paced learners. The national communities (particularly the Spanish) were an exception that may have played a special support role, even if respondents did not view language as a real barrier (with the exception of the web conferencing tools).

Some respondents showed a high level of reflection about the organization of their learning (and knowledge) technological environments, both for themselves and for their students (since most respondents were teachers).  This is denoted by comments such as, “The integration is what is interesting, not the tools themselves. The course encourages integration of available tools. But... would be hard for many teachers to implement,” “The combination of Moodle, a wiki and personal blogs is a powerful learning tool and I think I will use it with 16-19 year-old students,” and “I use them for personal learning network and with students to help them build personal learning networks.”

Overall, despite the abundance of tools that were proposed by the instructors as a metaphor for the course itself and as opportunities for building networks (Siemens, 2009), it seems that a more traditional approach is preferred.  In general, people seem to be torn between the time-saving advantage offered by the “Daily” solution and the multi-faceted, time-consuming alternative represented by direct access to unfiltered information. The “massive” character of the CCK08 course makes these alternatives difficult to balance.

Further research might also overcome the limitations of this study to better investigate the profile of the participants of OOCs, especially as they relate to course outcomes and retention. Moreover, issues related to sustainability and the workload of instructors should be studied in depth in order to better understand the cost and effectiveness of these initiatives.

Glossary of Tools

The Daily: a mailing list managed by Stephen Downes, one of the facilitators. He sent to subscribers a daily message with a summary of the key topics of the existing conversation, such as the most interesting posts, usually with comments.

Moodle ( an open source course management system, generally used by institutions for managing online courses. In CCK08 it was used mainly for discussions in web forums.

Elluminate ( a web conferencing system. CCK08 included weekly web conference sessions, via Elluminate, usually managed as informal conversations. Sometimes guest speakers were invited.

Ustream ( a video streaming system. In CCK08 it hosted a weekly discussion based on the activities of the week.

Pageflakes ( and Netvibes ( services that allow aggregation of RSS feeds in a single page. OPML and RSS: OPML is an XML  file format used to easily export/import lists of RSS feeds in aggregators. The instructors provided an OPML file including a large number of participants’ blogs feeds.  RSS is the standard format used for syndication of blogs and other websites.

Facebook ( a popular social network service. It included a specific CCK08 group.

Linkedin: a social network service oriented to business contacts.

Twitter (http:/ a micro-blogging service, based on short messages. Users publish their own “status” and read that of others.

Ning ( a service that allows users to create their own personalized social networks. A CCK08 Ning network was available.

Second Life: a 3D virtual world in which users act as avatars in a immersive environment and can create their own artefacts. Avatars can interact via text chat and audio.

Twine ( a semantic web service for collecting and connecting content  by topic.

Flickr ( a popular photo sharing service.

Social bookmarking: a generic term for services that allows users to store and share bookmarks on the Web.

Conceptual maps: web tools to collaboratively edit conceptual and mental maps.


Anderson, T. (2005). Distance learning – social software’s killer app? Retrieved from

Anderson, T. (2008). Towards a theory of online learning. In  T. Anderson (Ed.), Theory and practice of online learning. Athabasca, AB: Athabasca University Press. Retrieved from

Angelino, L, & Natvig, D. (2009). A conceptual model for engagement of the online learner. Journal of Educators Online, 6(1). Retrieved from

Ardito, C., Costabile, M., Marsico, M., Lanzilotti, R., Levialdi, S., Roselli, T., & Rossano, V. (2006). An approach to usability evaluation of e-learning applications. Universal Access in the Information Society, 4(3), 270-283.

Attwell, G. (2007). Personal learning environments - the future of eLearning? eLearningPapers, 2(1). Retrieved from

Cigognini, M.E., Pettenati M.C., & Edirisingha, P. (2009). Personal knowledge management skills in web 2.0-based learning. In M.J.W. Lee &  C. McLoughlin (Eds.), -based e-learning: Applying social informatics for tertiary teaching. Hershey, PA: IGI Global.

Clarebout, G., & Elen, J. (2006). Tool use in computer-based learning environments: Towards a research framework. Computers in Human Behavior, 22(3), 389-411.

Downes, S. (2008). Introducing edupunk. Retrieved from

Downes, S. (2009, April 25). New technology supporting informal learning [Web log post]. Retrieved from

Fini, A., Formiconi, A., Giorni, A., Pirruccello, N. S., Spadavecchia, E., & Zibordi, E. (2008). IntroOpenEd 2007: An experience on open education by a virtual community of teachers. Journal of e-Learning and Knowledge Society, 4(1), 231-239. Retrieved from

Iiyoshi, T., & Kumar M.S.V. (Eds.) (2008). Opening up education: The collective advancement of education through open technology, open content, and open knowledge. Camridge, MA: The MIT Press.

JISC (2009). Study of the effective use of social software to support student learning and engagement. Retrieved from

Leslie, S. (2007, October 29). Your favourite “loosely coupled teaching” example? [Web log post]. Retrieved from

Liu, N.,  Zhong Y., & Lim J. (2009). An empirical investigation on the effectiveness of virtual learning environment in supporting collaborative learning: A system design perspective in human interface and the management of information. Information and interaction. Berlin/Heidelberg: Springer.

Martin, A. (2006). Literacies for the digital age: Preview of Part 1. In A. Martin & D. Madigan (Eds.), Digital literacies for learning, (pp. 3-25). London, UK: Facet Publishing.

Mason, R., & Rennie, F. (2007). Using  for learning in the community. The Internet and Higher Education, 10(3), 196-203.

Nash, R. (2005). Course completion rates among distance learners: Identifying possible methods to improve retention. Online Journal of Distance Learning Administration, 8(4). Retrieved from

Nielsen, J. (1999). Designing web usability. Indianapolis, Indiana: New Riders.

Nielsen, J. (2000, July 23). End of web design. Retrieved from

Nielsen, J. (2007). Web 2.0 can be dangerous. Retrieved from

Pettenati, M.C., Cigognini, M.E., Mangione, G.R., & Guerin, E. (2009). Personal knowledge management skills for lifelong-learners 2.0. In Social Software and Developing Community Ontology. IGI Global Publishing. Information Science Reference. Retrieved from

Rigutti S., Paoletti S., & Morandini A. (2008). Lifelong learning and e-learning 2.0: The contribution of usability studies. Journal of e-Learning and Knowledge Society, 4(1), 221-229. Retrieved from

Rubel, S. (2008). RSS Adoption at 11% and it may be peaking, Forrester Says. Retrieved from

Shackel, B. (1991). Usability-context, framework, definition, design and evaluation. In B. Shackel & S. Richardson (Eds.), Human factors for informatics usability (pp.  21-38). Cambridge: Cambridge University Press.

Siemens, G. (2004). Connectivism: A learning theory for a digital age. Retrieved from

Siemens, G. (2006). Knowing knowledge. Retrieved from

Siemens, G. (2008). MOOC or mega-connectivism course. Retrieved from

Siemens, G. and Downes, S. (2008). Connectivism & connective knowledge. Retrieved from

Siemens, G. (2009). Socialization as information objects. Retrieved from

Weller, M. (2007). Virtual learning environments: Using, choosing and developing your VLE. Oxford, UK: Routledge.

Wilson, S. (2005, January 25). Future VLE - The visual version [Web log post]. Retrieved from

Young, J. (2008). When professors print their own diplomas, Who needs universities? The Chronicle of Higher Education. Retrieved from