September – 2014

Analytics to Literacies: The Development of a Learning Analytics Framework for Multiliteracies Assessment

Shane Dawson1and George Siemens2
1University of South Australia, Australia, 2University of Texas-Arlington, USA

Abstract

The rapid advances in information and communication technologies, coupled with increased access to information and the formation of global communities, have resulted in interest among researchers and academics to revise educational practice to move beyond traditional ‘literacy’ skills towards an enhanced set of “multiliteracies” or “new media literacies”. Measuring the literacy of a population, in the light of its linkage to individual and community wealth and wellbeing, is essential to determining the impact of compulsory education. The opportunity now is to develop tools to assess individual and societal attainment of these new literacies. Drawing on the work of Jenkins and colleagues (2006) and notions of a participatory culture, this paper proposes a conceptual framework for how learning analytics can assist in measuring individual achievement of multiliteracies and how this evaluative process can be scaled to provide an institutional perspective of the educational progress in fostering these fundamental skills.

Keywords: Learning analytics; multiliteracies

Introduction

The development of a literate population is one of the most recognizable goals of public education. Central to this goal has been defining what counts as literacy for contemporary society. In the more traditional sense, literacy was, and continues to be, deeply enmeshed with the written word (Kress, 2003). As Jim Dator (2005) argues, “seldom has a technology been the subject of more worship than the word is in literate cultures” (p. 202). This emphasis on the written word stems from the perceived relationship between basic literacy skills and future economic prospects. Even in the industrial age, the cultivation of a literate population (reading, writing and basic arithmetic) was viewed as an essential ingredient for achieving democracy, economic growth, and social stability (Kalman, 2008). In essence, an individual’s future social and economic prosperity was related to their capacity to read and write (Leadbeater & Wong, 2009). It is little wonder then that literacies have continued to be so intimately linked to economic benefits, such as higher socio-economic status, increased job opportunities, and increased wealth within the community (UNESCO Education Sector, 2004).

Basic literacies are as relevant today in the information age as they were previously, especially given that the Internet and mobile devices remain as heavily text-based technologies (Greenhow & Robelia, 2009; Warschauer, 2007). However, the pervasiveness and pace of change associated with new forms of media and the contexts in which they are applied places additional expectations on what it now means to be “literate” (Anstey & Bull, 2006; Huijser, 2006). That is, the expectation now exists that individuals have at their disposal a diverse set of skills and cultural competencies necessary to navigate the various forms of digital communication and participation in a global society. While not understating the continued importance for reading, writing, and numeracy skills, an emphasis placed solely on those ‘basics’ ignores the broader changes that have occurred in the cultural, social, and economic landscape and the speed of access to technologies and information that are now prevalent in today’s society (Kalantzis, Cope, & Harvey, 2003). A society that is increasingly reliant on technology for information access and communication alongside the globalization of information requires its citizens to effectively utilize a greatly expanded skill set to encompass cultural and new media competencies as well as a recognition of the various contexts in which they are applied (Coiro, Knobel, Lankshear, & Leu, 2014).

If an expanded form of literacy, multiliteracies, is the key to individual and community wealth and wellbeing within a society, then it is crucial to the value of education to establish measures regarding how well students in the education system are mastering these fundamentals. Not only is there a need to assess individual students on their progress and mastery of skills in multiliteracy, but it is important to measure systemic progress and attainment of multiliteracies of the education system and society as a whole. Measurement of the literacy of a population, in the light of its linkage to individual and community wealth and wellbeing, is an essential and fundamental measure of the success of formal education.

This paper begins with an examination of multiliteracies in order to create a defined set of skills that can be considered essential literacies for today’s society. Following this review, the potential assessment and learning activity artifacts that might be generated in the process of learning multiliteracies are discussed. The trend for increasing adoption of online and blended modes of learning continues unabated at a global level and as such there are now unprecedented levels of trace data that can be harnessed to inform teaching and learning practice. Given the potential artifacts that learners generate in these online learning environments, this paper proposes a conceptual framework for using learning analytics to measure the development of multiliteracies across a group of learners. This paper outlines a framework for moving analytics from what has been previously described as the “low hanging fruit”1 up the ladder to richer and more complex multi-dimensional analytics in education. The conceptual framework proposed in this paper emphasizes the identification of key multiliteracies based on a review of frameworks created by media scholars and then suggests possible data sources and analytics strategies that will enable a means of measuring and determining the impact of those literacies. Validation of the learning analytics/multiliteracy framework is beyond the scope of this paper but it is expected that detailing the relationship between literacies, artefacts, and analytics techniques will lead to empirical work to test, validate, and revise the frameworks offered by media scholars.

Multiliteracies

Recognition of the dramatically changing nature of what it means to be literate in the so-called ‘information age’ has seen the rise of discussions within educational research around the importance of students developing “multiliteracy” skills (I. Brown, Lockyer, & Caputi, 2010; Cope & Kalantzis, 2000a; Haythornthwaite & Andrews, 2011). As first described by the New London Group (1996), the term multiliteracies extends the scope of traditional literacy to include the diversity of media and modes of communication that are now available to learners and the varying contexts in which they are utilized. In discussing the genesis of the term multiliteracies, Gee (2009) noted that literacy “needed to be viewed as embedded in multiple socially and culturally constructed practices, not seen as a uniform set of mental abilities or processes” (p. 196). In essence, the New London Group (NLG) challenged the existing singular view of the term literacy recognizing the multiplicity of communications available to learners both as producers and consumers and the increasing cultural and linguistic diversity that is prevalent today (Cope & Kalantzis, 2000b). Simply put, new technologies are changing the way we communicate and interact. As such, the term multiliteracies is often used interchangeably with new literacies, digital literacies, or media literacies.

Since establishing “a pedagogy of multiliteracies” (Cope & Kalantzis, 2000b; New London Group, 1996), much conceptual and theoretical work has been undertaken in the literacy field (e.g., Gee, 2007; Kress, 2003; Lankshear & Knobel, 2003; Muspratt, Luke, & Freebody, 1998). The goal of this paper is to bring a new perspective of how to assess and evaluate the development of new literacies and the pedagogical activities that underpin such characteristics and skills, rather than to establish an authoritative definition or review of literacies.

A Participatory Culture

While further theorizing of the changing nature of and intersection between literacy, learning, and digitality needs to continue, there is acceptance that multiliteracies involve an increasing set of social skills that draw upon an ever expanding set of technologies, media, and discourses (Gee, 2007; Unsworth, 2001). In this context, viewing literacy through a sociocultural lens encompasses the fundamentals of literacy (e.g., reading, writing and meaning making) within embedded social practice (Lankshear & Knobel, 2007). Henry Jenkins (2006) emphasizes this point in noting that “the new literacies almost all involve social skills developed through collaboration and networking” (p. 4).

The complexity of the information, media, and technology environment that learners draw on in their day to day academic and social activities, as well as the types of skills necessary to be productive community members, is captured in Jenkins, et al.’s (2006) notion of a participatory culture. Jenkins and colleagues describe a participatory culture as one with “relatively low barriers to artistic expression and civic engagement, strong support for creating and sharing creations, and some type of informal mentorship whereby experienced participants pass along knowledge to novices” (p. xi). Educational engagement in this form of practice is strongly aligned with more Vygotskian-influenced approaches to learning such as social learning (J. S. Brown & Adler, 2008), game-based learning (Gee, 2007), self-directed learning (Garrison, 1997), and communities of practice (Lave & Wenger, 1991). The US Department of Education report, Transforming American Education: Learning Powered by Technology (2010), echoes this sentiment in noting that contemporary education practice must embrace technology mediated modes of learning for establishing access to diverse resources and connections to the broader learning community – a community that extends beyond the classroom and beyond national borders. Advocacy for these learning orientations is grounded in the view that knowledge is constructed through an individual’s interactions with the broader social group. To be productive participants in these social learning approaches requires not only the basic literacies but what Jenkins describes as “new media literacies” (Jenkins et al., 2006) or “cultural competencies and social skills” (p. xiii) necessary to be productive participatory citizens.

The competencies and skills necessary for a participatory culture include networking, collaboration, creativity, citizenship, and communication within a multiplicity of modes and mediums. Table 1 outlines Jenkins et al.’s (2006) new media literacies stressing the transition of literacy practice from individual to community. Simply put, for 21st century education these skills and attributes are considered to be the ‘new basics’. As such, the plurality of these new literacies also calls for new and diverse forms of assessment to be developed and implemented within the formal education context.

Table 1

Assessment of Literacies

The availability of an ever expanding wealth of information online, alongside the growth in the adoption of web enabled mobile platforms, impacts on how, when, and where we learn, who we learn with, and how we evaluate the legitimacy of information (Haythornthwaite & Andrews, 2011). Anywhere anytime learning now occurs with potential learners engaging with mentors, peers, and experts with minimal regard to time or place. These learner empowered collaborations complement recent research related to teaching and learning where social learning is viewed as the primary framework for effective pedagogical practice (J. S. Brown & Adler, 2008; Siemens, 2005). While embracing the pedagogical benefits that technology brings to bear to the education context, there is an associated necessity for developing “new and better ways to measure” learning (U.S. Department of Education Office of Educational Technology, 2010, p.xi).

Government calls for education accountability through standardized testing has in part reinforced the concept of a ‘back to basics’ literacy approach. Standardized testing using more traditional assessment methods drastically limits the evaluation of the social and cultural skills and competencies associated with participatory skills. This point is emphasized by Mary Kalantzis and colleagues (2003) in noting that “accountability and commensurability has focused global attention on producing education outcomes which are simple to interpret, tangible and transparent, and easily comparable” (p. 15). There is a need to examine alternate assessment approaches that can complement calls for accountability and standardization against an imperative for implementing new approaches that can measure the full spectrum of multiliteracies. More recently, Literat (2014) noted that the few studies that have attempted to measure literacy have generally focused on an individual’s understanding of text and or audio. Literat argued that alternate assessment methods are required to evaluate a broader suite of literacies. Literat addressed this methodological problem by demonstrating the potential for more psychometric methods to provide potential indicators of new media literacy skills. Literat’s (2014) new media literacy questionnaire begins to illustrate the capacity for alternate approaches to identify potential lead indicators of literacy development and comprehension. As detailed in Table 1 many of the skills are not readily assessed through standardized testing. Literacies such as “play” or “negotiation” are socially embedded and enabled. As a result, effective assessment requires a more nuanced and diverse approach than is possible through basic knowledge testing.

Alongside the curricular adoption of the skills and competencies associated with multiliteracies, current assessment practices must extend beyond the measurement of outputs to provide greater insight into learning processes (Edwards, 1997). To facilitate this transition, David Boud (2000) maintained that assessment practices cannot remain the “exclusive domain of assessors” (p. 151) as responsibility also resides with the learner. It is only through an individual’s capacity to evaluate, interpret, and make decisions about their learning progress will education foster the skills necessary for capable lifelong learners (Boud, 2000). Boud’s notion of “sustainable assessment” and learner empowerment is in keeping with calls for the development of new assessment practices that address literacies such as creativity, networking, citizenship, and collaboration (Kalantzis et al., 2003). Thus, for education, contemporary assessment practice needs to reflect these community-centric learning models to include measures of learning progression at both the community and individual levels. This is exemplified in Csikszentmihalyi’s (2006) call for creativity to be assessed at the level of community not as an end product derived from a sole individual. However, pressures of accountability, equity, standards, and quality assurance results in a loss of focus on assessment of community in lieu of an individual. It is at this intersection of education accountability and need for collaborative real-time assessment measures that analytics can provide an important role. The following section provides a brief overview of learning analytics before discussing the alignment of Jenkin’s (2006) new media literacies with learning analytics indicators. This example is used to illustrate how analytics can provide insight into individual and community based learning for the 21st century.

What are Learning Analytics?

The field of education has entered the era of “big data”. The Mckinsey report (2011) defines “big data” as a “dataset whose size is beyond the ability of typical database software tools to capture, store, manage and analyze” (p. 1). As noted in the report, while the definition is subjective, the ‘big data’ concept relates to the flood of data that is generated and captured as users interact with the myriad of IT systems that support daily activities from iTunes, Twitter, and YouTube through to ecommerce and public services. In an education context, student information systems and learner interactions with various technologies such as learning management systems (LMS) and social media leave a trail of digital breadcrumbs that are accessible for data mining and analysis (Buckingham Shum & Ferguson, 2012; Duval, 2011; Fournier, Kop, & Sitlia, 2011; Macfadyen & Dawson, 2010). The extraction and analysis of the data derived from these technologies has captured the attention of politicians, education leaders, researchers, and day to day education practitioners. This is partly due to the expectation that these relatively new forms of, and processes for, analytics can address some of the more pressing concerns confronting the education sector. These concerns include increasing completion rates, addressing basic curriculum standards, student questionnaire overload, accountability and measuring teaching effectiveness and quality. Although to date the field of learning analytics has largely focused on learner progression and developing lead indicators of student attrition (Campbell, De Blois, & Oblinger, 2007; Fritz, 2011), there is much potential (and some early research) in using student interaction data to establish indicators of more complex concepts such as knowledge construction (Pozzi, Manca, Persico, & Sarti, 2007), sense of community (Dawson, 2006; Gasevic, Adesope, Joksimovic, & Kovanovic, submitted), creativity (Dawson, 2010; Dawson, McWilliam, & Tan, 2011), and self-regulated learning (Biswas, Jeong, Kinnebrew, Sulcer, & Roscoe, 2010; Winne & Hadwin, 2013).

Learning analytics adopts many of the methods and approaches used by “big data” practitioners, but is specifically defined as “the measurement, collection, analysis and reporting of data about learners and their contexts, for the purposes of understanding and optimizing learning and the environments in which it occurs” (Siemens & Long, 2011). Learning analytics uses the data associated with a learner’s interactions with content, other learners, and the educational institution to make decisions and evaluations about teaching practices, personalized content, and needed interventions for learner success. The field draws on and integrates research and methodology related to data mining, social network analysis, data visualization, machine learning, learning sciences, psychology, semantics, artificial intelligence, e-learning, and educational theory and practice. Learning analytics focuses on the interpretation of the educational data from a learner and teacher orientation. This places as much emphasis on understanding the pedagogical context from where the data is derived as developing statistically robust interpretive and predictive models.

The interest in learning analytics is fuelled by increasing calls for accountability, quest for evidence, and the demonstration of what counts as learning and teaching quality alongside decreasing fiscal resources. Similarly, the accessibility to student learning data from various information systems has also contributed to the use of learning analytics as a tool for measuring impact and informing the strategic decision making process within education organizations (Macfadyen & Dawson, 2012). Although the concept of using these forms of analytics for informing education practice has been well received, the majority of educational institutions seldom make optimal use of their available data and analytical resources (Norris, Baer, Leonard, Pugliese, & Lefrere, 2008). Furthermore, as a result of the relative infancy of the field of learning analytics, the level and sophistication of the data analysis performed has to date been limited, with a predominance of studies and reports undertaking simple univariate or bivariate analyses (Dawson, Gašević, Siemens, & Joksimovic, 2014). For instance, low-level analysis such as the reporting of student login times, number of posted messages in a discussion forum, or total time online are common analytics approaches. Bivariate analyses have tended to relate the readily accessible learning management system (LMS) data (e.g., number of messages posted, frequency of logins, time online, etc.) to academic performance as measured by grades.

These early studies have served well to raise the profile of learning analytics and to generate interest in the field as a means to improve student retention and success, including the development of early indicators of academic performance through LMS usage trends. The Purdue University course Signals2 project is an example of early warning systems. The Signals software incorporates student past academic performance, demographics with LMS activity, and engagement to provide a statistical assessment of an individual learner’s probability for success (Arnold, 2010). Watson and Gemin (2008) defined “at-risk” as students with a high probability of withdrawing from the course as a result of academic failure, dropping out, or expulsion for behavioral reasons. In the instance of Signals, success is measured by an individual’s overall assessment score. The intent of the software is to provide early warning indicators to learners and instructors regarding an individual’s risk of course failure or attrition.

The implementation of practices that focus on improving student retention has a direct and easily measurable return on investment and aligns well with numerous federal government initiatives internationally. For instance, the Australian Bradley Report (Bradley, Noonan, Nugent, & Scales, 2008) emphasized the imperative for increasing graduation rates; Singapore’s “Thinking Schools, Learning Nation” (Ministry of Education, 1998) called for evidence based policy making; and the US “Building a Grad Nation” (Balfanz, Bridgeland, Moore, & Fox, 2010) outlined a strategy for addressing the nation’s increasing level of attrition. These types of reports add further weight to the importance of establishing well-grounded processes and practices for informed decision making practice and quality assurance and accountability. However, stated targets for reducing attrition alone, while admirable, takes a simplistic view of the learning process and does little to measure learning of skills and attributes.

Researchers such as Macfadyen and Dawson (2010) have investigated LMS data to predict student academic performance. While these authors used relatively simple metrics such as student grades as an indicator of success, the study incorporated a more sophisticated modeling process to analyze 15 LMS variables to determine and evaluate a best-fit predictive model. The authors note that social learning (such as in a discussion forum) and formative assessment tasks completed accounted for greater than 30% of the variation in student grade. Macfadyen and Dawson’s study moved beyond the incorporation of simple engagement measures such as number of discussion messages posted towards more complex analytic measures such as social network analysis (SNA). The inclusion of social network methods for determining community participation aligns with the move for new media literacies to be evaluated at the level of community (Jackson, 2006). Combining SNA with other more automated qualitative analytics through machine learning and computational linguistics can further enrich the prospects for establishing a meaningful model for evaluating new literacies. For example, while SNA provides an indication of the strength and diversity of relationships an individual actor establishes in a network there is minimal reference to be able to identify the purpose and value of this relationship. In this context, the inclusion of automated content analysis provides added insight that can determine the extent to which an individual demonstrates good participatory practice. More recently, learning analytics research has started to transition to more sophisticated methods targeting discourse, language, and affective learner attributes3.

Literacies and Learning Analytics

The inclusion of methods such as SNA, epistemic network analysis (Shaffer et al., 2009), and affective learner attributes (Baker, D’Mello, Rodrigo, & Graesser, 2010) merged with computational linguistics moves the focus of learning analytics from the measurement of an end product towards an evaluation of the process of learning. It is unlikely such a model can be developed through a reliance on extracting student assessment data and LMS activity alone. Grades and LMS activity do not sufficiently represent the diversity of social and cultural based interactions students frequently engage in as learning is not the sole domain of formal institutions. This calls for an examination of how student engagement across multiple educational and social systems can be captured and incorporated, including the more qualitative artifacts such as student discussion postings, essay writing, blog posts, YouTube, Facebook or Twitter feeds. The inclusion of social network methodologies provides a rich stream of data and an important analytical view to determine the types of relationships and extent of participation in defined communities. In this context, analytics can begin to capture the necessary insights that relate the individual to the community – or in the terms of Jenkins, the individual’s active engagement in a participatory culture. The following section outlines possible metrics for evaluating multiliteracies. This is not to suggest that existing forms of student assessment should be replaced, but rather the section outlines additional indicators that can further complement the suite of assessment practices regarding an individual’s progression and therefore the broader demonstration of new media literacies.

Using Analytics to Assess Multiliteracies

As with any kind of evaluative measurement, defining an outcome or process’s success indicators will guide the selection of tools for assessment. For the purposes of this paper, Jenkins et al.’s proposed new media literacies and definitions (Table 1) will be used as the basis for the learning analytics framework. Jenkins et al.’s multiliteracies have been clustered in order to refine the types of analytical data that can provide sufficient lead indicators of competency. For example, play, performance, and simulation are closely linked in terms of their affinity with problem solving processes, experimentation, and risk taking. This may be realized through activities such as gaming or role play. However, simulation also corresponds to aspects of appropriation, whereby these skills involve a form of creation or co-creation. Items such as collective intelligence, judgment, and negotiation relate to accessing, sharing, and evaluating information and resources within and across networks. The skill of distributed cognition lies in the intersection between accessing and sharing information and navigation and multi-tasking. Figure 1 presents a visualization of the associated grouping of Jenkins et al.’s (2006) classification of media literacies. The four clusters described above can be measured, monitored, and reported through a diversity of analytics and modified for the specific pedagogical context.

Figure 1

Experimentation: Play, Performance, and Simulation

Experimentation can be determined through the diversity of user interactions in particular technologies. This may include online games, role playing, or use of assigned simulations. There is a trend for educators to adopt “serious games” and simulations to support the achievement of stated learning objectives (Moreno-Ger, Burgos, Martínez-Ortiz, Sierra, & Fernández-Manjón, 2008). In this context, the use of virtual worlds provides an opportunity for learner experimentation in an immersive environment (De Freitas, Rebolledo-Mendez, Liarokapis, Magoulas, & Poulovassilis, 2010). User interactions mined from these forms of game based engagement can provide insight into the learner’s competency for play and performance. For example, the degree to which the user engages with the resources, models, pathways, and specific goal oriented outcomes provides an indication of the user’s efficacy for engaging in play activity as well as adopting alternate persona for the purposes of discovery, reflection, and perspective.

Products/Creation: Appropriation and Simulation

Products/creation can be observed directly through the generation of specific artifacts such as multimedia4. As such these products, associated methods for creation, and level of co-creation and engagement can be measured. In Axel Bruns’ (2008) terms, the concept of prod-users defines a shift away from a model of production to a more collaborative and user-led model of creation. The concept reinforces the notion that any digital product can be remixed and repurposed, and as such is a continual state of flux and evolution. The feedback link from producer to consumer back to producer can be collaborative and completed within an extremely short time frame. Wikipedia illustrates this dynamic and evolving collaborative system. Essentially, the flow and utilization of products as they evolve in both social and cultural importance can act as an indicator of appropriation and simulation.

Network Agility and Citizenship: Collective Intelligence, Judgment, Negotiation, Distributed Cognition

The skills grouped into network agility/citizenship represent an individual’s level of competence and capacity for building relationships, participating in networks, and contributing to a community of learners. This cluster relates to a learner’s role, position, and contributions to the learning network. This can be readily measured via social network analysis (SNA). The integration of SNA not only provides insight into the strength and diversity of relationships formed but also the types of information or resources shared within the social system (Haythornthwaite, 2002).

In examining the impact of network position and the flow of good ideas Ronald Burt (2004) illustrated the value of SNA to provide insights into network position and information and resource access. In this example, Burt noted that actors bridging two or more previously disparate network clusters demonstrate greater agility and enterprise than peers positioned within an insular network cluster. According to Burt (2004), these individuals “are able to see early, see more broadly, and translate information across groups” (p. 354). Burt sees this ‘translating’ function as value-adding creativity. This is not just because of the extent to which ‘brokers’ are able to move knowledge around in value-adding ways, but also their capacity to build, sustain, and expand upon their networks within and outside the existing environment. The skills these “border crossers” (McWilliam & Dawson, 2008) exhibit in order to establish these diverse networks reflect a high level of competence with digital literacies and also demonstrate good participatory practice. Hence understanding can be gained about how social network analysis can assess and provide early indicators of a participatory culture.

Interactions in networks are often more complex than the exchange of information between two or more individuals. Multidimensional networks (Contractor, Monge, & Leonardi, 2011) reflect the more complex interactions that are formed between different technologies and people. As the use of social technologies continues to penetrate the education sector there is an associated increase in opportunity to extract and visualize the relationship data. Aligning SNA or multidimensional network activity with new media literacies, an individual’s position in the network and the diversity of relationships formed serves as a measure of their capacity to form relationships, provide and share resources and information (collective intelligence, distributed cognition) as well as negotiate, adapt, and respect social and cultural community norms (judgment, negotiation). However, while SNA provides a robust approach for ascertaining an individual’s network agility there is limited information regarding the quality of the relationships and resources accessed and shared. It is only through an examination of the quality of the exchanges can we begin to evaluate the proficiency of an individual’s participatory practices.

The use of content analysis alongside SNA affords rich insights into the quantitative as well as qualitative aspects of a social system. Content analysis is a commonly adopted approach for determining the perceived quality of the knowledge construction process occurring within a learning network (De Laat, 2002; De Laat, Lally, Lipponen, & Simons, 2006). However, the mapping of the captured exchanges to a pre-defined coding scheme has to date been largely a manual and time consuming process. Hence, the level of integration between SNA and content analysis methods has thus far been minimal at best. The adoption of machine learning techniques and tools such as TagHelper (Rose et al., 2007) and Cohere (De Liddo, Buckingham Shum, Quinto, Bachler, & Cannavacciuolo, 2011) demonstrate the potential for automating the coding process and therefore the potential for measuring an individual’s network agility and citizenship.

Task Effectiveness and Efficiency: Multi-Tasking, Networking, Trans-Media Navigation

Measures of task effectiveness and efficiency relate specifically to the choice and method for achieving goals and outcomes as well as overall comprehension. For instance, transmedia navigation requires a level of proficiency for interpreting and understanding different representations of information or social and cultural icons and artifacts across and within multiple domains. Networking refers to an individual’s competency with various tools and methods for searching, synthesizing, and disseminating information. These characteristics imply a high level of competence and efficiency with accessing tools and resources. As such, the tool selection, search data and techniques, and levels of engagement reflect analytics data to evaluate the characteristics underpinning this cluster. Kennedy and Judd (2011; 2004) examined digital audit trails of a student’s activity within various technologies in order to identify specific patterns of behavior. The authors concluded that the audit trails provided significant interpretive power regarding an individual’s learning behavior, search process, and tool selection. By further incorporating a level of semantic analysis it is possible to also establish a user’s search term technique based on specific request and tasks. These forms of analysis provide a measure of a user’s competency with information evaluation and use in different settings and across various media.

Table 2

The media literacies described are still reliant upon an individual student demonstrating a certain level of competency in the more traditional literacies (Jenkins et al., 2006). A student who has limited reading and writing proficiency will continue to struggle with new media literacies. As depicted in Figure 1, the traditional literacies are the foundation for all other literacies. These basics can be evaluated through automated content analysis. Textual passages can be extracted from student activity with blogs, wikis, traditional writing assignments, discussion forum activity, even Twitter posts. However, while this data can be mined and analysed, an understanding of the learning design is essential for establishing meaningful indicators and assessment of an individual’s proficiency within one or more literacy (Lockyer, Heathcote, & Dawson, 2013).

Learning analytics and educational data mining frameworks have been created by researchers to describe the range of activity, in terms of techniques and applications, undertaken by researchers working with learning-related data (Baker & Yacef, 2009; Bienkowski, Feng, & Means, 2012; Siemens, 2013). These suggested frameworks reflect a sequential maturing and refinement of analytics work whereby the more recent work undertaken by Siemens incorporates and builds upon the earlier studies. Siemens’ framework comprises two related components: analytics techniques and applications. Analytic techniques include: modeling; relationship mining; and knowledge domain mapping. Analytic applications involve: applications; prediction; personalization and adaptive learning; and structured mapping. Mapping the proposed four multiliteracies (see Figure 1), experimentation, products/creation, network agility and citizenship, and task effectiveness and efficiency, to an existing framework of learning analytics techniques and applications (Siemens, 2013) can provide the analytics opportunities as described in Table 3.

Table 3

Conclusion

With growing interest in data and analytics in the education sector, it is important for researchers, educators, learners, and administrators to have tools and techniques that go beyond surface-level analytics. The complexity of a social process in learning cannot be adequately assessed through basic metrics such as logins, time online, and clicks. Multiliteracies draw attention to the skills and attributes learners require to navigate the increasingly complex technical, social, cultural, and economic worlds. However, traditional models of standardized assessment do little to either promote or effectively measure these multiliteracies (I. Brown et al., 2010; Kalantzis et al., 2003).

This paper builds upon established theoretical models to demonstrate the role learning analytics can play in assisting educators and students in developing real-time feedback and analytics for evaluating literacies. Thus, the significance of the model resides in its capacity to provide deep and nuanced insight into the learning activities of students and to merge the boundary between multiliteracies and learning analytics. In this context, alternate and diverse assessment techniques and instruments are necessary to better align and reflect the technical and information complexity and multimodal learning that form the core of 21st century education.

References

Anstey, M., & Bull, G. (2006). Teaching and learning multiliteracies: Changing times, changing literacies. Australia: Curriculum Press.

Arnold, K.E. (2010). Signals: Applying academic analytics. EDUCAUSE Quarterly Magazine, 33(1). Retrieved from http://www.educause.edu/EDUCAUSE+Quarterly/EDUCAUSEQuarterlyMagazineVolum/SignalsApplyingAcademicAnalyti/199385

Baker, R.S.J.d, D’Mello, S.K., Rodrigo, M.M.T, & Graesser, A.C. (2010). Better to be frustrated than bored: The incidence, persistence, and impact of learners’ cognitive–affective states during interactions with three different computer-based learning environments. International Journal of Human-Computer Studies, 68(4), 223-241.

Baker, R.S.J.d, & Yacef, K. (2009). The state of educational data mining in 2009: A review and future visions. Journal of Educational Data Mining, 1(1), 3-17.

Balfanz, R., Bridgeland, J.M., Moore, L.A., & Fox, J.H. (2010). Building a grad nation: Progress and challenge in ending the high school dropout epidemic. Retrieved frm http://www.americaspromise.org/Our-Work/Grad-Nation/Building-a-Grad-Nation.aspx.

Bienkowski, M., Feng, M., & Means, B. (2012). Enhancing teaching and learning through educational data mining and learning analytics: An issue brief. US Department of Education, Office of Educational Technology, 1-57.

Biswas, G., Jeong, H., Kinnebrew, J., Sulcer, B., & Roscoe, R. (2010). Measuring self-regulated learning skills through social interactions in a teachable agent environment. Research and Practice in Technology-Enhanced Learning (RPTEL), 5(2), 123-152.

Boud, D. (2000). Sustainable assessment: Rethinking assessment for the learning society. Studies in Continuing Education, 22(2), 151-167.

Bradley, D., Noonan, P., Nugent, H., & Scales, B. (2008). Review of Australian higher education: Final report. Canberra: Department of Education, Employment and Workplace Relations.

Brown, I., Lockyer, L., & Caputi, P. (2010). Multiliteracies and assessment practice. In D. R. Cole & D. R. Pullen (Eds.), Multiliteracies in motion: Current theory and practice (pp. 191-206). New York: Routledge.

Brown, J.S., & Adler, A.P. (2008). Minds on fire: Open education, the long tail, and learning 2.0. EDUCAUSE Review, 43(1), 16-32.

Bruns, A. (2008). Blogs, Wikipedia, Second life, and beyond: From production to produsage. New York: Peter Lang Publisher.

Buckingham Shum, S., & Ferguson, R. (2012). Social learning analytics. Educational Technology & Society, 15(3), 3-26.

Burt, R. (2004). Structural holes and good ideas. The American Journal of Sociology, 110(2), 349-399.

Campbell, J., De Blois, P.B., & Oblinger, D. (2007). Academic analytics: A new tool for a new era. EDUCAUSE Review, 42(4), 42-57.

Coiro, J., Knobel, M., Lankshear, C., & Leu, D. J. (Eds.). (2014). Handbook of research on new literacies. New York: Routledge.

Contractor, N.S., Monge, P.R., & Leonardi, P.M. (2011). Multidimensional networks and the dynamics of sociomateriality: Bringing technology inside the network. International Journal of Communication, 5, 682-720.

Cope, B., & Kalantzis, M. (2000a). Multicultural Education - an equity framework: South Australian Department of Education Curriculum Standards and Accountability Framework. Adelaide: South Australia Department of Education.

Cope, B., & Kalantzis, M. (Eds.). (2000b). Multiliteracies: Learning and the design of social futures. London: Routledge.

Csikszentmihalyi, M. (2006). Foreward: Developing creativity. In N. Jackson, M. Oliver, M. Shaw & J. Wisdom (Eds.), Developing creativity in higher education: An imaginative curriculum (pp. xviii-xx). London: Routledge.

Dator, J. (2005). Universities without “quality” and quality without “universities”. On the Horizon, 13(4), 199-215.

Dawson, S. (2006). Relationship between student communication interaction and sense of community in higher education. Internet and Higher Education, 9(3), 153-162.

Dawson, S. (2010). ‘Seeing’ the learning community: An exploration of the development of a resource for monitoring online student networking. British Journal of Educational Technology, 41(5), 736–752.

Dawson, S., Gašević, D., Siemens, G., & Joksimovic, S. (2014). Current state and future trends: A citation network analysis of the learning analytics field. Paper presented at the Fourth International Conference on Learning Analytics And Knowledge (LAK ‘14), Indianapolis, USA.

Dawson, S., McWilliam, E., & Tan, J. (2011). Measuring creative potential: Using social network analysis to monitor and develop learners’ creative capacity. Australasian Journal of Educational Technology, 27(6), 924-942.

De Freitas, S., Rebolledo-Mendez, G., Liarokapis, F., Magoulas, G., & Poulovassilis, A. (2010). Learning as immersive experiences: Using the four-dimensional framework for designing and evaluating immersive learning experiences in a virtual world. British Journal of Educational Technology, 41(1), 69-85.

De Laat, M. (2002). Network and content analysis in an online community discourse. Paper presented at the Computer-Supported Collaborative Learning, Boulder, Colorado.

De Laat, M., Lally, V., Lipponen, L., & Simons, R. J. (2006). Analysing student engagement with learning and tutoring activities in networked learning communities: A multi-method approach. International Journal of Web Based Communities, 2(4), 394-412.

De Liddo, A., Buckingham Shum, S., Quinto, I., Bachler, M., & Cannavacciuolo, L. (2011). Discourse-centric learning analytics. Paper presented at the Proceedings of the 1st International Conference on Learning Analytics and Knowledge, Banff, Alberta, Canada.

Duval, E. (2011). Attention please! Learning analytics for visualization and recommendation. Paper presented at the Proceedings of 1st International Conference on Learning Analytics and Knowledge, Banff, Canada.

Edwards, R. (1997). Changing places? Flexibility, lifelong learning and a learning society. London: Routledge.

Fournier, H., Kop, R., & Sitlia, H. (2011). The value of learning analytics to networked learning on a personal learning environment. Paper presented at the 1st International Conference on Learning Analytics and Knowledge Banff, Alberta, Canada.

Fritz, J. (2011). Classroom walls that talk: Using online course activity data of successful students to raise self-awareness of underperforming peers. The Internet and Higher Education, 14(2), 89-97.

Garrison, D.R. (1997). Self-directed learning: Toward a comprehensive model. Adult Education Quarterly, 48(1), 18-33.

Gasevic, D., Adesope, O., Joksimovic, S., & Kovanovic, V. (submitted). Externally-facilitated regulation scaffolding and role assignment to develop cognitive presence in asynchronous online discussions. The Internet and Higher Education.

Gee, J.P. (2007). Good video games + good learning : Collected essays on video games, learning and literacy. New York: Peter Lang.

Gee, J.P. (2009). Reflections on reading Cope and Kalantzis’ “Multiliteracies: New literacies, new learning”. Pedagogies: An International Journal, 4, 196-204.

Greenhow, C., & Robelia, B. (2009). Old communication, new literacies: Social network sites as social learning spaces. Journal of Computer-Mediated Communication, 14(4), 1130-1161.

Haythornthwaite, C. (2002). Building social networks via computer networks: Creating and sustaining distributed learning communities. In K. A. Renninger & W. Shumar (Eds.), Building virtual communities: Learning and change in cyberspace (pp. 159-190). New York: Cambridge University Press.

Haythornthwaite, C., & Andrews, R. (2011). E-Learning theory and practice. London, UK: Sage Publications.

Huijser, H. (2006). Refocusing multiliteracies for the net generation. International Journal of Pedagogies & Learning, 2(1), 22-34.

Jackson, N. (2006). Imagining a different world. In N. Jackson, M. Oliver, M. Shaw & J. Wisdom (Eds.), Developing creativity in higher education: An imaginative curriculum. London: Routledge.

Jenkins, H., Clinton, K., Purushotma, R. , Robinson, A. J., & Weigel, M. (2006). Confronting the challenges of participatory culture: Media education for the 21st century. Chicago, IL: MacArthur Foundation.

Judd, T., & Kennedy, G. (2011). Measurement and evidence of computer-based task switching and multitasking by ‘Net Generation’ students. Computers & Education, 56(3), 625-631. doi: 10.1016/j.compedu.2010.10.004

Kalantzis, M., Cope, B., & Harvey, A. (2003). Assessing multiliteracies and the new basics. Assessment in Education: Principles, Policy & Practice, 10(1), 15-26.

Kalman, J. (2008). Beyond definition: Central concepts for understanding literacy. International Review of Education, 54(5/6), 532-538.

Kennedy, G., & Judd, T. (2004). Making sense of audit trail data. Australian Journal of Educational Technology, 20(1), 18-32.

Kress, G. (2003). Literacy in the new media age. London: Routledge.

Lankshear, C., & Knobel, M. (2003). New literacies: Changing knowledge in the classroom. Buckingham, UK: Open University Press.

Lankshear, C., & Knobel, M. (2007). A new literacies sampler. In M. Knobel & C. Lankshear (Eds.), Sampling “the new” in new literacies (Vol. New literacies and digital epistemologies, pp. 1-24). New York: Peter Lang.

Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge: Cambridge University Press.

Leadbeater, C., & Wong, A. (2009). Learning from the extremes: Strategies for radical social innovation (Vol. http://www.cisco.com/web/about/citizenship/socio-economic/docs/LearningfromExtremes_WhitePaper.pdf). Cisco.

Literat, I. (2014). Measuring New Media Literacies: Towards the development of a comprehensive assessment tool. Journal of Media Literacy Education, 6(1), 15-27.

Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning design. American Behavioral Scientist, 57(10), 1439-1459.

Macfadyen, L., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” for educators: A proof of concept. Computers & Education, 54(2), 588-599.

Macfadyen, L., & Dawson, S. (2012). Numbers are not enough. Why e-learning analytics failed to inform an institutional strategic plan. Educational Technology & Society, 15(3), 149-163.

McKinsey Global Institute. (2011). Big data: The next frontier for innovation, competition, and productivity. Retrieved from http://www.mckinsey.com/Insights/MGI/Research/Technology_and_Innovation/Big_data_The_next_frontier_for_innovation.

McWilliam, E., & Dawson, S. (2008). Pedagogical practice after the Information Age. In S. Inayatullah, I. Milojevic & S. Bussey (Eds.), Alternative educational futures: Pedagogies for emergent worlds (Vol. 130-144). The Netherlands: Sense Publishers.

Ministry of Education. (1998). Learning to think, thinking to learn: Towards thinking schools, learning nation. Singapore: Ministry of Education.

Moreno-Ger, Pablo, Burgos, Daniel, Martínez-Ortiz, Iván, Sierra, José Luis, & Fernández-Manjón, Baltasar. (2008). Educational game design for online education. Computers in Human Behavior, 24(6), 2530-2540.

Muspratt, A., Luke, A., & Freebody, P. (Eds.). (1998). Constructing critical literacies: Teaching and learning textual practice. Cresskill, NJ: Hampton.

New London Group. (1996). A pedagogy of multiliteracies: Designing social futures. Harvard Educational Review, 66, 60-92.

Norris, D., Baer, L., Leonard, J., Pugliese, L., & Lefrere, P. (2008). Action analytics: Measuring and improving performance that matters in higher education. EDUCAUSE Review, 43(1), 42-67.

Pardo, A., & Kloos, C.D. (2011). Stepping out of the box: Towards analytics outside the learning management system. Paper presented at the Proceedings of the 1st International Conference on Learning Analytics and Knowledge, Banff, Alberta, Canada.

Pozzi, F., Manca, S., Persico, D., & Sarti, L. (2007). A general framework for tracking and analysing learning processes in computer-supported collaborative learning environments. Innovations in Education and Teaching International, 44(2), 169-179.

Rose, C., Wang, Y., Cui, Y., Arguella, J., Stegmann, K., Weinberger, A., & Fischer, F. (2007). Analyzing collaborative learning processes automatically: Exploiting the advances of computational linguistics in computer-supported collaborative learning. International Journal of Computer-Supported Collaborative Learning, 3(3), 237-272.

Shaffer, D. W., Hatfield, D., Svarovsky, G. N., Nash, P., Nutly, A., Bagley, E., . . . Mislevy, R. (2009). Epistemic network analysis: A prototype of 21st-century assessment of learning. International Journal of Learning and Media, 1(2), 33-53.

Siemens, G. (2005). A learning theory for the digital age. Instructional Technology and Distance Education, 2(1), 3-10.

Siemens, G. (2013). Learning analytics The emergence of a discipline. American Behavioral Scientist, 57(10), 1380-1400.

Siemens, G., & Long, P. (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE Review, 46(5), 30-40.

U.S. Department of Education Office of Educational Technology. (2010). Transforming American education: Learning powered by technology. Washington, D.C.

UNESCO Education Sector. (2004). The plurality of literacy and its implications for policies and programs: Position paper. Paris: UNESCO. Retrieved from http://unesdoc.unesco.org/images/0013/001362/136246e.pdf.

Unsworth, L. (2001). Teaching multiliteracies across the curriculum: Changing contexts of text and image in classroom practice. Buckingham: Open University Press.

Warschauer, M. (2007). The paradoxical future of digital learning. Learning Inquiry, 1(1), 41-49. doi: 10.1007/s11519-007-0001-5

Watson, J., & Gemin, B. (2008). Using online learning for at-risk students and credit recovery. Washington, DC: International Council for K-12 Online Learning.

Winne, P. H., & Hadwin, A. F. (2013). nStudy: Tracing and supporting self-regulated learning in the Internet. In R. Azevedo & V. Aleven (Eds.), International handbook of metacognition and learning technologies (pp. 293-308). Amsterdam: Springer.


1 Learning and Knowledge Analytics (2011): Knewton – the future of education? http://www.learninganalytics.net/?p=126

2 Purdue University Course Signals: http://www.itap.purdue.edu/learning/tools/signals/

3 http://machineanalytics.org/schedule/

4 See for example http://ds106.us/ as an artifact-creating online community