Troy University, USA
B. Jean Mandernach
Park University, USA
In contrast with traditional academic disciplines, online educators do not have a generally accepted list of scholarly journals, which is in part a result of the multidisciplinary nature of the field, the relative infancy of online learning, and the view of online pedagogy as an instructional modality rather than a discrete academic discipline. The purpose of this study is to determine a comprehensive listing and relative value ranking of scholarly journals whose content informs online educators and motivates scholarship. After defining the scope of investigation to target peer-reviewed, scholarly journals with an explicit focus on computer-mediated learning (e.g., virtual, electronic, distance, distributive, mobile, and blended learning), 46 scholarly journals were identified as advancing the knowledge base in computer-mediated learning. Popularity, importance, prestige, and overall rankings for each journal are presented. The results inform online educators about the range of scholarly journals available and provide insight into the relative value of journals devoted to computer-mediated learning.
Keywords: Journal Ranking; computer-mediated learning; online educators; e-learning; distance learning; distributed learning, virtual learning; mobile learning; open learning; blended learning.
The exponential growth of online learning has sparked widespread scholarly interest in issues related to online teaching and learning. This has produced a host of scholarly journals exploring issues such as best practices in online teaching, learning theories relevant to online education, and the interaction between technology and education. Nonetheless, little is scientifically known about the consistency of these publications and their relative worth to online teachers, learners, and scholars. Unlike established academic disciplines, there is neither a definitive list nor a value ranking of such journals.
Identifying journals relevant to educators involved in online learning together with value rankings of the journals critically advances the knowledge base for online educators by offering an efficient way to identify journals for their particular instructional and/or scholarly activities. The purposes of this study are outlined as follows: (a) define the scope of computer-mediated learning; (b) identify scholarly journals devoted to computer-mediated learning; and (c) rank the journals independently and collectively based upon popularity, importance, and prestige.
Scholarly journal bibliometric assessment seems a never ending search to definitively rank journals. According to Mason et al. (1997) and Kim (1991), there are two general approaches to journal ranking. The first is based on citations, which have been found to return inconsistent results (Baumgartner & Pieters, 2003; Jobber & Simpson, 1988). For example, Hascall, Bollen, and Hanson (2007) report that the Journal of Biological Chemistry was ranked first by PageRank, developed by Brin and Page (1998), and 180th using the impact factor, which is based on the Institute of Scientific Information’s (ISI) citation database. Also, review articles (versus original research) inflate a journal’s impact factor (Bollen et al., 2006), an issue that challenges the validity of the ranking. The second approach to journal ranking examines perceptions of style or content (e.g. Mylonopoulos & Theoharakis, 2001; Nisonger, 1999; Hult, Neese, & Bashaw, 1997; Luke & Doke, 1987). This form of journal ranking may be inaccurate due to the limited geographic scope of survey respondents, personal publication history, exposure, and other non-relevant perceptual biases.
To date, most ranking schemas align strictly with either a citation or perception approach. Alternative ranking options suggest utility in combining citation and perception data into an integrated system. For example, Rice and Stankus (1983) suggest a combination to include impact factor, manuscript acceptance rate, journal sponsorship, mission, and audience. Parnell (1997) limits journal quality to expert opinion surveys, citation counts, and a combination of both variables. Other studies have considered complex variables, such as accessibility (Polonsky et al., 1999), ethnocentricity (Czinkota, 2000), international involvement of editorial boards and article content (Rosenstreich & Wooliscroft, 2006), global dispersion of authorship (Polonsky et al., 2006), and inadequacy of ranking schemas for open access publications (Elbeck & Mandernach, 2008).
Online education reflects a highly diversified growth industry that represents all academic disciplines and uses a relatively recent set of technologies, which makes classification of the discipline a challenge. There are ten general categories relevant to online education, which are listed below:
The various terms represent nuanced perspectives of the multi-disciplinary nature and relatively recent evolution of online learning as a discipline. To help consolidate the various interdependent terms, we offer the term computer-mediated learning (CML), which we define in the following way: Computer-mediated learning occurs when an individual interactively learns (formally or informally, synchronously or asynchronously) about material via computer means where the learning materials and pedagogy are developed to take advantage of the available technologies.
The fundamental goal of CML is to remove the barriers of time and place in the facilitation of learning. The interactive learning relationship empowers students with control over (a) when and what they view, hear, or read, (b) the pace of their learning, and (c) requests for additional information from other student(s) or instructor(s) via the same or other media. Further, the medium to learn is any technology-based conduit connecting instructors and/or educational materials with students, which may (a) change in nature over time (e.g., from personal computer to podcast), (b) include non-electronic interventions (blended learning), and (c) facilitate instructor-student interaction both in real time (synchronously) and in different times (asynchronously). The extent of formality ranges from formal learning in a class setting to informal learning (browsing, surfing), such as lifetime learning, which may be incidental to, or a complementary feature of, formal learning.
As indicated by the definition of CML, a wide array of instructional strategies, pedagogies, and approaches are relevant to this study. A comprehensive list of potentially relevant computer-mediated learning journals was compiled using the following sources:
This search generated 154 publications. To be eligible for inclusion, the publication had to be a scholarly journal. A search for various definitions of scholarly journal from Wikipedia (http://en.wikipedia.org/wiki/Scholarly_journal), Cal Poly Pomona Library (2009), and Cornell University Library (2009) resulted in the following definition: To be considered a scholarly journal, the publication must be peer-reviewed, cite all source material, return manuscript acceptance rates below 100%, and solicit original work; as well, manuscripts may include original research articles, literature reviews, and book reviews. We further refined the definition by limiting journal selection to peer review journals that contained original articles.
As such, the following types of publications were not included in the final list of journals: (a) practitioner non-scholarly magazines (e.g., eLearn Magazine); (b) review only journals as they are limited to previously published material (e.g., eLearning Reviews); (c) newsletters (e.g., e-Learning Newsletter, Online Classroom); (d) blogs (e.g., The e-Learning Review); (e) journals no longer in publication (e.g., International Journal of Educational Technology); (f) publications limited in scope (e.g., Teaching with Technology Today is limited to University of Georgia system faculty and students); and (g) peer-reviewed conference proceedings (e.g., Annual Instructional Technology Conference at Middle Tennessee State University).
Based on the preceding exclusions, a content evaluation of each of the remaining 73 publications was conducted by examining a sample of articles (two from 2008 and two from 2007) and journal descriptions to screen for peer-reviewed journals containing original research articles. This process resulted in a list of 46 scholarly journals, of which 26 (56%) are journals from outside the US, reflecting the discipline’s world-wide scope in terms of readership and scholarship.
To refine the journal collection of 46 publications, we developed four ranking schemas to help online educators make informed judgments about each journal. The schemas reflect journal popularity, importance, prestige, and an overall ranking.
A useful metric to assess the popularity of a journal’s website is links from other websites. This metric is known as in-links, which are links from other websites to at least one page inside a journal’s website. Work comparing citations with URLs started with Larson (1996), followed by studies dismissing (Harter & Ford, 2000; Meric et al., 2002) and supporting (Vaughan & Hysen, 2002; An & Qiu, 2003) a relationship between impact factor and a site’s in-link count. A reasonable position implied by Vaughan and Thelwall (2003) is that in-link counts measure impact beyond scholarly impact to a wider audience of students, practitioners, and other interested parties. In-link counts are provided by various commercial search engines. As of December 2008, the most popular search engines and their shares of searches (R&R Web design, 2008) are Google (78.99%), Yahoo (11.46%) and MSN (3.15%). In-link counts from Google, AlltheWeb, AltaVista and MSN for the home page URL of each CML journal were supplied by the online links counter service, CheckSEO (2009).
Google’s PageRank algorithm is used to define a web page’s relevance or importance (Rogers, 2002) such that a link to a page represents support for that page. Chen, Xie, Maslov, and Redner et al. (2007) suggest that the PageRank result favors more important links and devalues unimportant links. The online service Top25Web (2009) generated the PageRank values, which range from 0 to 10. According to Rogers (2002, p.2), each PageRank value is a logarithmic scale (0=0 to 10; 1=10 to 100; etc). It is therefore prudent to consider each journal’s PageRank score representing a cohort of similarly scored journals.
The final construct we measured is prestige based on the perceptions of peers. The resulting experience survey (also known as a key informant survey) taps the knowledge and experience of scholarly CML journal editors familiar with the relative prestige among scholarly CML journals. Using e-mail, each of the 46 CML journal editors was invited to answer the question, “which are the top 5 scholarly journals in the field of online education?”
The final ranking schema uses indexes for each of the popularity, importance, and prestige ranking to combine them into a single world-wide ranking of all 46 journals.
The number of links from other sites to each journal’s home page is not a stable metric (Jacso, 2005, McCown & Nelson, 2007). Experimentation with in-links data returned valuation fluctuations of up to 10% over a 48-hour period, together with remarkable in-link differences for journals with more than one home page URL. For these reasons, data collection was conducted on January 28, 2009, in a constrained time period from 14:00 to 15:00 EST. For some journals, a ‘home page’ with the highest in-link counts was selected whenever the journal had a home page provided by the publisher and one provided by the journal’s sponsor.
For each search engine, the total number of in-link counts for each of the 46 journals was tabulated. Given that Yahoo owns AlltheWeb and AltaVista (Search Engine Watch.com, 2007) percentages for each search engine were computed and then averaged to represent each journal’s relative share of the Yahoo in-link counts. Each journal’s percentage share of in-link counts by search engine was calculated. To arrive at a realistic overall multi-search engine in-link, the overall ranking was calculated using each search engine’s percentage of global searches (R&R Web design, 2008) resulting in weights of 25.0 for a Google in-link score, 3.6 for a Yahoo in-link score, and 1.0 for a MSN in-link score. These results in Table 1 show each journal’s website’s relative popularity, as measured by links from both scholarly and non-scholarly websites.
To test the similarity of the search engine in-link counts, Table 2 presents a correlation matrix reporting highly uncorrelated relationships among the various search engine in-link counts with the exception of a strong relationship between Google and AlltheWeb (r = .678, p < .01). Clearly, each search engine computes in-link counts in differing ways with respect to the magnitude of the counts and the source of the incoming links.
PageRank scores in Table 3 refine the in-link count ranks shown in Table 1 by emphasizing the quality of the link sources, analogous to an emphasis on links from scholarly websites (PageRank) versus all websites (in-link count). Unlike the reported volatility of in-links data used to rank journals by popularity, PageRank values seemed to be stable, as observed over a five-day period. Data collection took place on January 28, 2009.
From a total of 46 journals, 31 editors responded (67% response rate; 17 from the United States, two each from Australia, Canada, and the United Kingdom, and one each from Austria, Germany, Lithuania, New Zealand, Spain, Turkey, and United Arab Emirates) to the question “which are the top 5 scholarly journals in the field of online education?” Eight editors offered no opinion, leaving a useable response rate of 50% from 23 editors whose total of 92 prestige votes (not all editors submitted five journal titles) make up the journal prestige rankings. This request was unguided given the editors did not know which journals made up the list for this study. The editors’ total of 92 votes makes up the list of journals ranked by prestige as shown in Table 4.
Table 5 presents all three popularity, importance, and prestige rankings as indexes to produce a final overall journal ranking. The indexing was computed as the journal’s score divided by the highest score for a particular ranking. For example, the prestige index for Distance Education = 7.61/10.87 = 0.70.
The correlation matrix which tests the relationship between the various ranking indexes is presented in Table 6, showing significant correlations between the overall index and the other three, though stronger for importance (r = .719, p < .01) and prestige (r = .674, p < .01) indexes. There are no significant correlations between the three ranking indexes of popularity, importance, and prestige.
Results in Table 1 rank eLearning Papers as the most popular journal with a 17.81% weighted share of in-link counts. The 90th percentile contains five journals, each with over 4.85% share of all CML journal links. Analogous to a journal’s in-link count share is market share that offers insights from the application of a four-firm concentration ratio (C4) representing the combined in-link count share of the top five journals, calculated as 43.59%. According to Wikipedia (http://en.wikipedia.org/wiki/Concentration_ratio), this describes a monopolistic market structure (Grewal & Levy, 2010) with a relatively large number of journals, many readers, and competition based on subtle product differentiation among the journals. The competitive response for journal publishers and editors is to focus on content quality and widespread availability (a benefit of open access journals) to strengthen the journal’s popularity.
The top ranked six journals in Table 3 show a PageRank score of 7; when compared to other commonly accepted quality journals, this metric indicates relative importance. For comparative purposes, the PageRank scores for Econometrica (http://www.econometricsociety.org/) and Harvard Law Review (http://www.harvardlawreview.org/) are 7 each. After two ranking iterations, two journals share upper ranks in both popularity and importance categories: the International Review of Research in Open and Distance Learning, and the Online Journal of Distance Learning Administration. Clearly, these are two very well known CML journals by virtue of the number and quality of links from other websites.
As with any set of computations, a caveat is in order. The two rankings of journal popularity and importance may be influenced positively by the age of the journal’s website and the relative amount of online content (Vaughan & Thelwall, 2003). With this consideration, one is cautioned to monitor these rankings over time.
As shown in Table 4, there are five journals in the 90th percentile, each with over 6% of the total number of editor votes, three from the US and one representing Canada and the UK respectively. The journal ranked first place for prestige is the International Review of Research in Open and Distance Learning.
The top five out of 46 scholarly CML journals are the International Review of Research in Open and Distance Learning, Journal for Asynchronous Learning Networks, eLearning Papers, Innovate: Journal of Online Education, and American Journal of Distance Education. Based upon popularity, importance, and perceptions of prestige, these journals represent the gold standard of quality and utility for online educators.
Table 6 reports no significant correlations between the three ranking indexes of popularity, importance, and prestige, suggesting that each index presents a unique perspective, which contributes to the robustness of the overall ranking index and allows for a valuable overall ranking of journals.
A challenge we faced when creating a list of CML scholarly journals centered on finding enough information sources because each new source offered diminishing returns. The ranking of journals by prestige was based on a single-item perception from editors in the field. Whilst single-item perceptual rankings do capture complex individual knowledge, we would be more comfortable with multiple variables to create multi-dimensional constructs for each ranking schema and so clearly delineate between popularity, importance, prestige, and other dimensions of journal criteria, as has been suggested by others (Baumgartner & Pieters, 2003; Shugan, 2003).
In addition, any study on journal ranking is fraught with opinions about face validity; specifically, one can challenge the operational definition of the constructs under investigation. This study will undoubtedly generate varying perspectives concerning the integrity of the selected journals, which, in itself, is a never ending task given the growth and change computer-mediated learning is experiencing. The challenge is evident as witnessed by the demise of over 5% of the journals listed in various scholarly and professional websites. For example, several international journals (e.g., Indian Journal of Open Learning, Brazilian Review of Open and Distance Learning, and the Malaysian Journal of Distance Education) have not published an issue in over 12 months.
The results of this study provide insight into current and emerging leaders in scholarly publishing for the field of computer-mediated learning. This information is essential to guide online educators seeking quality information concerning pedagogy, best practices, and scholarly developments in the field. Whether one is investigating CML journals for the purposes of advancing their teaching and course design or as an outlet to disseminate their own scholarly investigations, the results of this study serve as a heuristic for effective, efficient journal selection.
We hope this study will encourage like-minded scholars to design and publish rigorous studies addressing journal selection and ranking to arrive at a generally accepted list of CML journals and, from this, to develop a citation database to complement in-link counts, PageRank, expert opinions, and other dimensions offering alternative approaches to rank the journals, consistent with positions held by Bollen et al., (2006). A generally accepted list and ranking of scholarly CML journals will evolve over time as convergence validity is established from a result of a critical mass of these studies. Naturally, the relative youth and remarkable growth of CML as a discipline makes any decision about absolute journal ranking somewhat tentative. That is, definitively suggesting one journal is ‘better’ than another is premature given that a universally applicable journal ranking may not be possible (Polonsky & Whitelaw, 2005), and a review of 16 different ranking studies shows rank consistency limited to the top three to six journals and widely divergent results for the remaining set of journals (Hawes & Keillor, 2002).
This study offers a platform from which to start a formal conversation about scholarly publishing opportunities for CML educators. Also, it illustrates the importance of CML journals as a scholarly outlet to communicate knowledge and to offer manuscript submission options. The apparent independence among the various journal rankings (popularity, importance, and prestige) highlights the contribution of various qualitative measures to rank journals. Advancing the field of computer-mediated learning by offering a list and ranking of the scholarly journals guides educators toward an informed selection of the journals they will read and, as importantly, offers the ever growing number of scholars a cohort of journals to consider when submitting high quality manuscripts. Over the medium term, studies examining various aspects of journal ranking will generate a definitive list and ranking of journals that will doubtlessly help faculty and administrators judge the relative value of publications for promotion and tenure purposes.
The authors are grateful for the valuable guidance from two anonymous reviewers.
An, L., & Qiu, J. (2003). Research on the relationships between Chinese journal impact factors and web impact factors and external web link counts. Journal of the China Society for Scientific and Technical Information, 22(4), 398-402.
Baumgartner, H., & Pieters, R. (2003). The structural influence of marketing journals: A citation analysis of the discipline and its sub-areas over time. Journal of Marketing, 67(2), 123-39.
Bollen, J., Rodriguez, M.A., & Van de Sompel, H. (2006). Journal status. Scientometrics, 69(3), 669-687.
Brin, S., & Page, L. (1998). The anatomy of large-scale hypertextual web search engine. Computer Networks and ISDN Systems, 30, 107-17.
Cabell, D.W.E. (2006). Cabell's directory of publishing opportunities in marketing. Cabell Publishing, Beaumont, CA.
Cal Poly Pomona Library. (2009). Scholarly journals. Retrieved April 9, 2009, from http://www.csupomona.edu/~library/tutorials/scholarly_journals.html .
Check SEO. (2009). Link popularity. Check Search Engine Optimization. Retrieved April 9, 2009, from http://www.check-seo.com/LinkPopularity.php .
Chen, P., Xie, H., Maslov, S., & Redner, S. (2007). Finding scientific gems with Google. Journal of Informetrics, 1(1), January, 8-15.
Cornell University Library. (2009). Distinguishing scholarly journals from other periodicals. Retrieved April 9, 2009, from http://www.library.cornell.edu/olinuris/ref/research/skill20.html .
Czinkota, M.R. (2000). International information cross-fertilization in marketing. European Journal of Marketing, 34(11/12), 1305-14.
Elbeck, M., & Mandernach, B. J. (2008). Expanding the value of scholarly, open access e-journals. Library and Information Science Research, 30(4), 237-241.
Grewal, D., & Levy, M. (2010). Marketing (2nd ed.). New York, NY: McGraw-Hill Irwin.
Harter, S., & Ford, C. (2000). Web-based analysis of e-journal impact: Approaches, problems and issues. Journal of the American Society for Information Science, 51(13) 1159-76.
Hascall, V.C., Bollen, J., & Hanson, R. (2007). Impact factor page rankled. ASBMB Today, July, 16-19.
Hawes, J.M., & Keillor, B. (2002). Assessing marketing journals: A mission-based approach. Journal of the Academy of Business Education, 3(2) 70-86.
Hult, G.T.M., Neese, W.T., & Bashaw, R.E. (1997). Faculty perceptions of marketing journals. Journal of Marketing Education, 19(1), 37-52.
Jacso, P. (2005). Visualizing overlap and rank differences among web-wide search engines: Some free tools and services. Online Information Review, 29(5), 554-560. Retrieved April 9, 2009, from http://www.jacso.info/PDFs/jacso-visualizing-overlap.pdf .
Jobber, D., & Simpson, P. (1988). A citation analysis of selected marketing journals, International Journal of Research in Marketing, 5(2), 137-42.
Kim, M.T. (1991). Ranking journals in library and information science: A comparison of perceptual and citation-based measures. College and Research Libraries, 52(1), 24-37.
Larson, R.R. (1996). Bibliometrics of the World Wide Web: An exploratory analysis of the intellectual structure of cyberspace. ASIS, 96. Retrieved on January 23, 2009, from http://sherlock.berkeley.edu/asis96/asis96.html .
Luke, R.H., & Doke, E.R. (1987). Marketing journal hierarchies: Faculty perceptions, 1986-1987. Journal of the Academy of Marketing Science, 15(2), 74-8.
Mason, P.M., Steagall, J.W., & Fabritius, M.M. (1997). Economics journal rankings by the type of school: Perceptions versus citations. Quarterly Journal of Business & Economics, 36(1), 60-79.
McCown, F., & Nelson, M.L. (2007). Agreeing to disagree: Search engines and their public interfaces. Joint Conference on Digital Libraries (DCDL)’07, June 18–23, Vancouver, British Columbia, Canada. Retrieved April 9, 2009, from http://delivery.acm.org/10.1145/1260000/1255237/p309-mccown.pdf?key1=1255237&key2=1477939321&coll=GUIDE&dl=GUIDE&CFID=29810093&CFTOKEN=13911580 .
Meric, F., Bernstam, E.V., Mirza, N.Q., Hunt, K.K., Ames, F.C., Ross, M.I., et al. (2002). Breast cancer on the World Wide Web: Cross sectional survey of quality of information and popularity of websites. British Medical Journal, 324, March, 577-81.
Mylonopoulos, N.A., & Theoharakis, V. (2001). Global perceptions of IS journals. Communications of the ACM, 44(9), 29-37.
Nisonger, T.E. (1999). JASIS and library and information science journal rankings: A review and analysis of the last half century, Journal of the American Society for Information Science, 50(11), 1004-1020.
Parnell, J.A. (1997). Assessing management journal quality: A methodological critique and empirical analysis. The Mid-Atlantic Journal of Business, 33, March, 69-83.
Polonsky, M.J., Jones, G., & Kearsley, M.J. (1999). Accessibility: An alternative method of ranking marketing journals? Journal of Marketing Education, 21(3), 181-93.
Polonsky, M.J., & Whitelaw, P. (2005). What are we measuring when we evaluate journals? Journal of Marketing Education, 27(2) 189-201.
Polonsky, M.J., Garma, R., & Mittelstaedt, J.D. (2006). An examination of the globalization of authorship in publishing in 20 leading marketing journals. European Business Review, 18(6), 437-56.
Rice, B.A., & Stankus, T. (1983). Publication quality indicators for tenure or promotion decisions: What can the librarian ethically report? College and Research Libraries, 44, March, 173-8.
R&R Web design. (2008). Top global search engines December 2008 – market share. Retrieved April 11, 2009, from http://r-rwebdesign.com/blog/?p=353 .
Rogers, I. (2002). The Google Pagerank algorithm and how it works. Retrieved on January 27, 2009 from http://www.ianrogers.net/google-page-rank/ .
Rosenstreich, D. & Wooliscroft, B. (2006). How international are the top academic journals? The case of marketing. European Business Review, 18(6), 422-36.
Search Engine Watch.com (2007). Major search engines and directories. Retrieved January 29, 2009, from http://searchenginewatch.com/2156221 .
Shugan, S.M. (2003). Journal rankings: Save the outlets for your research. Marketing Science, 22(4), 437-41.
Top25Web. (2009). Google PageRank report. Retrieved April 9, 2009, from http://www.top25web.com/pagerank.php.
Vaughan, L., & Hysen, K. ( 2002). Relationship between links to journal web sites and impact factors. Aslib Proceedings, 54(6), 356-61.
Vaughan, L., & Thelwall, M. (2003). Scholarly use of the web: What are the key inducers of links to journal web sites? Journal of the American Society for Information Science, 54(1), 29-38.