Jered Borup
George Mason University, USA
Richard E. West, Rebecca A. Thomas, and Charles R. Graham
Brigham Young University, USA
This mixed method research examined instructors’ use of video feedback and its impact on instructor social presence in 12 blended sections of three preservice educational technology courses. An independent samples t-test was conducted and found no significant difference in perceptions of instructor social presence between students who received video feedback (M = 5.77, SD = 0.85) and those who received text (M = 5.62, SD = 0.75); t(178) = 1.23, p = 0.22. The analysis of 22 student and nine teacher interviews found that participants generally viewed video feedback to be more effective at establishing instructor social presence because instructors could better speak with emotions, talk in a conversational manner, and create a sense of closeness with students. Students also explained that the blended learning format lessened the impact of video feedback on instructor social presence, which may help to explain why statistical differences were not found.
Keywords: Blended learning; online learning; social presence; feedback; community; higher education; computer mediated communication
Johnson (2008) stated that there are two sides of teaching—the academic and the human—adding that the human side of teaching can be especially difficult for teachers. This is particularly true in online and blended learning environments where face-to-face instruction is lowered or eliminated, resulting in a feeling of isolation for many students (Palloff & Pratt, 2007). Research has found that this sense of isolation can negatively impact academic learning outcomes (Dziuban, Hartman, & Moskal, 2004; Song, Singleton, Hill, & Koh, 2004; Hara & Kling, 1999). However, Moore (1980) explained that this psychological distance can be reduced when interaction is increased. For instance, Boling, Hough, Krinsky, Saleem, and Stevens (2012) found that instructor feedback in an online course was “an important component for building a strong student-instructor connection” (p. 121). Evans (2013) added that feedback has a social dimension even when it focuses largely on the course content. However, instructor feedback is largely text based and lacks many of the non-verbal communication cues that make it easier to form a connection with students.
Some instructors have begun to incorporate asynchronous video feedback into their online and blended courses as a way to enhance the social dimension that was described by Evans (2013). However, research examining asynchronous video feedback’s impact on instructor social presence is limited and has focused largely on student perspectives. Price, Handley, Millar, and O’Donovan (2010) explained that “The learner may be in the best position to judge the effectiveness of feedback but, on the other hand, may not always recognize the benefits it provides” (p. 277). Combining student and instructor perspectives may provide a clearer understanding of how instructors use feedback to establish their social presence. As a result, this research combines student and instructor perceptions to examine the impact of asynchronous video feedback on instructor social presence in blended courses that provide the majority of instruction online. More specifically this research addressed the following:
We will first discuss the concept of social presence and its relationship to modes of communication. Following we will discuss the literature examining the use of video feedback to establish instructor social presence.
Historically social presence has been closely related to the mode of communication used. Short, Williams, and Christie (1976) originally defined social presence as “the degree of salience of the other person” in mediated communication (p. 65). They also emphasized that social presence was an attribute of the mode of communication—the more communication cues that the tool could transmit, the more social presence it contained. Short et al.’s original definition of social presence is similar to the concept of media richness (Draft & Lengel, 1986). Draft and Lengel (1986) defined media richness as a communication medium’s capacity to process rich information and explained that face-to-face communication had the highest richness and “impersonal written documents” and “numeric documents” had the lowest (p. 560).
Short et al. (1976) also believed that social presence was closely related to the concept of immediacy. Wiener and Mehrabian (1968) defined immediacy as the level of psychological distance that exists within communication. The words that are used, as well as the visual and auditory cues, during communication can affect the level of immediacy. Short et al. (1976) reasoned that when using the same communication tool it was possible for immediacy (a product of behavior) to vary while social presence (an attribute of tool) stayed constant.
The distinction between immediacy and social presence has become less clear. Unlike Short et al. (1976), Gunawardena (1995) contended that social presence was in part a product of behavior and thus could “be cultivated” by participants (p. 162). Gunawardena’s view that social presence can be cultivated has become widely accepted within the research community and is the view we adopted for this research. Garrison, Anderson, and Archer (2000) built on this new definition when they created the community of inquiry (CoI) framework to examine text-based learning interactions in an online learning environment. Similar to Gunawardena (1995), Garrison et al. (2000) defined social presence as the degree to which participants are able to project their full personality socially and emotionally and that communication behaviors—not media—are the most important factors when measuring social presence. The CoI framework also explained that social presence can be a prerequisite to cognitive presence or the extent that students are able to construct knowledge from their interactions with others in the course. Garrison, Anderson, and Archer (2010) would later explain:
An important contribution of our work was describing social presence from a multi-dimensional perspective that had overlap with the other presences. Building on the affective expression dimension we added “open communication” as a category within social presence to reflect the purposeful nature of the community, and “group cohesion” to reflect the collaborative nature of the community and its activities. (p. 7)
These three subcategories (i.e., affective expression, open communication, and group cohesion) were confirmed when Garrison and his colleagues (Rourke, Anderson, Garrison, & Archer, 2001) performed content analyses on online text discussion boards. In doing so, they also concluded that social presence could be established using only written communication. However, Garrison at al. (2000) acknowledged that “the lack of visual cues [in text] may present particular challenges to establishing social presence” (p. 95).
Within the CoI framework, social presence focused largely on student social presence; however, Anderson, Rourke, Garrison, and Archer (2001) acknowledged that a teacher’s responsibility to facilitate discourse within the course “overlaps with many of the behaviors identified in [the CoI’s] larger model of ‘social presence’” (p. 7). The examination of instructor social presence is especially important in light of Swan and Shih’s (2005) finding that instructors’ social presence had a larger impact on student outcomes than did students’ social presence.
Although research has shown that social presence can be established using text (Gunawardena, 1995; Rourke et al., 2001), it is more easily established when nonverbal and vocal cues are present (Tu & McIssac, 2002). Some instructors have attempted to use audio communication to more effectively establish social presence in their course. Overall, students have reported that audio feedback is more personal and humanizing than text (Cuthrell, Fogarty, & Anderson, 2009; Kim, 2004; Olesova & Richardson, 2011; Wood, Moskovitz, & Valiga, 2011) and helps them to feel connected to their instructors (Gould & Day, 2012; Ice et al., 2007; Oomen-Early, Bold, Wiginton, Gallien, & Anderson, 2008). For instance, Ice et al.’s (2007) analysis of student surveys and interviews found that the vocal cues in audio feedback helped students to know that their instructor cared about their learning, resulting in a sense of belonging and involvement. Similarly, a student in Oomen-Early et al.’s (2008) research believed that audio had a softening effect when receiving corrective feedback. Research has also indicated that audio feedback can help to motivate students to complete assignments (Kirschner, Brink, & Meester, 1991; Pearce & Ackley, 1995; Wood et al., 2011).
Although audio contains more communication cues than text, both lack visual cues such as facial expressions and body language that “provide context for verbal interactions” (Wolsey, 2008, p. 311). Cuthrell et al.’s (2009) examination of audio feedback described one student who found that audio feedback was difficult to understand because it lacked the instructor’s facial expressions. As a result, some researchers have sought to establish their social presence via asynchronous video communication. Overall student response has been positive, reporting that the richness of video helps them to perceive their instructor as real (Borup, Graham, & Velasquez, 2011; Borup, West, & Graham, 2012; Matheison, 2012) and caring (Griffiths & Graham, 2009b; Moore & Filling, 2012). Additionally students reported that the visual cues in video helped their instructor to feel more conversational (Silva, 2012), friendly (Griffiths & Graham, 2009b), and personal (Griffiths & Graham, 2009a; Matheison, 2012; Parton, Crain-Dorough, & Hancock, 2010). For example, Borup et al.’s (2012) analysis of student interviews found that students’ abilities to hear and see their instructor made him feel more real. Additionally, blended learning students in Parton et al.’s (2010) research stated that the personal nature of video feedback helped them feel connected to their instructor with one student stating that, “through this video feedback I felt more connected to my professor, that she knew me personally, and that my responses to assignments were important to her” (Analysis section, para. 7). Although research largely lacks instructor perceptions, Griffiths and Graham (2009a) found that video feedback helped instructors to know their students on a personal level and the instructor in Parton et al.’s (2010) study thought asynchronous video feedback increased students’ involvement.
These benefits are not without some costs. Although the use of asynchronous video communication affords time to reflect, it is not easily edited and may reduce some of the rigorous thinking that occurs when using text (Garrison et al., 2000). Furthermore, technological problems are more likely to arise when using video (Barrow, 2012; Thompson & Lee, 2012). Barrow (2012) added that providing video feedback can be less convenient because it requires the instructor to be “in an indoor quiet setting with minimal external audio and visual distractions” (p. 170).
In summary, the use of audio and video feedback has been perceived by both students and instructors as effective in enhancing social presence. However, as yet the few studies that have specifically reported on video feedback tended to qualitatively examine only student—not instructor—perceptions. Although insightful, these initial research studies should be interpreted cautiously—especially by those wishing to apply their findings in varying populations and contexts. More research is needed that uses a variety of methodologies to examine student and instructor perceptions in a variety of contexts, such as blended learning environments.
We addressed the research questions using a complementary mixed-methods design to “measure overlapping but also different facets” of instructors’ use of video feedback to establish social presence (Greene, Caracelli, & Graham, 1989, p. 285). More specifically we used quantitative data to address the first research question (Do students who received video feedback report higher levels of instructor social presence as compared to students who received text feedback?) and qualitative data to address the second research question (What are students’ and instructors’ perceptions of how video feedback influences instructor social presence?).
Research was conducted in 12 sections of three one-credit technology integration courses required for secondary and elementary education majors at a large university in the Midwestern United States. In total the courses enrolled 229 students (211 female and 18 male) and were taught by 10 instructors (8 male and 2 female). Tech4SecEd enrolled 71 secondary education majors in 6 sections. The remaining two courses were required for all elementary education majors. Tech4ECE enrolled 72 students in 3 sections that focused on integrating technology into grades K-2 and Tech4ElEd enrolled 86 students in 3 sections that specialized in technology integration for grades 3-6. All sections used Canvas as a learning management system (LMS) because it allowed instructors to easily provide students with text or video feedback. Students could also reply directly to the feedback they received within the LMS.
The majority of the instruction was presented primarily online for all sections. However, students were provided with optional face-to-face labs each week that they could attend to receive personalized face-to-face help when needed. All sections also met face-to-face on the first day of the semester and all but two sections met face-to-face 4-5 times during the 14 week semester to introduce students to new projects, provide direct instruction, and allow for group discussions and presentations.
Throughout the course, each student received personalized feedback on the courses’ three major assignments that were similar for all sections. All three assignments used a mastery based approach that allowed students to resubmit work after they received feedback from their instructor. Students created an online portfolio or blog for the first assignment and an instructional video for the second. The third assignment required students to create a presentation. Tech4ElEd and Tech4ECE students created a presentation regarding their experience designing and implementing a technology enhanced lesson during a four-week practicum experience. The Tech4SecEd did not have a concurrent practicum so students selected, learned, and presented on technologies that they could use in their specific content areas.
The control group contained 99 students in five sections who received text feedback on the first two assignments and the experimental group contained 130 students in seven sections who received video feedback on the same assignments. For the third assignment, instructors switched the mode of communication to ensure that each student received both video and text feedback from their instructor. Researchers also met with the instructors at the start of the semester to establish guidelines regarding the timing and content of the feedback they would provide to students.
Data was collected using two student surveys and interviews with students and instructors. The first survey was administered mid-course after students had received feedback on the first two assignments. The eight survey items measured student perceptions of instructor social presence and were obtained from the larger Social Ability Instrument (Yang, Tsai, Kim, Cho, & Laffey, 2006) that used a seven-point response scale (1 = strongly disagree and 7 = strongly agree) (see Appendix). Yang et al.’s (2006) instrument was selected for two reasons. First, it was validated in a setting similar to the one examined in this research; and second, it specifically measured the social presence of instructors where other instruments have focused on measuring social presence of students (Arbaugh et al., 2008; Kim, 2011; Swan et al., 2008). One instructor neglected to provide students video feedback for the third assignment and her course section was excluded from data collection beyond the mid-course survey.
At the end of the course, researchers administered the following open-ended survey item to students: “In this course, you have received feedback from your instructor via both video and text. Which type of feedback did you prefer, and why?” Using the responses to this item, 22 students were sampled in an attempt to represent opinions and demographics of the student population. These students, along with nine instructors, participated in a 45-60 minute interview that discussed various aspects of social presence in video feedback. Interview transcriptions were then sent to all participants to check for accuracy and provide clarification when needed.
Student’s quantitative responses to the mid-course survey were aggregated and compared using an independent samples t-test. Researchers qualitatively analyzed student and instructor interviews, using elements of constant comparison coding methods (Glaser, 1965). As stated previously, we adopted Garrison et al.’s (2000) definition of social presence for this research and the analysis was guided by their three subcategories of social presence (i.e., emotional expression, open communication, and cohesion). Two researchers double coded six interviews (three instructor and three student), meeting after each interview to compare codes and identify themes. Any disagreements were discussed until researchers reached agreement. Researchers coded the subsequent interviews separately and met following every three interviews to discuss emerging themes. Additionally, all four members of the research team met twice during coding to discuss the primary coding patterns and themes.
One-hundred and ninety students completed the instructor social presence scale for a total response rate of 83.0%. However, 10 students reported that they did not view any of their feedback comments and were excluded from the analysis resulting in a useable response rate of 78.6% (105 students who received video feedback and 75 who received text feedback). An independent samples t-test was conducted and found no significant difference in perceptions of instructor social presence between students who had received video feedback (M = 5.77, SD = 0.85) and those who received text feedback (M = 5.62, SD = 0.75); t(178) = 1.23, p = 0.22.
Guided by the three subcategories of social presence identified within the community of inquiry framework (i.e., emotional expression, cohesion, and open communication), we organized student and instructor perceptions into three categories: (1) emotional expression, (2) closeness, and (3) natural and open communication.
All of the instructors agreed that the richness of video allowed them to more easily convey their emotions in feedback because, as Jake explained, “you can look in the camera, you can smile, you can laugh, [and] you can talk with your hands if you want to.” Jake further stated, “I prefer video because I want [my students] to see me and understand who I am and that I really care for them, and I think that comes across much easier in video.” Instructors also believed that the visual and vocal cues in video allowed students to recognize the authenticity of their emotions. Kurt said that with video feedback “It’s easier to detect whether the teacher is really caring about you.” Robert added that his tone of voice in video feedback was especially important when providing corrective feedback because students could better “feel the empathy [he] had for them.” The added richness of video also helped instructors to feel more comfortable using humor because it decreased the likelihood that students would misinterpret their comments. For instance, Robert explained that “the problem with text is that sarcasm can come across as biting.”
Instructors found that they could include emotional expressions in text by using emoticons and expressive punctuations. However, the effectiveness of these strategies appeared to be limited. John found that it was difficult to convey emotions in text “even if you try and punctuate [text feedback] differently or use little emoticons,” whereas video feedback “just naturally made everything more human.” David added that an instructor’s ability to include emotional expressions in text depended largely on their writing ability, “How well can you convey emotion through text? I can’t do it very well. Some people are really good at it. I’m not.” Gwen summarized that in text, instructors had to “find a translation” for emotional expressions whereas “with video, it’s straight.”
Although instructors found that video was more effective than text at conveying positive emotions, instructors found two primary limitations. First, although the richness of video appeared to make it easier to express emotion, it did not ensure that instructors would do so. For instance, Jake had to remind himself “to smile and speak with inflections, because [he] can be kind of monotoned” in video and Kurt had to “make a conscious effort to look into the camera” and smile. Second, instructors found that video conveyed their actual emotions even when they did not want them shared. Chris explained, “I think when we are talking face to face, I can tell if your praise is for real or if it is just something that you are saying just to say.” David added that in video, “I share information that I don’t want to share through my face.” David also shared an experience where video conveyed the negative emotions that he was feeling and possibly hurt his relationship with one student:
I kind of went off on [a student] and that came across [in my video comment]. . . . So I went back and deleted the video and made a new one where I was less animated. . . . I don’t know if he ever saw [the first video post] and ironically, he’s one of the kids who just stopped [working].
Matt added that in some cases he could appear more excited in text than he could in video because he “can use an exclamation mark in text, when [he] wouldn’t come across as using an exclamation mark in person.”
Although students did not typically reply to their feedback, instructors felt that their video feedback had a conversational feel. Robert explained, “I try to talk as if they were really there. I’m not just sitting there, I’m actually looking at them, and saying ‘Hey, how are you doing?’ I’m trying to be as conversational as I can.” Bill believed that when instructors provided feedback in this manner, students would feel as if they “could jump in at any moment and say something, and it seems like we are actually having that conversation even though we’re not.” Although Gwen tried to be conversational in text feedback, she found that there “always is a barrier” and in video it was easier to be “more conversational, more natural, more normal to what you would do in a face-to-face situation than if you typed it.”
However instructors liked the ability to edit their text comments and wished that they could do the same with video. Chris explained that he wanted his video feedback to be “fluent” but found that in video he would sometimes be “rambling a lot or just drawing a blank.” Initially instructors would commonly rerecord their videos when it contained several imperfections but became more comfortable with small mistakes and “gave up on trying to [be] perfect.” David found that these imperfections were more acceptable in video because “when people talk, we naturally slip over our words, we backtrack, we change our sentences” but “when we write, it has to be a lot cleaner.” In fact, instructors felt that these imperfections, as John said, helped their feedback feel “more human.”
Instructors believed that their students could potentially feel “like just a number” or “a sheep in a herd.” As a result instructors used their feedback to recognize students’ individuality and develop a sense of closeness with them. Some strategies, such as addressing students by name and acknowledging aspects of students’ personalities or lives that they self-disclosed in their projects, were independent of the mode of communication. However, instructors found that the richness of video made these strategies more effective. David explained the following:
I think two things go into play in order for [a connection] to happen. One is . . . making comments about, “Hey, I noticed on your website that you are from Colorado. That’s cool, I didn’t know that. What part are you from?” You could do that on text, sure, but the other part is that aspect where the instructor is speaking to [the student] specifically and [they] can see that . . . because they are hearing my voice . . . but also just because they are constantly seeing me.
The visual cues in video appeared especially helpful in forming connections because it added a “human touch” that let students know that they “were talking with a real person” and not receiving a “computer generated response.” Chris explained, “I think it feels more intimate on video than it would on text just because my face is there—I feel more present.” Instructors also provided a “human touch” to their feedback by recording it at home where students could see them in more personal surroundings. For instance, when providing feedback from home, Jake would like to have a bookshelf behind him with “family photos and little things that are important to [him].” Robert also explained that on a couple of occasions he had his “toddler sitting on [his] lap” and remembered students telling him, “It’s really fun to see you at home. You’re like a regular person!” Robert summarized that he could create a connection with students through text “to some degree but not as well as with video.”
Instructors found some limitations and costs to providing students with video feedback. David acknowledged that while video “provides more than text” it is “still not as good as face to face.” Instructors also found that it was less convenient to provide video feedback. Gwen remembered, “I had to do video [recordings] at home because that’s where I had my camera and my microphone.” Robert, a father of young children, liked to provide feedback at night but found that he needed to “talk more quietly because [his family was] asleep.” Other instructors also commented that providing video feedback was inconvenient because they felt uncomfortable providing it in public places such as a computer lab.
Students found it easier to perceive their instructors’ emotions in video than in text feedback because, as Margie explained, “you have a lot more ways to [do it]. . . You can do it through your face, your voice, [and] your body language.” For instance, Angie, one of Robert’s students, shared the following: “When Robert talked in the video . . . he’s just more likely to bring out his personality as if he was sitting just right there.” Similarly, Tiffany stated that video feedback made her instructor “seem more like a person” because she could see his “personality and mannerisms.” Additionally, Nancy found that the content in her text and video feedback was similar but her ability to see and hear her instructor in video feedback helped her to “feel more like it’s coming from a person.”
Students also found that instructors’ emotional expressions in video had more authenticity than text. For instance, Hanna found that it was easier to know that her instructor “cared about us” because in text feedback “you can kind of fake it.” Kelly attributed some of the added authenticity to the visual cues in video because “the face is credible.” Natalie also explained that the authenticity of video feedback helped her to increase her confidence in her abilities because she could better see that the instructor “felt like [she] was competent in what [she] was doing.” The ability of students to see their instructors’ demeanor and hear their tone of voice was also helpful in avoiding misconceptions. For instance Natalie found that humor was better understood in video:
I made a video about how to write a thesis statement. The thesis statement I wrote for an example was, “Is Spiderman better than Batman?” In the video he said something about “I’m not sure this thesis statement is going to work because Spiderman is clearly better than Batman!” . . . He could have [said something similar in text] but it might’ve been harder for me to tell whether he was really joking or not.
Neal also preferred receiving corrective feedback via video because he “could see his [instructor’s] demeanor and could see that he wasn’t upset.”
Although students could also perceive instructor emotions in text, it appeared to be limited. Dallin explained that while she recognized some emotional expressions in text feedback, she could not “see as much” because it felt as if her instructor was “behind a wall” and Natalie found her text feedback to be “cold” because “it’s hard to see the emotion in it.” Similarly, Kelly stated that the visual nature of video was especially helpful for emotional expression, “In text, you can hear their voice because that’s how they normally talk or write, but you can’t feel any emotion because you don’t have a face to connect with it.”
Students found video feedback more conversational and interactive than text. Rebecca explained “it was like he was having a conversation with me even though I wasn’t responding. He was talking to me as if I was right there in front of him.” Similarly, Lisa found that her video feedback made her feel like she was “just talking to him in class” and that “she always felt very approachable.” Edith also found that video feedback comments “were very fluid and natural” because her instructor was not “constrained by that medium.” Rebecca wondered if her video feedback felt more conversational because “maybe [her instructor] communicates better verbally,” indicating that instructors’ ability to express emotion in feedback is somewhat dependent on their communication skills.
Although students found that video feedback was “more like a real face-to-face interaction,” some students indicated that they were more likely to actually respond to text feedback. Sadie said that when she received video feedback she “felt like [she] had to respond in video” even though her instructor did not have that requirement. Similarly, Natalie found that “text was easier for [her] to respond to” because responding to a video using text “felt a little unbalanced.” Nancy rationalized, “If I’m going to reply in text, we might as well just talk in text.”
Some students felt “more of an emotional connection” to their instructor as a result of their video feedback because it contained their instructors’ “voice, tone, and . . . facial expressions.” Sabrina shared the following: “Just talking with someone face to face always makes you feel more connected to them . . .which I think is especially good for a blended course where you’re not with them all the time.”
Students also believed that video feedback impacted their sense of closeness with their instructor because it made their instructor seem more “real.” Rebecca explained that in video feedback “you are actually seeing [the instructor] and you see that he is just a person too.” Video seemed to give students “a peek” into their instructor’s environment. For instance, Jaime stated “I can see him in his office . . . or his home. It was like ‘Welcome to Bill’s life!’” Kara also shared the following: “In the background there were toys on the ground and some of his home stuff and he was kind of just chilling in his chair. I was like, ‘He has a life. Weird!’” The knowledge that Hanna’s “professor [was] a person” lessened her frustration in the course:
Throughout my college experience, there have been classes with professors that students tend to not like. . . I think the video [feedback comments] are helpful in dealing with that because then [the instructor] is a person—not just words. . . It makes it harder to be frustrated at the teacher because the teacher is a person and you are actually interacting with them.
Margie added that the closeness she felt to her instructor made her feel “more accountable” and “included.”
Video feedback also felt more personalized because students knew that their instructor was communicating with them directly. Sadie explained that her instructor “couldn’t make the same video for everyone” but he could have reused the same text feedback comments by simply changing the student name. Some students also felt that hearing the instructor say their name helped them to feel closer to their instructor more than reading their name in text. Margie believed that with text the instructor could “just look at your name and type it in” but hearing her instructor say her name helped her to feel like he knew who she was. Caroline explained that people tended to mispronounce her name and felt a connection with her instructor because he pronounced it correctly and was appreciative that “he remembered.”
Samuel felt like he actually received more “one-on-one and face-to-face” feedback then he would have received in a “regular classroom.”
I felt like he was just speaking to me and how I could do better. I liked that. Usually in other [face-to-face] classes that I’ve had, when you get an assignment back you’d have to approach the professor and go and say, “Well, what did I do? What can I do better?” With him, he would look at your assignment and immediately do a video feedback just for you, so you would always get it rather than having to go and ask for it. I liked that.
Interestingly Caroline also found that she received more of her instructor’s attention than in a face-to-face class, however, unlike Samuel, the attention was somewhat unwanted because she was not “an attention seeker usually so direct eye contact and direct interaction sometimes makes [her] uncomfortable” and “was a little bit weird.” Similarly, Sabrina found that video feedback was more personal, which was “partly why [she] didn’t like it as much” because she “didn’t want to be too personal with the teacher.”
Although students found that video was more effective than text at helping them feel a connection with their instructor, blended students found that the impact was limited because they had already developed a sense of closeness with their instructor in face-to-face class sessions. Nancy explained,
Since I had talked to him one-on-one before, in person, I don’t feel like I needed the videos to help me feel like I knew him. . . but I could see how it would help in an all online class.
Natalie believed that video feedback would have been helpful in “forming a student-teacher relationship” even in a fully face-to-face course but agreed that video feedback would be even more important in a fully online course because students “wouldn’t have had any other way of seeing the professor.” Jaime summarized that “face-to-face would be the best [but] video is pretty good.”
Anderson (2009) explained that “Distance education has always been to a great degree determined by the technologies of the day. . . As these technologies have developed, distance education has evolved in parallel to support new forms of interaction” (p. 112). For instance, the advent of the Internet has enabled instructors to provide students with more feedback than was previously possible (Garrison, 2009). However, previous constraints in online communication tools resulted in feedback that was largely text based (Parsad & Lewis, 2009). Although text can be an effective mode of communication, it lacks visual and vocal cues that can make it more difficult to establish social presence (Garrison et al., 2000). As a result we used a mixed method research design to examine the impact of video feedback on instructors’ ability to establish their social presence in blended courses.
Our quantitative analysis of student responses to items measuring instructor social presence found no significant difference in perceptions of instructor social presence between students who received video feedback (M = 5.77, SD = 0.85) and those who received text (M = 5.62, SD = 0.75); t(178) = 1.23, p = 0.22. However, our qualitative analysis of instructor and student interviews indicated that participants perceived video feedback to be more effective than text in establishing instructor social presence. Students commented that the richness of video helped them to view their “professor [as] a person” because they could see their instructor’s face and surroundings. One student explained that in text feedback his instructor was “just words on a page” but in video feedback he could “hear emotion” that allowed him to develop “more of a relationship” with his instructor. This supports previous research that has found audio communication effective in helping instructors convey emotions (Hew & Cheung, 2013).
Additionally our qualitative analysis found that the visual cues in video feedback allowed for emotional expressions that are not possible in text or audio feedback. One student explained that in video feedback there were “a lot more ways to [express emotions].” One instructor added that text feedback could contain some emotional expressions, but it required instructors to “find a translation” such as emoticons. Hailey, Grant-Davie, and Hult (2001) summarized the limitations of text to express and perceive emotions: “In the strictly written medium . . . tone is hard to read and create, and the invisible, inaudible teacher has a less moderating presence in the class” (p. 390). There was also some evidence that video feedback actually made “it harder to be frustrated at the teacher because the teacher is a person.” This finding may help to explain Griffiths and Graham’s (2009b) case study that found that an instructor who provided video feedback received higher than average student ratings. It is also important to note that the visual nature of video feedback could also prove distracting.
The visual and vocal cues in video also appeared to help students avoid misunderstandings. This supports Draft and Lengel’s (1986) claim that by increasing the media richness, information can be conveyed with less uncertainty. Wolsey (2008) added that visual communication cues can provide important context. Instructors stated that they were more likely to use humor in video feedback because students were less likely to misinterpret their comments. Similarly, Swan’s (2002) analysis of text discussion board posts found that humor was seldom used “because many forms of humor are easily misinterpreted” (p. 39). The difficulty to recognize humor in text was also highlighted by Rourke et al. (2001) who were only able to obtain an inter-rater reliability of 0.25 when coding for humor in text discussion boards.
Students also found that video feedback was more conversational than text. Similarly, Thompson and Lee’s (2012) qualitative examination of video feedback concluded:
While [asynchronous video feedback] does not allow students to ask questions as they would in a face-to-face, phone, or video conference, hearing the voice of the teacher going through the paper does give students the sense that they can ask more questions because it establishes a personal connection and rapport, creating a sense of availability. (para. 37)
One teacher in our research explained that instructors’ ability to make text feedback conversational was largely dependent on their writing abilities. However, even skilled writers may find it difficult to match the conversational tone that instructors can convey in video. For instance, Silva (2012) found that an instructor with high writing abilities could not match the conversational tone found in her video comments when she provided text feedback. Ultimately, the establishment of social presence in video feedback is dependent on instructors’ communication behaviors (Garrison et al., 2000). One student explained that video was simply an “an extension” of his instructor’s personality, “because he is a nice person, his [video] feedback was nice.” If instructors do not speak with emotion or in a conversational manner in a face-to-face context, they are unlikely to do so in video feedback. Instructors also found that it was difficult for them to hide their disappointment or frustration when providing video feedback. One instructor wondered if providing video feedback when frustrated actually harmed his relationship with one student who stopped working in the course. Similarly one student in Rodway-Dyer, Knight, and Dunne’s (2011) research felt like she was “being told off” (p. 220) when she received corrective audio feedback. Instructors should be aware of this finding and consider providing text feedback when frustrated. Borup, West, and Graham (2013) found that students ability to establish their social presence via video was somewhat dependent on their personal characteristics. It is also likely that some personal characteristics can make it more difficult for instructors to establish their social presence via video feedback.
Although students agreed that video feedback was more effective at establishing instructors’ social presence than text, they also believed that their face-to-face class sessions lessened video’s impact on their instructor’s social presence. This may help to explain why no statistical difference was found in perceived instructor social presence between those students who received text and those who received video. One student explained, “Since I had talked to him one-on-one before in person, I didn’t feel like I needed videos to help me feel like I knew him.” Similarly, Martin and Mottet (2011) found that instructors’ immediacy behaviors in face-to-face classes were more important than those found in online feedback. Khurana and Boling’s (2012) research of multimedia in an online graduate course also found that audio feedback enabled students to better perceive their instructor’s tone of voice in subsequent text feedback comments. Shea, Sauli, and Pickett (2006) also hypothesized that online students close to campus may feel less of a need to develop an online learning community as compared to students living far from campus who may not have the same social opportunities.
These research findings may prove insightful to those in a variety of contexts. Merriam (1998) explained that “Insights gleaned from case studies can directly influence policy, practice, and future research” (p. 19). However, these findings need to be understood within our research context and should not be generalized to other settings. As a result we have attempted to follow Wolcott’s (1994) warning to avoid “the temptation to read too far beyond the case itself in speculating about its meaning or implications” (p. 37). As stated earlier participants were enrolled in blended course sections that met on the first day of the semester, which may have limited the impact that video feedback had on instructors’ social presence. As a result, future research should examine this phenomenon in fully online courses with off campus students. In addition, this research only examined instructor feedback on student projects and additional research could build on studies such as Borup, Graham, and Velasquez’s (2011) case studies that sought to understand how instructors establish a persona for their other instructional responsibilities, including facilitating online discussions and class announcements. Students were also enrolled in one-credit courses which may have diminished the value that students placed on instructor social presence. Future research could similarly examine this phenomenon with varying instructor and student populations. For instance, although Borup et al. (2013) found that a non-native speaker had difficulties establishing her social presence in video, research may find that non-native speakers or students who have difficulty hearing may benefit from the non-verbal cues that are present in video feedback. Researchers may also seek to understand video feedback—including peer feedback—in non-educational settings to create virtual collaborative teams.
Garrison and Arbaugh (2007) explained that more research is needed “to understand exactly how social presence patterns develop” (p. 160). This is especially true with video feedback, and research may find that it is more important in establishing social presence at the start of a semester and less important once instructors have established their voice. This type of research could provide instructors with helpful heuristics on when video feedback is the most effective and could save instructors valuable time considering that video appeared to be more time consuming and less convenient than providing text feedback. Because video contains audio, future research should also seek to compare audio and video feedback in an attempt to isolate the impact that the visual cues in video have on instructor social presence. Although beyond the scope of this research, future research may also attempt to examine how video feedback impacts the quality of students’ work and cognitive presence.
Social presence is also to a large degree determined by how participants perceive the person with whom they are communicating. As a result this research relied on participant-reported data. However, when measuring social presence, Rourke et al. (2001) measured student social presence by analyzing discussion board comments. Future research examining the impact of video feedback on instructor social presence may strengthen their research by combining these two approaches. This type of triangulation can be time intensive and difficult but would improve the validity of the research findings and provide additional insights (Mathison, 1988).
Although innovations in communication technology have enabled instructors to more easily provide students with video feedback, research examining its use is currently lacking. This mixed method research examined the impact of asynchronous video feedback on instructor social presence in blended courses that provided the majority of instruction online. An independent samples t-test found no significant difference in perceptions of instructor social presence between students who received video feedback (M = 5.77, SD = 0.85) and those who received text (M = 5.62, SD = 0.75); t(178) = 1.23, p = 0.22. However, qualitative analysis of nine instructor and 22 student interviews found that video enabled instructors to better establish their online social presence because they could more easily speak with emotions and communicate in a conversational manner. Students also found that the ability to see and hear their instructor helped them to create a sense of closeness with their instructor. Although video feedback appeared to be more effective than text in establishing instructor social presence, students believed that the need for video feedback to establish social presence was less in blended courses where students and instructors interact face-to-face. This may help to explain why no statistical difference was found in student perceptions of instructor social presence. Future research can test this hypothesis by examining the effects of video feedback on instructor social presence in fully online courses.
Anderson, T. (2009). A rose by any other name : Still distance education — a response to D . R . Garrison: Implications of online and blended learning for the conceptual development and practice of distance education. Journal of Distance Education, 23(3), 111–116.
Anderson, T., Rourke, L., Garrison, D. R., & Archer, W. (2001). Assessing teaching presence in a computer conferencing context. Journal of Asynchronous Learning Networks, 5(2), 1–17.
Arbaugh, J., Cleveland-Innes, M., Diaz, S., Garrison, D., Ice, P., Richardson, J., & Swan, K. (2008). Developing a community of inquiry instrument: Testing a measure of the community of inquiry framework using a multi-institutional sample. The Internet and Higher Education, 11(3-4), 133–136. doi:10.1016/j.iheduc.2008.06.003
Barrow, T. H. (2012). Social presence in the asynchronous online classroom: The asynchronous online video conversation (Unpublished doctoral dissertation). Texas Tech University, Lubbock, TX.
Boling, E. C., Hough, M., Krinsky, H., Saleem, H., & Stevens, M. (2012). Cutting the distance in distance education: Perspectives on what promotes positive, online learning experiences. The Internet and Higher Education, 15(2), 118–126.
Borup, J., Graham, C. R., & Velasquez, A. (2011). The use of asynchronous video communication to improve instructor immediacy and social presence in a blended learning environment. In A. Kitchenham (Ed.), Blended learning across disciplines: Models for implementation (pp. 38–57). Hershey, PA: IGI Global.
Borup, J., West, R. E., & Graham, C. R. (2013). The influence of asynchronous video communication on learner social presence: a narrative analysis of four cases. Distance Education, 34(1), 48–63.
Borup, J., West, R. E., & Graham, C. R. (2012). Improving online social presence through asynchronous video. The Internet and Higher Education, 15(3), 195–203. doi:10.1016/j.iheduc.2011.11.001
Cuthrell, K., Fogarty, E. A., & Anderson, P. J. (2009). “Is this thing on?” University student preferences regarding audio feedback. Society for Information Technology & Teacher Education International Conference (pp. 32–35).
Draft, R. L., & Lengel, R. H. (1986). Organizational information requirements, media richness and structural design. Management Science, 32(5), 554-571.
Dziuban, C. D., Hartman, J. L., & Moskal, P. D. (2004). Blended learning. Boulder, CO: Educause Center for Applied Research. Retrieved from http://net.educause.edu/ir/library/pdf/ERB0407.pdf
Evans, C. (2013). Making sense of assessment feedback in higher education. Review of Educational Research, 83(1), 70–120.
Garrison, D. R., Anderson, T., & Archer, W. (2010). The first decade of the community of inquiry framework: A retrospective. The Internet and Higher Education, 13(1-2), 5–9. doi:10.1016/j.iheduc.2009.10.003
Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2-3), 87–105.
Garrison, D. R., & Arbaugh, J. (2007). Researching the community of inquiry framework: Review, issues, and future directions. The Internet and Higher Education, 10(3), 157–172. doi:10.1016/j.iheduc.2007.04.001
Garrison, R. (2009). Implications of online learning for the conceptual development and practice of distance education. Journal of Distance Education, 23(2), 93–104.
Glaser, B. G. (1965). The constant comparative method of qualitative analysis. Social Problems, 12(4), 436–445. Retrieved from http://www.jstor.org/stable/798843
Gould, J., & Day, P. (2012). Hearing you loud and clear: Student perspectives of audio feedback in higher education. Assessment & Evaluation in Higher Education, 1–13.
Greene, J. C., Caracelli, V. J., & Graham, W. F. (1989). Toward a conceptual framework for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 11(3), 255–274.
Griffiths, M. E., & Graham, C. R. (2009a). The potential of asynchronous video in online education. Distance Learning, 6(2), 13–23.
Griffiths, M. E., & Graham, C. R. (2009b). Using asynchronous video in online classes: Results from a pilot study. International Journal of Instructional Technology and Distance Learning, 6(3), 65–76.
Gunawardena, C. N. (1995). Social presence theory and implications for interaction and collaborative learning in computer conferences. International Journal of Educational Telecommunications, 1(2/3), 147–166.
Hailey, D. E., Grant-Davie, K., & Hult, C. A. (2001). Online education horror stories worthy of Halloween: A short list of problems and solutions in online instruction. Computers and Composition, 18, 387–397.
Hara, N., & Kling, R. (1999). Students’ frustrations with a web-based distance education course. First Monday, 4(12).
Hew, K. F., & Cheung, W. S. (2013). Audio-based versus text-based asynchronous online discussion: Two case studies. Instructional Science, 41(2), 365–380.
Ice, P., Curtis, R., Phillips, P., & Wells, J. (2007). Using asynchronous audio feedback to enhance teaching presence and students’ sense of community. Journal of Asynchronous Learning Networks, 11(2), 3–25.
Johnson, L. P. (2008). The caring teacher: Tips to motivate student learning. Lanham, Maryland: Rowman & Littlefield Education.
Khurana, C., & Boling, E. (2010). “Setting the Climate”: The role of instructional design and multimedia to enhance social presence. World Conference on Educational Media, Hypermedia, and Telecommunications (pp. 1813–1818).
Kim, J. (2011). Developing an instrument to measure social presence in distance higher education. British Journal of Educational Technology, 42(5), 763–777. doi:10.1111/j.1467-8535.2010.01107.x
Kim, L. (2004). Online technologies for teaching writing: Students react to teacher response in voice and written modalities. Research in the Teaching of English, 38(3), 304–337.
Kirschner, P. a., Brink, H., & Meester, M. (1991). Audiotape feedback for essays in distance education. Innovative Higher Education, 15(2), 185–195.
Martin, L., & Mottet, T. P. (2011). The effect of instructor nonverbal immediacy behaviors and feedback sensitivity on Hispanic students’ affective learning outcomes in ninth-grade writing conferences. Communication Education, 60(1), 1–19.
Mathieson, K. (2012). Exploring student perceptions of audiovisual feedback via screencasting in online courses. American Journal of Distance Education, 26(3), 143–156.
Mathison, S. (1988). Why triangulate? Educational Researcher, 17(2), 13–17.
Merriam, S. B. (1998). Qualitative research and case study applications in education: Revised and expanded from case study research in education. San Francisco, CA: Jossey-Bass.
Moore, M. G. (1980). Independent study. In R. D. Boyd & J. Apps (Eds.), Redefining the discipline of adult education (pp. 16–31). San Francisco, CA: Jossey Bass. Retrieved from http://www.ajde.com/Documents/independent_study.pdf
Moore, N. S., & Filling, M. L. (2012). iFeedback: Using video technology for improving student writing. Journal of College Literacy & Learning, 38, 3–14.
Olesova, L. A., & Richardson, J. C. (2011). Using asynchronous instructional audio feedback in online environments: A mixed methods study. Journal of Online Learning and Teaching, 7(1), 30–42.
Oomen-Early, J., Bold, M., Wiginton, K. L., Gallien, T. L., & Anderson, N. (2008). Using asynchronous audio communication (AAC) in the online classroom: A comparative study. Journal of Online Learning and Teaching, 4(3), 267–276.
Palloff, R. M., & Pratt, K. (2007). Building online learning communities: Effective strategies for the virtual classroom (2nd ed.). San Francisco, CA: Jossey-Bass.
Parsad, B., & Lewis, L. (2009). Distance education at degree-granting postsecondary institutions: 2006–07. World Wide Web Internet And Web Information Systems. Retrieved from http://nces.ed.gov/pubs2009/2009044.pdf
Parton, B. S., Crain-Dorough, M., & Hancock, R. (2010). Using flip camcorders to create video feedback: Is it realistic for professors and beneficial to students? International Journal of Instructional Technology and Distance Learning, 7(1).
Pearce, C. G., & Ackley, R. J. (1995). Audiotaped feedback in business writing: An exploratory study. Business Communication Quarterly, 58(1), 31–35.
Price, M., Handley, K., Millar, J., & O’Donovan, B. (2010). Feedback: All that effort, but what is the effect? Assessment & Evaluation in Higher Education, 35(3), 277–289.
Rodway-Dyer, S., Knight, J., & Dunne, E. (2011). A case study on audio feedback with geography undergraduates. Journal of Geography in Higher Education, 35(2), 217–231.
Rourke, L., Anderson, T., Garrison, D. R., & Archer, W. (2001). Assessing social presence in asynchronous text-based computer conferencing. Journal of Distance Education, 14(2), 51–70.
Shea, P., Sauli, C., & Pickett, A. (2006). A study of teaching presence and student sense of learning community in fully online and web-enhanced college courses. The Internet and Higher Education, 9(3), 175–190. doi:10.1016/j.iheduc.2006.06.005
Short, J., Williams, E., & Christie, B. (1976). The social psychology of telecommunications. New York, NY: John Wiley & Sons.
Silva, M. L. (2012). Camtasia in the classroom: Student attitudes and preferences for video commentary or Microsoft Word comments during the revision process. Computers and Composition, 29(1), 1–22. doi:10.1016/j.compcom.2011.12.001
Song, L., Singleton, E., Hill, J., & Koh, M. (2004). Improving online learning: Student perceptions of useful and challenging characteristics. The Internet and Higher Education, 7(1), 59–70.
Swan, K. (2002). Building learning communities in online courses: The importance of interaction. Education, Communication & Information, 2(1), 23–49.
Swan, K. P., Richardson, J. C., Ice, P., Garrison, D. R., Cleveland-innes, M., & Arbaugh, J. B. (2008). Validating a measurement tool of presence in online communities of inquiry. e-mentor, 2(24), 1–12.
Swan, K., & Shih, L. F. (2005). On the nature and development of social presence in online course discussions. Journal of Asynchronous Learning Networks, 9(3), 115–136.
Thompson, R., & Lee, M. J. (2012). Talking with students through screencasting: Experimentations with video feedback to improve student learning. The Journal of Interactive Technology & Pedagogy, 1.
Tu, C., & McIsaac, M. (2002). The relationship of social presence and interaction in online classes. American Journal of Distance Education, 16(3), 131–150.
Wiener, M. & Mehrabian, A. (1968). Language within language: Immediacy, a channel in verbal communication. New York: Appleton-Century-Crofts.
Wolcott, H. F. (1994). Transformative qualitative data: Description, analysis, and interpretation. Thousand Oaks, CA: Sage Publications.
Wolsey, T. D. (2008). Efficacy of instructor feedback on written work in an online program. International Journal on E-Learning, 7(2), 311–329.
Wood, K. A., Moskovitz, C., & Valiga, T. M. (2011). Audio feedback for student writing in online nursing courses: Exploring student and instructor reactions. The Journal of Nursing Education, 50(9), 540–3.
Yang, C., Tsai, I., Kim, B., Cho, M., & Laffey, J. (2006). Exploring the relationships between students’ academic motivation and social ability in online learning environments. The Internet and Higher Education, 9(4), 277–286.
The following survey items were obtained from the larger Social Ability Instrument (Yang et al., 2006) and were used to measure student perceptions of instructor social presence. Students responded to each item using a seven-point response scale (1=strongly disagree and 7=strongly agree).
1. My interactions with the instructor are sociable and friendly
2. I feel comfortable expressing my feelings to the instructor
3. My online interactions with the instructor seem personal
4. The actions of the instructor in the course are easily visible in our online system
5. In my interactions with the instructor I am able to be myself and show what kind of student I really am
6. I trust the instructor in the course to help me if I need it
7. When I log on I am usually interested in seeing what the instructor is doing or had done
8. I feel connected to the instructor in this course