Does increasing the amount of time students spend viewing faculty research videos and playing course specific games positively impact course grades?
Yes, it does, in both online and face-to-face courses and regardless of student GPAs.
These questions were the basis for a paper on the effects of faculty-research videos and games in an upper-division empirical-methods course. We expected that a media-rich environment would increase student engagement and that this increased student engagement would enhance performance and increase student satisfaction with the course.
Working with ASU Online and the College of Liberal Arts and Sciences, Political Science faculty created 10- to 20-minute videos in which they presented a research project; they explained what they found, how they found it, and why they asked the question.
The videos are part of a repository and, although the impetus for the project was the growing demand for research methods online, the videos have been used in online, face-to-face and blended courses.
- We assumed that the video examples of real research projects, tied to theory, would add depth and breadth to the course. In turn, this would make research methods more interesting;
- increase student engagement;
- improve the quality and quantity of student interactions with faculty;
- expose online students to a wider range of faculty than their coursework typically might offer, thus providing them with some of the benefits accrued by traditional students;
- and raise course grades.
These questions are consistent with the media-integration model, which dominates the theoretic literature and research on technology inclusion. Earlier research found that streaming video and audio explanations are better than textbooks at explaining complex concepts.
We found that online students spent 21 percent of their online time viewing the faculty-research videos. The correlation between video-viewing time and course grade (49%) was the same as the correlation between course grade and GPA (50%). We expected the relationship between GPA and course grade to be strong but were surprised to find that the relationship between video viewing and course grade was equally strong. The correlation between viewing the research videos and GPA was 24 percent. When we controlled for GPA, the partial correlation between faculty video viewing and course grade was 73%, which was considerably higher than the zero order correlation between course grade and video viewing. GPA and faculty-research videos had independent effects on course grade; they did not substitute for one another. Moreover, within “GPA groups” video viewing had a strong, positive influence on course grades.
Anecdotal evidence is consistent with these findings. Students wrote, “the videos were fantastic and…fun to watch; as an online student, the videos are essential to making me feel a connection to the faculty; (the videos) were indeed informative.”
The videos “helped make the abstract concepts a bit more concrete and easier to grasp. I think the videos were…a good addition to the course.”
Another student identified a benefit beyond the scope of the project when s/he wrote, “I quite enjoyed the videos and found them useful not only as a tool to explain course concepts and research examples, but also – and probably more importantly – as an opportunity to sort of ‘sample’ other professors. The videos provided a kind of mini course from a few professors, which helped in selecting classes for the following semesters.”
Our project had one more element. The spring 2013 online section added games associated with the course material for each week. These were drill and practice games emphasizing memory, repetition and retention; and Quizlet (our creation tool) offered students variations within the drill and practice theme.
The data regarding the impact of these tools is unequivocal. Students using drill and practice online vocabulary games had higher quiz and course grades than students who opted not to use the games.
Additionally, the inclusion of “mini-games” or “learning objects” in the course, while not measured, appeared anecdotally to increase student satisfaction.
On an anonymous course evaluation, one student wrote, “this instructor went out of her way to make it easy for distance-learning students. She…tied the different elements of the course together …. (making) learning much easier (textbook, homework, quizzes, discussions and games).” Another wrote, “The many sources of media ….brought diversity to the course material.”
Our project was a success. Many faculty who made research videos reported that students referred to the videos in conversations. This was an unexpected consequence but one that was entirely consistent with the project’s goal of making students more comfortable with research.
The videos helped students become reviewers and evaluators of research instead of passive consumers; the games made them comfortable with the language. Course material often seen as dry became engaging, and as they came to appreciate it, teaching became easier and more rewarding.
By Marilyn Dantico, PhD, Gina Woodall, PhD, and Tahnja Wilson, MIM/MBA