Engaging and Measuring Student Knowledge With Video Content
More Assessment & Feedback documents
Finding ways to engage students and measure their understanding while they watch video content
These days, students are engaging in more video content as part of their courses. It is important for instructors to know not only whether students are watching content, but to understand ways they can deliver content in ways that engage students and know how well students are learning the content presented in these videos. This document will provide recommended solutions and approaches you can use as you deliver video content to students.|
Length of Video Content
The first major barrier to overcome with regard to getting students to watch video content and ensure they learn from it has to do with the length of videos. Several articles have studied the ideal length of a video. The ideal length of engagement drew from several factors including cognitive load, attention span, learning outcomes, and student preferences. The research generally confirms the recommendation based on the Danforth, Schumacher, & Ma article that found students' preference is for video content limited to 4-6 minutes (Danforth, Schumacher, & Ma, 2012). Guo, Kim, and Rubin confirmed that video length is the most significant indicator of student engagement, with the highest level of engagement being found between 1-3 minutes, the median engagement time around 6 minutes, with engagement dropping off after 6 minutes. Studying engagement levels with instructor-present video, they found that engagement dropped off after 6-9 minutes, compared to 3-6 minutes with narrated presentations (Guo, Kim, and Rubin, 2014). Pi and Hong found that students who viewed video content with a length between 4-6 minutes had the best learning scores following their watching the video. They also confirmed that mental fatigue begins at 10 minutes and seriously deteriorates after 22 minutes (Pi & Hong, 2016). Guo, Kim, and Rubin also found that students watching an instructor-present video in which the instructor was seated at a desk with a closer focus on their face engaged twice as long (6-12 minutes) than they did when they watched a lecture capture video (3-6 minutes). Both formats found engagement levels deteriorating after 12 minutes (Guo, Kim, & Rubin 2014).
Regardless of how engaging and compelling the content is in a 50-minute video lecture, you can assume that students are going to struggle to consume this content, regardless of their interest or desire to succeed in your course. The best approach is to break these videos into smaller 8-10 minute videos. Not only will this increase the likelihood that students watch them to completion, but it will also help focus learning on one concept at a time.
Instructors can view analytics on video content placed in Kaltura. This will provide a list of students who have viewed the video and their completion rate for the video. Be aware, however, that this data is not presented in real-time and may take a day to update. Analytics may also not be collected from students who are running privacy or ad-blocking software. It is not recommended that instructors use this information to assign points or prove whether students did or did not watch content.
Measuring Student Knowledge Upon Viewing Video Content
While analytics can tell you know many students completed the viewing of a video, it cannot tell you how successfully the content facilitated learning, what content students understood, and with what content students continue to struggle. The following approaches can be used to develop a learning pathway that introduces content and provides opportunities for students to apply and measure their understanding of the content, while at the same time, providing you with valuable information you can use to facilitate their learning.
The first option is to place pauses in the video for students to stop and reflect on materials before moving on to the next topic. This can easily be done using Kaltura Interactive Video Quiz to your lecture video. This feature should be used only to provide self-assessment of content, and not as a formal measure of learning or to assign points for watching the video to completion. Interactive Video Quiz can include the following question types:
- Multiple Choice: Questions with only one correct answer. Questions have a 180 character limit for questions and 140 character limit for answers.
- True / False: Question with the choice of True or False. Questions have a 450 character limit.
- Reflection Point: This is a question with no answer. The video will pause and allow you to point out specific items in the video to guide your viewer's attention. This is a non-graded question and will not be part of the quiz score. Questions have a 500 character limit.
- Open Question: The student can type in a free text answer. This is a non-graded question and will not be part of the quiz score. Questions have a 200 character limit. Answers have a 270 character limit.
As the instructor, you can view this feedback to see well students understand the content and what questions they may still have.
Consider creating a practice quiz that students can complete after the video that measures their understanding of the content presented. Students do not receive a grade for practice quizzes, even though the quiz results display the number of points earned in the quiz. The instructor can see who has completed the exam, however. Communicate to students that the exam is the way you will check their understanding of the content presented in the video. This is a better way of ensuring students have watched the video through the available analytic. This approach also provides an opportunity for students to immediately apply the content presented and quickly identify gaps and misunderstandings in their understanding. To create a practice quiz, follow these steps:
- Select Quizzes from the course navigation.
- Select the + Quiz button.
- Under the Details tab, give your quiz a name, and provide instructions in the text box.
- Select Practice Quiz from the Quiz Type menu.
- Click on the Questions tab.
- Add the questions to your survey. Select from the following question types: Multiple Choice, True/False, Fill In the Blank, Fill In Multiple Blanks, Multiple Answers, Multiple Dropdowns, Matching, Numerical Answer, Formula Question, Essay Question, File Upload Question, and Text (no question).
- When done, select Save & Publish.
Instructors can view the results of practice exams in Canvas to get a picture of student comprehension of content covered in the video.
With regard to the creation of narrated presentations, the recommended tools are Microsoft PowerPoint for creating a narrated presentation to deliver content to students; Kaltura for storing the content, and Canvas for delivering content to students. Kaltura Machine Captioning is also available to make recorded content accessible. Guides can be found for creating narrated presentations for Mac and Windows users.
Other In-context Video Solutions
Tools like Camtasia and Captivate make it easy to create questions in a video or narrated presentation, Using a technology called SCORM, instructors can upload these modules into your Canvas course and present the content to students. While this approach is pedagogically sound, it is important to know that Canvas currently has issues processing the results of SCORM content, and instructors should not implement SCORM content designed to record results to the Grades tool within Canvas.
- VIEW — Danforth, S., Cullen, R., and ;Ma Y.J. "Evaluating Format Preferences and Effectiveness of Video Podcasts Related to Nutrition Education and Recipe Demonstrations." Journal of Nutrition and Dietetics, (2012). 112, A19.
- VIEW — Guo, Philip, Juho Kim, and Rob Rubin. "How Video Production Affects Student Engagement: An Empirical Study of MOOC Videos." In Proceedings of the First ACM Conference on Learning@Scale Conference. (2014). pp. 41-50.
- VIEW — Kizilcec, René. Kathryn Papadopoulus and Lalida Sritanyaratana. "Showing Face in Video Instruction: Effects on Information Retention, Visual Attention, and Affect." Proceedings of The SIGCHI Conference on Human Factors in Computing Systems. (2014). pp. 2095-2102.
- VIEW — Pi, Zhongling, and Jianzhong Hong. "Learning Process and Learning Outcomes of Video Podcasts Including the Instructor and PPT Slides: A Chinese Case." Innovations in Education and Teaching International. (2016) 53(2). pp. 135-144.
- VIEW — Sweat, Anthony, and Kenneth Alford. "Getting Started with Blended Learning Videos." Faculty Focus. June 2019.
- VIEW — Wang, Jiahui, Pavlo Antonenko, and Kara Dawson. "Does Visual Attention to the Instructor in Online Video Affect Learning and Learning Perceptions? An Eye-Tracking Analysis." Computers and Education (2020) 146. 103779).
- VIEW — Wilson, Kristen, et al. "Instructor Presence Effect: Liking Does Not Always Lead to Learning." Computers and Education. (2018) (122). pp.205-220.