Since the outbreak of CoVid-19, online education has become an undeniably common practice for educational practitioners. Teachers keep exploring an effective instructional design for their online courses. As of now we see the value in sharing our experiences of designing and running the Massive Open Online Course (MOOC). In 2014, Hanze University of Applied Sciences, The Netherlands, started to design a MOOC on the subject of Introductory Economics for learners who desired to directly apply the knowledge in practice. It was published in 2015 at canvas.net and has been run multiple times until 2018. The project received funding from the Dutch Ministry of Education, Culture, and Science (OCW) in 2017 to improve the course design and create a community of online learners. Because of the transformation of canvas.net, we had to move the whole MOOC into our university’s website as a self-paced open online course. Following are our recommendations for online course designers based on the data of our MOOC learners and the experiences of the course design.
- Online instructional videos:
a) Length of videos: Previous research has shown that shorter videos are much more effective than long videos (Guo, et al., 2014). Ozan and Ozarslan (2016) reviewed 2927 learners’ behaviors of 18,144 video events, and found that 58% of the learners preferred to watch short videos (<10 min). From our MOOC, we have learnt that there is no fixed rule of the video length. It depends on the content and the way the knowledge is presented. Our course concerns the introductory knowledge of economics. Some parts contain quite theoretical content while some involve some examples of applying knowledge in a business context. For the former one, we found three-minute long videos are the best. Most of our theoretical instructions are chopped into several two or three-minute-long videos. In some relatively long videos (>7 min), we invited real business people to share their experiences, but they still attracted high click rates.
b) Presentation of videos: For some theoretical content, it might be difficult to divide them into multiple short videos. We used to have a video that was 4.33 minutes long, talking about the labor market. Around a third of learners stopped clicking the rest videos after watching this one, maybe because the topic was too theoretical. In preparation for publishing the MOOC for the second time, we revised the slides and inserted some animations and illustrations to match the oral presentation, see the example in Figure 1. After reviewing the other videos, we tried to animate almost all theoretical content. We noticed that the drop-out rate significantly decreased in the second round of MOOCs.
c) Instructor’s display in the video: A video with instructors’ display at an opportune time may engage learners well, according to Guo, et al. (2014). In order not to distract learners, the instructor or guest lecturers of our MOOC only showed up on three occasions:
- At the beginning of one topic to introduce the learning outcomes and show expectations.
- In the videos to introduce a case study of applying the knowledge. Figure 2 is an example of one case study with our guest lecturer.
- At the end of each topic to wrap up the content and imply that he was looking forward to receiving the assignments.
- Communication with learners:
a) Discussion Forum: In the online-only learning setting, the discussion forum is an essential tool to facilitate learners’ communication (Boroujeni, Hecking, Hoppe, and Dillenbourg, 2017). We fully realized the importance of a discussion forum and also the multi-cultural background of our learners. We recruited five to seven second-year students from our school as teaching assistants. Their task was to respond to all messages posted on the Discussion Forum within 24 hours. For some questions beyond their knowledge level, they forwarded them to the instructor. And the instructor responded to the learners within 48 hours at the discussion forum.
Furthermore, earlier studies on MOOC design suggested instructors insert probing questions in the forum at a critical moment to elicit in-depth discussion among students (Yang, et al., 2014). Mr. Jager brought two open questions into the Discussion Forum in the second and third weeks, respectively. And these questions were always well responded to.
b) Regular mail communication: Before each topic starts and ends, we sent emails to the registered learners. From our data analysis of drop-out learners, we noticed that for this four-week MOOC, week 3 was the crucial time. For learners who didn’t start watching the opening video of the third week, they tended to quit the course. Therefore, in our fifth round of MOOC, by the end of the second week, we sent one more email to summarize the learning outcomes of the first two weeks and encourage learners to keep on;
a) Weekly Quiz: Weekly quizzes are used to measure learners’ knowledge gain, which can be automatically and promptly marked. In our MOOC, each weekly quiz consisted of 15 multiple-choice questions that closely matched the learning topic. We didn’t set a time limit for completing the quizzes. Learners were able to access it unlimitedly. Still, we noticed that over 95% of learners accessed the online quizzes at least twice. We also saw weekly quizzes as an indicator of students’ continuation of study. For learners who failed the quiz of the second week, we sent a personalized email and encouraged them to try again. Unfortunately, we haven’t collected data to measure the effectiveness of the return rate after such kinds of communication.
b) Peer Assessment: We have learned that both weekly quizzes and peer assessments are significantly related to students’ final learning achievement (Admiraal, et al., 2015). However, we encountered difficulties while including peer assessment into our MOOC. In the first round of our MOOC, we asked learners to go through a peer assessment for their report assignment that accounted for 75% of the final grade. Learners were required to submit their assignments by the end of Week 3, and all registered learners were randomly paired for peer assessment. A well-structured assessment rubric was presented to instruct learners on how to assess the assignment of a peer learner. However, we soon realized that this was a mistake because more than 80% of the registered learners didn’t submit their assignments. In the second round of the MOOC, we randomly paired peer learners who had submitted the assignments. Still, there were some problems. For example, quite a few learners could not provide valid or accessible feedback. The peer assessment was not successful. We had to recruit more senior students from the second and fourth years of our bachelor's program to assess these assignments. In the third round of the MOOC, we assigned the role of peer assessors to our senior students.
Admiraal, W., Huisman, B. & Pilli, O. (2015). Assessment in Massive Open Online Courses. The Electronic Journal of e-Learning , 13 (4), 207-216.
Boroujeni, M., Hecking, T., Hoppe H.U. & Dillenbourg, P. (2017). Proceedings of the seventh international learning analytics and knowledge conference (LAK17) (2017), pp. 128-137, 10.1145/3027385.3027391
Guo, P. J., Kim, J., & Rubin, R. (2014). How Video Production Affects Student Engagement: An Empirical Study of MOOC Videos. In Proceedings of the First ACM Conference on Learning@ Scale Conference (pp. 41-50). New York, NY: Association for Computing Machinery.
Ozlem Ozan & Yasin Ozarslan (2016) Video lecture watching behaviors of learners in online courses, Educational Media International, 53:1, 27-41, DOI: 10.1080/09523987.2016.1189255
Yang, D., Wen, M., Howley, I., Kraut, R. & Rose, C. (2015). Exploring the effect of confusion in discussion forums of massive open online courses. Proceedings of the second ACM conference on learning@ scale (2015), pp. 121-130.
Share this post
About Ning Ding
PhD, University of Applied Sciences Groningen, the Netherlands (Coordinator of Cloud 6: Learning in Digital Era: Technology Enhanced Learning)