Estimation of Test Scores based on Questionnaire and Video Viewing Behavior in the Programming MOOC Course

  • Masako Furukawa National Institute of Informatics
  • Hiroshi Itsumura University of Tsukuba
  • Kazutsuna Yamaji National Institute of Informatics
Keywords: MOOC, Learning Analytics, Test Score, Video Viewing Behavior, Multiple Regression Analysis


The Massive Open Online Courses (MOOCs) offer various types of learners the opportunity to attend university-level lectures. However, since many learners drop out during the learning process, the average MOOC completion rate is usually as low as 10%. For future improvement, MOOCs must grasp the learners’ features in the earlier stage and provide appropriate support to each learner. This paper investigates the relationship between learners’ characteristics and test scores in the programming MOOC course to recognize different types of learners. Video view-ing behavior and the questionnaire information at the beginning of the lecture, i.e., age, pro-gramming skill, and keywords in the free description, are analyzed to characterize learners. As the results, it was observed that the repeated learning behavior and later join to the course relates to the higher and lower score, respectively. The information from the questionnaire improves the accuracy of pass/fail estimation before the third week. The characteristic cluster of learners, who could be rescued by offering appropriate support, was also obtained by multiple regression analysis results.



Hajimete no P (The first step of programming);

J.Reich and J.A.Ruiperez-Valiente, “Supplementary Material for The MOOC pivot,”

B.K.Pursel, L.Zhang, K.W.Jablokow, G.W.Choi, D.Velegol, “Understanding MOOC stu-dents: motivations and behaviors indicative of MOOC completion,” Journal of ComputerAssisted Learning, 32, 3, 2016, pp.202-217.

New Media Consortium, “Learning Analytics and Adaptive Learning,” NMC Horizon Re-port 2016 Higher Education Edition, 2016, pp.38-39.

H.Khalil and M.Ebner, “MOOCs Completion Rates and Possible Methods to Improve Re-tention - A Literature Review,” World Conference on Educational Multimedia, Hypermediaand Telecommunications, Vol. 2014, No. 1, 2014, pp. 1305-1313.

Y.Lee and J.Choi, “A review of online course dropout research: implications for practice andfuture research,” Educational Technology Research and Development, 59, 5, 2011, pp593-618.

M.Tan and P.Shao, “Prediction of Student Dropout in E-Learning Program Through the Useof Machine Learning Method,” International Journal of Emerging Technologies in Learning,Volume 10, Issue 1, 2015, pp.11-17.

N.Gitinabard, F.Khoshnevisan, C.F.Lynch, and E.Y.Wang, “Your Actions or Your Associ-ates? Predicting Certification and Dropout in MOOCs with Behavioral and Social Features,”Proceedings of the 11th International Conference on Educational Data Mining, 2018,pp.404-410.

R.Manrique, B.P.Nunes, O.Marino, M.A.Casanova, and T.Nurmikko-Fuller, “An Analysisof Student Representation, Representative Features and Classification Algorithms to PredictDegree Dropout,” Proceedings of the 9th International Conference on Learning Analytics andKnowledge (LAK ‘19), 2019, pp.401-410.

C.Ye and G.Biswas, “Early Prediction of Student Dropout and Performance in MOOCsusing Higher Granularity Temporal Information,” Journal of Learning Analytics, 1, 2014,pp.169-172; DOI:10.18608/jla.2014.13.14.

M.D.Milliron, L.Malcolm and D.Kil, “Insight and action analytics: Three case studies toconsider,” Research and Practice in Assessment, 9, 2014, pp.70-89.

M. Furukawa, H. Itsumura, K. Yamaji, “Estimation of Test Scores Based on Video ViewingBehavior in the Programming MOOC Course,” 9th International Congress on AdvancedApplied Informatics (IIAI AAI 2020), 2020, pp.155-162.



M.Furukawa, K.Yamaji, Y.Yaginuma, and T.Yamada, “Development of Learning AnalyticsPlatform for OUJ Online Courses,” IEEE 6th Global Conference on Consumer Electronics(GCCE 2017), 2017, pp.557-558.

Learning Locker;

Open edX;

ADL, Experience API Specification;

Technical Papers