Developing an AR Lecture Recording System with Direct Manipulation of Virtual Slides by Physical Objects
AbstractThe increasing demand for lecture videos needs support for making them. Some lecturers conduct conventional and physical-style presentations in the real world. However, videos recording slides physically displayed by a projector are blurry than videos recording slides on a display monitor. We aim to develop a lecture recording system having a direct manipulation interface between virtual slides in augmented reality (AR) space and users in the real world. This research has three subgoals; sharp slide images in lecture videos, direct manipulation on virtual slides, and hand-writable virtual slides using ordinary physical objects such as whiteboard markers. First, we implemented an AR lecture recording system with an AR slide function, a direct manipulation interface on AR slides, and a handwriting function with whiteboard markers. Then, we conducted experiments in terms of performance and usability. As a result, the fps of our recording system is 30 fps or more. First, the result of the task-based user experiments achieved -17.85 seconds. Second, the questionnaire result of the seven-point likert scale improved +2.26 points. Finally, we concluded that our system is practical enough as a lecture recording system with a direct manipulation interface on AR slides. This paper describes the implementation and its evaluations.
M. Furini, G. Galli, and M. C. Martini. An online education system to produce and distribute video lectures. Mobile Networks and Applications, 25:969–976, 2020.
H. Okumoto, M. Yoshida, K. Umemura, and Y. Ichikawa. Response collector: A video learning system for ﬂipped classrooms. In 5th International Conference on Advanced Informatics: Concept Theory and Applications (ICAICTA), pages 176–181, 2018.
P. J. Guo, J. Kim, and R. Rubin. How video production affects student engagement: An empirical study of mooc videos. In Proceedings of the First ACM Conference on Learning at Scale Conference, page 41–50. Association for Computing Machinery, 2014.
Y. Y. Ng and A. Przybyłek. Instructor presence in video lectures: Preliminary ﬁndings from an online experiment. IEEE Access, 9:36485–36499, 2021.
Y. Ito, M. Kikuchi, T. Ozono, and T. Shintani. Developing a lecture video recording system using augmented reality. In 10th International Congress on Advanced Applied Informatics (IIAI-AAI), pages 65–70, 2021.
T. N. Tombaugh. Trail making test a and b: Normative data stratiﬁed by age and education. Archives of Clinical Neuropsychology, 19(2):203–214, 2004.
S. F. Crowe. The differential contribution of mental tracking, cognitive ﬂexibility, visual search, and motor speed to performance on parts a and b of the trail making test. Journal of Clinical Psychology, 54(5):585–591, 1998.
Y. Shi, H. Yang, J. Zhang, S. Wang, and H. H. Yang. The effects of interactive whiteboard-based classroom instruction on students’ cognitive learning outcomes: A meta-analysis. In International Symposium on Educational Technology (ISET), pages 276–280, 2019.
S. Pochtoviuk, T. Vakaliuk, and A. Pikilnyak. Possibilities of application of aug-mented reality in different branches of education. Electronic, pages 179–197, 2020.
I. Barakonyi, T. Fahmy, and D. Schmalstieg. Remote collaboration using augmented reality videoconferencing. In Proceedings of Graphics Interface 2004, page 89–96, 2004.
N. H. Lehment, K. Erhardt, and G. Rigoll. Interface design for an inexpensive hands-free collaborative videoconferencing system. In IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pages 295–296, 2012.
N. Saquib, R. H. Kazi, L.-Y. Wei, and W. Li. Interactive body-driven graphics for augmented video performance. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, page 1–12.
H. Kataoka, T. Ozono, and S. Toramatsu. Developing an AR pop-up picture book and its effect editor based on teaching motions. International Institute of Applied Informatics, 7(1):1–10, 2021.
F. Matulic, L. Engeln, C. Tr¨ager, and R. Dachselt. Embodied interactions for novel immersive presentational experiences. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, page 1713–1720.
H. Benko, R. Jota, and A. Wilson. Miragetable: Freehand interaction on a projected augmented reality tabletop. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, page 199–208, 2012.
A. D. Wilson and H. Benko. Combining multiple depth cameras and projectors for interactions on, above, and between surfaces. In Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology, page 273–282, 2010.
J. Hartmann, Y.-T. Yeh, and D. Vogel. AAR: Augmenting a wearable augmented reality display with an actuated head-mounted projector. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology, page 445–458, 2020.
K. Higuchi, Y. Chen, P. A. Chou, Z. Zhang, and Z. Liu. Immerseboard: Immersive telepresence experience using a digital whiteboard. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, page 2383–2392, 2015.
U. Kister, P. Reipschl¨ager, F. Matulic, and R. Dachselt. Bodylenses - embodied magic lenses and personal territories for wall displays. In Proceedings of the 2015 Interna-tional Conference on Interactive Tabletops & Surfaces, page 117–126, 2015.
S. Suzuki and K. Abe. Topological structural analysis of digitized binary images by border following. Computer Vision, Graphics, and Image Processing, 30(1):32–46, 1985.
K. Davila, F. Xu, S. Setlur, and V. Govindaraju. FCN-lecturenet: Extractive sum-marization of whiteboard and chalkboard lecture videos. IEEE Access, 9:104469–104484, 2021.