Procedure Generation for Algorithm Learning System using Comment Synthesis and LSTM

  • Akiyoshi Takahashi Okayama Univerity of Science
  • Hiromitsu Shiina Okayama University of Science
  • Ryunosuke Ito Advanced Information Design
  • Nobuyuki Kobayashi Sanyo Gakuen University
Keywords: Programming learning, Comment Generating, Summarization, Neural Machine Translation, Encoder–Decoder Translation Model


We have constructed a learning system by organizing procedures for learning program creation. However, manually creating procedures for learning systems requires a significant amount of time. In this study, in addition to automatically generating program procedures using natural language processing, we generate new program content and procedures by learning program code and comments through deep learning long short-term memory.


Ministry of Education, Culture, Sports, Science and Technology, “Elementary school programming education guide (2nd edition),” 2018. [Online] Available:, [Accessed Nov. 15, 2018] (in Japanse).

Ministry of Education, Culture, Sports, Science and Technology. “How to programming education at elementary school level (Summary of discussion).” 2016. [Online] Available: , [Accessed Nov. 15, 2018] (In Japanese).

H. Kanamori, T. Tomoto and T. Akakura, “Development of a Computer Programming Learning Support System Based on Reading Computer Program,” Human Interface and the Management of Information. Information and Interaction for Learning, Culture, Collaboration and Business (HIMI) 2013, pp. 63-69, Springer, 2013. DOI:10.1007/978-3-642-39226-9_8

K. Okimoto, S. Matsumoto, S. Yamagishi and T. Kashima, “Developing a source code reading tutorial system and analyzing its learning log data with multiple classification analysis,” Artificial Life and Robotics, Vol 22, No. 7, pp. 227-237, 2017. DOI:10.1007/s10015-017-0357-2

S. Matsumoto, K. Okimoto, T. Kashima and S. Yamagishi, “Automatic Generation of C Source Code for Novice Programming Education, Human-Computer Interaction,” Theory, Design, Development and Practice 2016, pp. 65-76, 2016. DOI:10.1007/978-3-319-39510-4_7

M. Oba, K. Ito, and A. Shimogoori, “Analysis of Correlation between Programming Skills and Technical Writing Skills,” IPSJ SIG Technical Report, Vol 2015-IFAT-118 No. 2, pp. 1-4, 2015.

K. Sakane, N. Kobayashi, H. Shiina and F. Kitagawa, “Kanji Learning and Programming Support System which conjoined with a Lecture,” IEICE Technical Report, ET2014-86, Vol. 114, No. 513, pp. 7-12, 2015.

F. Tetsuya, Y. Hayase and K. Inoue, “Generating Descriptions of Nouns in Software from Program Comments,” IEICE-110, no. 169, pp. 65-69, 2010.

I.Mani and E. Bloedorn, “Multi-document summarizzation by graph search and matching,” In Proc. 14th National Conferenece on Artificial Intelligence, pp. 622-628 1997.

D. Marcu, “Improving summarization through rhetorical parsing tuning,” In Proc. 6th Workshop on Very Large Corpora, pp.206-215, 1998.

I. Mani. Automatic Summarization, John Benjamins Pub Co, 2001. DOI:10.1075/nlp.3

K. Greff, et al., “LSTM: A Search Space Odyssey,” IEEE Transactions on Neural Networks and Learning Systems, Vol. 28, Issue10, pp. 2222-2232, 2017.

I. Sutskever, O. Vinyals and Q. Le, “Sequence to Sequence Learning with Neural Networks,” Advances in Neural Information Processing Systems 27 (NIPS 2014), pp. 3104-3112, 2014.

M. Luong, H. Pham and D. Manning, “Effective Approaches to Attention-based Neural Machine Translation,” arXiv preprint arXiv:1508.04025v5, 2015.

A. Rush, S. Chopra and C. Weston, “A Neural Attention Model for Sentence Summarization,” In Proc. EMNLP 2015: Conference on Empirical Methods in Natural Language Processing, pp. 379-389, 2015.