Two Approaches to Supporting Improvisational Ensemble for Music Beginners based on Body Motion Tracking

  • Shugo Ichinose Nagoya Institute of Technology
  • Souta Mizuno Nagoya Institute of Technology
  • Shun Shiramatsu Nagoya Institute of Technology
  • Tetsuro Kitahara Nihon University
Keywords: Body motion, improvisational ensemble, pitch contour, motion sensor, smartphone sensor

Abstract

Melody recognition consists of three cognitive elements: pitch contour, rhythm, and tonality. Pitch contour and rhythm can be relatively easily represented by body motion. In comparison, tonality is difficult to understand and to represent for music beginners. In this paper, we focus on two approaches to supporting improvisational ensembles for music beginners on the basis of body motion tracking: one using a 3D motion capture camera and one using sensors in a smartphone. Users of our systems using these approaches can participate in improvisational ensembles by making hand movements that correspond to the pitch contour and rhythm without considering tonality because the generated pitch is automatically adjusted to a consonant pitch with the chords of a background tune. To deal with the delay and error in gesture recognition due to a 3D motion capture camera, we improved methods for recognizing gesture. The experimental results show that the delay of our method was improved over that of the conventional one. Furthermore, we implemented a method for motion tracking on the basis of smartphone sensors. The experimental results show the difficulties of motion tracking with smartphone sensors. Moreover, we discuss perspectives on the social reuse of improvisational melody data shared as open data.

References

G. Hatano, “Music and Cognition,” University of Tokyo Press, 1981.

T. Kitahara and Y. Tsuchiya, “Short-term and Long-term Evaluations of Melody Editing Method based on Melodic Outline,” Proceedings of the Joint International Computer Music and Sound and Music Computing Conference (ICMC|SMC| 2014), 2014, pp. 1204-1211.

M. Goto et al., “Songle: A Web Service for Active Music Listening Improved by User Contributions,” Proceedings of the 12th International Society for Music Information Retrieval Conference (ISMIR 2011), 2011, pp. 311-316.

H. Kanke et al., “Airstic Drum: Construction of a Drumstick for Integration of Real and Virtual Drums,” Transactions of Information Processing Society of Japan, 2013, pp. 1393-1401.

K. Tanaka, K. Higuchi, and K. Ueno, “The effect of sound delay conditions on electronic drum performance,” Technical Committee of Musical Acoustics of Acoustical Society of Japan, 2013, pp. 1-6.

T. Kitahara et al., “A Loop Sequencer That Selects Music Loops based on the Degree of Excitement,” Proceedings of the 12th Sound and Music Computing Conference (SMC 2015), 2015, pp. 435-438.

S. Shiramatsu, T. Ozono, and T. Shintani, “A Computational Model of Tonality Cognition Based on Prime Factor Representation of Frequency Ratios and Its Application,” Proceedings of the 12th Sound and Music Computing Conference (SMC 2015), 2015, pp. 133-139.

Shikumi Design Inc.: “KAGURA: Change Your Motion Into Music,” https://www.kagura.cc (accessed on Jan. 2018).

C. Mayer et al., “An Audio-visual Music Installation With Dichotomous user Interactions,” November 2014 ACE ’14: Proceedings of the 11th Conference on Advances in Computer Entertainment Technology.

D. Marinos et al., “Design of a Touchless Multipoint Musical Interface in a Virtual Studio Environment,” November 2011 ACE ’11: Proceedings of the 8th International Conference on Advances in Computer Entertainment Technology.

K. Kurosaki et al., “Skill Transmission for Hand Positioning Task Through Viewsharing System,” March 2011 AH ’11: Proceedings of the 2nd Augmented Human International Conference.

Published
2019-05-31