Interactive Dance Guidance Using Example Motions
Keywords:Dance motion, dance train, example motion, interactive guidance, online lesson.
Background/Objectives: Human movements in dance are difficult to train without taking an actual class. In this paper, an interactive system of dance guidance is proposed to teach dance motions using examples.
Methods/Statistical analysis: In the proposed system, a set of example motions are captured from experts through a method of marker-free motion capture, which consists of multiple Kinect cameras. The captured motions are calibrated and optimally reconstructed into a motion database. For the efficient exchange of motion data between a student and an instructor, a posture-based motion search and multi-mode views are provided for online lessons.
Findings: To capture accurate example motions, the proposed system solves the joint occlusion problem by using multiple Kinect cameras. An iterative closest point (ICP) method is used to unify the multiple camera data into the same coordinate system, which generates an output motion in real time. Comparing to a commercial system, our system can capture various dance motions over an average of 85% accuracy, as shown in the experimental results. Using the touch screen devices, a student can browse a desired motion from the database to start a dance practice and send own motion to an instructor for feedback. By conducting online dance lessons such as ballet, K-pop, and traditional Korean, our experimental results show that the participating students can train their dance skills over a given period.
Improvements/Applications: Our system is applicable to any student who wants to learn dance motions without taking an actual class andto receive online feedback from a distant instructor.
 United Nations Educational, Scientific and Cultural Organization (UNESCO). Intangible cultural heritage. Retrieved from https://ich.unesco.org/en/home/.
 Dance Notation Bureau. Labanotation. Retrieved from http://www.dancenotation.org/.
 The Noa Eshkol Foundation for Movement Notation. Eshkol-Wachman Movement Notation (EWMN). Retrieved from http://noaeshkol.org/.
 TroikaTronicx. Isadora. Retrieved from https://troikatronix.com/.
 Credo Interactive. DanceForms 2. Retrieved from http://charactermotion.com/products/danceforms/.
 Harmonix. Dance Central Spotlight. Retrieved from http://www.harmonixmusic.com/games/dance-central/.
 Ubisoft Entertainment. Just Dance Now. Retrieved from https://just-dance.ubisoft.com/en-us/home/.
 Microsoft. Kinect Camera Sensor. Retrieved from https://developer.microsoft.com/en-us/windows/kinect/.
 Chan, J., Leung, H., Tang, J., & Komura, T. (2011). A Virtual Reality Dance Training System Using Motion Capture Technology. IEEE Transactions on Learning Technologies,4(2), 187-195.
 Intel. RealSense. Retrieved from https://software.intel.com/en-us/realsense.
 Zhang, L., Sturm, J., Cremers, D., &Lee, D. (2012, October 7-12). Real-time human motion tracking using multiple depth cameras. Paper presented at the IEEE International Conference on Intelligent Robots and Systems. doi: 10.1109/IROS.2012.6385968
 Kitsikidis, A., Dimitropoulos, K., Douka, S., &Grammalidis, N. (2014, January 5-8). Dance Analysis using Multiple Kinect Sensors. Paper presented at the International Conference on Computer Vision Theory and Applications, Lisbon, Portugal. Piscataway, New Jersey: IEEE.
 Kaenchan, S., Mongkolnam, P., Watanapa, B., &Sathienpong, S. (2013, September 4-6). Automatic Multiple Kinect Cameras Setting for Simple Walking Posture Analysis. Paper presented at the International Computer Science and Engineering Conference. doi: 10.1109/ICSEC.2013.6694787
 Moon, S., Park, Y., Ko, D.W., &Suh, I.H. (2016). Multiple Kinect Sensor Fusion for Human Skeleton Tracking using Kalman Filtering. International Journal of Advanced Robotic Systems, 13(2), 1-10, doi:10.5772/62415
 Jo, H., Yu, H., Kim, K., &Jung, H.S. (2015). Motion Tracking System for Multi-User with Multiple Kinects. International Journal of u- and e-Service, Science and Technology, 8(7), 99-108. doi:10.14257/ijunesst. 2015.8.7.10
 Kim, Y., Baek, S., & Bae, B.-C. (2017). Motion Capture of the Human Body. ETRI Journal, 39(2), 181-190. doi:10.4218/etrij.17.2816.0045
 Kim, Y. (2017). Dance motion capture and composition using multiple RGB and depth sensors, International Journal of Distributed Sensor Networks, 13(2), 1-11. doi:10.1177/1550147717696083
 Besl, P.J., & McKay, N.D. (1992). A Method for Registration of 3-D Shapes. IEEE Transactions on Pattern Analysis and Machine Intelligence, 14(2), 239â€“256.
 Hong, S., & Kim, M., (2016). A Framework for Human Body Parts Detection in RGB-D Image. Journal of Korea Multimedia Society, 19(12), 1927-1935.
 Xsens.MVN motion capture system. Retrieved from http://xsens.com.
View Full Article:
How to Cite
LicenseAuthors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under aÂ Creative Commons Attribution Licensethat allows others to share the work with an acknowledgement of the work''s authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal''s published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (SeeÂ The Effect of Open Access).