TY - GEN
T1 - An Intuitive Human-Robot Interaction Method for Robotic Dance Choreography
AU - Wang, Mi chi
AU - Shen, Yang Ting
N1 - Publisher Copyright:
© 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2023
Y1 - 2023
N2 - Recently, with the rapid development in the field of technology and art, artists have presented various experimental works by combining technology and art, creating a new mode of interaction between the works and the audience. Among them, performance art fields such as dance have begun to experiment with robots. In the control of traditional industrial machine arms, professional programming skills and a large amount of time are required to write a series of dance movements for the robot arm. These methods mainly control the position of the robot arm's end effector, rather than the posture of the robot arm. The control logic differs from dance movements. In order to allow dancers without a technological background to choreograph the robot arm, this study has developed a method for choreographing robot arms intuitively. This study integrates skeleton recognition technology, uses a camera to real-time recognize the 3d coordinates of the human skeleton and converts the angle between the skeletons into the angles of the six axes of the robot arm. Through the control logic of forward kinematics, control commands are executed in real-time to the robot arm, allowing the dancer to intuitively control the posture of the robot arm through their own body movements.
AB - Recently, with the rapid development in the field of technology and art, artists have presented various experimental works by combining technology and art, creating a new mode of interaction between the works and the audience. Among them, performance art fields such as dance have begun to experiment with robots. In the control of traditional industrial machine arms, professional programming skills and a large amount of time are required to write a series of dance movements for the robot arm. These methods mainly control the position of the robot arm's end effector, rather than the posture of the robot arm. The control logic differs from dance movements. In order to allow dancers without a technological background to choreograph the robot arm, this study has developed a method for choreographing robot arms intuitively. This study integrates skeleton recognition technology, uses a camera to real-time recognize the 3d coordinates of the human skeleton and converts the angle between the skeletons into the angles of the six axes of the robot arm. Through the control logic of forward kinematics, control commands are executed in real-time to the robot arm, allowing the dancer to intuitively control the posture of the robot arm through their own body movements.
UR - http://www.scopus.com/inward/record.url?scp=85172993523&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85172993523&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-35602-5_18
DO - 10.1007/978-3-031-35602-5_18
M3 - Conference contribution
AN - SCOPUS:85172993523
SN - 9783031356018
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 248
EP - 257
BT - Human-Computer Interaction - Thematic Area, HCI 2023, Held as Part of the 25th HCI International Conference, HCII 2023, Proceedings
A2 - Kurosu, Masaaki
A2 - Hashizume, Ayako
PB - Springer Science and Business Media Deutschland GmbH
T2 - Thematic Area on Human Computer Interaction, HCI 2023, held as part of the 25th International Conference on Human-Computer Interaction, HCII 2023
Y2 - 23 July 2023 through 28 July 2023
ER -