TY - GEN
T1 - Interactive human action search using body language
AU - Lin, Yan Ching
AU - Chen, Hong Ming
AU - Hsieh, Yung Huan
AU - Hu, Min Chun
AU - Cheng, Wen Huang
PY - 2012/5/29
Y1 - 2012/5/29
N2 - Searching for human actions in a large video collection is a frequent demand in our daily lives. However, it is often not well supported by current multimedia technologies. For example, by using the traditional text-based search methods, it is not quite straightforward to give proper keywords as query input if users are uncertain about the textual or verbal descriptions of interesting actions in their mind. According to the sociological findings, the use of body language could be arguably a more natural and direct way for people to express their conscious or subconscious thoughts in a nonverbal manner. Therefore, in this paper, we propose an interactive system for human action search in videos, which is characterized by enabling the user to give a search query of interesting human actions through directly performing it. In contrast to a machines learning based recognition system, we address the problem of human action search with the approximate string matching (ASM) technique. As long as a user's actions can be matched with any sequence of the video database, they are said to be meaningful actions. The experiments demonstrate the effectiveness of our system in support of the user's search task.
AB - Searching for human actions in a large video collection is a frequent demand in our daily lives. However, it is often not well supported by current multimedia technologies. For example, by using the traditional text-based search methods, it is not quite straightforward to give proper keywords as query input if users are uncertain about the textual or verbal descriptions of interesting actions in their mind. According to the sociological findings, the use of body language could be arguably a more natural and direct way for people to express their conscious or subconscious thoughts in a nonverbal manner. Therefore, in this paper, we propose an interactive system for human action search in videos, which is characterized by enabling the user to give a search query of interesting human actions through directly performing it. In contrast to a machines learning based recognition system, we address the problem of human action search with the approximate string matching (ASM) technique. As long as a user's actions can be matched with any sequence of the video database, they are said to be meaningful actions. The experiments demonstrate the effectiveness of our system in support of the user's search task.
UR - http://www.scopus.com/inward/record.url?scp=84861436626&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84861436626&partnerID=8YFLogxK
U2 - 10.1109/WOCC.2012.6198141
DO - 10.1109/WOCC.2012.6198141
M3 - Conference contribution
AN - SCOPUS:84861436626
SN - 9781467309394
T3 - 2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012
SP - 30
EP - 31
BT - 2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012
T2 - 2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012
Y2 - 19 April 2012 through 21 April 2012
ER -