Interactive human action search using body language

Yan Ching Lin, Hong Ming Chen, Yung Huan Hsieh, Min-Chun Hu, Wen Huang Cheng

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Searching for human actions in a large video collection is a frequent demand in our daily lives. However, it is often not well supported by current multimedia technologies. For example, by using the traditional text-based search methods, it is not quite straightforward to give proper keywords as query input if users are uncertain about the textual or verbal descriptions of interesting actions in their mind. According to the sociological findings, the use of body language could be arguably a more natural and direct way for people to express their conscious or subconscious thoughts in a nonverbal manner. Therefore, in this paper, we propose an interactive system for human action search in videos, which is characterized by enabling the user to give a search query of interesting human actions through directly performing it. In contrast to a machines learning based recognition system, we address the problem of human action search with the approximate string matching (ASM) technique. As long as a user's actions can be matched with any sequence of the video database, they are said to be meaningful actions. The experiments demonstrate the effectiveness of our system in support of the user's search task.

Original languageEnglish
Title of host publication2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012
Pages30-31
Number of pages2
DOIs
Publication statusPublished - 2012 May 29
Event2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012 - Kaohsiung, Taiwan
Duration: 2012 Apr 192012 Apr 21

Publication series

Name2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012

Other

Other2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012
CountryTaiwan
CityKaohsiung
Period12-04-1912-04-21

Fingerprint

Learning systems
Experiments

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications

Cite this

Lin, Y. C., Chen, H. M., Hsieh, Y. H., Hu, M-C., & Cheng, W. H. (2012). Interactive human action search using body language. In 2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012 (pp. 30-31). [6198141] (2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012). https://doi.org/10.1109/WOCC.2012.6198141
Lin, Yan Ching ; Chen, Hong Ming ; Hsieh, Yung Huan ; Hu, Min-Chun ; Cheng, Wen Huang. / Interactive human action search using body language. 2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012. 2012. pp. 30-31 (2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012).
@inproceedings{f45e173355914342a6f715852e768069,
title = "Interactive human action search using body language",
abstract = "Searching for human actions in a large video collection is a frequent demand in our daily lives. However, it is often not well supported by current multimedia technologies. For example, by using the traditional text-based search methods, it is not quite straightforward to give proper keywords as query input if users are uncertain about the textual or verbal descriptions of interesting actions in their mind. According to the sociological findings, the use of body language could be arguably a more natural and direct way for people to express their conscious or subconscious thoughts in a nonverbal manner. Therefore, in this paper, we propose an interactive system for human action search in videos, which is characterized by enabling the user to give a search query of interesting human actions through directly performing it. In contrast to a machines learning based recognition system, we address the problem of human action search with the approximate string matching (ASM) technique. As long as a user's actions can be matched with any sequence of the video database, they are said to be meaningful actions. The experiments demonstrate the effectiveness of our system in support of the user's search task.",
author = "Lin, {Yan Ching} and Chen, {Hong Ming} and Hsieh, {Yung Huan} and Min-Chun Hu and Cheng, {Wen Huang}",
year = "2012",
month = "5",
day = "29",
doi = "10.1109/WOCC.2012.6198141",
language = "English",
isbn = "9781467309394",
series = "2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012",
pages = "30--31",
booktitle = "2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012",

}

Lin, YC, Chen, HM, Hsieh, YH, Hu, M-C & Cheng, WH 2012, Interactive human action search using body language. in 2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012., 6198141, 2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012, pp. 30-31, 2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012, Kaohsiung, Taiwan, 12-04-19. https://doi.org/10.1109/WOCC.2012.6198141

Interactive human action search using body language. / Lin, Yan Ching; Chen, Hong Ming; Hsieh, Yung Huan; Hu, Min-Chun; Cheng, Wen Huang.

2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012. 2012. p. 30-31 6198141 (2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Interactive human action search using body language

AU - Lin, Yan Ching

AU - Chen, Hong Ming

AU - Hsieh, Yung Huan

AU - Hu, Min-Chun

AU - Cheng, Wen Huang

PY - 2012/5/29

Y1 - 2012/5/29

N2 - Searching for human actions in a large video collection is a frequent demand in our daily lives. However, it is often not well supported by current multimedia technologies. For example, by using the traditional text-based search methods, it is not quite straightforward to give proper keywords as query input if users are uncertain about the textual or verbal descriptions of interesting actions in their mind. According to the sociological findings, the use of body language could be arguably a more natural and direct way for people to express their conscious or subconscious thoughts in a nonverbal manner. Therefore, in this paper, we propose an interactive system for human action search in videos, which is characterized by enabling the user to give a search query of interesting human actions through directly performing it. In contrast to a machines learning based recognition system, we address the problem of human action search with the approximate string matching (ASM) technique. As long as a user's actions can be matched with any sequence of the video database, they are said to be meaningful actions. The experiments demonstrate the effectiveness of our system in support of the user's search task.

AB - Searching for human actions in a large video collection is a frequent demand in our daily lives. However, it is often not well supported by current multimedia technologies. For example, by using the traditional text-based search methods, it is not quite straightforward to give proper keywords as query input if users are uncertain about the textual or verbal descriptions of interesting actions in their mind. According to the sociological findings, the use of body language could be arguably a more natural and direct way for people to express their conscious or subconscious thoughts in a nonverbal manner. Therefore, in this paper, we propose an interactive system for human action search in videos, which is characterized by enabling the user to give a search query of interesting human actions through directly performing it. In contrast to a machines learning based recognition system, we address the problem of human action search with the approximate string matching (ASM) technique. As long as a user's actions can be matched with any sequence of the video database, they are said to be meaningful actions. The experiments demonstrate the effectiveness of our system in support of the user's search task.

UR - http://www.scopus.com/inward/record.url?scp=84861436626&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84861436626&partnerID=8YFLogxK

U2 - 10.1109/WOCC.2012.6198141

DO - 10.1109/WOCC.2012.6198141

M3 - Conference contribution

SN - 9781467309394

T3 - 2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012

SP - 30

EP - 31

BT - 2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012

ER -

Lin YC, Chen HM, Hsieh YH, Hu M-C, Cheng WH. Interactive human action search using body language. In 2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012. 2012. p. 30-31. 6198141. (2012 21st Annual Wireless and Optical Communications Conference, WOCC 2012). https://doi.org/10.1109/WOCC.2012.6198141