Human action recognition based on graph-embedded spatio-temporal subspace

Chien Chung Tseng, Ju Chin Chen, Ching Hsien Fang, Jenn Jier James Lien

Research output: Contribution to journalArticlepeer-review

18 Citations (Scopus)


Human action recognition is an important issue in the pattern recognition field, with applications ranging from remote surveillance to the indexing of commercial video content. However, human actions are characterized by non-linear dynamics and are therefore not easily learned and recognized. Accordingly, this study proposes a silhouette-based human action recognition system in which a three-step procedure is used to construct an efficient discriminant spatio-temporal subspace for k-NN classification purposes. In the first step, an Adaptive Locality Preserving Projection (ALPP) method is proposed to obtain a low-dimensional spatial subspace in which the linearity in the local data structure is preserved. To resolve the problem of overlaps in the spatial subspace resulting from the ambiguity of the human body shape among different action classes, temporal data are extracted using a Non-base Central-Difference Action Vector (NCDAV) method. Finally, the Large Margin Nearest Neighbor (LMNN) metric learning method is applied to construct an efficient spatio-temporal subspace for classification purposes. The experimental results show that the proposed system accurately recognizes a variety of human actions in real time and outperforms most existing methods. In addition, a robustness test with noisy data indicates that our system is remarkably robust toward noise in the input images.

Original languageEnglish
Pages (from-to)3611-3624
Number of pages14
JournalPattern Recognition
Issue number10
Publication statusPublished - 2012 Oct

All Science Journal Classification (ASJC) codes

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence


Dive into the research topics of 'Human action recognition based on graph-embedded spatio-temporal subspace'. Together they form a unique fingerprint.

Cite this