Personal gesture-driven virtual walk-through systems

Ling Erl Cheng, Hung Ming Wang, Jun Ren Ding, Ji Kun Lin, Zhi Wei Zhang, Jar Ferr Yang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

In this paper, we propose the use of the body gesture, instead of mouse and keyboard, as an interface to browse the exhibited multimedia video contents. For real applications, we have defined some control modes, which are based on human natural movements, to construct the commands of interactivities. These commands controlled a body gesture, are able to be built to control a virtual reality system for enhancing user's experiences of exhibitions. In our demo system, this body gesture offers a more nature and intuitive interface for the users, who easily walk through a virtual environment, which is created by specific oil paintings generated by graph engines. The proposed system can help the viewers to obtain more information about those paintings such that we can enhance their exploration experiences in touring multimedia contents.

Original languageEnglish
Title of host publicationTENCON 2007 - 2007 IEEE Region 10 Conference
DOIs
Publication statusPublished - 2007
EventIEEE Region 10 Conference, TENCON 2007 - Taipei, Taiwan
Duration: 2007 Oct 302007 Nov 2

Publication series

NameIEEE Region 10 Annual International Conference, Proceedings/TENCON

Other

OtherIEEE Region 10 Conference, TENCON 2007
Country/TerritoryTaiwan
CityTaipei
Period07-10-3007-11-02

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Personal gesture-driven virtual walk-through systems'. Together they form a unique fingerprint.

Cite this