POOT: An efficient object tracking strategy based on short-term optimistic predictions for face-structured sensor networks

Jenq Muh Hsu, Chao Chun Chen, Chia Chi Li

Research output: Contribution to journalArticlepeer-review

19 Citations (Scopus)

Abstract

The advance of wireless sensor networks has enabled the development of a great number of applications in various areas, such as biology, military and environmental surveillance. Among these applications, object tracking systems have particularly useful functions, and have been studied by many researchers in recent years. In the design of a sensor network system, energy consumption is a critical consideration. In this paper, we propose a short-term Prediction-based Optimistic Object Tracking strategy (POOT) to reduce energy consumption and prolong the lifetime of sensor nodes while sacrificing only minimal tracking precision. Furthermore, we present two schemes, a Time-efficient Object Recovery Scheme (TORS) and a Communication-efficient Object Recovery Scheme (CORS), to improve object recovery. We also derive cost models for POOT. Through a set of experiments, our proposed prediction-based optimistic object tracking scheme can save up to 23% energy consumption compared to the related scheme, DOT. Meanwhile, the accuracy of POOT is still higher than 97.5% which reveals the optimistic design does not affect the tracking accuracy. Hence, POOT is shown to effectively conserve energy and achieve the objective of tracking of moving objects.

Original languageEnglish
Pages (from-to)391-406
Number of pages16
JournalComputers and Mathematics with Applications
Volume63
Issue number2
DOIs
Publication statusPublished - 2012 Jan

All Science Journal Classification (ASJC) codes

  • Modelling and Simulation
  • Computational Theory and Mathematics
  • Computational Mathematics

Fingerprint Dive into the research topics of 'POOT: An efficient object tracking strategy based on short-term optimistic predictions for face-structured sensor networks'. Together they form a unique fingerprint.

Cite this