Motion-Aware Temporal Coherence for Video Resizing

Yu Shuen Wang, Tong Yee Lee, Hongbo Fu, Olga Sorkine, Hans Peter Seidel

Research output: Contribution to journalArticlepeer-review

4 Citations (Scopus)

Abstract

Temporal coherence is crucial in content-aware video retargeting. To date, this problem has been addressed by constraining temporally adjacent pixels to be transformed coherently. However, due to the motion-oblivious nature of this simple constraint, the retargeted videos often exhibit flickering or waving artifacts, especially when significant camera or object motions are involved. Since the feature correspondence across frames varies spatially with both camera and object motion, motion-aware treatment of features is required for video resizing. This motivated us to align consecutive frames by estimating interframe camera motion and to constrain relative positions in the aligned frames. To preserve object motion, we detect distinct moving areas of objects across multiple frames and constrain each of them to be resized consistently. We build a complete video resizing framework by incorporating our motion-aware constraints with an adaptation of the scale-and-stretch optimization recently proposed by Wang and colleagues. Our streaming implementation of the framework allows efficient resizing of long video sequences with low memory cost. Experiments demonstrate that our method produces spatiotemporally coherent retargeting results even for challenging examples with complex camera and object motion, which are difficult to handle with previous techniques.

Original languageEnglish
Pages (from-to)1-10
Number of pages10
JournalACM Transactions on Graphics
Volume28
Issue number5
DOIs
Publication statusPublished - 2009 Dec 1

All Science Journal Classification (ASJC) codes

  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'Motion-Aware Temporal Coherence for Video Resizing'. Together they form a unique fingerprint.

Cite this