Rapid image completion system using multiresolution patch-based directional and nondirectional approaches

Chih Wei Fang, Jenn Jier James Lien

Research output: Contribution to journalArticlepeer-review

43 Citations (Scopus)


This study presents a rapid image completion system comprising a training (or analysis) process and an image completion (or synthesis) process. The proposed system adopts a multiresolution approach, which not only improves the convergence rate of the synthesis process, but also provides the ability to deal with large replaced regions. In the training process, a down-sampling approach is applied to create a patch-based texture eigenspace based on multiresolution background region information. In the image completion process, an up-sampling approach is applied to synthesize the replaced foreground regions. To ensure the continuity of the geometric texture structure between the original background scene regions and the replaced foreground regions, directional and nondirectional image completion approaches are developed to reconstruct the global geometric structure and to enhance the local detailed features of the replaced foreground regions in the lower and higher resolution level images, respectively. Moreover, the synthesis priority order of the individual patches and the appropriate choice of completion scheme (i.e., directional or nondirectional) are both determined in accordance with a Hessian matrix decision value (HMDV) parameter. Finally, a texture refinement process is performed to optimize the resolution of the synthesized result.

Original languageEnglish
Pages (from-to)2769-2779
Number of pages11
JournalIEEE Transactions on Image Processing
Issue number12
Publication statusPublished - 2009 Dec

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Graphics and Computer-Aided Design


Dive into the research topics of 'Rapid image completion system using multiresolution patch-based directional and nondirectional approaches'. Together they form a unique fingerprint.

Cite this