High relief from brush painting

Yunfei Fu, Hongchuan Yu, Chih Kuo Yeh, Jianjun Zhang, Tong Yee Lee

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

Relief is an art form part way between 3D sculpture and 2D painting. We present a novel approach for generating a texture-mapped high-relief model from a single brush painting. Our aim is to extract the brushstrokes from a painting and generate the individual corresponding relief proxies rather than recovering the exact depth map from the painting, which is a tricky computer vision problem, requiring assumptions that are rarely satisfied. The relief proxies of brushstrokes are then combined together to form a 2.5D high-relief model. To extract brushstrokes from 2D paintings, we apply layer decomposition and stroke segmentation by imposing boundary constraints. The segmented brushstrokes preserve the style of the input painting. By inflation and a displacement map of each brushstroke, the features of brushstrokes are preserved by the resultant high-relief model of the painting. We demonstrate that our approach is able to produce convincing high-reliefs from a variety of paintings(with humans, animals, flowers, etc.). As a secondary application, we show how our brushstroke extraction algorithm could be used for image editing. As a result, our brushstroke extraction algorithm is specifically geared towards paintings with each brushstroke drawn very purposefully, such as Chinese paintings, Rosemailing paintings, etc.

Original languageEnglish
Article number8419282
Pages (from-to)2763-2776
Number of pages14
JournalIEEE Transactions on Visualization and Computer Graphics
Volume25
Issue number9
DOIs
Publication statusPublished - 2019 Sep 1

All Science Journal Classification (ASJC) codes

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'High relief from brush painting'. Together they form a unique fingerprint.

Cite this