Shadow detection and information recovery in aerial images

Chih Wei Chang, Jaan-Rong Tsay

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

The intensity value on each image pixel is considered as the result of illumination and ground reflection functions. In fact, light cast on terrain objects such as buildings and trees would cause shadows and occlusion. It is hard to recover original textures in shadow area or eliminate shadows. Many detail information is hidden or lost because of such problems. Moreover, false color tone, shape distortion and failure of image matching within shadow areas also infect image recognition. However, we can still have the bright and color information to offer shadow compensation. It is necessarily to reduce or compensate the effect caused by shadow and occlusion, and get information we need. Methods for shadow detection and compensation have been studied for many years, and it's still difficult to obtain a satisfied outcome. Therefore, it's important to figure out a useful and practical method to improve compensation result. There are many models to compensate shadows, such as HSI, HSV, HCV, YIQ, etc. Our experience shows the effectiveness of the model we proposed, and reveals details covered by shadows.

Original languageEnglish
Title of host publication31st Asian Conference on Remote Sensing 2010, ACRS 2010
Pages392-397
Number of pages6
Publication statusPublished - 2010 Dec 1
Event31st Asian Conference on Remote Sensing 2010, ACRS 2010 - Hanoi, Viet Nam
Duration: 2010 Nov 12010 Nov 5

Publication series

Name31st Asian Conference on Remote Sensing 2010, ACRS 2010
Volume1

Other

Other31st Asian Conference on Remote Sensing 2010, ACRS 2010
CountryViet Nam
CityHanoi
Period10-11-0110-11-05

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications

Fingerprint Dive into the research topics of 'Shadow detection and information recovery in aerial images'. Together they form a unique fingerprint.

Cite this