A real-time video noise reduction algorithm in the dusk environment

Thou Ho Chen, Chao Yu Chen, Shi Feng Huang, Chin Hsing Chen

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper is dedicated to achieving real-time video noise reduction in the dusk environment. For coping with the inherent problem of blur moving object, the basic idea behind the proposed method is to use different filtering strategies for background and foreground (i.e., moving object). Firstly, background and foreground are separated by changing detection technique and then the temporal averaging filter and spatial filter are utilized for recovering the background and foreground, respectively. Thus, detail preserving can be performed as well as noise reduction in the real-time process. Experimental results show that the proposed method manifests a better visual quality and provides a compressing ratio of above 90% and at least 15% over other filters.

Original languageEnglish
Title of host publicationProceedings - 3rd International Conference on Intelligent Information Hiding and Multimedia Signal Processing, IIHMSP 2007.
Pages531-534
Number of pages4
DOIs
Publication statusPublished - 2007 Dec 1
Event3rd International Conference on Intelligent Information Hiding and Multimedia Signal Processing, IIHMSP 2007 - Kaohsiung, Taiwan
Duration: 2007 Nov 262007 Nov 28

Publication series

NameProceedings - 3rd International Conference on Intelligent Information Hiding and Multimedia Signal Processing, IIHMSP 2007.
Volume2

Other

Other3rd International Conference on Intelligent Information Hiding and Multimedia Signal Processing, IIHMSP 2007
CountryTaiwan
CityKaohsiung
Period07-11-2607-11-28

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Signal Processing
  • Information Systems and Management

Fingerprint Dive into the research topics of 'A real-time video noise reduction algorithm in the dusk environment'. Together they form a unique fingerprint.

Cite this