A near-duplicate video retrieval method based on Zernike moments

Tang You Chang, Shen Chuan Tai, Guo Shiang Lin

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

In this paper, a near-duplicate video retrieval method developed based on invariant features was proposed. After shot change detection, Zernike moments are extracted from each key-frame of videos as invariant features. We obtain the key-frame similarity by computing the difference of Zernike moments between key-frames of the query and test videos. To achieve near-duplicate video retrieval, each key-frame is considered as an individual sensor and then evaluating all of key-frames is considered as multiple sensors. The results of key-frames are fused to obtain a better performance of near-duplicate video retrieval. The experimental results show that the proposed method can not only find the relevant videos effectively but also resist to the possible modifications such as re-scaling and logo insertion.

Original languageEnglish
Title of host publication2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages860-864
Number of pages5
ISBN (Electronic)9789881476807
DOIs
Publication statusPublished - 2016 Feb 19
Event2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015 - Hong Kong, Hong Kong
Duration: 2015 Dec 162015 Dec 19

Publication series

Name2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015

Other

Other2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015
CountryHong Kong
CityHong Kong
Period15-12-1615-12-19

Fingerprint

Zernike Moments
Video Retrieval
Sensors
Sensor
Invariant
Change Detection
Rescaling
Resist
Insertion
Query
Computing
Experimental Results

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Modelling and Simulation
  • Signal Processing

Cite this

Chang, T. Y., Tai, S. C., & Lin, G. S. (2016). A near-duplicate video retrieval method based on Zernike moments. In 2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015 (pp. 860-864). [7415393] (2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/APSIPA.2015.7415393
Chang, Tang You ; Tai, Shen Chuan ; Lin, Guo Shiang. / A near-duplicate video retrieval method based on Zernike moments. 2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015. Institute of Electrical and Electronics Engineers Inc., 2016. pp. 860-864 (2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015).
@inproceedings{42ae3cc5da864d5da5975afde9e2a0d4,
title = "A near-duplicate video retrieval method based on Zernike moments",
abstract = "In this paper, a near-duplicate video retrieval method developed based on invariant features was proposed. After shot change detection, Zernike moments are extracted from each key-frame of videos as invariant features. We obtain the key-frame similarity by computing the difference of Zernike moments between key-frames of the query and test videos. To achieve near-duplicate video retrieval, each key-frame is considered as an individual sensor and then evaluating all of key-frames is considered as multiple sensors. The results of key-frames are fused to obtain a better performance of near-duplicate video retrieval. The experimental results show that the proposed method can not only find the relevant videos effectively but also resist to the possible modifications such as re-scaling and logo insertion.",
author = "Chang, {Tang You} and Tai, {Shen Chuan} and Lin, {Guo Shiang}",
year = "2016",
month = "2",
day = "19",
doi = "10.1109/APSIPA.2015.7415393",
language = "English",
series = "2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "860--864",
booktitle = "2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015",
address = "United States",

}

Chang, TY, Tai, SC & Lin, GS 2016, A near-duplicate video retrieval method based on Zernike moments. in 2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015., 7415393, 2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015, Institute of Electrical and Electronics Engineers Inc., pp. 860-864, 2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015, Hong Kong, Hong Kong, 15-12-16. https://doi.org/10.1109/APSIPA.2015.7415393

A near-duplicate video retrieval method based on Zernike moments. / Chang, Tang You; Tai, Shen Chuan; Lin, Guo Shiang.

2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015. Institute of Electrical and Electronics Engineers Inc., 2016. p. 860-864 7415393 (2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - A near-duplicate video retrieval method based on Zernike moments

AU - Chang, Tang You

AU - Tai, Shen Chuan

AU - Lin, Guo Shiang

PY - 2016/2/19

Y1 - 2016/2/19

N2 - In this paper, a near-duplicate video retrieval method developed based on invariant features was proposed. After shot change detection, Zernike moments are extracted from each key-frame of videos as invariant features. We obtain the key-frame similarity by computing the difference of Zernike moments between key-frames of the query and test videos. To achieve near-duplicate video retrieval, each key-frame is considered as an individual sensor and then evaluating all of key-frames is considered as multiple sensors. The results of key-frames are fused to obtain a better performance of near-duplicate video retrieval. The experimental results show that the proposed method can not only find the relevant videos effectively but also resist to the possible modifications such as re-scaling and logo insertion.

AB - In this paper, a near-duplicate video retrieval method developed based on invariant features was proposed. After shot change detection, Zernike moments are extracted from each key-frame of videos as invariant features. We obtain the key-frame similarity by computing the difference of Zernike moments between key-frames of the query and test videos. To achieve near-duplicate video retrieval, each key-frame is considered as an individual sensor and then evaluating all of key-frames is considered as multiple sensors. The results of key-frames are fused to obtain a better performance of near-duplicate video retrieval. The experimental results show that the proposed method can not only find the relevant videos effectively but also resist to the possible modifications such as re-scaling and logo insertion.

UR - http://www.scopus.com/inward/record.url?scp=84986232769&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84986232769&partnerID=8YFLogxK

U2 - 10.1109/APSIPA.2015.7415393

DO - 10.1109/APSIPA.2015.7415393

M3 - Conference contribution

AN - SCOPUS:84986232769

T3 - 2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015

SP - 860

EP - 864

BT - 2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015

PB - Institute of Electrical and Electronics Engineers Inc.

ER -

Chang TY, Tai SC, Lin GS. A near-duplicate video retrieval method based on Zernike moments. In 2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015. Institute of Electrical and Electronics Engineers Inc. 2016. p. 860-864. 7415393. (2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015). https://doi.org/10.1109/APSIPA.2015.7415393