A near-duplicate video retrieval method based on Zernike moments

Tang You Chang, Shen Chuan Tai, Guo Shiang Lin

研究成果: Conference contribution

4 引文 斯高帕斯(Scopus)

摘要

In this paper, a near-duplicate video retrieval method developed based on invariant features was proposed. After shot change detection, Zernike moments are extracted from each key-frame of videos as invariant features. We obtain the key-frame similarity by computing the difference of Zernike moments between key-frames of the query and test videos. To achieve near-duplicate video retrieval, each key-frame is considered as an individual sensor and then evaluating all of key-frames is considered as multiple sensors. The results of key-frames are fused to obtain a better performance of near-duplicate video retrieval. The experimental results show that the proposed method can not only find the relevant videos effectively but also resist to the possible modifications such as re-scaling and logo insertion.

原文English
主出版物標題2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015
發行者Institute of Electrical and Electronics Engineers Inc.
頁面860-864
頁數5
ISBN(電子)9789881476807
DOIs
出版狀態Published - 2016 2月 19
事件2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015 - Hong Kong, Hong Kong
持續時間: 2015 12月 162015 12月 19

出版系列

名字2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015

Other

Other2015 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2015
國家/地區Hong Kong
城市Hong Kong
期間15-12-1615-12-19

All Science Journal Classification (ASJC) codes

  • 人工智慧
  • 建模與模擬
  • 訊號處理

指紋

深入研究「A near-duplicate video retrieval method based on Zernike moments」主題。共同形成了獨特的指紋。

引用此