A normalized surf for multispectral image matching and band co-registration

J. P. Jhan, Jiann-Yeou Rau

研究成果: Conference article

摘要

Due to the raw images of multi-lens multispectral (MS) camera has significant misregistration errors, performing image registration for band co-registration is necessary. Image matching is an essential step for image registration, which obtains conjugate features on the overlapped areas, and use them to estimate the coefficients of a transformation model for correcting the geometrical errors. However, due to the none-linear intensity of spectral response, performing feature-based image matching (such as SURF) can only obtain only a few conjugate features on cross-band MS images. Different to SURF that extracts local extremum in a multi-scale space and utilizes a threshold to determine a feature, we proposed a normalized SURF (N-SURF) that extracts features on single scale, calculates the cumulative distribution function (CDF) of features, and obtains consistent features from the CDF. In this study, two datasets acquired from Tetracam MiniMCA-12 and Micasense RedEdge Altum are used for evaluating the matching performance of N-SURF. Results show that N-SURF can extract approximately 2–3 times number of features, match more points, and have more efficient than original SURF. On the other hand, with the successful of MS image matching, we can therefor use the conjugates to compute the coefficients of a geometric transformation model. In this study, three transformation models are used to compare the difference on MS band co-registration, i.e. affine, projective, and extended projective. Results show that extended projective model is better than the others as it can compensate the difference of lens distortion and viewpoint, and has co-registration accuracy of 0.3–0.6 pixels.

原文English
頁(從 - 到)393-399
頁數7
期刊International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives
42
發行號2/W13
DOIs
出版狀態Published - 2019 六月 4
事件4th ISPRS Geospatial Week 2019 - Enschede, Netherlands
持續時間: 2019 六月 102019 六月 14

指紋

Image matching
multispectral image
Image registration
Distribution functions
Lenses
pixel
Pixels
Cameras
registration
distribution
image registration
performance

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Geography, Planning and Development

引用此文

@article{3a197855c7ad4c08b1774a1773b094fe,
title = "A normalized surf for multispectral image matching and band co-registration",
abstract = "Due to the raw images of multi-lens multispectral (MS) camera has significant misregistration errors, performing image registration for band co-registration is necessary. Image matching is an essential step for image registration, which obtains conjugate features on the overlapped areas, and use them to estimate the coefficients of a transformation model for correcting the geometrical errors. However, due to the none-linear intensity of spectral response, performing feature-based image matching (such as SURF) can only obtain only a few conjugate features on cross-band MS images. Different to SURF that extracts local extremum in a multi-scale space and utilizes a threshold to determine a feature, we proposed a normalized SURF (N-SURF) that extracts features on single scale, calculates the cumulative distribution function (CDF) of features, and obtains consistent features from the CDF. In this study, two datasets acquired from Tetracam MiniMCA-12 and Micasense RedEdge Altum are used for evaluating the matching performance of N-SURF. Results show that N-SURF can extract approximately 2–3 times number of features, match more points, and have more efficient than original SURF. On the other hand, with the successful of MS image matching, we can therefor use the conjugates to compute the coefficients of a geometric transformation model. In this study, three transformation models are used to compare the difference on MS band co-registration, i.e. affine, projective, and extended projective. Results show that extended projective model is better than the others as it can compensate the difference of lens distortion and viewpoint, and has co-registration accuracy of 0.3–0.6 pixels.",
author = "Jhan, {J. P.} and Jiann-Yeou Rau",
year = "2019",
month = "6",
day = "4",
doi = "10.5194/isprs-archives-XLII-2-W13-393-2019",
language = "English",
volume = "42",
pages = "393--399",
journal = "International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives",
issn = "1682-1750",
number = "2/W13",

}

TY - JOUR

T1 - A normalized surf for multispectral image matching and band co-registration

AU - Jhan, J. P.

AU - Rau, Jiann-Yeou

PY - 2019/6/4

Y1 - 2019/6/4

N2 - Due to the raw images of multi-lens multispectral (MS) camera has significant misregistration errors, performing image registration for band co-registration is necessary. Image matching is an essential step for image registration, which obtains conjugate features on the overlapped areas, and use them to estimate the coefficients of a transformation model for correcting the geometrical errors. However, due to the none-linear intensity of spectral response, performing feature-based image matching (such as SURF) can only obtain only a few conjugate features on cross-band MS images. Different to SURF that extracts local extremum in a multi-scale space and utilizes a threshold to determine a feature, we proposed a normalized SURF (N-SURF) that extracts features on single scale, calculates the cumulative distribution function (CDF) of features, and obtains consistent features from the CDF. In this study, two datasets acquired from Tetracam MiniMCA-12 and Micasense RedEdge Altum are used for evaluating the matching performance of N-SURF. Results show that N-SURF can extract approximately 2–3 times number of features, match more points, and have more efficient than original SURF. On the other hand, with the successful of MS image matching, we can therefor use the conjugates to compute the coefficients of a geometric transformation model. In this study, three transformation models are used to compare the difference on MS band co-registration, i.e. affine, projective, and extended projective. Results show that extended projective model is better than the others as it can compensate the difference of lens distortion and viewpoint, and has co-registration accuracy of 0.3–0.6 pixels.

AB - Due to the raw images of multi-lens multispectral (MS) camera has significant misregistration errors, performing image registration for band co-registration is necessary. Image matching is an essential step for image registration, which obtains conjugate features on the overlapped areas, and use them to estimate the coefficients of a transformation model for correcting the geometrical errors. However, due to the none-linear intensity of spectral response, performing feature-based image matching (such as SURF) can only obtain only a few conjugate features on cross-band MS images. Different to SURF that extracts local extremum in a multi-scale space and utilizes a threshold to determine a feature, we proposed a normalized SURF (N-SURF) that extracts features on single scale, calculates the cumulative distribution function (CDF) of features, and obtains consistent features from the CDF. In this study, two datasets acquired from Tetracam MiniMCA-12 and Micasense RedEdge Altum are used for evaluating the matching performance of N-SURF. Results show that N-SURF can extract approximately 2–3 times number of features, match more points, and have more efficient than original SURF. On the other hand, with the successful of MS image matching, we can therefor use the conjugates to compute the coefficients of a geometric transformation model. In this study, three transformation models are used to compare the difference on MS band co-registration, i.e. affine, projective, and extended projective. Results show that extended projective model is better than the others as it can compensate the difference of lens distortion and viewpoint, and has co-registration accuracy of 0.3–0.6 pixels.

UR - http://www.scopus.com/inward/record.url?scp=85067489259&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85067489259&partnerID=8YFLogxK

U2 - 10.5194/isprs-archives-XLII-2-W13-393-2019

DO - 10.5194/isprs-archives-XLII-2-W13-393-2019

M3 - Conference article

AN - SCOPUS:85067489259

VL - 42

SP - 393

EP - 399

JO - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives

JF - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives

SN - 1682-1750

IS - 2/W13

ER -