TY - GEN
T1 - Motion-Aware Iterative Closest Point Estimation for Fast Visual Odometry
AU - Lin, Ting Yu
AU - Chen, Chun Wei
AU - Wang, Jonas
AU - Shieh, Ming Der
PY - 2017/1/30
Y1 - 2017/1/30
N2 - Iterative closest point (ICP) algorithm is a common localization method used to estimate camera poses by aligning two depth frames. Since the input depth map is easily distorted when the camera is in large motion, it might result in incorrect pose estimation and produce apparent drift for ICP-based applications. To alleviate this problem, instead of using the time-consuming graph-based optimization approach for post processing, this work aims at refining poses when detecting noisy depth maps and presents a hybrid decision mechanism to detect noisy depth maps based on the characteristic of ICP. The camera pose of the next frame is decided by referring to the last frame instead of the current frame when a noisy depth map is detected, by doing so, we can prevent the errors produced in the current frame from propagating to the next frame, thus reducing drift. Experimental results show that the relative pose error reduce to 58% in average at the time when large motion happens.
AB - Iterative closest point (ICP) algorithm is a common localization method used to estimate camera poses by aligning two depth frames. Since the input depth map is easily distorted when the camera is in large motion, it might result in incorrect pose estimation and produce apparent drift for ICP-based applications. To alleviate this problem, instead of using the time-consuming graph-based optimization approach for post processing, this work aims at refining poses when detecting noisy depth maps and presents a hybrid decision mechanism to detect noisy depth maps based on the characteristic of ICP. The camera pose of the next frame is decided by referring to the last frame instead of the current frame when a noisy depth map is detected, by doing so, we can prevent the errors produced in the current frame from propagating to the next frame, thus reducing drift. Experimental results show that the relative pose error reduce to 58% in average at the time when large motion happens.
UR - http://www.scopus.com/inward/record.url?scp=85015244554&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85015244554&partnerID=8YFLogxK
U2 - 10.1109/ISMAR-Adjunct.2016.0091
DO - 10.1109/ISMAR-Adjunct.2016.0091
M3 - Conference contribution
AN - SCOPUS:85015244554
T3 - Adjunct Proceedings of the 2016 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2016
SP - 268
EP - 269
BT - Adjunct Proceedings of the 2016 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2016
A2 - Veas, Eduardo
A2 - Grasset, Raphael
A2 - Langlotz, Tobias
A2 - Martin, Alejandro
A2 - Martinez-Carranza, Jose
A2 - Sugimoto, Maki
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 15th Adjunct IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2016
Y2 - 18 September 2016 through 23 September 2016
ER -