TY - JOUR
T1 - The performance analysis of INS/GNSS/V-SLAM integration scheme using smartphone sensors for land vehicle navigation applications in GNSS-challenging environments
AU - Chiang, Kai Wei
AU - Le, Dinh Thuan
AU - Duong, Thanh Trung
AU - Sun, Rui
N1 - Funding Information:
Funding: This research was funded by Ministry of Science and Technology, grant number 107-2221-E-006-125-MY3.
Funding Information:
This research was funded by Ministry of Science and Technology, grant number 107-2221-E-006-125-MY3. The authors thank Ministry of Science and Technology (MOST) for its financial support. We also thank the editor and anonymous reviewers for their constructive comments on this paper.
Publisher Copyright:
© 2020 by the authors.
PY - 2020/6/1
Y1 - 2020/6/1
N2 - Modern smartphones contain embedded global navigation satellite systems (GNSSs), inertial measurement units (IMUs), cameras, and other sensors which are capable of providing user position, velocity, and attitude. However, it is difficult to utilize the actual navigation performance capabilities of smartphones due to the low-cost and disparate sensors, software technologies adopted by manufacturers, and the significant influence of environmental conditions. In this study, weproposed a scheme that integrated sensor data from smartphone IMUs, GNSS chipsets, and cameras using an extended Kalman filter (EKF) to enhance the navigation performance. The visual data from the camera was preprocessed using oriented FAST (Features from accelerated segment test) and rotated BRIEF (Binary robust independent elementary features)-simultaneous localization and mapping (ORB-SLAM), rescaled by applying GNSS measurements, and converted to velocity data before being utilized to update the integration filter. In order to verify the performance of the integrated system, field test data was collected in a downtown area of Tainan City, Taiwan. Experimental results indicated that visual data contributed significantly to improving the accuracy of the navigation performance, demonstrating improvements of 43.0% and 51.3% in position and velocity, respectively. It was verified that the proposed integrated system, which used data from smartphone sensors, was efficient in terms of increasing navigation accuracy in GNSS-challenging environments.
AB - Modern smartphones contain embedded global navigation satellite systems (GNSSs), inertial measurement units (IMUs), cameras, and other sensors which are capable of providing user position, velocity, and attitude. However, it is difficult to utilize the actual navigation performance capabilities of smartphones due to the low-cost and disparate sensors, software technologies adopted by manufacturers, and the significant influence of environmental conditions. In this study, weproposed a scheme that integrated sensor data from smartphone IMUs, GNSS chipsets, and cameras using an extended Kalman filter (EKF) to enhance the navigation performance. The visual data from the camera was preprocessed using oriented FAST (Features from accelerated segment test) and rotated BRIEF (Binary robust independent elementary features)-simultaneous localization and mapping (ORB-SLAM), rescaled by applying GNSS measurements, and converted to velocity data before being utilized to update the integration filter. In order to verify the performance of the integrated system, field test data was collected in a downtown area of Tainan City, Taiwan. Experimental results indicated that visual data contributed significantly to improving the accuracy of the navigation performance, demonstrating improvements of 43.0% and 51.3% in position and velocity, respectively. It was verified that the proposed integrated system, which used data from smartphone sensors, was efficient in terms of increasing navigation accuracy in GNSS-challenging environments.
UR - http://www.scopus.com/inward/record.url?scp=85086427911&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85086427911&partnerID=8YFLogxK
U2 - 10.3390/rs12111732
DO - 10.3390/rs12111732
M3 - Article
AN - SCOPUS:85086427911
VL - 12
JO - Remote Sensing
JF - Remote Sensing
SN - 2072-4292
IS - 11
M1 - 1732
ER -