Modern smartphones contain embedded global navigation satellite systems (GNSSs), inertial measurement units (IMUs), cameras, and other sensors which are capable of providing user position, velocity, and attitude. However, it is difficult to utilize the actual navigation performance capabilities of smartphones due to the low-cost and disparate sensors, software technologies adopted by manufacturers, and the significant influence of environmental conditions. In this study, weproposed a scheme that integrated sensor data from smartphone IMUs, GNSS chipsets, and cameras using an extended Kalman filter (EKF) to enhance the navigation performance. The visual data from the camera was preprocessed using oriented FAST (Features from accelerated segment test) and rotated BRIEF (Binary robust independent elementary features)-simultaneous localization and mapping (ORB-SLAM), rescaled by applying GNSS measurements, and converted to velocity data before being utilized to update the integration filter. In order to verify the performance of the integrated system, field test data was collected in a downtown area of Tainan City, Taiwan. Experimental results indicated that visual data contributed significantly to improving the accuracy of the navigation performance, demonstrating improvements of 43.0% and 51.3% in position and velocity, respectively. It was verified that the proposed integrated system, which used data from smartphone sensors, was efficient in terms of increasing navigation accuracy in GNSS-challenging environments.
All Science Journal Classification (ASJC) codes
- Earth and Planetary Sciences(all)