TY - JOUR
T1 - An Efficient Lane Following Navigation Strategy With Fusion Attention for Autonomous Drones in Urban Areas
AU - Lee, Chao Yang
AU - Khanum, Abida
AU - Wang, Neng Chung
AU - Karuparthi, Bala Syam Kumar
AU - Yang, Chu Sing
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023/3/1
Y1 - 2023/3/1
N2 - —Drones are a vital part of our daily lives because of their flexible flying nature and low operation and maintenance costs. Navigation is the most important aspect in the autonomous drone era. With that being said, recent studies have tackled the greatly challenging task of obtaining the most feasible and safest path. In this work, we focus on safe and reliable unmanned aerial vehicle navigation in densely populated city environments to efficiently follow the lane. We present herein a deep learning method through which a drone can autonomously navigate by following a lane in a dynamic urban environment. This drone perceives the environment by using two cameras facing front and down. A convolutional neural network (CNN) is used to capture the high-level feature representation through the attention and residual mechanism of the drone’s raw visual inputs. The CNN concurrent pipelines used for three raw visual inputs can output high-level features and follow the concurrent feature fusion into the self-head attention mechanism that encapsulates both dynamic and static information to predict yaw and linear velocity. These predictions are modulated into the final control commands to obtain smooth and continuous inputs that are delivered to the drone. The experimental evaluation results show that the proposed method can learn the drone’s navigation strategy for a lane following a guidance system. The high performance of the trained navigation policy is observed in the simulations with different scenarios. In conclusion, our proposed approach outperforms both existing and current state-of-the-art ImageNet models.
AB - —Drones are a vital part of our daily lives because of their flexible flying nature and low operation and maintenance costs. Navigation is the most important aspect in the autonomous drone era. With that being said, recent studies have tackled the greatly challenging task of obtaining the most feasible and safest path. In this work, we focus on safe and reliable unmanned aerial vehicle navigation in densely populated city environments to efficiently follow the lane. We present herein a deep learning method through which a drone can autonomously navigate by following a lane in a dynamic urban environment. This drone perceives the environment by using two cameras facing front and down. A convolutional neural network (CNN) is used to capture the high-level feature representation through the attention and residual mechanism of the drone’s raw visual inputs. The CNN concurrent pipelines used for three raw visual inputs can output high-level features and follow the concurrent feature fusion into the self-head attention mechanism that encapsulates both dynamic and static information to predict yaw and linear velocity. These predictions are modulated into the final control commands to obtain smooth and continuous inputs that are delivered to the drone. The experimental evaluation results show that the proposed method can learn the drone’s navigation strategy for a lane following a guidance system. The high performance of the trained navigation policy is observed in the simulations with different scenarios. In conclusion, our proposed approach outperforms both existing and current state-of-the-art ImageNet models.
UR - http://www.scopus.com/inward/record.url?scp=85174798281&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85174798281&partnerID=8YFLogxK
U2 - 10.1109/TVT.2023.3322808
DO - 10.1109/TVT.2023.3322808
M3 - Article
AN - SCOPUS:85174798281
SN - 0018-9545
VL - 73
SP - 3094
EP - 3105
JO - IEEE Transactions on Vehicular Technology
JF - IEEE Transactions on Vehicular Technology
IS - 3
ER -