Path planning for unmanned surface vessels with multisensor fusion design

Yung Hsiang Chen, Yung Yue Chen, Joe Ming Yang

Research output: Contribution to journalArticle

Abstract

To achieve a successful sea voyage, an effective path planning algorithm plays an important role in avoiding vessel collisions in an autonomous surface vessel design . This algorithm depends on real-time global images of explored environments. Currently, satellite images used for the path planning algorithm always have rough resolutions, which increase the risk of vessel collision. To solve this problem, an alternative solution of integrating the aerial images captured by the image sensor of a flying quadrotor and the global positioning system (GPS), electrical compass, and height sensor data is proposed to provide real-time images in the global coordinate instead of satellite images. Naturally, aerial images possess higher resolution pixels than the satellite images and allow real-time modifications of geographic information. Images captured by the image sensor installed on the quadrotor are essentially represented in local coordinates; hence, in this investigation, a coordinate transformation for converting the local coordinate of the aerial images into the global coordinate is developed and named the distance- resolution transformation. With the fusion of these sensors and designs, the global coordinate of the high-resolution image can be calculated directly, and the real-time path planning using this aerial image can be applied to plan a collision-free path to navigate the unmanned surface vessel in any unknown ocean environment.

Original languageEnglish
Pages (from-to)889-903
Number of pages15
JournalSensors and Materials
Volume31
Issue number3
DOIs
Publication statusPublished - 2019 Jan 1

All Science Journal Classification (ASJC) codes

  • Instrumentation
  • Materials Science(all)

Fingerprint Dive into the research topics of 'Path planning for unmanned surface vessels with multisensor fusion design'. Together they form a unique fingerprint.

  • Cite this