Experimental Study of Material Classification and Recognition by a LiDAR Scanning

Chien Ru Yu, Chao Chung Peng

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

In this short note, a method is presented to classify metal, wood and glass objects by using the measurement of light detection and ranging (LiDAR). It is found that those objects can be easily distinguished if the incident angle, which is formed by a laser bin and a normal vector of the reflection surface, is near zero. However, for practical application, it would be quite difficult to guarantee the zero-incident-angle condition. Therefore, a bunch of data from arbitrary view were collected and used for training and classification. Experiments show that the incident angle affect the reflection intensity of the LiDAR significantly and therefore cause the performance degradation of object recognition.

Original languageEnglish
Title of host publication2020 IEEE International Conference on Consumer Electronics - Taiwan, ICCE-Taiwan 2020
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728173993
DOIs
Publication statusPublished - 2020 Sept 28
Event7th IEEE International Conference on Consumer Electronics - Taiwan, ICCE-Taiwan 2020 - Taoyuan, Taiwan
Duration: 2020 Sept 282020 Sept 30

Publication series

Name2020 IEEE International Conference on Consumer Electronics - Taiwan, ICCE-Taiwan 2020

Conference

Conference7th IEEE International Conference on Consumer Electronics - Taiwan, ICCE-Taiwan 2020
Country/TerritoryTaiwan
CityTaoyuan
Period20-09-2820-09-30

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Artificial Intelligence
  • Computer Science Applications
  • Signal Processing
  • Electrical and Electronic Engineering
  • Instrumentation

Fingerprint

Dive into the research topics of 'Experimental Study of Material Classification and Recognition by a LiDAR Scanning'. Together they form a unique fingerprint.

Cite this