Classification of Underwater Signals Using Wavelet Transforms and Neural Networks

Chin Hsing Chen, Jiann Der Lee, Ming Chi Lin

Research output: Contribution to journalArticle

32 Citations (Scopus)


Neural network classifiers have been widely used in classification due to its adaptive and parallel processing ability. This paper concerns classification of underwater passive sonar signals radiated by ships using neural networks. Classification process can be divided into two stages: one is the signal preprocessing and feature extraction, the other is the recognition process. In the preprocessing and feature extraction stage, the wavelet transform (WT) is used to extract tonal features from the average power spectral density (APSD) of the input data. In the classification stage, two kinds of neural network classifiers are used to evaluate the classification results, inclusive of the hyperplane-based classifier-Multilayer Perceptron (MLP)and the kernel-based classifier-Adaptive Kernel Classifier (AKC). The experimental results obtained from MLP with different configurations and algorithms show that the bipolar continuous function possesses a wider range and a higher value of the learning rate than the unipolar continuous function. Besides, AKC with fixed radius (modified AKC) sometimes gives better performance than AKC, but the former takes more training time in selecting the width of the receptive field. More important, networks trained with tonal features extracted by WT has 96% or 94% correction rate, but the training with original APSDs only have 80% correction rate.

Original languageEnglish
Pages (from-to)47-60
Number of pages14
JournalMathematical and Computer Modelling
Issue number2
Publication statusPublished - 1998 Jan

All Science Journal Classification (ASJC) codes

  • Modelling and Simulation
  • Computer Science Applications

Fingerprint Dive into the research topics of 'Classification of Underwater Signals Using Wavelet Transforms and Neural Networks'. Together they form a unique fingerprint.

  • Cite this