Abstract
Radial basis function networks (RBFNs) have recently attracted interest, because of their advantages over multilayer perceptrons as they are universal approximators but achieve faster convergence since only one layer of weights is required. The least squares method is the most popularly used in estimating the synaptic weights which provides optimal results if the underlying error distribution is Gaussian. However, the generalization performance of the networks deteriorates for realistic noise whose distribution is either unknown or non-Gaussian; in particular, it becomes very bad if outliers are present. In this paper we propose a positive-breakdown learning algorithm for RBFNs by applying the breakdown point approach in robust regression such that any assumptions about or estimation of the error distribution are avoidable. The expense of losing efficiency in the presence of Gaussian noise and the problem of local minima for most robust estimators has also been taken into account. The resulting network is shown to be highly robust and stable against a high fraction of outliers as well as small perturbations. This demonstrates its superiority in controlling bias and variance of estimators.
Original language | English |
---|---|
Pages (from-to) | 98-105 |
Number of pages | 8 |
Journal | Proceedings of the International Conference on Tools with Artificial Intelligence |
Publication status | Published - 1995 |
Event | Proceedings of the 1995 IEEE 7th International Conference on Tools with Artificial Intelligence - Herndon, VA, USA Duration: 1995 Nov 5 → 1995 Nov 8 |
All Science Journal Classification (ASJC) codes
- Software