TY - JOUR

T1 - Towards positive-breakdown radial basis function networks

AU - Li, Sheng Tun

AU - Leiss, Ernst L.

PY - 1995

Y1 - 1995

N2 - Radial basis function networks (RBFNs) have recently attracted interest, because of their advantages over multilayer perceptrons as they are universal approximators but achieve faster convergence since only one layer of weights is required. The least squares method is the most popularly used in estimating the synaptic weights which provides optimal results if the underlying error distribution is Gaussian. However, the generalization performance of the networks deteriorates for realistic noise whose distribution is either unknown or non-Gaussian; in particular, it becomes very bad if outliers are present. In this paper we propose a positive-breakdown learning algorithm for RBFNs by applying the breakdown point approach in robust regression such that any assumptions about or estimation of the error distribution are avoidable. The expense of losing efficiency in the presence of Gaussian noise and the problem of local minima for most robust estimators has also been taken into account. The resulting network is shown to be highly robust and stable against a high fraction of outliers as well as small perturbations. This demonstrates its superiority in controlling bias and variance of estimators.

AB - Radial basis function networks (RBFNs) have recently attracted interest, because of their advantages over multilayer perceptrons as they are universal approximators but achieve faster convergence since only one layer of weights is required. The least squares method is the most popularly used in estimating the synaptic weights which provides optimal results if the underlying error distribution is Gaussian. However, the generalization performance of the networks deteriorates for realistic noise whose distribution is either unknown or non-Gaussian; in particular, it becomes very bad if outliers are present. In this paper we propose a positive-breakdown learning algorithm for RBFNs by applying the breakdown point approach in robust regression such that any assumptions about or estimation of the error distribution are avoidable. The expense of losing efficiency in the presence of Gaussian noise and the problem of local minima for most robust estimators has also been taken into account. The resulting network is shown to be highly robust and stable against a high fraction of outliers as well as small perturbations. This demonstrates its superiority in controlling bias and variance of estimators.

UR - http://www.scopus.com/inward/record.url?scp=0029480199&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0029480199&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:0029480199

SN - 1063-6730

SP - 98

EP - 105

JO - Proceedings of the International Conference on Tools with Artificial Intelligence

JF - Proceedings of the International Conference on Tools with Artificial Intelligence

T2 - Proceedings of the 1995 IEEE 7th International Conference on Tools with Artificial Intelligence

Y2 - 5 November 1995 through 8 November 1995

ER -