TY - JOUR
T1 - Robust radial basis function neural networks
AU - Lee, Chien Cheng
AU - Chung, Pau Choo
AU - Tsai, Jea Rong
AU - Chang, Chein I.
N1 - Funding Information:
Manuscript received July 5, 1997; revised September 18, 1999. This work was supported by the National Science Council, Taiwan, R.O.C., under Grants NSC 84-2213-E-006-025 and NSC 84-2213-E-006-086. This paper was recommended by Associate Editor P. Borne. J.-R. Tsai, P.-C. Chung, and C.-C. Lee are with the Department of Electrical Engineering, National Cheng Kung University, Tainan, Taiwan, 70101 R.O.C. C.-I. Chang is with the Department of Electrical Engineering, University of Maryland—Baltimore County, Baltimore, MD 21228-5398 USA. Publisher Item Identifier S 1083-4419(99)09703-4.
PY - 1999
Y1 - 1999
N2 - Function approximation has been found in many applications. The radial basis function (RBF) network is one approach which has shown a great promise in this sort of problems because of its faster learning capacity. A traditional RBF network takes Gaussian functions as its basis functions and adopts the least-squares criterion as the objective function. However, it still suffers from two major problems. First, it is difficult to use Gaussian functions to approximate constant values. If a function has nearly constant values in some intervals, the RBF network will be found inefficient in approximating these values. Second, when the training patterns incur a large error, the network will interpolate these training patterns incorrectly. In order to cope with these problems, an RBF network is proposed in this paper which is based on sequences of sigmoidal functions and a robust objective function. The former replaces the Gaussian functions as the basis function of the network so that constantvalued functions can be approximated accurately by an RBF network, while the latter is used to restrain the influence of large errors. Compared with traditional RBF networks, the proposed network demonstrates the following advantages: 1) better capability of approximation to underlying functions; 2) faster learning speed; 3) better size of network; 4) high robustness to outliers.
AB - Function approximation has been found in many applications. The radial basis function (RBF) network is one approach which has shown a great promise in this sort of problems because of its faster learning capacity. A traditional RBF network takes Gaussian functions as its basis functions and adopts the least-squares criterion as the objective function. However, it still suffers from two major problems. First, it is difficult to use Gaussian functions to approximate constant values. If a function has nearly constant values in some intervals, the RBF network will be found inefficient in approximating these values. Second, when the training patterns incur a large error, the network will interpolate these training patterns incorrectly. In order to cope with these problems, an RBF network is proposed in this paper which is based on sequences of sigmoidal functions and a robust objective function. The former replaces the Gaussian functions as the basis function of the network so that constantvalued functions can be approximated accurately by an RBF network, while the latter is used to restrain the influence of large errors. Compared with traditional RBF networks, the proposed network demonstrates the following advantages: 1) better capability of approximation to underlying functions; 2) faster learning speed; 3) better size of network; 4) high robustness to outliers.
UR - http://www.scopus.com/inward/record.url?scp=0033280252&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0033280252&partnerID=8YFLogxK
U2 - 10.1109/3477.809023
DO - 10.1109/3477.809023
M3 - Article
C2 - 18252348
AN - SCOPUS:0033280252
VL - 29
SP - 674
EP - 685
JO - IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
JF - IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
SN - 1083-4419
IS - 6
ER -