Local radial basis function network regressor with feature importance optimization

Yu Ann Chen, Pau-Choo Chung

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Recent big data analysis usually involves datasets with features collected from various sources, where each feature may have different importance, and the training datasets may not be uniformly sampled. To improve the prediction quality of realworld learning problems, we propose a local radial basis function network that is capable of handling both nonuniform sampling density and heterogeneous features. Nonuniform sampling is resolved by estimating local sampling density and adjust the width of the Gaussian kernels accordingly, and heterogeneous features are handled by scaling each dimension of the feature space asymmetrically. To make the learner aware of inter-feature relationship, we propose a feature importance optimization technique base on L-BFGS-B algorithm, using the leave-one-out cross-validation mean squared error as the objective function. Leave-one-out cross-validation used to be a very time consuming process, but the optimization has been made practical by the fast cross-validation capability of local RBFN. Our experiments show that when both nonuniform sampling density and interfeature relationship are properly handled, a simple RBFN can outperform more complex kernel-based learning models such as support vector regressor on both mean-squared-error and training speed.

Original languageEnglish
Title of host publication2015 International Joint Conference on Neural Networks, IJCNN 2015
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781479919604, 9781479919604, 9781479919604, 9781479919604
DOIs
Publication statusPublished - 2015 Sep 28
EventInternational Joint Conference on Neural Networks, IJCNN 2015 - Killarney, Ireland
Duration: 2015 Jul 122015 Jul 17

Publication series

NameProceedings of the International Joint Conference on Neural Networks
Volume2015-September

Other

OtherInternational Joint Conference on Neural Networks, IJCNN 2015
CountryIreland
CityKillarney
Period15-07-1215-07-17

Fingerprint

Radial basis function networks
Sampling
Experiments

All Science Journal Classification (ASJC) codes

  • Software
  • Artificial Intelligence

Cite this

Chen, Y. A., & Chung, P-C. (2015). Local radial basis function network regressor with feature importance optimization. In 2015 International Joint Conference on Neural Networks, IJCNN 2015 [7280345] (Proceedings of the International Joint Conference on Neural Networks; Vol. 2015-September). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/IJCNN.2015.7280345
Chen, Yu Ann ; Chung, Pau-Choo. / Local radial basis function network regressor with feature importance optimization. 2015 International Joint Conference on Neural Networks, IJCNN 2015. Institute of Electrical and Electronics Engineers Inc., 2015. (Proceedings of the International Joint Conference on Neural Networks).
@inproceedings{7df95be10e3848728db05890bf80bae2,
title = "Local radial basis function network regressor with feature importance optimization",
abstract = "Recent big data analysis usually involves datasets with features collected from various sources, where each feature may have different importance, and the training datasets may not be uniformly sampled. To improve the prediction quality of realworld learning problems, we propose a local radial basis function network that is capable of handling both nonuniform sampling density and heterogeneous features. Nonuniform sampling is resolved by estimating local sampling density and adjust the width of the Gaussian kernels accordingly, and heterogeneous features are handled by scaling each dimension of the feature space asymmetrically. To make the learner aware of inter-feature relationship, we propose a feature importance optimization technique base on L-BFGS-B algorithm, using the leave-one-out cross-validation mean squared error as the objective function. Leave-one-out cross-validation used to be a very time consuming process, but the optimization has been made practical by the fast cross-validation capability of local RBFN. Our experiments show that when both nonuniform sampling density and interfeature relationship are properly handled, a simple RBFN can outperform more complex kernel-based learning models such as support vector regressor on both mean-squared-error and training speed.",
author = "Chen, {Yu Ann} and Pau-Choo Chung",
year = "2015",
month = "9",
day = "28",
doi = "10.1109/IJCNN.2015.7280345",
language = "English",
series = "Proceedings of the International Joint Conference on Neural Networks",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
booktitle = "2015 International Joint Conference on Neural Networks, IJCNN 2015",
address = "United States",

}

Chen, YA & Chung, P-C 2015, Local radial basis function network regressor with feature importance optimization. in 2015 International Joint Conference on Neural Networks, IJCNN 2015., 7280345, Proceedings of the International Joint Conference on Neural Networks, vol. 2015-September, Institute of Electrical and Electronics Engineers Inc., International Joint Conference on Neural Networks, IJCNN 2015, Killarney, Ireland, 15-07-12. https://doi.org/10.1109/IJCNN.2015.7280345

Local radial basis function network regressor with feature importance optimization. / Chen, Yu Ann; Chung, Pau-Choo.

2015 International Joint Conference on Neural Networks, IJCNN 2015. Institute of Electrical and Electronics Engineers Inc., 2015. 7280345 (Proceedings of the International Joint Conference on Neural Networks; Vol. 2015-September).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Local radial basis function network regressor with feature importance optimization

AU - Chen, Yu Ann

AU - Chung, Pau-Choo

PY - 2015/9/28

Y1 - 2015/9/28

N2 - Recent big data analysis usually involves datasets with features collected from various sources, where each feature may have different importance, and the training datasets may not be uniformly sampled. To improve the prediction quality of realworld learning problems, we propose a local radial basis function network that is capable of handling both nonuniform sampling density and heterogeneous features. Nonuniform sampling is resolved by estimating local sampling density and adjust the width of the Gaussian kernels accordingly, and heterogeneous features are handled by scaling each dimension of the feature space asymmetrically. To make the learner aware of inter-feature relationship, we propose a feature importance optimization technique base on L-BFGS-B algorithm, using the leave-one-out cross-validation mean squared error as the objective function. Leave-one-out cross-validation used to be a very time consuming process, but the optimization has been made practical by the fast cross-validation capability of local RBFN. Our experiments show that when both nonuniform sampling density and interfeature relationship are properly handled, a simple RBFN can outperform more complex kernel-based learning models such as support vector regressor on both mean-squared-error and training speed.

AB - Recent big data analysis usually involves datasets with features collected from various sources, where each feature may have different importance, and the training datasets may not be uniformly sampled. To improve the prediction quality of realworld learning problems, we propose a local radial basis function network that is capable of handling both nonuniform sampling density and heterogeneous features. Nonuniform sampling is resolved by estimating local sampling density and adjust the width of the Gaussian kernels accordingly, and heterogeneous features are handled by scaling each dimension of the feature space asymmetrically. To make the learner aware of inter-feature relationship, we propose a feature importance optimization technique base on L-BFGS-B algorithm, using the leave-one-out cross-validation mean squared error as the objective function. Leave-one-out cross-validation used to be a very time consuming process, but the optimization has been made practical by the fast cross-validation capability of local RBFN. Our experiments show that when both nonuniform sampling density and interfeature relationship are properly handled, a simple RBFN can outperform more complex kernel-based learning models such as support vector regressor on both mean-squared-error and training speed.

UR - http://www.scopus.com/inward/record.url?scp=84951115064&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84951115064&partnerID=8YFLogxK

U2 - 10.1109/IJCNN.2015.7280345

DO - 10.1109/IJCNN.2015.7280345

M3 - Conference contribution

T3 - Proceedings of the International Joint Conference on Neural Networks

BT - 2015 International Joint Conference on Neural Networks, IJCNN 2015

PB - Institute of Electrical and Electronics Engineers Inc.

ER -

Chen YA, Chung P-C. Local radial basis function network regressor with feature importance optimization. In 2015 International Joint Conference on Neural Networks, IJCNN 2015. Institute of Electrical and Electronics Engineers Inc. 2015. 7280345. (Proceedings of the International Joint Conference on Neural Networks). https://doi.org/10.1109/IJCNN.2015.7280345