TY - GEN
T1 - Improving ESVM with Generalized Cross-Validation
AU - Feng, Tianshu
AU - Zhuang, Fuzhen
AU - He, Qing
N1 - Publisher Copyright:
© 2015 IEEE.
PY - 2015/9/28
Y1 - 2015/9/28
N2 - ELM works for the 'generalized' singlehidden layer feedforward networks (SLFNs) but the hidden layer (or called feature mapping) in ELM needs not be tuned. Extreme Support Vector Machine (ESVM), combining Support Vector Machine (SVM) and Extreme Learning Machine (ELM) kernels, can lead to a better prediction capability. ESVM can usually have a relatively good predictive capability, and its training time is shorter than SVM most of the time. However, the estimation of regularization parameter of ESVM is very time-consuming. Moreover, the effects of the variance of hidden layer weights and the number of hidden neurons on ESVM are still unclear. Generalized Cross-Validation (GCV) has been widely used in statistics because it can efficiently estimate the ridge parameter without estimating the variance of errors. In this work, we study a connection between ESVM and GCV. Specifically, we consider the computation of the separating plane in ESVM as a ridge regression problem, and propose to use GCV to estimate the regularization parameter of ESVM. Experimental results show that GCV can significantly improve the efficiency of ESVM without accuracy lost. Also, the regularization parameter estimated by GCV can help to analyze how the variance of hidden layer weights and the number of hidden neurons affect the performance of ESVM.
AB - ELM works for the 'generalized' singlehidden layer feedforward networks (SLFNs) but the hidden layer (or called feature mapping) in ELM needs not be tuned. Extreme Support Vector Machine (ESVM), combining Support Vector Machine (SVM) and Extreme Learning Machine (ELM) kernels, can lead to a better prediction capability. ESVM can usually have a relatively good predictive capability, and its training time is shorter than SVM most of the time. However, the estimation of regularization parameter of ESVM is very time-consuming. Moreover, the effects of the variance of hidden layer weights and the number of hidden neurons on ESVM are still unclear. Generalized Cross-Validation (GCV) has been widely used in statistics because it can efficiently estimate the ridge parameter without estimating the variance of errors. In this work, we study a connection between ESVM and GCV. Specifically, we consider the computation of the separating plane in ESVM as a ridge regression problem, and propose to use GCV to estimate the regularization parameter of ESVM. Experimental results show that GCV can significantly improve the efficiency of ESVM without accuracy lost. Also, the regularization parameter estimated by GCV can help to analyze how the variance of hidden layer weights and the number of hidden neurons affect the performance of ESVM.
KW - Accuracy
KW - Estimation
KW - Lead
KW - Least squares approximations
KW - Neurons
KW - Standards
UR - https://www.scopus.com/pages/publications/84951205739
U2 - 10.1109/IJCNN.2015.7280322
DO - 10.1109/IJCNN.2015.7280322
M3 - 会议稿件
AN - SCOPUS:84951205739
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - 2015 International Joint Conference on Neural Networks, IJCNN 2015
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - International Joint Conference on Neural Networks, IJCNN 2015
Y2 - 12 July 2015 through 17 July 2015
ER -