Open Access Open Access  Restricted Access Subscription or Fee Access

Machine Learning Techniques for SVM Classification and Regression Loss Function

M. Premalatha, C. Vijaya Lakshmi

Abstract


Support vector machines (SVM) are a group of supervised learning methods that can be applied to classification or regression. SVM regression is used to predict various physical, chemical, or biological properties). A presentation of linear SVM followed by its extension to nonlinear SVM and SVM regression is then provided to give the basic mathematical details. SVMs were developed to solve the classification problem, but recently they have been extended to the domain of regression problems. In this paper gives the idea of SVM regression is based on the computation of a linear regression function in a high dimensional feature space where the input data are mapped via a nonlinear function. SVR attempts to minimize the generalization error bound so as to achieve generalized performance instead of minimizing the observed training error.Statistical Learning Theory has provided a very effective framework for classification and regression tasks involving features. Support Vector Machines (SVM) are directly derived and they work by solving a constrained quadratic problem where the convex objective function for minimization is given by the combination of a loss function with a regularization term (the norm of the weights).


Keywords


Basic Machine Learning, Loss Function, Non Linear SVM, SVM Pattern Classification, SVM Regression

Full Text:

PDF

References


Blaschko, M. B., & Lampert, C. H. (2008). Learning to Localize Objects with Structured Output Regression. ECCV

R.-E. Fan, K.-W. Chang, C.-J. Hsieh, X.-R. Wang, and C.-J. Lin. LIBLINEAR: A library for large linear classification. Journal of Machine Learning Research, 9:1871 -1874, 2008.

Finley, T., & Joachims, T. (2005). Supervised clustering with support vector machines. ICML (pp. 217–224). New York, NY, USA: ACM Press.

V. Vapnik, Statistical Learning Theory, John Wiley, New York, 1998.

C. Chang and C.-J. Lin. LIBSVM: A library for support vector machines. ACM Transactions on Intelligent Systems and Technology,2:27:1 - 27:27, 2011.

C. M. Bishop, Pattern Recognition and Machine Learning (Information Science and Statistics). Springer (2006).

Blaschko, M. B., & Lampert, C. H. (2008). Learning to Localize Objects with Structured Output Regression. ECCV.

Bo, L., Sminchisescu, C.: Structured output-associative regression. In IEEE Computer Society Conference on Computer Vision and Pattern Recognition. (2009).

W. Hsu, C.-C. Chang, and C.-J. Lin. A practical guide to support vector classification. Technical report, Department of Computer Science, National Taiwan University, 2003.

Osuna E., Freund R., and Girosi F., “Support Vector Machines: Training and Applications”, A.I. Memo No. 1602, Artificial Intelligence Laboratory, MIT, 1997

W. Hsu, C.-C. Chang, and C.-J. Lin. A practical guide to support vector classification. Technical report, Department of Computer Science,National Taiwan University, 2003.

C. Chang and C.-J. Lin. Training ν -support vector regression: Theory and algorithms. Neural Computation, 14(8):1959 -1977, 2002.

Lin, C.-J., Weng, R. C., & Keerthi, S. S. (2008). Trust region Newton method for logistic regression. JMLR, 9.

X. Zhou and K. Z. Mao, Bioinformatics, 21, 1559–1564 (2005). LS Bound Based Gene Selection for DNA Microarray Data.

Y. Q. Zhan and D. G. Shen, Pattern Recognition. 38, 157–161 (2005).Design Efficient Support Vector Machine for Fast Classification.


Refbacks

  • There are currently no refbacks.