Milan Hladík's Publications:

Least squares approach to K-SVCR multi-class classification with its applications

Hossein Moosaei and Milan Hladík. Least squares approach to K-SVCR multi-class classification with its applications. Ann. Math. Artif. Intell., 90:873–892, 2022.

Download

[PDF] [gzipped postscript] [postscript] [HTML] 

Abstract

The support vector classification-regression machine for K-class classification (K-SVCR) is a novel multi-class classification method based on the 1-versus-1-versus-rest structure. In this paper, we propose a least squares version of K-SVCR named LSK-SVCR. Similarly to the K-SVCR algorithm, this method assesses all the training data into a 1-versus-1-versus-rest structure, so that the algorithm generates ternary outputs −1,0,+1. In LSK-SVCR, the solution of the primal problem is computed by solving only one system of linear equations instead of solving the dual problem, which is a convex quadratic programming problem in K-SVCR. Experimental results on several benchmark, MC-NDC, and handwritten digit recognition data sets show that not only does the LSK-SVCR have better performance in the aspects of classification accuracy to that of K-SVCR and Twin-KSVC algorithms but also has remarkably higher learning speed.

BibTeX

@article{MooHla2022a,
 author = "Hossein Moosaei and Milan Hlad\'{\i}k",
 title = "Least squares approach to {K-SVCR} multi-class classification with its applications",
 journal = "Ann. Math. Artif. Intell.",
 fjournal = "Annals of Mathematics and Artificial Intelligence",
 volume = "90",
 pages = "873-892",
 year = "2022",
 doi = "10.1007/s10472-021-09747-1",
 issn = "1573-7470",
 url = "https://link.springer.com/article/10.1007/s10472-021-09747-1",
 bib2html_dl_html = "https://doi.org/10.1007/s10472-021-09747-1",
 bib2html_dl_pdf = "https://rdcu.be/cT6ar",
 abstract = "The support vector classification-regression machine for K-class classification (K-SVCR) is a novel multi-class classification method based on the 1-versus-1-versus-rest structure. In this paper, we propose a least squares version of K-SVCR named LSK-SVCR. Similarly to the K-SVCR algorithm, this method assesses all the training data into a 1-versus-1-versus-rest structure, so that the algorithm generates ternary outputs {−1,0,+1}. In LSK-SVCR, the solution of the primal problem is computed by solving only one system of linear equations instead of solving the dual problem, which is a convex quadratic programming problem in K-SVCR. Experimental results on several benchmark, MC-NDC, and handwritten digit recognition data sets show that not only does the LSK-SVCR have better performance in the aspects of classification accuracy to that of K-SVCR and Twin-KSVC algorithms but also has remarkably higher learning speed.",
 keywords = "Support vector machine; Twin-KSVC; K-SVCR; Multi-class classification; Least squares",
}

Generated by bib2html.pl (written by Patrick Riley ) on Mon Apr 15, 2024 08:26:42