Milan Hladík's Publications:

Optimization techniques for Twin Support Vector Machines in primal space

Hossein Moosaei, Fatemeh Bazikar, and Milan Hladík. Optimization techniques for Twin Support Vector Machines in primal space. In Boris Goldengorin, editor, Theory, Algorithms, and Experiments in Applied Optimization: In Honor of the 70th Birthday of Panos Pardalos, SOIA, pp. 241–259, Springer, Cham, 2025.

Download

[PDF] [gzipped postscript] [postscript] [HTML] 

Abstract

In this chapter, we examine the twin support vector machine (TWSVM) for binary data classification, a model originally introduced by Jayadeva et al. TWSVM builds on the generalized eigenvalue proximal support vector machine (GEPSVM) developed by Mangasarian et al. Both TWSVM and GEPSVM aim to separate data classes using two non-parallel hyperplanes, with each hyperplane positioned closer to one class while remaining farther from the other. However, their approaches differ significantly: GEPSVM relies on eigenvectors derived from generalized eigenvalue problems, while TWSVM adheres more closely to the traditional support vector machine (SVM) framework by solving two smaller quadratic programming problems (QPPs) instead of one large one, resulting in improved computational efficiency. Shao et al. later introduced an enhanced version of TWSVM, called twin bounded support vector machines (TBSVM). The primary advantage of TBSVM over TWSVM is its inclusion of structural risk minimization by adding a regularization term. Rooted in statistical learning theory, this modification enhances classification performance. Another TWSVM variant, the least squares twin support vector machine (LS-TSVM), was proposed by Kumar and Gopal. This model addresses some TWSVM limitations by solving two sets of linear equations to directly obtain two non-parallel planes. However, LS-TSVM applies the principle of empirical risk minimization rather than structural risk minimization, focusing solely on minimizing training error, which can increase susceptibility to overfitting. To address this, Xu et al. proposed an improved LS-TSVM version, enhancing classifier accuracy. In this chapter, we will explore various optimization techniques for twin support vector machines in the primal space.

BibTeX

@inCollection{MooBaz2025a,
 author = "Hossein Moosaei and Fatemeh Bazikar and Milan Hlad\'{\i}k",
 title = "Optimization techniques for Twin Support Vector Machines in primal space",
 editor = "Goldengorin, Boris",
 feditor = "Sergeyev, Y. D. and Kvasov, D. E. and Astorino, A.",
 sbooktitle = "Theory, Algorithms, and Experiments in Applied Optimization",
 booktitle = "Theory, Algorithms, and Experiments in Applied Optimization: In Honor of the 70th Birthday of Panos Pardalos",
 publisher = "Springer",
 address = "Cham",
 fseries = "Springer Optimization and Its Applications",
 series = "SOIA",
 volume = "226",
 pages = "241-259",
 year = "2025",
 doi = "10.1007/978-3-031-91357-0_12",
 isbn = "978-3-031-91356-3",
 url = "https://link.springer.com/chapter/10.1007/978-3-031-91357-0_12",
 bib2html_dl_html = "https://doi.org/10.1007/978-3-031-91357-0_12",
 bib2html_dl_pdf = "https://rdcu.be/eOnLO",
 abstract = "In this chapter, we examine the twin support vector machine (TWSVM) for binary data classification, a model originally introduced by Jayadeva et al. TWSVM builds on the generalized eigenvalue proximal support vector machine (GEPSVM) developed by Mangasarian et al. Both TWSVM and GEPSVM aim to separate data classes using two non-parallel hyperplanes, with each hyperplane positioned closer to one class while remaining farther from the other. However, their approaches differ significantly: GEPSVM relies on eigenvectors derived from generalized eigenvalue problems, while TWSVM adheres more closely to the traditional support vector machine (SVM) framework by solving two smaller quadratic programming problems (QPPs) instead of one large one, resulting in improved computational efficiency. Shao et al. later introduced an enhanced version of TWSVM, called twin bounded support vector machines (TBSVM). The primary advantage of TBSVM over TWSVM is its inclusion of structural risk minimization by adding a regularization term. Rooted in statistical learning theory, this modification enhances classification performance. Another TWSVM variant, the least squares twin support vector machine (LS-TSVM), was proposed by Kumar and Gopal. This model addresses some TWSVM limitations by solving two sets of linear equations to directly obtain two non-parallel planes. However, LS-TSVM applies the principle of empirical risk minimization rather than structural risk minimization, focusing solely on minimizing training error, which can increase susceptibility to overfitting. To address this, Xu et al. proposed an improved LS-TSVM version, enhancing classifier accuracy. In this chapter, we will explore various optimization techniques for twin support vector machines in the primal space.",
 keywords = "Computer Science; Computer Vision; Linear Algebra; Machine Learning; Mathematics and Computing; Transcranial magnetic stimulation",
}

Generated by bib2html.pl (written by Patrick Riley ) on Mon Dec 22, 2025 15:50:44