Milan Hladík's Publications:

Universum parametric-margin $\nu$-support vector machine for classification using the difference of convex functions algorithm

Hossein Moosaei, Fatemeh Bazikar, Saeed Ketabchi, and Milan Hladík. Universum parametric-margin $\nu$-support vector machine for classification using the difference of convex functions algorithm. Appl. Intell., 52(3):2634–2654, 2022.

Download

[PDF] [gzipped postscript] [postscript] [HTML] 

Abstract

Universum data that do not belong to any class of a classification problem can be exploited to utilize prior knowledge to improve generalization performance. In this paper, we design a novel parametric $\nu$-support vector machine with universum data ($\mathfrakU$Par-$\nu$-SVM). Unlabeled samples can be integrated into supervised learning by means of $\mathfrakU$Par-$\nu$-SVM.We propose a fast method to solve the suggested problem of $\mathfrakU$Par-$\nu$-SVM. The primal problem of $\mathfrakU$Par-$\nu$-SVM, which is a nonconvex optimization problem, is transformed into an unconstrained optimization problem so that the objective function can be treated as a difference of two convex functions (DC). To solve this unconstrained problem, a boosted difference of convex functions algorithm (BDCA) based on a generalized Newton method is suggested (named DC-$\mathfrakU$Par-$\nu$-SVM). We examined our approach on UCI benchmark data sets, NDC data sets, a handwritten digit recognition data set, and a landmine detection data set. The experimental results confirmed the effectiveness and superiority of the proposed method for solving classification problems in comparison with other methods.

BibTeX

@article{MooBaz2022a,
 author = "Hossein Moosaei and Fatemeh Bazikar and Saeed Ketabchi and Milan Hlad\'{\i}k",
 title = "Universum parametric-margin $\nu$-support vector machine for classification using the difference of convex functions algorithm",
 journal = "Appl. Intell.",
 fjournal = "Applied Intelligence",
 volume = "52",
 number = "3",
 pages = "2634-2654",
 year = "2022",
 doi = "10.1007/s10489-021-02402-6",
 issn = "1573-7497",
 url = "https://link.springer.com/article/10.1007/s10489-021-02402-6",
 bib2html_dl_html = "https://doi.org/10.1007/s10489-021-02402-6",
 bib2html_dl_pdf = "https://rdcu.be/cHAms",
 abstract = "Universum data that do not belong to any class of a classification problem can be exploited to utilize prior knowledge to improve generalization performance. In this paper, we design a novel parametric $\nu$-support vector machine with universum data ($\mathfrak{U}$Par-$\nu$-SVM). Unlabeled samples can be integrated into supervised learning by means of $\mathfrak{U}$Par-$\nu$-SVM.We propose a fast method to solve the  suggested problem of $\mathfrak{U}$Par-$\nu$-SVM. The primal problem of $\mathfrak{U}$Par-$\nu$-SVM, which is a nonconvex optimization problem, is transformed into an unconstrained optimization problem so that the objective function can be treated as a difference of two convex functions (DC). To solve this unconstrained  problem, a boosted difference of convex functions algorithm (BDCA) based on a generalized Newton method is suggested (named DC-$\mathfrak{U}$Par-$\nu$-SVM). We examined our approach on UCI benchmark data sets, NDC data sets, a handwritten digit recognition data set, and a landmine detection data set. The experimental results confirmed the effectiveness and superiority of the proposed method for solving classification problems in comparison with other methods.",
 keywords = "Universum; Par-$\nu$-support vector machine; Nonconvex optimization; DC programming; DCA; BDCA; Modified Newton method",
}

Generated by bib2html.pl (written by Patrick Riley ) on Mon Apr 15, 2024 08:26:42