A Dual-Based Pruning Method for the Least-Squares Support Vector Machine
Date
2023Author
Subject
Metadata
Show full item recordAbstract
The least-squares support vector machine (LS-SVM) is generally parameterized by a large number of support vectors, which slows down the speed of classification. This paper proposes to search for and prune two types of support vectors. The first type is the potential outliers, each of which is misclassified by the model trained on the other samples. The second type is the sample whose removal causes the least perturbation to the dual objective function. Without implicitly implementing the training procedure, the LS-SVM model pertaining to omission of a training sample is derived analytically from the LS-SVM trained on the whole training set. The derivation reduces the computational cost of pruning a sample, which makes the major technical contribution of this paper. Experimental results on six UCI datasets show that, compared with classical pruning methods, the proposed algorithm can enhance the sparsity of the LS-SVM significantly, while maintaining satisfactory generalization performances.
Collections
Publisher
Journal
Volume
Issue
Pagination
Publisher URL
Recommended, similar items
The following license files are associated with this item: