Tuning support vector machines for minimax and Neyman-Pearson classification

Date
2008-08-19
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract

This paper studies the training of support vector machine (SVM) classifiers with respect to the minimax and Neyman-Pearson criteria. In principle, these criteria can be optimized in a straightforward way using a cost-sensitive SVM. In practice, however, because these criteria require especially accurate error estimation, standard techniques for tuning SVM parameters, such as crossvalidation, can lead to poor classifier performance. To address this issue, we first prove that the usual cost-sensitive SVM, here called the 2C-SVM, is equivalent to another formulation called the 2nu-SVM. We then exploit a characterization of the 2nu-SVM parameter space to develop a simple yet powerful approach to error estimation based on smoothing. In an extensive experimental study we demonstrate that smoothing significantly improves the accuracy of cross-validation error estimates, leading to dramatic performance gains. Furthermore, we propose coordinate descent strategies that offer significant gains in computational efficiency, with little to no loss in performance.

Description
Advisor
Degree
Type
Report
Keywords
error estimation, minimax classification, support vector machines, Neyman-Pearson classification, parameter selection
Citation

C. D. Scott, R. G. Baraniuk and M. A. Davenport, "Tuning support vector machines for minimax and Neyman-Pearson classification," Rice University ECE Technical Report, no. TREE 0804, 2008.

Has part(s)
Forms part of
Published Version
Rights
Link to license
Citable link to this page
Collections