DIMACS TR: 2009-18
Experimental Study of Support Vector Machines Based on Linear and Quadratic Optimization Criteria
Authors: Alexey Nefedov, Jiankuan Ye, Casimir Kulikowski, Ilya Muchnik and Kenton Morgan
We present results from a comparative empirical study on the performance of two methods for constructing support vector machines (SVMs). The first method is the conventional one based on the quadratic programming approach, which builds the optimal separating hyperplane maximizing the margin between two classes (optimal SVM). The second method is based on the linear programming approach suggested by Vapnik to build a separating hyperplane with the minimum number of support vectors (heuristic SVM). Using synthetic data from two classes, we compare the classification performance of these SVMs, with an in-depth geometrical comparison of their separating hyperplanes and support vectors. We show that both classifiers achieve practically identical classification accuracy and generalization performance. However, the heuristic SVM has many fewer support vectors than the optimal SVM. In addition, in contrast to the optimal SVM, its support vectors lie on the furthermost borders of the classes, at the maximum distance from the opposite class. In our future work, we will seek to find a theoretical basis to explain these geometrical patterns of the heuristic SVM. We will also compare these classifiers using real benchmark data.
Paper Available at:
DIMACS Home Page