Pub. online:1 Jan 2007Type:Research ArticleOpen Access
Volume 18, Issue 1 (2007), pp. 137–157
In this paper we propose a modified framework of support vector machines, called Oblique Support Vector Machines(OSVMs), to improve the capability of classification. The principle of OSVMs is joining an orthogonal vector into weight vector in order to rotate the support hyperplanes. By this way, not only the regularized risk function is revised, but the constrained functions are also modified. Under this modification, the separating hyperplane and the margin of separation are constructed more precise. Moreover, in order to apply to large-scale data problem, an iterative learning algorithm is proposed. In this iterative learning algorithm, three different schemes for training can be found in this literature, including pattern-mode learning, semi-batch mode learning and batch mode learning. Besides, smooth technique is adopted in order to convert the constrained nonlinear programming problem into unconstrained optimum problem. Consequently, experimental results and comparisons are given to demonstrate that the performance of OSVMs is better than that of SVMs and SSVMs.