Pub. online:1 Jan 2009Type:Research ArticleOpen Access
Volume 20, Issue 2 (2009), pp. 173–186
In this paper, we consider the problem of semi-supervised binary classification by Support Vector Machines (SVM). This problem is explored as an unconstrained and non-smooth optimization task when part of the available data is unlabelled. We apply non-smooth optimization techniques to classification where the objective function considered is non-convex and non-differentiable and so difficult to minimize. We explore and compare the properties of Simulated Annealing and of Simultaneous Perturbation Stochastic Approximation (SPSA) algorithms (SPSA with the Lipschitz Perturbation Operator, SPSA with the Uniform Perturbation Operator, Standard Finite Difference Approximation) for semi-supervised SVM classification. Numerical results are given, obtained by running the proposed methods on several standard test problems drawn from the binary classification literature. The performance of the classifiers were evaluated by analyzing Receiver Operating Characteristics (ROC).
Pub. online:1 Jan 1995Type:Research ArticleOpen Access
Volume 6, Issue 1 (1995), pp. 93–117
This work is our first attempt in establishing the connections between evolutionary computation algorithms and stochastic approximation procedures. By treating evolutionary algorithms as recursive stochastic procedures, we study both constant gain and decreasing step size algorithms. We formulate the problem in a rather general form, and supply the sufficient conditions for convergence (both with probability one, and in the weak sense). Among other things, our approach reveals the natural connection of the discrete iterations and the continuous dynamics (ordinary differential equations, and/or stochastic differential equations). We hope that this attempt will open up a new horizon for further research and lead to in depth understanding of the underlying algorithms.