Journal:Informatica
Volume 11, Issue 4 (2000), pp. 381–396
Abstract
An estimation of the generalization performance of classifier is one of most important problems in pattern clasification and neural network training theory. In this paper we estimate the generalization error (mean expected probability of classification) for randomized linear zero empirical error (RLZEE) classifier which was considered by Raudys, Dičiūnas and Basalykas. Instead of “non-explicit” asymptotics of a generalization error of RLZEE classifier for centered multivariate spherically Gaussian classes proposed by Basalykas et al. (1996) we obtain an “explicit” and more simple asymptotics. We also present the numerical simulations illustrating our theoretical results and comparing them with each other and previously obtained results.
Journal:Informatica
Volume 2, Issue 3 (1991), pp. 434–454
Abstract
The smoothing constant λ is the most important characteristic of the nonparametric Parzen window classifier (PWC). The PWC tends to a one-nearest neighbour classifier as λ tends to zero and to a parametric linear Eucliden distance classifier as λ tends to infinity. An asymptotic probability of misclassification of the PWC decreases with the decrease in λ. A sensitivity of the PWC to a finiteness of the training data depends on a true-intrinsic dimensionality of the data, and it increases with the decrease in the value of λ. It is proposed to determine an optimal value of the smoothing constant from a smoothed empirical graph of the dependence of an expected probability of misclassification on the value of λ. The graph can be estimated by means of leaving-one-out or hold-out methods simultaneously for a number of values of λ chosen from the interval (0.001–1000) in a logarithmic scale.