The expected probability of misclassification of linear zero empirical error classifier
Volume 7, Issue 2 (1996), pp. 137–154
Pub. online: 1 January 1996
Type: Research Article
Published
1 January 1996
1 January 1996
Abstract
There exist two principally different approaches to design the classification rule. In classical (parametric) approach one parametrizes conditional density functions of the pattern classes. In a second (nonparametric) approach one parametrizes a type of the discriminant function and minimizes an empirical classification error to find unknown coefficients of the discriminant function. There is a number of asymptotic expansions for an expected probability of misclassification of parametric classifiers. Error bounds exist for nonparametric classifiers so far. In this paper an exact analytical expression for the expected error EPN of nonparametric linear zero empirical error classifier is derived for a case when the distributions of pattern classes are spherically Gaussian. The asymptotic expansion of EPN is obtained for a case when both the number of learning patterns N and their, dimensionality p increase infinitely. The tables for exact and approximate expected errors as functions of N, dimensionality p and the distance δ between pattern classes are presented and compared with the expected error of the Fisher's linear classifier and indicate that the minimum empirical error classifier can be used even in cases where dimensionality exceeds the number of learning examples.