Pub. online:1 Jan 2017Type:Research ArticleOpen Access
Journal:Informatica
Volume 28, Issue 4 (2017), pp. 651–664
Abstract
In this paper, one method for training the Support Vector Regression (SVR) machine in the complex data field is presented, which takes into account all the information of both the real and imaginary parts simultaneously. Comparing to the existing methods, it not only considers the geometric information of the complex-valued data, but also can be trained with the same amount of computation as the original SVR in the real data field. The accuracy of the proposed method is analysed by the simulation experiments. This also can be applied to the field of anti-interference for satellite navigation successfully, which shows its effectiveness in practical application.
Journal:Informatica
Volume 2, Issue 3 (1991), pp. 434–454
Abstract
The smoothing constant λ is the most important characteristic of the nonparametric Parzen window classifier (PWC). The PWC tends to a one-nearest neighbour classifier as λ tends to zero and to a parametric linear Eucliden distance classifier as λ tends to infinity. An asymptotic probability of misclassification of the PWC decreases with the decrease in λ. A sensitivity of the PWC to a finiteness of the training data depends on a true-intrinsic dimensionality of the data, and it increases with the decrease in the value of λ. It is proposed to determine an optimal value of the smoothing constant from a smoothed empirical graph of the dependence of an expected probability of misclassification on the value of λ. The graph can be estimated by means of leaving-one-out or hold-out methods simultaneously for a number of values of λ chosen from the interval (0.001–1000) in a logarithmic scale.