Journal:Informatica
Volume 6, Issue 1 (1995), pp. 93–117
Abstract
This work is our first attempt in establishing the connections between evolutionary computation algorithms and stochastic approximation procedures. By treating evolutionary algorithms as recursive stochastic procedures, we study both constant gain and decreasing step size algorithms. We formulate the problem in a rather general form, and supply the sufficient conditions for convergence (both with probability one, and in the weak sense). Among other things, our approach reveals the natural connection of the discrete iterations and the continuous dynamics (ordinary differential equations, and/or stochastic differential equations). We hope that this attempt will open up a new horizon for further research and lead to in depth understanding of the underlying algorithms.
Journal:Informatica
Volume 3, Issue 1 (1992), pp. 98–118
Abstract
We consider a class of identification algorithms for distributed parameter systems. Utilizing stochastic optimization techniques, sequences of estimators are constructed by minimizing appropriate functionals. The main effort is to develop weak and strong invariance principles for the underlying algorithms. By means of weak convergence methods, a functional central limit theorem is established. Using the Skorohod imbedding, a strong invariance principle is obtained. These invariance principles provide very precise rates of convergence results for parameter estimates, yielding important information for experimental design.