Journal:Informatica
Volume 27, Issue 2 (2016), pp. 323–334
Abstract
This paper reviews the interplay between global optimization and probability models, concentrating on a class of deterministic optimization algorithms that are motivated by probability models for the objective function. Some complexity results are described for the univariate and multivariate cases.
Journal:Informatica
Volume 22, Issue 4 (2011), pp. 471–488
Abstract
We describe an adaptive algorithm for approximating the global minimum of a continuous univariate function. The convergence rate of the error is studied for the case of a random objective function distributed according to the Wiener measure.
Journal:Informatica
Volume 20, Issue 2 (2009), pp. 273–292
Abstract
The paper studies stochastic optimization problems in Reproducing Kernel Hilbert Spaces (RKHS). The objective function of such problems is a mathematical expectation functional depending on decision rules (or strategies), i.e. on functions of observed random parameters. Feasible rules are restricted to belong to a RKHS. This kind of problems arises in on-line decision making and in statistical learning theory. We solve the problem by sample average approximation combined with Tihonov's regularization and establish sufficient conditions for uniform convergence of approximate solutions with probability one, jointly with a rule for downward adjustment of the regularization factor with increasing sample size.
Journal:Informatica
Volume 6, Issue 1 (1995), pp. 93–117
Abstract
This work is our first attempt in establishing the connections between evolutionary computation algorithms and stochastic approximation procedures. By treating evolutionary algorithms as recursive stochastic procedures, we study both constant gain and decreasing step size algorithms. We formulate the problem in a rather general form, and supply the sufficient conditions for convergence (both with probability one, and in the weak sense). Among other things, our approach reveals the natural connection of the discrete iterations and the continuous dynamics (ordinary differential equations, and/or stochastic differential equations). We hope that this attempt will open up a new horizon for further research and lead to in depth understanding of the underlying algorithms.
Journal:Informatica
Volume 4, Issues 3-4 (1993), pp. 335–350
Abstract
This paper is devoted to research aspects of the convergence rate of conservative difference schemes (d.s.) with time-adaptive grids in cases, where a space grid is irregular and the third boundary-value problem is considered for one-dimensional linear parabolic equations. The unconditional convergence of created d.s. is proved in a C-metric at the rate O(h2+τ01/2).
Journal:Informatica
Volume 3, Issue 2 (1992), pp. 275–279
Abstract
In some recent papers a discussion on global minimization algorithms for a broad class of functions was started. An idea is presented here why such a case is different from a case of Lipshitzian functions in respect with the convergence and why for a broad class of functions an algorithm converges to global minimum of an objective function if it generates an everywhere dense sequence of trial points.