A tool for modeling the propagation of optical beams is proposed and investigated. Truncated Laguerre–Gauss polynomial series are used for approximation of the field at any point in free space. Aposteriori error estimates in various norms are calculated using errors for input functions. The accumulation of truncation errors during space transition is investigated theoretically. The convergence rate of truncated LG series is obtained numerically for super-Gaussian beams. An optimization of algorithm realization costs is done by choosing parameters in such a way that the error reaches minimum value. Results of numerical experiments are presented.