Pub. online:2 Jun 2020Type:Research ArticleOpen Access
Journal:Informatica
Volume 31, Issue 2 (2020), pp. 249–275
Abstract
Emotion recognition from facial expressions has gained much interest over the last few decades. In the literature, the common approach, used for facial emotion recognition (FER), consists of these steps: image pre-processing, face detection, facial feature extraction, and facial expression classification (recognition). We have developed a method for FER that is absolutely different from this common approach. Our method is based on the dimensional model of emotions as well as on using the kriging predictor of Fractional Brownian Vector Field. The classification problem, related to the recognition of facial emotions, is formulated and solved. The relationship of different emotions is estimated by expert psychologists by putting different emotions as the points on the plane. The goal is to get an estimate of a new picture emotion on the plane by kriging and determine which emotion, identified by psychologists, is the closest one. Seven basic emotions (Joy, Sadness, Surprise, Disgust, Anger, Fear, and Neutral) have been chosen. The accuracy of classification into seven classes has been obtained approximately 50%, if we make a decision on the basis of the closest basic emotion. It has been ascertained that the kriging predictor is suitable for facial emotion recognition in the case of small sets of pictures. More sophisticated classification strategies may increase the accuracy, when grouping of the basic emotions is applied.
Pub. online:1 Jan 2019Type:Research ArticleOpen Access
Journal:Informatica
Volume 30, Issue 4 (2019), pp. 749–780
Abstract
Despite the mass of empirical data in neuroscience and plenty of interdisciplinary approaches in cognitive science, there are relatively few applicable theories of how the brain as a coherent system functions in terms of energy and entropy processes. Recently, a free energy principle has been portrayed as a possible way towards a unified brain theory. However, its capacity, using free energy and entropy, to unify different perspectives on brain function dynamics is yet to be established. This multidisciplinary study attempts to make sense of the free energy and entropy not only from the perspective of Helmholtz thermodynamic basic principles but also from the information theory framework. Based on the proposed conceptual framework, we constructed (i) four basic brain states (deep sleep, resting, active wakeful and thinking) as dynamic entropy and free energy processes and (ii) stylized a self-organizing mechanism of transitions between the basic brain states during a day period. Adaptive transitions between brain states represent homeostatic rhythms, which produce complex daily brain states dynamics. As a result, the proposed simulation model produces different self-organized circadian dynamics of brain states for different types of chronotypes, which corresponds with the empirical observations.
Pub. online:1 Jan 2019Type:Research ArticleOpen Access
Journal:Informatica
Volume 30, Issue 2 (2019), pp. 367–390
Abstract
The multidimensional data model for kriging is developed using fractional Euclidean distance matrices (FEDM). The properties of FEDM are studied by means of the kernel matrix mehod. It has been shown that the factorization of kernel matrix enables us to create the embedded set being a nonsingular simplex. Using the properties of FEDM the Gaussian random field (GRF) is constructed doing it without positive definite correlation functions usually applied for such a purpose. Created GRF can be considered as a multidimensional analogue of the Wiener process, for instance, line realizations of this GRF are namely Wiener processes. Next, the kriging method is developed based on FEDM. The method is rather simple and depends on parameters that are simply estimated by the maximum likelihood method. Computer simulation of the developed kriging extrapolator has shown that it outperforms the well known Shepard inverse distance extrapolator. Practical application of the developed approach to surrogate modelling of wastewater treatment is discussed. Theoretical investigation, computer simulation, and a practical example demonstrate that the proposed kriging model, using FEDM, can be efficiently applied to multidimensional data modelling and processing.
Journal:Informatica
Volume 26, Issue 4 (2015), pp. 569–591
Abstract
The nonlinear stochastic programming problem involving CVaR in the objective and constraints is considered. Solving the latter problem in a framework of bi-level stochastic programming, the extended Lagrangian is introduced and the related KKT conditions are derived. Next, the sequential simulation-based approach has been developed to solve stochastic problems with CVaR by finite sequences of Monte Carlo samples. The approach considered is grounded by the rule for iterative regulation of the Monte Carlo sample size and the stochastic termination procedure, taking into account the stochastic model risk. The rule is introduced to regulate the size of the Monte Carlo sample inversely proportionally to the square of the stochastic gradient norm allows us to solve stochastic nonlinear problems in a rational way and ensures the convergence. The proposed termination procedure enables us to test the KKT conditions in a statistical way and to evaluate the confidence intervals of the objective and constraint functions in a statistical way as well. The results of the Monte Carlo simulation with test functions and solution of the practice sample of trade-offs of gas purchases, storage and service reliability, illustrate the convergence of the approach considered as well as the ability to solve in a rational way the nonlinear stochastic programming problems handling CVaR in the objective and constraints, with an admissible accuracy, treated in a statistical manner.
Journal:Informatica
Volume 24, Issue 2 (2013), pp. 253–274
Abstract
The paper deals with the application of the theory of locally homogeneous and isotropic Gaussian fields (LHIGF) to probabilistic modelling of multivariate data structures. An asymptotic model is also studied, when the correlation function parameter of the Gaussian field tends to infinity. The kriging procedure is developed which presents a simple extrapolator by means of a matrix of degrees of the distances between pairs of the points of measurement. The resulting model is rather simple and can be defined only by the mean and variance parameters, efficiently evaluated by maximal likelihood method. The results of application of the extrapolation method developed for two analytically computed surfaces and estimation of the position of the spacecraft re-entering the atmosphere are given.
Journal:Informatica
Volume 22, Issue 1 (2011), pp. 1–10
Abstract
Estimation and modelling problems as they arise in many data analysis areas often turn out to be unstable and/or intractable by standard numerical methods. Such problems frequently occur in fitting of large data sets to a certain model and in predictive learning. Heuristics are general recommendations based on practical statistical evidence, in contrast to a fixed set of rules that cannot vary, although guarantee to give the correct answer. Although the use of these methods became more standard in several fields of sciences, their use for estimation and modelling in statistics appears to be still limited. This paper surveys a set of problem-solving strategies, guided by heuristic information, that are expected to be used more frequently. The use of recent advances in different fields of large-scale data analysis is promoted focusing on applications in medicine, biology and technology.
Journal:Informatica
Volume 20, Issue 2 (2009), pp. 165–172
Abstract
Recent changes in the intersection of the fields of intelligent systems optimization and statistical learning are surveyed. These changes bring new theoretical and computational challenges to the existing research areas racing from web page mining to computer vision, pattern recognition, financial mathematics, bioinformatics and many other ones.
Journal:Informatica
Volume 18, Issue 4 (2007), pp. 603–614
Abstract
The paper considers application of stochastic optimization to system of automatic recognition of ischemic stroke area on computed tomography (CT) images. The algorithm of recognition depends on five inputs that influence the results of automatic detection. The quality of recognition is measured by size of conjunction of ethalone image and the image calculated by the program of automatic detection. The method of Simultaneous Perturbation Stohastic Approximation algorithm with the Metropolis rule has been applied to the optimization of the quality of image recognition. The Monte-Carlo simulation experiment was performed in order to evaluate the properties of developed algorithm.
Journal:Informatica
Volume 15, Issue 2 (2004), pp. 271–282
Abstract
We consider a problem of nonlinear stochastic optimization with linear constraints. The method of ɛ‐feasible solution by series of Monte‐Carlo estimators has been developed for solving this problem avoiding “jamming” or “zigzagging”. Our approach is distinguished by two peculiarities: the optimality of solution is tested in a statistical manner and the Monte‐Carlo sample size is adjusted so as to decrease the total amount of Monte‐Carlo trials and, at the same time, to guarantee the estimation of the objective function with an admissible accuracy. Under some general conditions we prove by the martingale approach that the proposed method converges a.s. to the stationary point of the problem solved. As a counterexample the maximization of the probability of portfolio desired return is given, too.
Journal:Informatica
Volume 11, Issue 4 (2000), pp. 455–468
Abstract
Methods for solving stochastic optimization problems by Monte-Carlo simulation are considered. The stoping and accuracy of the solutions is treated in a statistical manner, testing the hypothesis of optimality according to statistical criteria. A rule for adjusting the Monte-Carlo sample size is introduced to ensure the convergence and to find the solution of the stochastic optimization problem from acceptable volume of Monte-Carlo trials. The examples of application of the developed method to importance sampling and the Weber location problem are also considered.