Journal:Informatica
Volume 20, Issue 2 (2009), pp. 305–320
Abstract
Multi-attribute analysis is a useful tool in many economical, managerial, constructional, etc. problems. The accuracy of performance measures in COPRAS (The multi-attribute COmplex PRoportional ASsessment of alternatives) method is usually assumed to be accurate. This method assumes direct and proportional dependence of the weight and utility degree of investigated versions on a system of attributes adequately describing the alternatives and on values and weights of the attributes. However, there is usually some uncertainty involved in all multi-attribute model inputs. The objective of this research is to demonstrate how simulation can be used to reflect fuzzy inputs, which allows more complete interpretation of model results. A case study is used to demonstrate the concept of general contractor choice of on the basis of multiple attributes of efficiency with fuzzy inputs applying COPRAS-G method. The research has concluded that the COPRAS-G method is appropriate to use.
Journal:Informatica
Volume 6, Issue 3 (1995), pp. 289–298
Abstract
We compare two alternative ways to use the Bayesian approach in heuristic optimization. The “no-learning” way means that we optimize the randomization parameters for each problem separately. The “learning” way means that we optimize the randomization parameters for some “learning” set of problems. We use those parameters later on for a family of related problems.
We define the learning efficiency as a non-uniformity of optimal parameters while solving a set of randomly generated problems. We show that for flow-shop problems the non-uniformity of optimal parameters is significant. It means that the Bayesian learning is efficient in those problems.
Journal:Informatica
Volume 6, Issue 3 (1995), pp. 277–288
Abstract
Two-dimensional signals of physical phenomena may be inadvertently altered before recording through the system whose bandwidth is smaller than that of the signal. It is often desired to restore later such data by removing the effects of the linear system. This restoration may be accomplished by synthesizing two-dimensional (2-D) inverse filters on computers. Approximations are necessary to insure the stability of the inverse filter.
Journal:Informatica
Volume 6, Issue 2 (1995), pp. 193–224
Abstract
We apply some concepts of Information-Based Complexity (IBC) to global and discrete optimization. We assume that only partial information on the objective is available. We gather this partial information by observations. We use the traditional IBC definitions and notions while defining formal aspects of the problem. We use the Bayesian framework to consider less formal aspects, such as expert knowledge and heuristics.
We extend the traditional Bayesian Approach (BA) including heuristics. We call that a Bayesian Heuristic Approach (BHA).
We discuss how to overcome the computational difficulties using parallel computing. We illustrate the theoretical concepts by three examples: by discrete problems of flow-shop scheduling and parameter grouping, and by a continuous problem of batch operations scheduling.
Journal:Informatica
Volume 5, Issues 1-2 (1994), pp. 123–166
Abstract
We consider here the average deviation as the most important objective when designing numerical techniques and algorithms. We call that a Bayesian approach.
We start by describing the Bayesian approach to the continuous global optimization. Then we show how to apply the results to the adaptation of parameters of randomized techniques of optimization. We assume that there exists a simple function which roughly predicts the consequences of decisions. We call it heuristics. We define the probability of a decision by a randomized decision function depending on heuristics. We fix this decision function, except for some parameters that we call the decision parameters.
We repeat the randomized decision procedure several times given the decision parameters and regard the best outcome as a result. We optimize the decision parameters to make the search more efficient. Thus we replace the original optimization problem by an auxiliary problem of continuous stochastic optimization. We solve the auxiliary problem by the Bayesian methods of global optimization. Therefore we call the approach as the Bayesian one.
We discuss the advantages and disadvantages of the Bayesian approach. We describe the applications to some of discrete programming problems, such as optimization of mixed Boolean bilinear functions including the scheduling of batch operations and the optimization of neural networks.