Journal:Informatica
Volume 24, Issue 3 (2013), pp. 413–433
Abstract
Relational mathematics, as it is studied in fields like mathematical economics and social choice theory for some time, provides a rich and general framework and appears to be a natural and direct way to paraphrase optimization goals, to represent user preferences, to justify fairness criterions, to cope with QoS or to valuate utility. Here, we will focus on the specific application aspects of formal relations in network design and control problems and provide the general concept of relational optimization. In relational optimization, we represent the optimization problem by a formal relation, and the solution by the set of maximal (or non-dominated) elements of this relation. This appears to be a natural extension of standard optimization, and covers other notions of optimality as well. Along with this, we will provide a set of fairness relations that can serve as maximizing relations in relational optimization according to various application needs, and we specify a meta-heuristic approach derived from evolutionary multi-objective optimization algorithms to approximate their maximum sets.
Journal:Informatica
Volume 24, Issue 3 (2013), pp. 395–411
Abstract
In this paper, the nonlinear neural network FitzHugh–Nagumo model with an expansion by the excited neuronal kernel function has been investigated. The mean field approximation of neuronal potentials and recovery currents inside neuron ensembles was used. The biologically more realistic nonlinear sodium ionic current–voltage characteristic and kernel functions were applied. A possibility to present the nonlinear integral differential equations with kernel functions under the Fourier transformation by partial differential equations allows us to overcome the analytical and numerical modeling difficulties. An equivalence of two kinds solutions was confirmed basing on the errors analysis. The approach of the equivalent partial differential equations was successfully employed to solve the system with the heterogeneous synaptic functions as well as the FitzHugh–Nagumo nonlinear time-delayed differential equations in the case of the Hopf bifurcation and stability of stationary states. The analytical studies are corroborated by many numerical modeling experiments.
The digital simulation at the transient and steady-state conditions was carried out by using finite difference technique. The comparison of the simulation results revealed that some of the calculated parameters, i.e. response and sensitivity is the same, while the others, i.e. half-time of the steady-state is significantly different for distinct models.
Journal:Informatica
Volume 24, Issue 3 (2013), pp. 381–394
Abstract
While users increasingly use such large multimedia data, more people use the cloud computing technology. It is necessary to manage large data in an efficient way, and to consider transmission efficiency for multimedia data of different quality. To this end, an important thing is to ensure efficient distribution of important resources (CPU, network and storage) which constitute cloud computing, and variable distribution algorithms are required therefor. This study proposes a method of designing a scheme for applying MapReduce of the FP-Growth algorithm which is one of data mining methods based on the Hadoop platform at the stage of IaaS (Infrastructure As a Service) including CPU, networking and storages. The method is then for allocating resources with the scheme.
Journal:Informatica
Volume 24, Issue 3 (2013), pp. 357–380
Abstract
This study proposes a model for supporting the decision making process of the cloud policy for the deployment of virtual machines in cloud environments. We explore two configurations, the static case in which virtual machines are generated according to the cloud orchestration, and the dynamic case in which virtual machines are reactively adapted according to the job submissions, using migration, for optimizing performance time metrics. We integrate both solutions in the same simulator for measuring the performance of various combinations of virtual machines, jobs and hosts in terms of the average execution and total simulation time. We conclude that the dynamic configuration is prosperus as it offers optimized job execution performance.
Journal:Informatica
Volume 24, Issue 3 (2013), pp. 339–356
Abstract
Generating sequences of random numbers or bits is a necessity in many situations (cryptography, modeling, simulations, etc…). Those sequences must be random in the sense that their behavior should be unpredictable. For example, the security of many cryptographic systems depends on the generation of unpredictable values to be used as keys. Since randomness is related to the unpredictable property, it can be described in probabilistic terms, studying the randomness of a sequence by means of a hypothesis test. A new statistical test for randomness of bit sequences is proposed in the paper. The created test is focused on determining the number of different fixed length patterns that appear along the binary sequence. When ‘few’ distinct patterns appear in the sequence, the hypothesis of randomness is rejected. On the contrary, when ‘many’ different patterns appear in the sequence, the hypothesis of randomness is accepted.
The proposed can be used as a complement of other statistical tests included in suites to study randomness. The exact distribution of the test statistic is derived and, therefore, it can be applied to short and long sequences of bits. Simulation results showed the efficiency of the test to detect deviation from randomness that other statistical tests are not able to detect. The test was also applied to binary sequences obtained from some pseudorandom number generators providing results in keeping with randomness. The proposed test distinguishes by fast computation when the critical values are previously calculated.
Journal:Informatica
Volume 24, Issue 2 (2013), pp. 315–337
Abstract
We consider a generalization of heterogeneous meta-programs by (1) introducing an extra level of abstraction within the meta-program structure, and (2) meta-program transformations. We define basic terms, formalize transformation tasks, consider properties of meta-program transformations and rules to manage complexity through the following transformation processes: (1) reverse transformation, when a correct one-stage meta-program M1 is transformed into the equivalent two-stage meta-meta-program M2; (2) two-stage forward transformations, when M2 is transformed into a set of meta-programs, and each meta-program is transformed into a set of target programs. The results are as follows: (a) formalization of the transformation processes within the heterogeneous meta-programming paradigm; (b) introduction and approval of equivalent transformations of meta-programs into meta-meta-programs and vice versa; (c) introduction of metrics to evaluate complexity of meta-specifications. The results are approved by examples, theoretical reasoning and experiments.
Journal:Informatica
Volume 24, Issue 2 (2013), pp. 291–313
Abstract
This investigate proposed a innovative Improved Hybrid PSO-GA (IHPG) algorithm which it combined the advantages of the PSO algorithm and GA algorithm. The IHPG algorithm uses the velocity and position update rules of the PSO algorithm and the GA algorithm in selection, crossover and mutation thought. This study explores the quality monitoring experiment by three existing neural network approaches to data fusion in wireless sensor module measurements. There are ten sensors deployed in a sensing area, the digital conversion and weight adjustment of the collected data need to be done. This experiment result can improve the accuracy of the estimated data and reduce the randomness of computing by adjustment optimization of smoothing parameter. According to the experimental analysis, the IHPG is better than the single PSO and GA in comparison the various neural network learning model.
Journal:Informatica
Volume 24, Issue 2 (2013), pp. 275–290
Abstract
Based on an example, we describe how outcomes of computational experiment can be employed for study of stability of numerical algorithm, provided that related theoretical propositions are not proven yet. More precisely, we propose a systematic and generalized methodology, how to investigate the influence of the weight functions α(x) and β(x), present in the integral boundary conditions, on the stability of difference schemes, for some class of parabolic equations. The ground of the methodology is the investigation of the spectrum of a matrix, defining the transition to the upper layer of the difference scheme. Spectral structure of this matrix is analysed by both analytic method and computational experiment.
Journal:Informatica
Volume 24, Issue 2 (2013), pp. 253–274
Abstract
The paper deals with the application of the theory of locally homogeneous and isotropic Gaussian fields (LHIGF) to probabilistic modelling of multivariate data structures. An asymptotic model is also studied, when the correlation function parameter of the Gaussian field tends to infinity. The kriging procedure is developed which presents a simple extrapolator by means of a matrix of degrees of the distances between pairs of the points of measurement. The resulting model is rather simple and can be defined only by the mean and variance parameters, efficiently evaluated by maximal likelihood method. The results of application of the extrapolation method developed for two analytically computed surfaces and estimation of the position of the spacecraft re-entering the atmosphere are given.
Journal:Informatica
Volume 24, Issue 2 (2013), pp. 231–251
Abstract
This paper presents a new approach for the business and information systems (IS) alignment consisting of a framework, metamodel, process, and tools for implementing it in practice. The purpose of the approach is to fill in the gap between the existing conceptual business and IS alignment frameworks and the empirical business and IS alignment methods. The suggested approach is based on the SOA, GRAAL, and enterprise modeling techniques such as TOGAF, DoDAF, and UPDM. The proposed approach is applied on four real world projects. Both the application results and the small example are provided to validate the suitability of the approach.