Journal:Informatica
Volume 24, Issue 3 (2013), pp. 461–484
Abstract
In the paper we describe the bus routing problem (BRP), which the goal is to find a route from the start stop to the final stop minimizing the time and the cost of travel and the length of the route. Additionally the time of starting travel at the start stop is given. Analysis of the problem is presented and in particular we point at properties of routes. The BRP is an example of multicriteria optimization problem (MOP), which the solution is the set of non-dominated solutions. This paper proposes a label correcting algorithm with storing partial solutions for solving the BRP. The algorithm makes possible to find all routes which belong to the set of non-dominated solutions. Apart from that the results of experimental tests are presented. Additionally the results are compared with results for the BRP where the goal is to minimize only the time and the cost of travel and the length of the route is not taken into consideration.
Journal:Informatica
Volume 24, Issue 3 (2013), pp. 447–460
Abstract
The paper is devoted to goodness of fit tests based on probability density estimates generated by kernel functions. The test statistic is considered in the form of maximum of the normalized deviation of the estimate from its expected value or a hypothesized distribution density function. A comparative Monte Carlo power study of the investigated criterion is provided. Simulation results show that the proposed test is a powerful competitor to the existing classical criteria testing goodness of fit against a specific type of alternative hypothesis. An analytical way for establishing the asymptotic distribution of the test statistic is proposed, using the theory of high excursions of close to Gaussian random processes and fields introduced by Rudzkis (1992, 2012).
Journal:Informatica
Volume 24, Issue 3 (2013), pp. 435–446
Abstract
The performance of an automatic speech recognition system heavily depends on the used feature set. Quality of speech recognition features is estimated by classification error, but then the recognition experiments must be performed, including both front-end and back-end implementations. We propose a method for features quality estimation that does not require recognition experiments and accelerate automatic speech recognition system development. The key component of our method is usage of metrics right after front-end features computation. The experimental results show that our method is suitable for recognition systems with back-end Euclidean space classifiers.
Journal:Informatica
Volume 24, Issue 3 (2013), pp. 413–433
Abstract
Relational mathematics, as it is studied in fields like mathematical economics and social choice theory for some time, provides a rich and general framework and appears to be a natural and direct way to paraphrase optimization goals, to represent user preferences, to justify fairness criterions, to cope with QoS or to valuate utility. Here, we will focus on the specific application aspects of formal relations in network design and control problems and provide the general concept of relational optimization. In relational optimization, we represent the optimization problem by a formal relation, and the solution by the set of maximal (or non-dominated) elements of this relation. This appears to be a natural extension of standard optimization, and covers other notions of optimality as well. Along with this, we will provide a set of fairness relations that can serve as maximizing relations in relational optimization according to various application needs, and we specify a meta-heuristic approach derived from evolutionary multi-objective optimization algorithms to approximate their maximum sets.
Journal:Informatica
Volume 24, Issue 3 (2013), pp. 395–411
Abstract
In this paper, the nonlinear neural network FitzHugh–Nagumo model with an expansion by the excited neuronal kernel function has been investigated. The mean field approximation of neuronal potentials and recovery currents inside neuron ensembles was used. The biologically more realistic nonlinear sodium ionic current–voltage characteristic and kernel functions were applied. A possibility to present the nonlinear integral differential equations with kernel functions under the Fourier transformation by partial differential equations allows us to overcome the analytical and numerical modeling difficulties. An equivalence of two kinds solutions was confirmed basing on the errors analysis. The approach of the equivalent partial differential equations was successfully employed to solve the system with the heterogeneous synaptic functions as well as the FitzHugh–Nagumo nonlinear time-delayed differential equations in the case of the Hopf bifurcation and stability of stationary states. The analytical studies are corroborated by many numerical modeling experiments.
The digital simulation at the transient and steady-state conditions was carried out by using finite difference technique. The comparison of the simulation results revealed that some of the calculated parameters, i.e. response and sensitivity is the same, while the others, i.e. half-time of the steady-state is significantly different for distinct models.
Journal:Informatica
Volume 24, Issue 3 (2013), pp. 381–394
Abstract
While users increasingly use such large multimedia data, more people use the cloud computing technology. It is necessary to manage large data in an efficient way, and to consider transmission efficiency for multimedia data of different quality. To this end, an important thing is to ensure efficient distribution of important resources (CPU, network and storage) which constitute cloud computing, and variable distribution algorithms are required therefor. This study proposes a method of designing a scheme for applying MapReduce of the FP-Growth algorithm which is one of data mining methods based on the Hadoop platform at the stage of IaaS (Infrastructure As a Service) including CPU, networking and storages. The method is then for allocating resources with the scheme.
Journal:Informatica
Volume 24, Issue 3 (2013), pp. 357–380
Abstract
This study proposes a model for supporting the decision making process of the cloud policy for the deployment of virtual machines in cloud environments. We explore two configurations, the static case in which virtual machines are generated according to the cloud orchestration, and the dynamic case in which virtual machines are reactively adapted according to the job submissions, using migration, for optimizing performance time metrics. We integrate both solutions in the same simulator for measuring the performance of various combinations of virtual machines, jobs and hosts in terms of the average execution and total simulation time. We conclude that the dynamic configuration is prosperus as it offers optimized job execution performance.
Journal:Informatica
Volume 24, Issue 3 (2013), pp. 339–356
Abstract
Generating sequences of random numbers or bits is a necessity in many situations (cryptography, modeling, simulations, etc…). Those sequences must be random in the sense that their behavior should be unpredictable. For example, the security of many cryptographic systems depends on the generation of unpredictable values to be used as keys. Since randomness is related to the unpredictable property, it can be described in probabilistic terms, studying the randomness of a sequence by means of a hypothesis test. A new statistical test for randomness of bit sequences is proposed in the paper. The created test is focused on determining the number of different fixed length patterns that appear along the binary sequence. When ‘few’ distinct patterns appear in the sequence, the hypothesis of randomness is rejected. On the contrary, when ‘many’ different patterns appear in the sequence, the hypothesis of randomness is accepted.
The proposed can be used as a complement of other statistical tests included in suites to study randomness. The exact distribution of the test statistic is derived and, therefore, it can be applied to short and long sequences of bits. Simulation results showed the efficiency of the test to detect deviation from randomness that other statistical tests are not able to detect. The test was also applied to binary sequences obtained from some pseudorandom number generators providing results in keeping with randomness. The proposed test distinguishes by fast computation when the critical values are previously calculated.
Journal:Informatica
Volume 24, Issue 2 (2013), pp. 315–337
Abstract
We consider a generalization of heterogeneous meta-programs by (1) introducing an extra level of abstraction within the meta-program structure, and (2) meta-program transformations. We define basic terms, formalize transformation tasks, consider properties of meta-program transformations and rules to manage complexity through the following transformation processes: (1) reverse transformation, when a correct one-stage meta-program M1 is transformed into the equivalent two-stage meta-meta-program M2; (2) two-stage forward transformations, when M2 is transformed into a set of meta-programs, and each meta-program is transformed into a set of target programs. The results are as follows: (a) formalization of the transformation processes within the heterogeneous meta-programming paradigm; (b) introduction and approval of equivalent transformations of meta-programs into meta-meta-programs and vice versa; (c) introduction of metrics to evaluate complexity of meta-specifications. The results are approved by examples, theoretical reasoning and experiments.
Journal:Informatica
Volume 24, Issue 2 (2013), pp. 291–313
Abstract
This investigate proposed a innovative Improved Hybrid PSO-GA (IHPG) algorithm which it combined the advantages of the PSO algorithm and GA algorithm. The IHPG algorithm uses the velocity and position update rules of the PSO algorithm and the GA algorithm in selection, crossover and mutation thought. This study explores the quality monitoring experiment by three existing neural network approaches to data fusion in wireless sensor module measurements. There are ten sensors deployed in a sensing area, the digital conversion and weight adjustment of the collected data need to be done. This experiment result can improve the accuracy of the estimated data and reduce the randomness of computing by adjustment optimization of smoothing parameter. According to the experimental analysis, the IHPG is better than the single PSO and GA in comparison the various neural network learning model.