Journal:Informatica
Volume 17, Issue 3 (2006), pp. 363–380
Abstract
The Augmented Representation of Cultural Objects (ARCO) system provides software and interface tools to museum curators to develop virtual museum exhibitions, as well as a virtual environment for museum visitors over the World Wide Web or in informative kiosks. The main purpose of the system is to offer an enhanced educative and entertaining experience to virtual museum visitors. In order to assess the usability of the system, two approaches have been employed: a questionnaire based survey and a Cognitive Walkthrough session. Both approaches employed expert evaluators, such as domain experts and usability experts. The result of this study shows a fair performance of the followed approach, as regards the consumed time, financial and other resources, as a great deal of usability problems has been uncovered and many aspects of the system have been investigated. The knowledge gathered aims at creating a conceptual framework for diagnose usability problems in systems in the area of Virtual Cultural Heritage.
Journal:Informatica
Volume 17, Issue 3 (2006), pp. 347–362
Abstract
This paper introduces a new concept of convertible user designating confirmer partially blind signature, in which only the designated confirmer (designated by the user) and the user can verify and confirm the validity of given signatures and convert given signatures into publicly verifiable ones. We give a formal definition for it and propose a concrete provably secure scheme with a proof of security and a brief analysis of efficiency. Assuming the intractabilities of the Discrete Logarithm Problem and the ROS-Problem, the proposed scheme is unforgeable under adaptive chosen-message attack.
Journal:Informatica
Volume 17, Issue 3 (2006), pp. 325–346
Abstract
The paper proposes a methodology for evaluation of specification language functionality characteristics. It describes background of the proposed methodology, discusses the methodology in detail, and shortly describes experimental results obtained using the proposed methodology to evaluate the functionality of Z and UML languages.
Journal:Informatica
Volume 17, Issue 3 (2006), pp. 309–324
Abstract
Three parallel algorithms for solving the 3D problem with nonlocal boundary condition are considered. The forward and backward Euler finite-difference schemes, and LOD scheme are typical representatives of three general classes of parallel algorithms used to solve multidimensional parabolic initial-boundary value problems. All algorithms are modified to take into account additional nonlocal boundary condition. The algorithms are implemented using the parallel array object tool ParSol, then a parallel algorithm follows semi-automatically from the serial one. Results of computational experiments are presented and the accuracy and efficiency of the presented parallel algorithms are tested.
Journal:Informatica
Volume 17, Issue 2 (2006), pp. 297–304
Abstract
The paper addresses the problem of discrimination of homographs when a lengthy segment of an uttered word is missing. The considered discrimination procedure is done by recognizer that operates on cepstrum coefficients extracted from the speech signal. For restoration of the missing speech segment rather than use of the known speech signal, it has been proposed to calculate speech signal characteristics: the period of fundamental frequency and intensity. By experimentation it has been shown that the polynomial approximation of speech signal characteristics improves homograph discrimination results. An extra computational burden associated with the proposed method is not high because it involves recalculation of the already extracted Fourier coefficients.
Journal:Informatica
Volume 17, Issue 2 (2006), pp. 279–296
Abstract
Given a set of objects with profits (any, even negative, numbers) assigned not only to separate objects but also to pairs of them, the unconstrained binary quadratic optimization problem consists in finding a subset of objects for which the overall profit is maximized. In this paper, an iterated tabu search algorithm for solving this problem is proposed. Computational results for problem instances of size up to 7000 variables (objects) are reported and comparisons with other up-to-date heuristic methods are provided.
Journal:Informatica
Volume 17, Issue 2 (2006), pp. 259–278
Abstract
The objective is to investigate two emerging information technologies in graduate studies and scientific cooperation. Internet is the first technology. The open source is the second. They help each other in many ways. We investigate the joint influence of both.
Results of complexity theory show the limitations of exact analysis. That explains popularity of heuristic algorithms. It is well known that efficiency of heuristics depends on the parameters. Thus we need some automatic procedures for tuning the heuristics. That helps comparing results of different heuristics. This enhance their efficiency, too.
An initial presentation of the basic ideas is in (Mockus, 2000). Preliminary results of distance graduate studies are in (Mockus, 2006a). Examples of optimization of sequential statistical decisions are in (Mockus, 2006b).
In this paper the theory and applications of Bayesian Heuristic Approach are discussed. In the next paper examples of Bayesian Approach to automated tuning of heuristics will be regarded. The examples of traditional methods of optimization including applications of linear and dynamic programming will be investigated in the last paper. These papers represents three parts of the same work. However each part can be read independently.
All the algorithms are implemented as platform independent Java applets or servlets. Readers can easily verify and apply the results for studies and for real life optimization models.
The information is on the main web-site http://pilis.if.ktu.lt/∼jmockus and four mirrors.
Journal:Informatica
Volume 17, Issue 2 (2006), pp. 237–258
Abstract
Recently, genetic algorithms (GAs) and their hybrids have achieved great success in solving difficult combinatorial optimization problems. In this paper, the issues related to the performance of the genetic search in the context of the grey pattern problem (GPP) are discussed. The main attention is paid to the investigation of the solution recombination, i.e., crossover operators which play an important role by developing robust genetic algorithms. We implemented seven crossover operators within the hybrid genetic algorithm (HGA) framework, and carried out the computational experiments in order to test the influence of the recombination operators to the genetic search process. We examined the one point crossover, the uniform like crossover, the cycle crossover, the swap path crossover, and others. A so-called multiple parent crossover based on a special type of recombination of several solutions was tried, too. The results obtained from the experiments on the GPP test instances demonstrate promising efficiency of the swap path and multiple parent crossovers.
Journal:Informatica
Volume 17, Issue 2 (2006), pp. 225–236
Abstract
The two major Markov Random Fields (MRF) based algorithms for image segmentation are the Simulated Annealing (SA) and Iterated Conditional Modes (ICM). In practice, compared to the SA, the ICM provides reasonable segmentation and shows robust behavior in most of the cases. However, the ICM strongly depends on the initialization phase.
In this paper, we combine Bak–Sneppen model and Markov Random Fields to define a new image segmentation approach. We introduce a multiresolution technique in order to speed up the segmentation process and to improve the restoration process. Image pixels are viewed as lattice species of Bak–Sneppen model. The a-posteriori probability corresponds to a local fitness. At each cycle, some objectionable species are chosen for a random change in their fitness values. Furthermore, the change in the fitness of each species engenders fitness changes for its neighboring species. After a certain number of iteration, the system converges to a Maximum A Posteriori estimate. In this multireolution approach, we use a wavelet transform to reduce the size of the system.
Journal:Informatica
Volume 17, Issue 2 (2006), pp. 207–224
Abstract
The paper describes the development and performance of parallel algorithms for the discrete element method (DEM) software. Spatial domain decomposition strategy and message passing inter-processor communication have been implemented in the DEMMAT code for simulation of visco-elastic frictional granular media. The novel algorithm combining link-cells for contact detection, the static domain decomposition for parallelization and MPI data transfer for processors exchanging particles has been developed for distributed memory PC clusters. The parallel software DEMMAT_PAR has been applied to model compacting of spherical particles in the rectangular box. Two benchmark problems with different numbers of particles have been solved in order to measure parallel efficiency of the code. The inter-processor communication has been examined in order to improve domain decomposition topology and to achieve better load balancing. The speed-up equal to 11 has been obtained on 16 processors. The parallel performance study has been performed on the PC cluster VILKAS of Vilnius Gediminas Technical University, Lithuania.