Journal:Informatica
Volume 21, Issue 4 (2010), pp. 521–532
Abstract
This paper analyses the possibilities of integrating different technological and knowledge representation techniques for the development of a framework for the remote control of multiple agents such as wheelchair-type robots. Large-scale multi-dimensional recognitions of emotional diagnoses of disabled persons often generate a large amount of multi-dimensional data with complex recognition mechanisms, based on the integration of different knowledge representation techniques and complex inference models. The problem is to reveal the main components of a diagnosis as well as to construct flexible decision making models. Sensors can help record primary data for monitoring objects. However the recognition of abnormal situations, clustering of emotional stages and resolutions for certain types of diagnoses is an oncoming issue for bio-robot constructors. The prediction criteria of the diagnosis of the emotional situations of disabled persons are described using knowledge based model of Petri nets. The research results present the development of multi-layered framework architecture with the integration of artificial agents for diagnosis recognition and control of further actions. The method of extension of Petri nets is introduced in the reasoning modules of robots that work in real time. The framework provides movement support for disabled individuals. The fuzzy reasoning is described by using fuzzy logical Petri nets in order to define the physiological state of disabled individuals through recognizing their emotions during their different activities.
Journal:Informatica
Volume 21, Issue 4 (2010), pp. 505–519
Abstract
For many businesses and organizations, the achievement of interoperability has proven to be a highly desirable goal. However, without efficient schema mapping mechanisms or models that allow for the storage and management of information from several distinct systems, the goal of interoperability is impossible to attain. Due to the role of XML as a standard in information exchange, considerable research has been undertaken to find effective methods or algorithms for the conversion from XML to database models. This paper reviews leading research in the field – focusing particularly on three novel approaches taken – and proposes an original schema mapping mechanism using a conceptual model which, due its higher level of abstraction, maximizes the preservation of semantics.
Journal:Informatica
Volume 21, Issue 4 (2010), pp. 487–504
Abstract
Enterprise systems should be assembled out of components and services according to an orchestration schema and taking into account not only functional requirements but also the resulting Quality of Service (QoS). In other words, QoS-aware composition of services and components must be performed. The problem is to find which components or services have to be employed that the resulting system would optimize some QoS attributes while satisfying some other QoS constraints. The paper proposes to use the Constraint Logic Programming approach to solve this problem, that is, we see this problem as a discrete optimization and satisfaction problem.
Journal:Informatica
Volume 21, Issue 4 (2010), pp. 471–486
Abstract
The problem of automatic classification of scientific texts is considered. Methods based on statistical analysis of probabilistic distributions of scientific terms in texts are discussed. The procedures for selecting the most informative terms and the method of making use of auxiliary information related to the terms positions are presented. The results of experimental evaluation of proposed algorithms and procedures over real-world data are reported.
Journal:Informatica
Volume 21, Issue 3 (2010), pp. 455–470
Abstract
In this article, a method is proposed for analysing the thermovision-based video data that characterize the dynamics of temperature anisotropy of the heart tissue in a spatial domain. Many cardiac rhythm disturbances at present time are treated by applying destructive energy sources. One of the most common source and the related methodology is to use radio-frequency ablation procedure. However, the rate of the risk of complications including arrhythmia recurrence remains enough high. The drawback of the methodology used is that the suchlike destruction procedure cannot be monitored by visual spectra and results in the inability to control the ablation efficiency. To the end of understanding the nature of possible complications and controlling the treating process, the means of thermovision could be used. The aim of the study was to analyse possible mechanisms of these complications, measure and determine optimal radio-frequency ablation parameters, according to the analysis of video data, acquired using thermovision.
Journal:Informatica
Volume 21, Issue 3 (2010), pp. 425–454
Abstract
The Turing machine is one of the simple abstract computational devices that can be used to investigate the limits of computability. In this paper, they are considered from several points of view that emphasize the importance and the relativity of mathematical languages used to describe the Turing machines. A deep investigation is performed on the interrelations between mechanical computations and their mathematical descriptions emerging when a human (the researcher) starts to describe a Turing machine (the object of the study) by different mathematical languages (the instruments of investigation). Together with traditional mathematical languages using such concepts as ‘enumerable sets’ and ‘continuum’ a new computational methodology allowing one to measure the number of elements of different infinite sets is used in this paper. It is shown how mathematical languages used to describe the machines limit our possibilities to observe them. In particular, notions of observable deterministic and non-deterministic Turing machines are introduced and conditions ensuring that the latter can be simulated by the former are established.
Journal:Informatica
Volume 21, Issue 3 (2010), pp. 409–424
Abstract
The paper addresses the over-saturated protein spot detection and extraction problem in two-dimensional electrophoresis gel images. The effective technique for detection and reconstruction of over-saturated protein spots is proposed. The paper presents: an algorithm of the median filter mask adaptation for initial filtering of gel image; the models of over-saturation used for gel image analysis; several models of protein spots used for reconstruction; technique of the automatic over-saturated protein spot search and reconstruction. Experimental investigation confirms that proposed search technique lets to find up to 96% of over-saturated protein spots. Moreover the proposed flexible protein spot shape models for reconstruction are faster and more accurate in comparison to the flexible diffusion model.
Journal:Informatica
Volume 21, Issue 3 (2010), pp. 393–407
Abstract
In a fuzzy identity-based encryption (IBE) scheme, a user with the secret key for an identity ID is able to decrypt a ciphertext encrypted with another identity ID' if and only if ID and ID' are within a certain distance of each other as judged by some metric. Fuzzy IBE also allows to encrypt a document to all users that have a certain set of attributes. In 2005, Sahai and Waters first proposed the notion of fuzzy IBE and proved the security of their scheme under the selective-ID model. Currently, there is no fuzzy IBE scheme available that is fully CCA2 secure in the standard model. In this paper, we propose a new fuzzy IBE scheme which achieves IND-FID-CCA2 security in the standard model with a tight reduction. Moreover, the size of public parameters is independent of the number of attributes associated with an identity.
Journal:Informatica
Volume 21, Issue 3 (2010), pp. 375–391
Abstract
In this paper new semilogarithmic quantizer for Laplacian distribution is presented. It is simpler than classic A-law semilogarithmic quantizer since it has unit gain around zero. Also, it gives for 2.97 dB higher signal-to-quantization noise-ratio (SQNR) for referent variance in relation to A-law, and therefore it is more suitable for adaptation. Forward adaptation of this quantizer is done on frame-by-frame basis. In this way G.712 standard is satisfied with 7 bits/sample, which is not possible with classic A-law. Inside each frame subframes are formed and lossless encoder is applied on subframes. In that way, double adaptation is done: adaptation on variance within frames and adaptation on amplitude within subframes. Joined design of quantizer and lossless encoder is done, which gives better performances. As a result, standard G.712 is satisfied with only 6.43 bits/sample. Experimental results, obtained by applying this model on speech signal, are presented. It is shown that experimental and theoretical results are matched very well (difference is less than 1.5%). Models presented in this paper can be applied for speech signal and any other signal with Laplacian distribution.
Journal:Informatica
Volume 21, Issue 3 (2010), pp. 361–374
Abstract
The paper deals with the use of formant features in dynamic time warping based speech recognition. These features can be simply visualized and give a new insight into understanding the reasons of speech recognition errors. The formant feature extraction method, based on the singular prediction polynomials, has been applied in recognition of isolated words. However, the speech recognition performance depends on the order of singular prediction polynomials, whether symmetric or antisymmetric singular prediction polynomials are used for recognition and as well on the fact even or odd order of these polynomials is chosen. Also, it is important to know how informative separate formants are, how the speech recognition results depend on other parameters of the recognition system such as: analysis frame length, number of the formants used in recognition, frequency scale used for representation of formant features, and the preemphasis filter parameters. Properly choosing the processing parameters, it is possible to optimize the speech recognition performance.
The aim of our current investigation is to optimize formant feature based isolated word recognition performance by varying processing parameters of the recognition system as well as to find improvements of the recognition system which could make it more robust to white noise. The optimization experiments were carried out using speech records of 111 Lithuanian words. The speech signals were recorded in the conventional room environment (SNR = 30 dB). Then the white noise was generated at a predefined level (65 dB, 60 dB and 55 dB) and added to the test utterances. The recognition performance was evaluated at various noise levels.
The optimization experiments allowed us to improve considerably the performance of the formant feature based speech recognition system and made the system more robust to white noise.