Journal:Informatica
Volume 20, Issue 4 (2009), pp. 555–578
Abstract
ASPECTJ and composition filters are well-known influential approaches among a wide range of aspect-oriented programming languages that have appeared in the last decade. Although the two approaches are relatively mature and many research works have been devoted to their enhancement and use in practical applications, so far, there has been no attempt that aims at comparing deeply the two approaches. This article is a step towards this comparison; it proposes a mapping between ASPECTJ and Composition filters that put to the test the two approaches by confronting and relating their concepts. Our work shows that the mapping is neither straightforward nor one-to-one despite the fact that the two approaches belong to the same category and provide extension of the same Java language.
Journal:Informatica
Volume 20, Issue 4 (2009), pp. 539–554
Abstract
Digital signal processing is one of the most powerful technologies, developed by achievements in science and electronics engineering. Achievements of this technology significantly influenced communications, medicine technique, radiolocation and other. Digital signal processors are usually used for effective solution of digital signal processing problems class. Today digital signal processors are widely used practically in all fields, in which information processing in real-time is needed. Creation of diagnostic medicine systems is one of perspective fields using digital signal processors. The aim of this work was to create digital mathematical model of blood circulation analysis system using digital signal processing instead of analogical nodes of device. In first stage – work algorithm of blood circulation analysis system and mathematical model of blood circulation analysis system in Matlab–Simulink environment was created. In second stage – mathematical model was tested experimentally. Mathematically imitated Doppler signal was sent to tissue and was reflected. The signal was processed in digitally, blood flow direction was marked and blood speed was evaluated. Experimentation was done with real signals that were recorded while investigating patients in eye clinics. Gained results confirmed adequacy of created mathematical model to real analogical blood circulation analysis system (Lizi et al., 2003).
Journal:Informatica
Volume 20, Issue 4 (2009), pp. 519–538
Abstract
The article addresses the issues of combinatorial evolution of standards in transmission of multimedia information including the following: (a) brief descriptions of basic combinatorial models as multicriteria ranking, knapsack-like problems, clustering, combinatorial synthesis, multistage design, (b) a description of standard series (MPEG) for video information processing and a structural (combinatorial) description of system changes for the standards, (c) a set of system change operations (including multi-attribute description of the operations and binary relations over the operations), (d) combinatorial models for the system changes, and (e) a multistage combinatorial scheme (heuristic) for the analysis of the system changes. Expert experience is used. Numerical examples illustrate the suggested problems, models, and procedures.
Journal:Informatica
Volume 20, Issue 4 (2009), pp. 499–518
Abstract
The main scientific problems investigated in this paper deal with the problem of multiple criteria evaluation of the quality of the main components of e-learning systems, i.e., learning objects (LOs) and virtual learning environments (VLEs). The aim of the paper is to analyse the existing LO and VLE quality evaluation methods, and to create more comprehensive methods based on learning individualisation approach. LOs and VLEs quality evaluation criteria are further investigated as the optimisation parameters and several optimisation methods are explored to be applied. Application of the experts' additive utility function using evaluation criteria ratings and their weights is explored in more detail. These new elements make the given work distinct from all the other earlier works in the area.
Journal:Informatica
Volume 20, Issue 4 (2009), pp. 487–498
Abstract
It is well known, the voice segments and coincident data packets are not equally valued and significant for decoding and comprehension of speech signal. Some lost segments may only slightly worsen audible quality, while others cause strong distortion of the speech signals. Despite this, the feature of different importance of different voice segments in current generation of digital voice transmission systems is not fully used. There is a fundamental problem with discrimination of different importance and value of voice frames. In this paper the concept “of value of voice frame” is introduced, the metric and means for evaluation and measurement of voice frame value are proposed and also results of the measurements of voice frames value are presented.
Journal:Informatica
Volume 20, Issue 4 (2009), pp. 477–486
Abstract
In the present paper, the neural networks theory based on presumptions of the Ising model is considered. Indirect couplings, the Dirac distributions and the corrected Hebb rule are introduced and analyzed. The embedded patterns memorized in a neural network and the indirect couplings are considered as random. Apart from the complex theory based on Dirac distributions the simplified stationary mean field equations and their solutions taking into account an ergodicity of the average overlap and the indirect order parameter are presented. The modeling results are demonstrated to corroborate theoretical statements and applied aspects.
Journal:Informatica
Volume 20, Issue 4 (2009), pp. 461–476
Abstract
In this paper, we propose a new ID-based threshold signature scheme from the bilinear pairings, which is provably secure in the random oracle model under the bilinear Diffie–Hellman assumption. Our scheme adopts the approach that the private key associated with an identity rather than the master key of PKG is shared. Comparing to the-state-of-art work by Baek and Zheng, our scheme has the following advantages. (1) The round-complexity of the threshold signing protocol is optimal. Namely, during the signing procedure, each party broadcasts only one message. (2) The communication channel is optimal. Namely, during the threshold signing procedure, the broadcast channel among signers is enough. No private channel between any two signing parties is needed. (3) Our scheme is much more efficient than the Baek and Zheng scheme in term of computation, since we try our best to avoid using bilinear pairings. Indeed, the private key of an identity is indirectly distributed by sharing a number xID∈ $\mathbb{Z}^{*}_{q}$, which is much more efficient than directly sharing the element in the bilinear group. And the major computationally expensive operation called distributed key generation protocol based on the bilinear map is avoided. (4) At last, the proactive security can be easily added to our scheme.
Journal:Informatica
Volume 20, Issue 3 (2009), pp. 439–460
Abstract
Business rules are relatively new addition in the field of Enterprise Resource Planning (ERP) systems, which are kind of business information systems, development. Recently some relevant enhancements of existing business information systems engineering methods were introduced, although there are still open issues of how business rules may be used and improve qualitative and quantitative attributes of such kind of information systems. The paper discusses existing business information systems engineering issues arising out of using business rules approach. The paper also introduces several ways of business rule involvement aiming at ensuring ERP systems development agility based on running researches in the field also carried out by the authors.
Journal:Informatica
Volume 20, Issue 3 (2009), pp. 417–438
Abstract
The widespread use of the XML format for document representation and message exchange has influenced techniques for data integration in last years. A development of various XML languages, methods and tools gave rise to so called XML technology. Enterprise Information Integration (EII) requires an accurate, precise and complete understanding of the disparate data sources, the needs of the information consumers, and how these map to the business concepts of the enterprise. Any integration takes place in context of an Enterprise Information System (EIS). In the paper we explain various approaches to EII, its architectures as well as its association to Enterprise Application Integration (EAI). We introduce basic features and issues of EII and justify why XML technology contributes to finding sufficiently powerful support for tools for enabling EII. In particular, a database approach to XML provides a universal solution enabling to construct tools for achieving EII. In the paper we present some features of the XML technology, mainly its database part, and show how it is usable in EII.
Journal:Informatica
Volume 20, Issue 3 (2009), pp. 397–416
Abstract
Semantic Web is envisioned as semantic description of data and services enabling unambiguous computerized interpretation. Thanks to semantic description, computers can perform demanding tasks such as automation of discovery and access to heterogeneous data sources. Although this is possible with the existing technologies, combination of web services technology, ontologies and generative programming methods makes this simpler and more efficient. This paper presents the model for dynamic generation of web services for data retrieval from heterogeneous data sources using ontologies. Emphasis is on dynamic generation of web services customized to a particular user based on the request defined by ontology. The paper also describes a prototype of the model implementation. Some advantages of our approach over other approaches are also provided.