Journal:Informatica
Volume 15, Issue 1 (2004), pp. 3–22
Abstract
The paper deals with the intelligent functional model for optimizing the product design and its manufacturing process in hybrid manufacturing systems consisting of people, machines and computers. The knowledge‐based framework of an intelligent functional model has been developed. It furnishes the possibility for a product designer and manufacturer to find an optimal production plan in the early stage of the product design. The mathematical model formalization is provided. A consecutive optimization scheme has been applied for selecting an optimal alternative of a product design and its production plan. The proposed model is being implemented both in industry and university education process.
Journal:Informatica
Volume 15, Issue 1 (2004), pp. 23–38
Abstract
Extensive amounts of knowledge and data stored in medical databases require the development of specialized tools for storing, accessing, analysis, and effectiveness usage of stored knowledge and data. Intelligent methods such as neural networks, fuzzy sets, decision trees, and expert systems are, slowly but steadily, applied in the medical fields. Recently, rough set theory is a new intelligent technique was used for the discovery of data dependencies, data reduction, approximate set classification, and rule induction from databases.
In this paper, we present a rough set method for generating classification rules from a set of observed 360 samples of the breast cancer data. The attributes are selected, normalized and then the rough set dependency rules are generated directly from the real value attribute vector. Then the rough set reduction technique is applied to find all reducts of the data which contains the minimal subset of attributes that are associated with a class label for classification. Experimental results from applying the rough set analysis to the set of data samples are given and evaluated. In addition, the generated rules are also compared to the well‐known IDS classifier algorithm. The study showed that the theory of rough sets seems to be a useful tool for inductive learning and a valuable aid for building expert systems.
Journal:Informatica
Volume 15, Issue 1 (2004), pp. 39–44
Abstract
Neural networks built of Hodgkin–Huxley neurons were examined. These structures behaved like Liquid State Machines (LSM). They could effectively process different input signals (i.e., Morse alphabet) into precisely defined output. It is also shown that there is a possibility of logical gates creation with use of Hodgkin–Huxley neurons and simple LSMs.
Journal:Informatica
Volume 15, Issue 1 (2004), pp. 45–62
Abstract
Petri net variants are widely used as a real time systems modeling technique. Recently, UML activity diagrams have been used for the same purpose, even though the syntax and semantics of activity diagrams has not been yet fully worked out. Nevertheless, activity diagrams seem very similar to Petri net semantics. UML, being the industry standard as a common object oriented modeling language needs a well‐defined semantic base for its notation. Formalization of the graphical notation enables automated processing and analysis tasks. Petri nets can provide a formal semantic framework for the UML notations plus the behavioral modeling/analysis strength needed to system designers. This paper describes the methodology for creating the model of the RT application that would allow testing the correctness of the algorithm and the fulfillment of the time constraints at the design stage using UML and Petri Nets.
Journal:Informatica
Volume 15, Issue 1 (2004), pp. 63–76
Abstract
Evolutionary Engineering (EE) is defined to be “the art of using evolutionary algorithms approach such as genetic algorithms to build complex systems”. This paper deals with a neural net based system. It analyses ability of genetically trained neural nets to control Simulated robot arm, witch tries to track a moving object. In difference from classical Approaches neural network learning is performed on line, i.e., in real time. Usually systems are built/evolved, i.e., genetically trained separately of their utilization. That is how it is commonly done. It's a fact that evolution process is heavy on time; that's why Real‐Time approach is rarely taken into consideration. The results presented in this paper show that such approach (Real‐Time EE) is possible. These successful results are essentially due to the “continuity” of the target's trajectory. In EE terms, we express this by the Neighbourhood Hypothesis (NH) concept.
Journal:Informatica
Volume 15, Issue 1 (2004), pp. 77–92
Abstract
The mathematical model and methods of calculation of the layout structure of comparator signal circuits with distributed parameters are presented. The algorithm of computer formulation and solving of equations of transfer functions of comparator circuits is provided. Theoretical substantiation of optimizing the micro‐layout of large‐scale integration circuits of parallel subnanosecond analog‐to‐digital converters (ADC) is proposed.
The signal modeling and investigation of transitional processes in comparator circuits of the subnanosecond range 6‐, 8‐bit ADC of different layouts are presented. It has been determined that the transitional process quality in inputs of comparator blocks strongly depends on the signal circuit layout architecture, the compatibility of wave resistances of signal microstrip lines and on the number of branches to comparator bloks.The designed layouts of the 6‐bit subnanosecond range ADC comparator circuit with different layout structures are presented. Modeling of equivalent circuits of the designed layouts was performed and the modeling results are presented.The architecture of topology for comparators circuits presented here allows the developing of gigahertz 6‐ and 8‐bit analog‐to‐digital information converter.
Journal:Informatica
Volume 15, Issue 1 (2004), pp. 93–110
Abstract
Specifics of hidden Markov model‐based speech recognition are investigated. Influence of modeling simple and context‐dependent phones, using simple Gaussian, two and three‐component Gaussian mixture probability density functions for modeling feature distribution, and incorporating language model are discussed. Word recognition rates and model complexity criteria are used for evaluating suitability of these modifications for practical applications. Development of large vocabulary continuous speech recognition system using HTK toolkit and WSJCAM0 English speech corpus is described. Results of experimental investigations are presented.
Journal:Informatica
Volume 15, Issue 1 (2004), pp. 111–126
Abstract
We propose a layered Soft IP Customisation (SIPC) model for specifying and implementing system‐level soft IP design processes such as wrapping and customisation. The SIPC model has three layers: (1) Specification Layer for specification of a customisation process using UML class diagrams, (2) Generalisation Layer for representation of a customisation process using the metaprogramming techniques, and (3) Generation Layer for generation of the customised soft IP instances from metaspecifications. UML allows us to specify customisation of soft IPs at a high level of abstraction. Metaprogramming allows us to manage variability in a domain, develop generic domain components, and describe generation of customised component instances. The usage of the SIPC model eases and accelerates reuse, adaptation and integration of the pre‐designed soft IPs into new hardware designs.
Journal:Informatica
Volume 15, Issue 1 (2004), pp. 127–142
Abstract
The JPEG image is the most popular file format in relation to digital images. However, up to the present time, there seems to have been very few data hiding techniques taking the JPEG image into account. In this paper, we shall propose a novel high capacity data hiding method based on JPEG. The proposed method employs a capacity table to estimate the number of bits that can be hidden in each DCT component so that significant distortions in the stego‐image can be avoided. The capacity table is derived from the JPEG default quantization table and the Human Visual System (HVS). Then, the adaptive least‐significant bit (LSB) substitution technique is employed to process each quantized DCT coefficient. The proposed data hiding method enables us to control the level of embedding capacity by using a capacity factor. According to our experimental results, our new scheme can achieve an impressively high embedding capacity of around 20% of the compressed image size with little noticeable degradation of image quality.