Journal:Informatica
Volume 22, Issue 2 (2011), pp. 259–278
Abstract
We consider that the outer hair cells of the inner ear together with the local structures of the basilar membrane, reticular lamina and tectorial membrane form the primary filters (PF) of the second order. Taking into account a delay in transmission of the excitation signal in the cochlea and the influence of the Reissner membrane, we design a signal filtering system consisting of the PF with the common PF of the neighboring channels. We assess the distribution of the central frequencies of the channels along the cochlea, optimal number of the PF constituting a channel, natural frequencies of the channels, damping factors and summation weights of the outputs of the PF. As an example, we present a filter bank comprising 20 Gaussian-type channels each consisting of five PF. The proposed filtering system can be useful for designing cochlear implants based on biological principles of signal processing in the cochlea.
Journal:Informatica
Volume 22, Issue 2 (2011), pp. 241–257
Abstract
The current increment in energy consumption has renewed the interest in the development of alternatives to fossil fuels. In this regard, the interest in solving the different control problems existing in nuclear fusion reactors like Tokamaks has been intensified. The aim of this manuscript is to show how the ASTRA code, which is used to simulate the performance of Tokamaks, can be integrated into the Matlab-Simulink tool in order to make easier the development of suitable controllers for Tokamaks. As a demonstrative case study to show the feasibility and the goodness of the proposed integration, a modified anti-windup PID-based controller coupled with an optimization algorithm for the loop voltage has been implemented. This integration represents an original and innovative work in the Tokamak control area and it provides new possibilities for the development and application of advanced control schemes to the standardized ASTRA code.
Journal:Informatica
Volume 22, Issue 2 (2011), pp. 225–240
Abstract
This paper presents a hand-shape biometric system based on a novel feature extraction methodology using the morphological pattern spectrum or pecstrum. Identification experiments were carried out using the obtained feature vectors as an input to some recognition systems using neural networks and support vector machine (SVM) techniques, obtaining in average an identification of 98.5%. The verification case was analyzed through an Euclidean distance classifier, obtaining the acceptance rate (FAR) and false rejection rate (FRR) of the system for some K-fold cross validation experiments. In average, an Equal Error Rate of 2.85% was obtained. The invariance to rotation and position properties of the pecstrum allow the system to avoid a fixed hand position using pegs, as is the case in other reported systems. The results indicate that the pattern spectrum represents a good alternative of feature extraction for biometric applications.
Journal:Informatica
Volume 22, Issue 2 (2011), pp. 203–224
Abstract
In this paper, we describe a model for aligning books and documents from bilingual corpus with a goal to create “perfectly” aligned bilingual corpus on word-to-word level. Presented algorithms differ from existing algorithms in consideration of the presence of human translator which usage we are trying to minimize. We treat human translator as an oracle who knows exact alignments and the goal of the system is to optimize (minimize) the use of this oracle. The effectiveness of the oracle is measured by the speed at which he can create “perfectly” aligned bilingual corpus. By “Perfectly” aligned corpus we mean zero entropy corpus because oracle can make alignments without any probabilistic interpretation, i.e., with 100% confidence. Sentence level alignments and word-to-word alignments, although treated separately in this paper, are integrated in a single framework. For sentence level alignments we provide a dynamic programming algorithm which achieves low precision and recall error rate. For word-to-word level alignments Expectation Maximization algorithm that integrates linguistic dictionaries is suggested as the main tool for the oracle to build “perfectly” aligned bilingual corpus. We show empirically that suggested pre-aligned corpus requires little interaction from the oracle and that creation of perfectly aligned corpus can be achieved almost with the speed of human reading. Presented algorithms are language independent but in this paper we verify them with English–Lithuanian language pair on two types of text: law documents and fiction literature.
Journal:Informatica
Volume 22, Issue 2 (2011), pp. 189–201
Abstract
Batch cryptography has been developed into two main branches – batch verification and batch identification. Batch verification is a method to determine whether a set of signatures contains invalid signatures, and batch identification is a method to find bad signatures if a set of signatures contains invalid signatures. Recently, some significant developments appeared in such field, especially by Lee et al., Ferrara et al. and Law et al., respectively. In this paper, we address some weakness of Lee et al.'s earlier work, and propose an identification method in an RSA-type signature. Our method is more efficient than the well known divide and conquer method for the signature scheme. We conclude this paper by providing a method to choose optimal divide and conquer verifiers.
Journal:Informatica
Volume 22, Issue 2 (2011), pp. 177–188
Abstract
The paper presents a novel method for improving the estimates of closely-spaced frequencies of a short length signal in additive Gaussian noise based on the Burg algorithm with extrapolation. The proposed method is implemented in two consecutive steps. In the first step, the Burg algorithm is used to estimate the parameters of the predictive filter, while in the second step the extrapolation technique of the signal is used to improve the frequency estimates. The experimental results demonstrate that the frequency estimates of the short length signal, using the Burg algorithm with extrapolation, are more accurate than the frequency estimates using the Burg algorithm without extrapolation.
Journal:Informatica
Volume 22, Issue 2 (2011), pp. 165–176
Abstract
The instrumental variable (IV) method is one of the most renowned methods for parameter estimation. Its bigger advantage is that it is applicable for open-loop as well as for closed-loop systems. The main difficulty in closed-loop identification is due to the correlation between the disturbances and the control signal induced by the loop. In order to overcome this problem, additional excitation signal is introduced. Non-recursive modifications of the instrumental variable method for closed-loop system identification on the base of a generalized IV method have been developed (Atanasov and Ichtev, 2009; Gilson and Van den Hof, 2001; Gilson and Van den Hof, 2003). In this paper, recursive algorithms for theses modifications are proposed and investigated. A simulation is carried out in order to illustrate the obtained results.
Journal:Informatica
Volume 22, Issue 1 (2011), pp. 149–164
Abstract
The contribution is focused on change point detection in a one-dimensional stochastic process by sparse parameter estimation from an overparametrized model. A stochastic process with change in the mean is estimated using dictionary consisting of Heaviside functions. The basis pursuit algorithm is used to get sparse parameter estimates. The mentioned method of change point detection in a stochastic process is compared with several standard statistical methods by simulations.
Journal:Informatica
Volume 22, Issue 1 (2011), pp. 135–148
Abstract
Detecting communities in real world networks is an important problem for data analysis in science and engineering. By clustering nodes intelligently, a recursive algorithm is designed to detect community. Since the relabeling of nodes does not alter the topology of the network, the problem of community detection corresponds to the finding of a good labeling of nodes so that the adjacency matrix form blocks. By putting a fictitious interaction between nodes, the relabeling problem becomes one of energy minimization, where the total energy of the network is defined by putting interaction between the labels of nodes so that clustering nodes that are in the same community will decrease the total energy. A greedy method is used for the computation of minimum energy. The method shows efficient detection of community in artificial as well as real world network. The result is illustrated in a tree showing hierarchical structure of communities on the basis of sub-matrix density. Applications of the method to weighted and directed networks are discussed.
Journal:Informatica
Volume 22, Issue 1 (2011), pp. 115–134
Abstract
In this paper, the quality of quantization and visualization of vectors, obtained by vector quantization methods (self-organizing map and neural gas), is investigated. A multidimensional scaling is used for visualization of multidimensional vectors. The quality of quantization is measured by a quantization error. Two numerical measures for proximity preservation (Konig's topology preservation measure and Spearman's correlation coefficient) are applied to estimate the quality of visualization. Results of visualization (mapping images) are also presented.