Journal:Informatica
Volume 20, Issue 2 (2009), pp. 203–216
Abstract
One of the most known applications of Discrete Optimization is on scheduling. In contrast, one of the most known applications of Continuous Nonlinear Optimization is on the control of dynamic systems. In this paper, we combine both views, solving scheduling problems as dynamic systems, modeled as discrete-time nonlinear optimal control problems with state and control continuous variables subjected to upper and lower bounds. Complementarity constraints are used to represent scheduling decisions. One example we discuss in detail is the crude oil scheduling in ports, with numerical results presented.
Journal:Informatica
Volume 20, Issue 2 (2009), pp. 187–202
Abstract
In this paper, a method for the study of cluster stability is purposed. We draw pairs of samples from the data, according to two sampling distributions. The first distribution corresponds to the high density zones of data-elements distribution. Thus it is associated with the clusters cores. The second one, associated with the cluster margins, is related to the low density zones. The samples are clustered and the two obtained partitions are compared. The partitions are considered to be consistent if the obtained clusters are similar. The resemblance is measured by the total number of edges, in the clusters minimal spanning trees, connecting points from different samples. We use the Friedman and Rafsky two sample test statistic. Under the homogeneity hypothesis, this statistic is normally distributed. Thus, it can be expected that the true number of clusters corresponds to the statistic empirical distribution which is closest to normal. Numerical experiments demonstrate the ability of the approach to detect the true number of clusters.
Journal:Informatica
Volume 20, Issue 2 (2009), pp. 173–186
Abstract
In this paper, we consider the problem of semi-supervised binary classification by Support Vector Machines (SVM). This problem is explored as an unconstrained and non-smooth optimization task when part of the available data is unlabelled. We apply non-smooth optimization techniques to classification where the objective function considered is non-convex and non-differentiable and so difficult to minimize. We explore and compare the properties of Simulated Annealing and of Simultaneous Perturbation Stochastic Approximation (SPSA) algorithms (SPSA with the Lipschitz Perturbation Operator, SPSA with the Uniform Perturbation Operator, Standard Finite Difference Approximation) for semi-supervised SVM classification. Numerical results are given, obtained by running the proposed methods on several standard test problems drawn from the binary classification literature. The performance of the classifiers were evaluated by analyzing Receiver Operating Characteristics (ROC).
Journal:Informatica
Volume 20, Issue 2 (2009), pp. 165–172
Abstract
Recent changes in the intersection of the fields of intelligent systems optimization and statistical learning are surveyed. These changes bring new theoretical and computational challenges to the existing research areas racing from web page mining to computer vision, pattern recognition, financial mathematics, bioinformatics and many other ones.
Journal:Informatica
Volume 20, Issue 1 (2009), pp. 151–163
Abstract
To restore the underexposure image, an illumination compensation inpainting model which employs the joint-diffused partial differential equations (PDEs) is proposed. Firstly, the novel model compensates the illumination effect in multi-scaled underexposure images respectively. Secondly, the information in the fused compensated image is restored by PDEs which diffuse the geometric property and gray information into the target region simultaneously. Experimental results demonstrate that the novel model can properly restore scratches while compensating the illumination effect in underexposure image, and the joint-diffused PDEs which are employed in it lead to a better performance than the conventional PDE inpainting models.
Journal:Informatica
Volume 20, Issue 1 (2009), pp. 139–150
Abstract
Secure communication between set-top boxes (STBs) and smart cards is directly related to the benefit of the service providers and the legal rights of users, while key exchange is the essential part of a secure communication. In 2004, Jiang et al. proposed a key exchange protocol for STBs and smart cards based upon Schnorr's digital signature protocol and a one-way hash function. This paper, however, demonstrates that Jiang et al.'s protocol is vulnerable to an impersonation attack and does not provide perfect forward secrecy. In addition, in order to isolate such problems, we present a new secure key exchange protocol based on a one-way hash function and Diffie–Hellman key exchange algorithm.
Journal:Informatica
Volume 20, Issue 1 (2009), pp. 115–138
Abstract
The paper presents a novel method for the extraction of facial features based on the Gabor-wavelet representation of face images and the kernel partial-least-squares discrimination (KPLSD) algorithm. The proposed feature-extraction method, called the Gabor-based kernel partial-least-squares discrimination (GKPLSD), is performed in two consecutive steps. In the first step a set of forty Gabor wavelets is used to extract discriminative and robust facial features, while in the second step the kernel partial-least-squares discrimination technique is used to reduce the dimensionality of the Gabor feature vector and to further enhance its discriminatory power. For optimal performance, the KPLSD-based transformation is implemented using the recently proposed fractional-power-polynomial models. The experimental results based on the XM2VTS and ORL databases show that the GKPLSD approach outperforms feature-extraction methods such as principal component analysis (PCA), linear discriminant analysis (LDA), kernel principal component analysis (KPCA) or generalized discriminant analysis (GDA) as well as combinations of these methods with Gabor representations of the face images. Furthermore, as the KPLSD algorithm is derived from the kernel partial-least-squares regression (KPLSR) model it does not suffer from the small-sample-size problem, which is regularly encountered in the field of face recognition.
Journal:Informatica
Volume 20, Issue 1 (2009), pp. 99–114
Abstract
This paper has two achievements. The first aim of this paper is optimization of the lossy compression coder realized as companding quantizer with optimal compression law. This optimization is achieved by optimizing maximal amplitude for that optimal companding quantizer for Laplacian source. Approximate expression in closed form for optimal maximal amplitude is found. Although this expression is very simple and suitable for practical implementation, it satisfy optimality criterion for Lloyd–Max quantizer (for R >= 6 bits/sample). In the second part of this paper novel simple lossless compression method is presented. This method is much simpler than Huffman method, but it gives better results. Finally, at the end of the paper, we join optimal companding quantizer and lossless coding method together in one generalized compression method. This method is applied on the concrete still image and good results are obtained. Besides still images, this method also could be used for compression speech and bio-medical signals.
Journal:Informatica
Volume 20, Issue 1 (2009), pp. 79–98
Abstract
The objective of this paper is the description, justification, and web-based implementation of polynomial time algorithms for equilibrium search of Quadratic Bimatrix Games (QBG). An algorithm is proposed combining exact and heuristic parts. The exact part has the Irelevant Fraud (IF) component for cases when an equilibrium exists with no pure strategies. The Direct Search (DS) component finds a solution if an equilibrium exists in pure strategies. The heuristic Quadratic Strategy Elimination (QSE) part applies IF and DS to reduced matrices obtained by sequential elimination of strategies that lead to non-positive IF solutions. Finally, penalties needed to prevent unauthorized deals are calculated based on Nash axioms of two-person bargaining theory. In the numeric experiments QSE provided correct solution in all examples. The novel results include necessary and sufficient conditions when the QBG problem is solved by IF algorithm, the development of software and the experimental testing of large scale QBG problems up to n=800. The web-site http://pilis.if.ktu.lt/~jmockus includes this and accompanying optimization models.
Journal:Informatica
Volume 20, Issue 1 (2009), pp. 51–78
Abstract
This paper presents a novel data modulation scheme PCCD-OFDM-ASK: the phase continuous context dependent orthogonal frequency division multiplexing amplitude shift keying. The proposed modulation is successfully applied in the mobile payment system. It is used to transmit the digital data over the speech channel of the mobile communication system GSM, as well as CDMA. The main key points of the proposed modulation schemes are: precise signal synchronization between the modulator and demodulator, signal energy independent operation, on line adaptation of frequency characteristics of the transmission channel, and controlled frequency bandwidth thus enabling non-overlapped full duplex communication over the GSM's voice channel.