Journal:Informatica
Volume 18, Issue 3 (2007), pp. 375–394
Abstract
The notion of concurrent signatures was introduced by Chen, Kudla and Paterson in their seminal paper in Eurocrypt 2004. In concurrent signature schemes, two entities can produce two signatures that are not binding, until an extra piece of information (namely the keystone) is released by one of the parties. Upon release of the keystone, both signatures become binding to their true signers concurrently. In ICICS 2005, two identity-based perfect concurrent signature schemes were proposed by Chow and Susilo. In this paper, we show that these two schemes are unfair. In which the initial signer can cheat the matching signer. We present a formal definition of ID-based concurrent signatures which redress the flaw of Chow et al.'s definition and then propose two simple but significant improvements to fix our attacks.
Journal:Informatica
Volume 18, Issue 3 (2007), pp. 363–374
Abstract
Internationalization of compilers and localization of programming languages is not a usual phenomenon yet; however, due to a rapid progress of software and programming technologies it is inevitable. The new versions of wide used programming systems already allow using the identifiers written in the native language, and partially supports Unicode standard, but still have many internationalization deficiencies.
The paper analyses the main elements of internationalization of compilers and their localization possibilities. According to contemporary standards, existing practices of software internationalization and tendencies there are given recommendations how compilers should be internationalized. The paper gives arguments of the importance of localization of lexical elements of the programming languages, and presents solutions that enable to solve the problems of portability of programs developed using localized compiler as well as problems of compiler's compatibility with other compilers.
Journal:Informatica
Volume 18, Issue 3 (2007), pp. 343–362
Abstract
One of the tasks of data mining is classification, which provides a mapping from attributes (observations) to pre-specified classes. Classification models are built by using underlying data. In principle, the models built with more data yield better results. However, the relationship between the available data and the performance is not well understood, except that the accuracy of a classification model has diminishing improvements as a function of data size. In this paper, we present an approach for an early assessment of the extracted knowledge (classification models) in the terms of performance (accuracy), based on the amount of data used. The assessment is based on the observation of the performance on smaller sample sizes. The solution is formally defined and used in an experiment. In experiments we show the correctness and utility of the approach.
Journal:Informatica
Volume 18, Issue 3 (2007), pp. 325–342
Abstract
This paper is concerned with an employee scheduling problem involving multiple shifts and work centers, where employees belong to a hierarchy of categories having downward substitutability. An employee at a higher category may perform the duties of an employee at a lower category, but not vice versa. However, a higher category employee receives a higher compensation than a lower category employee. For a given work center, the demand for each category during a given shift is fixed for the weekdays, and may differ from that on weekends. Two objectives need to be achieved: The first is to find a minimum-cost workforce mix of categories of employees that is needed to satisfy specified demand requirements, and the second is to assign the selected employees to shifts and work centers taking into consideration their preferences for shifts, work centers, and off-days. A mixed-integer programming model is initially developed for the problem, based on which a specialized scheduling heuristic is subsequently developed for the problem. Computational results reported reveal that the proposed heuristic determines solutions proven to lie within 92–99% of optimality for a number of realistic test problems.
Journal:Informatica
Volume 18, Issue 2 (2007), pp. 305–320
Abstract
This paper considers Lur'e type descriptor systems (LDS). The concept of strongly absolute stability is defined for LDS and such a notion is a generalization of absolute stability for Lur'e type standard state-space systems (LSS). A reduced-order LSS is obtained by a standard coordinate transformation and it is shown that the strongly absolute stability of the LDS is equivalent to the absolute stability of the reduced-order LSS. By a generalized Lyapunov function, we derive an LMIs based strongly absolute stability criterion. Furthermore, we present the frequency-domain interpretation of the criterion, which shows that the criterion is a generalization of the classical circle criterion. Finally, numerical examples are given to illustrate the effectiveness of the obtained results.
Journal:Informatica
Volume 18, Issue 2 (2007), pp. 289–304
Abstract
Iterative abstraction refinement has emerged in the last few years as the leading approach to software model checking. We present an approach for automatically verifying C programs against safety specifications based on finite state machine. The approach eliminates unneeded variables using program slicing technique, and then automatically extracts an initial abstract model from C source code using predicate abstraction and theorem proving. In order to reduce time complexities, we partition the set of candidate predicates into subsets, and construct abstract model independently. On the basis of a counterexample-guided abstraction refinement scheme, the abstraction refines incrementally until the specification is either satisfied or refuted. Our methods can be extended to verifying concurrency C programs by parallel composition.
Journal:Informatica
Volume 18, Issue 2 (2007), pp. 279–288
Abstract
In this paper an exact and complete analysis of the Lloyd–Max's algorithm and its initialization is carried out. An effective method for initialization of Lloyd–Max's algorithm of optimal scalar quantization for Laplacian source is proposed. The proposed method is very simple method of making an intelligent guess of the starting points for the iterative Lloyd–Max's algorithm. Namely, the initial values for the iterative Lloyd–Max's algorithm can be determined by the values of compandor's parameters. It is demonstrated that by following that logic the proposed method provides a rapid convergence of the Lloyd–Max's algorithm.
Journal:Informatica
Volume 18, Issue 2 (2007), pp. 267–278
Abstract
A technique to improve an eye cataract early detection and quantitative evaluation of maturity using ultrasound was investigated. A broadband coherent signal, backscattered from an eye lens tissue, was digitized, recorded and processed. A new parameter – lens quality was proposed for the human eye cataract quantitative evaluation. Lens quality reflects two phenomena of ultrasound interaction with lens tissue – attenuation and scattering. Digital technique for echo-signal energy and time frequency analysis was applied, ultrasound waves scattering strength and spectral slope was calculated.
Experimental statistical investigations performed with signals divided into five groups – mature cataract, immature form of cataract, incipience cataract phase, healthy lenses and human eye phantom. Investigations have showed that value of specific quality in the test groups vary in the wide range from 1 to 60. This feature allows theoretically differentiate eye lenses cataract in different classes with defined boundaries. Presented results show that we with high reliability can differentiate lenses into three groups: healthy lenses (QL>50), lenses with incipient or immature cataract (QL=2-20) and lenses with mature cataract (QL<1).
The investigated method can be used for an eye lens classification and for early cataract detection This technique was used at the Department of Ophthalmology, Institute for Biomedical Research, Kaunas University of Medicine.
Journal:Informatica
Volume 18, Issue 2 (2007), pp. 253–266
Abstract
The method for calculating the specific conductivity tensor of an anisotropically conductive medium, proposed in this paper, distinguishes itself by the simplicity of physical measurements: it suffices to make an equally thick rectangle-shaped sample with four electrodes fixed on its sides and to take various measurements of current intensity and differences of potentials. The necessary mathematical calculations can be promptly performed, even without using a complex computing technique. The accuracy of the results obtained depends on the dimensions of the sample and on the ratios of the conductivity tensor components.
Journal:Informatica
Volume 18, Issue 2 (2007), pp. 239–252
Abstract
We propose an Identity Based Strong Designated Verifier Signature (IBSDVS) scheme using bilinear pairings. Designated Verifier Signature finds application in e-voting, auctions and call for tenders. We prove that the scheme is secure against existential forgery under adaptively chosen message and identity attack in random oracle model. We also show that the problem of delegatability does not exist in our scheme.