Pub. online:9 Nov 2020Type:Research ArticleOpen Access
Journal:Informatica
Volume 31, Issue 4 (2020), pp. 659–680
Abstract
In this paper, we continue the study of efficient algorithms for the computation of zeta functions on the complex plane, extending works of Coffey, Šleževičienė and Vepštas. We prove a central limit theorem for the coefficients of the series with binomial-like coefficients used for evaluation of the Riemann zeta function and establish the rate of convergence to the limiting distribution. An asymptotic expression is derived for the coefficients of the series. We discuss the computational complexity and numerical aspects of the implementation of the algorithm. In the last part of the paper we present our results on 3D visualizations of zeta functions, based on series with binomial-like coefficients. 3D visualizations illustrate underlying structures of surfaces and 3D curves associated with zeta functions.
Pub. online:15 Oct 2020Type:Research ArticleOpen Access
Journal:Informatica
Volume 31, Issue 4 (2020), pp. 681–706
Abstract
This paper deals with the two-stage transportation problem with fixed charges, denoted by TSTPFC. We propose a fast solving method, designed for parallel environments, that allows solving real-world applications efficiently. The proposed constructive heuristic algorithm is iterative and its primary feature is that the solution search domain is reduced at each iteration. Our achieved computational results were compared with those of the existing solution approaches. We tested the method on two sets of instances available in literature. The outputs prove that we have identified a very competitive approach as compared to the methods than one can find in literature.
Pub. online:2 Dec 2020Type:Research ArticleOpen Access
Journal:Informatica
Volume 31, Issue 4 (2020), pp. 707–722
Abstract
Spherical fuzzy sets theory is useful and advantageous for handling uncertainty and imprecision in multiple attribute decision-making problems by considering membership, non-membership, and indeterminacy degrees. In this paper, by extending the classical linear assignment method, we propose a novel method called the spherical fuzzy linear assignment method (SF-LAM) to solve multiple criteria group decision-making problems in the spherical fuzzy environment. A ranking procedure consisting of aggregation functions, score functions, accuracy functions, weighted rank frequency, and a binary mathematical model are presented to determine the criterion-wise preferences and various alternatives’ priority order. The proposed method’s applicability and validity are shown through the selection problem among wind power farm locations. The proposed method helps managers to find the best location to construct the wind power plant based on the determined criteria. Finally, a comparative analysis is performed between the proposed spherical fuzzy linear assignment (SF-LAM) model and the spherical fuzzy analytic hierarchy process (SF-AHP) and spherical fuzzy WASPAS methods.
Pub. online:6 Oct 2020Type:Research ArticleOpen Access
Journal:Informatica
Volume 31, Issue 4 (2020), pp. 723–749
Abstract
Traffic flow forecasting is an acknowledged time series problem whose solutions have been essentially grounded on statistical-based models. Recent times came, however, with promising results regarding the use of Recurrent Neural Networks (RNNs), such as Long Short-Term Memory networks (LSTMs), to accurately address time series problems. Literature is, however, evasive in regard to several aspects of the conceived models and often exhibits misconceptions that may lead to important pitfalls. This study aims to conceive and find the best possible LSTM model for traffic flow forecasting while addressing several important aspects of such models such as the multitude of input features, the time frames used by the model and the employed approach for multi-step forecasting. To overcome the spatial problem of open source datasets, this study presents and describes a new dataset collected by the authors of this work. After several weeks of model fitting, Recursive Multi-Step Multi-Variate models were the ones showing better performance, strengthening the perception that LSTMs can be used to accurately forecast the traffic flow for several future timesteps.
Journal:Informatica
Volume 31, Issue 4 (2020), pp. 751–768
Abstract
In cryptography, key establishment protocols are often the starting point paving the way towards secure execution of different tasks. Namely, the parties seeking to achieve some cryptographic task, often start by establishing a common high-entropy secret that will eventually be used to secure their communication. In this paper, we put forward a security model for group key establishment ($\mathsf{GAKE}$) with an adversary that may execute efficient quantum algorithms, yet only once the execution of the protocol has concluded. This captures a situation in which keys are to be established in the present, while security guarantees must still be provided in the future when quantum resources may be accessible to a potential adversary.
Further, we propose a protocol design that can be proven secure in this model. Our proposal uses password authentication and builds upon efficient and reasonably well understood primitives: a message authentication code and a post-quantum key encapsulation mechanism. The hybrid structure dodges potential efficiency downsides, like large signatures, of some “true” post-quantum authentication techniques, making our protocol a potentially interesting fit for current applications with long-term security needs.
Pub. online:23 Nov 2020Type:Research ArticleOpen Access
Journal:Informatica
Volume 31, Issue 4 (2020), pp. 769–791
Abstract
In this paper we consider a non-cooperative N players differential game affected by deterministic uncertainties. Sufficient conditions for the existence of a robust feedback Nash equilibrium are presented in a set of min-max forms of Hamilton–Jacobi–Bellman equations. Such conditions are then used to find the robust Nash controls for a linear affine quadratic game affected by a square integrable uncertainty, which is seen as a malicious fictitious player trying to maximize the cost function of each player. The approach allows us to find robust strategies in the solution of a group of coupled Riccati differential equation. The finite, as well as infinite, time horizon cases are solved for this last game. As an illustration of the approach, the problem of the coordination of a two-echelon supply chain with seasonal uncertain fluctuations in demand is developed.
Journal:Informatica
Volume 31, Issue 4 (2020), pp. 793–820
Abstract
This paper proposes a new family of 4-dimensional chaotic cat maps. This family is then used in the design of a novel block-based image encryption scheme. This scheme is composed of two independent phases, a robust light shuffling phase and a masking phase which operate on image-blocks. It utilizes measures of central tendency to mix blocks of the image at hand to enhance security against a number of cryptanalytic attacks. The mixing is designed so that while encryption is highly sensitive to the secret key and the input image, decryption is robust against noise and cropping of the cipher-image. Empirical results show high performance of the suggested scheme and its robustness against well-known cryptanalytic attacks. Furthermore, comparisons with existing image encryption methods are presented which demonstrate the superiority of the proposed scheme.
Journal:Informatica
Volume 31, Issue 4 (2020), pp. 821–839
Abstract
Ligand Based Virtual Screening methods are widely used in drug discovery as filters for subsequent in-vitro and in-vivo characterization. Since the databases processed are enormously large, this pre-selection process requires the use of fast and precise methodologies. In this work, the similarity between compounds is measured in terms of electrostatic potential. To do so, we propose a new and alternative methodology, called LBVS-Electrostatic. Accordingly to the obtained results, we are able to conclude that many of the compounds proposed with our novel approach could not be discovered with the classical one.
Journal:Informatica
Volume 31, Issue 4 (2020), pp. 841–856
Abstract
Data users are generally interested in two types of aggregated information: summarization of the selected attribute(s) for all considered entities, and retrieval and evaluation of entities by the requirements posed on the relevant attributes. Less statistically literate users (e.g. domain experts) and the business intelligence strategic dashboards can benefit from the linguistic summarization, i.e. a summary like the most of customers are middle–aged can be understood immediately. Evaluation of the mandatory and optional requirements of the structure ${P_{1}}$and most of the other posed predicates should be satisfied is beneficial for analytical business intelligence dashboards and search engines in general. This work formalizes the integration of aforementioned quantified summaries and quantified evaluation into the concept of database queries to empower their flexibility by, e.g. the nested quantified query conditions on hierarchical data structures. Next, this approach contributes to the mitigation of the empty answer problem in data retrieval tasks. Thus, the strategic and analytical dashboards as well as query engines might benefit from the proposed approach. Finally, the obtained results are illustrated on examples, the internal and external trustworthiness is elaborated, and the future research topics and applicability are discussed.
Pub. online:8 Jun 2020Type:Research ArticleOpen Access
Journal:Informatica
Volume 31, Issue 4 (2020), pp. 857–880
Abstract
Normalization and aggregation are two most important issues in multi-criteria analysis. Although various multi-criteria decision-making (MCDM) methods have been developed over the past several decades, few of them integrate multiple normalization techniques and mixed aggregation approaches at the same time to reduce the deviations of evaluation values and enhance the reliability of the final decision result. This study is dedicated to introducing a new MCDM method called Mixed Aggregation by COmprehensive Normalization Technique (MACONT) to tackle complicate MCDM problems. This method introduces a comprehensive normalization technique based on criterion types, and then uses two mixed aggregation operators to aggregate the distance values between each alternative and the reference alternative on different criteria from the perspectives of compensation and non-compensation. An illustrative example is given to show the applicability of the proposed method, and the advantages of the proposed method are highlighted through sensitivity analyses and comparative analyses.