Pub. online:1 Jan 2019Type:Research ArticleOpen Access
Journal:Informatica
Volume 30, Issue 3 (2019), pp. 529–552
Abstract
A standard problem in certain applications requires one to find a reconstruction of an analogue signal f from a sequence of its samples $f{({t_{k}})_{k}}$. The great success of such a reconstruction consists, under additional assumptions, in the fact that an analogue signal f of a real variable $t\in \mathbb{R}$ can be represented equivalently by a sequence of complex numbers $f{({t_{k}})_{k}}$, i.e. by a digital signal. In the sequel, this digital signal can be processed and filtered very efficiently, for example, on digital computers. The sampling theory is one of the theoretical foundations of the conversion from analog to digital signals. There is a long list of impressive research results in this area starting with the classical work of Shannon. Note that the well known Shannon sampling theory is mainly for one variable signals. In this paper, we concern with bandlimited signals of several variables, whose restriction to Euclidean space ${\mathbb{R}^{n}}$ has finite p-energy. We present sampling series, where signals are sampled at Nyquist rate. These series involve digital samples of signals and also samples of their partial derivatives. It is important that our reconstruction is stable in the sense that sampling series converge absolutely and uniformly on the whole ${\mathbb{R}^{n}}$. Therefore, having a stable reconstruction process, it is possible to bound the approximation error, which is made by using only of the partial sum with finitely many samples.
Pub. online:1 Jan 2019Type:Research ArticleOpen Access
Journal:Informatica
Volume 30, Issue 3 (2019), pp. 503–528
Abstract
In cases where the balance problem of an assembly line with the aim to distribute the work loads among the stations as equal as possible, the concept of entropy function can be used. In this paper, a typical assembly line balancing problem with different objective functions such as entropy-based objective function plus two more objective functions like equipment purchasing cost and worker time-dependent wage is formulated. The non-linear entropy-based objective function is approximated as a linear function using the bounded variable method of linear programming. A new hybrid fuzzy programming approach is proposed to solve the proposed multi-objective formulation efficiently. The extensive computational experiments on some test problems proves the efficiency of the proposed solution approach comparing to the available approaches of the literature.
Pub. online:1 Jan 2019Type:Research ArticleOpen Access
Journal:Informatica
Volume 30, Issue 3 (2019), pp. 481–502
Abstract
This paper presents a model which integrates inbound and outbound logistics with a crossdocking system. This model integrates the problem of routing inbound vehicles between suppliers and cross-docks and outbound vehicles between cross-docks and retailers, considering logistics costs and the products properties. This model aims to minimize the total cost by optimizing assignment of products to suppliers and retailers and operations of inbound and outbound vehicles. We developed an endosymbiotic evolutionary algorithm, which yields good performance in concurrent searches for the solutions of multiple subproblems and validate the performance using several numerical examples.
Pub. online:1 Jan 2019Type:Research ArticleOpen Access
Journal:Informatica
Volume 30, Issue 3 (2019), pp. 455–480
Abstract
The paper deals with the causality driven modelling method applied for the domain deep knowledge elicitation. This method is suitable for discovering causal relationships in domains that are characterized by internal circular causality, e.g. control and management, regulatory processes, self-regulation and renewal. Such domains are organizational systems (i.e. enterprise) or cyber-social systems, also biological systems, ecological systems, and other complex systems. Subject domain may be of different nature: real-world activities or documented content. A causality driven approach is applied here for the learning content analysis and normalization of the knowledge structures. This method was used in the field of education, and a case study of learning content renewal is provided. The domain here is a real world area – a learning content is about. The paper is on how to align the existing learning content and current (new) knowledge of the domain using the same causality driven viewpoint and the described models (frameworks). Two levels of the domain causal modelling are obtained. The first level is the discovery of the causality of the domain using the Management Transaction (MT) framework. Secondly, a deep knowledge structure of MT is revealed through a more detailed framework called the Elementary Management Cycle (EMC). The algorithms for updating the LO content in two steps are presented. Traceability matrix indicates the mismatch of the LO content (old knowledge) and new domain knowledge. Classification of the content discrepancies and an example of the study program content analysis is presented. The main outcome of the causality driven modelling approach is the effectiveness of discovering the deep knowledge when the relevant domain causality frameworks are applicable.
Pub. online:1 Jan 2019Type:Research ArticleOpen Access
Journal:Informatica
Volume 30, Issue 3 (2019), pp. 431–454
Abstract
The way that forensic examiners compare fingerprints highly differs from the behaviour of current automatic fingerprint identification algorithms. Experts usually use all the information in the fingerprint, not only minutiae, while automatic algorithms don’t. Partial (especially latent) fingerprint matching algorithms still report low accuracy values in comparison to those achieved by experts. This difference is mainly due to the features used in each case. In this work, a novel approach for matching partial fingerprints is presented. We introduce a new fingerprint feature, named Distinctive Ridge Point (DRP), combined with an improved triangle-based representation which also uses minutiae. The new feature describes the neighbouring ridges of minutiae in a novel way. A modified version of a fingerprint matching algorithm presented in a previous work is used for matching two triangular representations of minutiae and DRPs. The experiments conducted on NIST27 database with a background added of 29000 tenprint impressions from NIST14 and NIST4 databases showed the benefits of this approach. The results show that using the proposal we achieved an accuracy of 70.9% in rank-1, improving in an 11% the accuracy obtained using minutiae and the reference point. This result is comparable with the best accuracy reached in the state of the art while the amount of features is reduced.
Pub. online:1 Jan 2019Type:Research ArticleOpen Access
Journal:Informatica
Volume 30, Issue 2 (2019), pp. 413–429
Abstract
An extended TODIM is proposed in this paper to comprehensively reflect the psychological characteristics of decision makers (DMs) according to cumulative prospect theory (CPT). We replace the original weight with the weighting function of CPT and modify the perceived value of the dominance based on CPT, because the general psychological phenomena of DMs explained in CPT are verified by many experiments and recognized by researchers. Hence, the extended TODIM not only integrates the advantages of CPT in considering the psychological factors of DMs but also retains the superiority of the classical TODIM in relative dominance. Finally, the extended TODIM is demonstrated to capture the psychological factors of DMs well from the case study.
Pub. online:1 Jan 2019Type:Research ArticleOpen Access
Journal:Informatica
Volume 30, Issue 2 (2019), pp. 391–412
Abstract
Fermatean fuzzy sets (FFSs), proposed by Senapati and Yager (2019a), can handle uncertain information more easily in the process of decision making. They defined basic operations over the Fermatean fuzzy sets. Here we shall introduce three new operations: subtraction, division, and Fermatean arithmetic mean operations over Fermatean fuzzy sets. We discuss their properties in details. Later, we develop a Fermatean fuzzy weighted product model to solve the multi-criteria decision-making problem. Finally, an illustrative example of selecting a suitable bridge construction method is given to verify the approach developed by us and to demonstrate its practicability and effectiveness.
Pub. online:1 Jan 2019Type:Research ArticleOpen Access
Journal:Informatica
Volume 30, Issue 2 (2019), pp. 367–390
Abstract
The multidimensional data model for kriging is developed using fractional Euclidean distance matrices (FEDM). The properties of FEDM are studied by means of the kernel matrix mehod. It has been shown that the factorization of kernel matrix enables us to create the embedded set being a nonsingular simplex. Using the properties of FEDM the Gaussian random field (GRF) is constructed doing it without positive definite correlation functions usually applied for such a purpose. Created GRF can be considered as a multidimensional analogue of the Wiener process, for instance, line realizations of this GRF are namely Wiener processes. Next, the kriging method is developed based on FEDM. The method is rather simple and depends on parameters that are simply estimated by the maximum likelihood method. Computer simulation of the developed kriging extrapolator has shown that it outperforms the well known Shepard inverse distance extrapolator. Practical application of the developed approach to surrogate modelling of wastewater treatment is discussed. Theoretical investigation, computer simulation, and a practical example demonstrate that the proposed kriging model, using FEDM, can be efficiently applied to multidimensional data modelling and processing.
Pub. online:1 Jan 2019Type:Research ArticleOpen Access
Journal:Informatica
Volume 30, Issue 2 (2019), pp. 349–365
Abstract
The isometric mapping (Isomap) algorithm is often used for analysing hyperspectral images. Isomap allows to reduce such hyperspectral images from a high-dimensional space into a lower-dimensional space, keeping the critical original information. To achieve such objective, Isomap uses the state-of-the-art MultiDimensional Scaling method (MDS) for dimensionality reduction. In this work, we propose to use Isomap with SMACOF, since SMACOF is the most accurate MDS method. A deep comparison, in terms of accuracy, between Isomap based on an eigen-decomposition process and Isomap based on SMACOF has been carried out using three benchmark hyperspectral images. Moreover, for the hyperspectral image classification, three classifiers (support vector machine, k-nearest neighbour, and Random Forest) have been used to compare both Isomap approaches. The experimental investigation has shown that better classification accuracy is obtained by Isomap with SMACOF.
Pub. online:1 Jan 2019Type:Research ArticleOpen Access
Journal:Informatica
Volume 30, Issue 2 (2019), pp. 269–292
Abstract
The 3D extensions of ordinary fuzzy sets such as intuitionistic fuzzy sets (IFS), Pythagorean fuzzy sets (PFS), and neutrosophic sets (NS) aim to describe experts’ judgments more informatively and explicitly. In this paper, generalized three dimensional spherical fuzzy sets are presented with their arithmetic, aggregation, and defuzzification operations. Weighted Aggregated Sum Product ASsessment (WASPAS) is a combination of two well-known multi-criteria decision-making (MCDM) methods, which are weighted sum model (WSM) and weighted product model (WPM). The aim of this paper is to extend traditional WASPAS method to spherical fuzzy WASPAS (SF-WASPAS) method and to show its application with an industrial robot selection problem. Additionally, we present comparative and sensitivity analyses to show the validity and robustness of the given decisions.