One of the biggest difficulties in telecommunication industry is to retain the customers and prevent the churn. In this article, we overview the most recent researches related to churn detection for telecommunication companies. The selected machine learning methods are applied to the publicly available datasets, partially reproducing the results of other authors and then it is applied to the private Moremins company dataset. Next, we extend the analysis to cover the exiting research gaps: the differences of churn definitions are analysed, it is shown that the accuracy in other researches is better due to some false assumptions, i.e. labelling rules derived from definition lead to very good classification accuracy, however, it does not imply the usefulness for such churn detection in the context of further customer retention. The main outcome of the research is the detailed analysis of the impact of the differences in churn definitions to a final result, it was shown that the impact of labelling rules derived from definitions can be large. The data in this study consist of call detail records (CDRs) and other user aggregated daily data, 11000 user entries over 275 days of data was analysed. 6 different classification methods were applied, all of them giving similar results, one of the best results was achieved using Gradient Boosting Classifier with accuracy rate 0.832, F-measure 0.646, recall 0.769.
An extension of the Integrated Simple Weighted Sum Product (WISP) method is presented in this article, customized for the application of single-valued neutrosophic numbers. The extension is suggested to take advantage that the application of neutrosophic sets provides in terms of solving complex decision-making problems, as well as decision-making problems associated with assessments, prediction uncertainty, imprecision, and so on. In addition, an adapted questionnaire and appropriate linguistic variables are also proposed in the article to enable a simpler and more precise collection of respondents’ attitudes using single-valued neutrosophic numbers. An approach for deneutrosophication, i.e. the transformation of a single-valued neutrosophic number into a crisp number is also proposed in the article. Detailed use and characteristics of the presented improvement are shown on an example of the evaluation of rural tourist tours.
An image or volume of interest in positron emission tomography (PET) is reconstructed from gamma rays emitted from a radioactive tracer, which are then captured and used to estimate the tracer’s location. The image or volume of interest is reconstructed by estimating the pixel or voxel values on a grid determined by the scanner. Such an approach is usually associated with limited resolution of the reconstruction, high computational complexity due to slow convergence and noisy results.
This paper presents a novel method of PET image reconstruction using the underlying assumption that the originals of interest can be modelled using Gaussian mixture models. Parameters are estimated from one-dimensional projections using an iterative algorithm resembling the expectation-maximization algorithm. This presents a complex computational problem which is resolved by a novel approach that utilizes ${L_{1}}$ minimization.
Aimed at achieving the accurate restoration of Poissonian images that exhibit neat edges and no staircase effect, this article develops a novel hybrid nonconvex double regularizer model. The proposed scheme closely takes the advantages of total variation with overlapping group sparsity and nonconvex high-order total variation priors. The overlapping group sparsity is adopted to globally suppress the staircase artifacts, while the nonconvex high-order regularization plays the role of locally preserving the significant image features and edge details. Computationally, a quite efficient alternating direction method of multipliers, associated with the iteratively reweighted ${\ell _{1}}$ algorithm and the majorization-minimization method, is employed to settle the optimization problem iteratively. Finally, exhaustive simulation experiments are executed for recovering Poissonian images, which are made comparisons with several state-of-the-art restoration strategies, indicate the brilliant performance of our model in terms of intuitive effects and accuracy evaluation.
The software Randentropy is designed to estimate inequality in a random system where several individuals interact moving among many communities and producing dependent random quantities of an attribute. The overall inequality is assessed by computing the Random Theil’s Entropy. Firstly, the software estimates a piecewise homogeneous Markov chain by identifying the change-points and the relative transition probability matrices. Secondly, it estimates the multivariate distribution function of the attribute using a copula function approach and finally, through a Monte Carlo algorithm, evaluates the expected value of the Random Theil’s Entropy. Possible applications are discussed as related to the fields of finance and human mobility.
Present worth (PW) analysis is an important technique in engineering economics for investment analysis. The values of PW analysis parameters such as interest rate, first cost, salvage value and annual cash flow are generally estimated including some degree of uncertainty. In order to capture the vagueness in these parameters, fuzzy sets are often used in the literature. In this study, we introduce interval-valued intuitionistic fuzzy PW analysis and circular intuitionistic fuzzy PW analysis in order to handle the impreciseness in the estimation of PW analysis parameters. Circular intuitionistic fuzzy sets are the latest extension of intuitionistic fuzzy sets defining the uncertainty of membership and non-membership degrees through a circle whose radius is r. Thus, we develop new fuzzy extensions of PW analysis including the uncertainty of membership functions. The methods are given step by step and an application for water treatment device purchasing at a local municipality is illustrated in order to show their applicability. In addition, a multi-parameter sensitivity analysis is given. Finally, discussions and suggestions for future research are given in conclusion section.
The smallest enclosing circle is a well-known problem. In this paper, we propose modifications to speed-up the existing Weltzl’s algorithm. We perform the preprocessing to reduce as many input points as possible. The reduction step has lower computational complexity than the Weltzl’s algorithm and thus speed-ups its computation. Next, we propose some changes to Weltzl’s algorithm. In the end are summarized results, that show the speed-up for ${10^{6}}$ input points up to 100 times compared to the original Weltzl’s algorithm. Even more, the proposed algorithm is capable to process significantly larger data sets than the standard Weltzl’s algorithm.
During the COVID-19 pandemic, masks have become essential items for all people to protect themselves from the virus. Because of considering multiple factors when selecting an antivirus mask, the decision-making process has become more complicated. This paper proposes an integrated approach that uses F-BWM-RAFSI methods for antivirus mask selection process with respect to the COVID-19 pandemic. Finally, sensitivity analysis was demonstrated by evaluating the effects of changing the weight coefficients of the criterion on the ranking results, simulating changes in Heronian operator parameters, and comparing the obtained solution to other MCDM approaches to ensure its robustness.
In this paper we propose modifications of the well-known algorithm of particle swarm optimization (PSO). These changes affect the mapping of the motion of particles from continuous space to binary space for searching in it, which is widely used to solve the problem of feature selection. The modified binary PSO variations were tested on the dataset SVC2004 dedicated to the problem of user authentication based on dynamic features of a handwritten signature. In the example of k-nearest neighbours (kNN), experiments were carried out to find the optimal subset of features. The search for the subset was considered as a multicriteria optimization problem, taking into account the accuracy of the model and the number of features.
This paper models and solves the scheduling problem of cable manufacturing industries that minimizes the total production cost, including processing, setup, and storing costs. Two hybrid meta-heuristics, which combine simulated annealing and variable neighbourhood search algorithms with tabu search algorithm, are proposed. Applying some case-based theorems and rules, a special initial solution with optimal setup cost is obtained for the algorithms. The computational experiments, including parameter tuning and final experiments over the benchmarks obtained from a real cable manufacturing factory, show superiority of the combination of tabu search and simulated annealing comparing to the other proposed hybrid and classical meta-heuristics.