Journal:Informatica
Volume 22, Issue 4 (2011), pp. 507–520
Abstract
The most classical visualization methods, including multidimensional scaling and its particular case – Sammon's mapping, encounter difficulties when analyzing large data sets. One of possible ways to solve the problem is the application of artificial neural networks. This paper presents the visualization of large data sets using the feed-forward neural network – SAMANN. This back propagation-like learning rule has been developed to allow a feed-forward artificial neural network to learn Sammon's mapping in an unsupervised way. In its initial form, SAMANN training is computation expensive. In this paper, we discover conditions optimizing the computational expenditure in visualization even of large data sets. It is shown possibility to reduce the original dimensionality of data to a lower one using small number of iterations. The visualization results of real-world data sets are presented.
Journal:Informatica
Volume 21, Issue 3 (2010), pp. 339–348
Abstract
In the presented paper, some issues of the fundamental classical mechanics theory in the sense of Ising physics are introduced into the applied neural network area. The expansion of the neural networks theory is based primarily on introducing Hebb postulate into the mean field theory as an instrument of analysis of complex systems. Appropriate propositions and a theorem with proofs were proposed. In addition, some computational background is presented and discussed.
Journal:Informatica
Volume 20, Issue 4 (2009), pp. 477–486
Abstract
In the present paper, the neural networks theory based on presumptions of the Ising model is considered. Indirect couplings, the Dirac distributions and the corrected Hebb rule are introduced and analyzed. The embedded patterns memorized in a neural network and the indirect couplings are considered as random. Apart from the complex theory based on Dirac distributions the simplified stationary mean field equations and their solutions taking into account an ergodicity of the average overlap and the indirect order parameter are presented. The modeling results are demonstrated to corroborate theoretical statements and applied aspects.
Journal:Informatica
Volume 12, Issue 1 (2001), pp. 101–108
Abstract
This paper considers some aspects of using a cascade-correlation network in the investment task in which it is required to determine the most suitable project to invest money. This task is one of the most often met economical tasks. In various bibliographical sources on economics there are described different methods of choosing investment projects. However, they all use either one or a few criteria, i.e., out of the set of criteria there are chosen most valuable ones. With this, a lot of information contained in other choice criteria is omitted. A neural network enables one to avoid information losses. It accumulates information and helps to gain better results when choosing an investment project in comparison with classical methods. The cascade-correlation network architecture that is used in this paper has been developed by Scott E. Fahlman and Cristian Lebiere at Carnegie Mellon University.
Journal:Informatica
Volume 2, Issue 2 (1991), pp. 221–232
Abstract
The principles of a neural network environmental model are proposed. The principles are universal and can use different neural network architectures. Such a model is self-organizing, it can operate in both regimes with and without a teacher. It codes information about objects, their features, the actions operating in an environment, analyzes concrete situations. There are functions for making an action plan, for action control. The goal of the model is given from an external site. The model has more than sixteen active regimes. The neural network environmental model is fulfilled in software and hardware tools.