Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Purchase individual online access for 1 year to this journal.
Price: EUR 410.00Impact Factor 2024: 0.4
Fundamenta Informaticae is an international journal publishing original research results in all areas of theoretical computer science. Papers are encouraged contributing:
- solutions by mathematical methods of problems emerging in computer science
- solutions of mathematical problems inspired by computer science.
Topics of interest include (but are not restricted to): theory of computing, complexity theory, algorithms and data structures, computational aspects of combinatorics and graph theory, programming language theory, theoretical aspects of programming languages, computer-aided verification, computer science logic, database theory, logic programming, automated deduction, formal languages and automata theory, concurrency and distributed computing, cryptography and security, theoretical issues in artificial intelligence, machine learning, pattern recognition, algorithmic game theory, bioinformatics and computational biology, quantum computing, probabilistic methods, & algebraic and categorical methods.
Authors: Czaja, Ludwik | Penczek, Wojciech | Stencel, Krzysztof
Article Type: Other
DOI: 10.3233/FI-2016-1401
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. i-iii, 2016
Authors: Alsolami, Fawaz | Amin, Talha | Chikalov, Igor | Moshkov, Mikhail | Zielosko, Beata
Article Type: Research Article
Abstract: In the paper, an application of dynamic programming approach for optimization of association rules from the point of view of knowledge representation is considered. The association rule set is optimized in two stages, first for minimum cardinality and then for minimum length of rules. Experimental results present cardinality of the set of association rules constructed for information system and lower bound on minimum possible cardinality of rule set based on the information obtained during algorithm work as well as obtained results for length.
Keywords: association rules, decision rules, dynamic programming, set cover problem, rough sets
DOI: 10.3233/FI-2016-1402
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 159-171, 2016
Authors: Barbuti, Roberto | Gori, Roberta | Levi, Francesca | Milazzo, Paolo
Article Type: Research Article
Abstract: Reaction systems are a qualitative formalism for modeling systems of biochemical reactions characterized by the non-permanency of the elements: molecules disappear if not produced by any enabled reaction. Reaction systems execute in an environment that provides new molecules at each step. Brijder, Ehrenfeucht and Rozemberg introduced the idea of predictors . A predictor of a molecule s , for a given n , is the set of molecules to be observed in the environment to determine whether s is produced or not at step n by the system. We introduced the notion of formula based …predictor , that is a propositional logic formula that precisely characterizes environments that lead to the production of s after n steps. In this paper we revise the notion of formula based predictor by defining a specialized version that assumes the environment to provide molecules according to what expressed by a temporal logic formula. As an application, we use specialized formula based predictors to give theoretical grounds to previously obtained results on a model of gene regulation. Show more
DOI: 10.3233/FI-2016-1403
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 173-191, 2016
Authors: Buregwa-Czuma, Sylwia | Bazan, Jan G. | Zareba, Lech | Bazan-Socha, Stanislawa | Rewerska, Barbara | Pardel, Przemyslaw | Dydo, Lukasz
Article Type: Research Article
Abstract: The decision making depends on the perception of the world and the proper identification of objects. The perception can be modified by various factors, that alter a way of perceiving the object even though the object is not changed (e.g., in the perception of a medical condition, such factors can be drugs or diet). The purpose of this research is to study how the disturbing factors can influence the perception. The idea was to introduce the description of the rules of these changes. We propose a method for evaluating the effect of additional therapy in patients with coronary heart disease …based on the tree of the impact. The leaves of the tree provide cross-decision rules of perception changes which could be suggested as a solution to the problem of predicting changes in perception. The problems considered in this paper are associated with the design of classifiers which allow the perception of the object in the context of information related to the decision attribute. Show more
Keywords: classification, perception interference, cross-decision rules, tree of impact
DOI: 10.3233/FI-2016-1404
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 193-207, 2016
Authors: Czaja, Ludwik
Article Type: Research Article
Abstract: Two observations in the matter of pictorial as well as formal presentation of some consistency in distributed shared memory are made. The first concerns geometric transformation of line segments and points picturing read/write operations, the second - converting partial order of the operations into linear order of their initiations and terminations. This allows to reduce serialization of the read/write operations as a whole to permutations of their beginnings and ends. Some draft proposals are introduced.
DOI: 10.3233/FI-2016-1405
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 209-221, 2016
Authors: Grabowski, Adam
Article Type: Research Article
Abstract: Rough sets offer a well-known approach to incomplete or imprecise data. In the paper I briefly report how this framework was successfully encoded by means of one of the leading computer proof-assistants in the world. The general approach is essentially based on binary relations, and all natural properties of approximation operators can be obtained via adjectives added to underlying relations. I focus on lattice-theoretical aspects of rough sets to enable the application of external theorem provers like EQP or Prover9 as well as to translate them into TPTP format widely recognized in the world of automated proof search. I wanted …to have a clearly written, possibly formal, although informal as a rule, paper authored by a specialist from the discipline another than lattice theory. It appeared that Lattice theory for rough sets by Jouni Järvinen (called LTRS for short) was quite a reasonable choice to be a testbed for the current formalisation both of lattices and of rough sets. A popular computerised proof-assistant Mizar was used as a tool, hence all the efforts are available in one of the largest repositories of computer-checked mathematical knowledge, called Mizar Mathematical Library. Show more
DOI: 10.3233/FI-2016-1406
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 223-240, 2016
Authors: Kopczyński, Maciej | Grześ, Tomasz | Stepaniuk, Jarosław
Article Type: Research Article
Abstract: This paper presents FPGA and softcore CPU based device for large datasets core calculation using rough set methods. Presented architectures have been tested on two real datasets by downloading and running presented solutions inside FPGA. Tested datasets had 1 000 to 10 000 000 objects. The same operations were performed in software implementation. Obtained results show the big acceleration in computation time using hardware supporting core generation in comparison to pure software implementation.
DOI: 10.3233/FI-2016-1407
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 241-259, 2016
Authors: Nguyen, Linh Anh
Article Type: Research Article
Abstract: We present the first direct tableau decision procedure for graded PDL, which uses global caching and has ExpTime (optimal) complexity when numbers are encoded in unary. It shows how to combine checking fulfillment of existential star modalities with integer linear feasibility checking for tableaux with global caching. As graded PDL can be used as a description logic for representing and reasoning about terminological knowledge, our procedure is useful for practical applications.
DOI: 10.3233/FI-2016-1408
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 261-288, 2016
Authors: Niewiadomski, Artur | Skaruz, Jaroslaw | Switalski, Piotr | Penczek, Wojciech
Article Type: Research Article
Abstract: The paper deals with the concrete planning problem – a stage of the web service composition in the PlanICS framework. We present several known and new methods of concrete planning including those based on Satisfiability Modulo Theories (SMT), Genetic Algorithm (GA), as well as methods combining SMT with GA and other nature-inspired algorithms such as Simulated Annealing (SA) and Generalised Extremal Optimization (GEO). The discussion of all the approaches is supported by the complexity analysis, extensive experimental results, and illustrated by a running example.
Keywords: Web Service Composition, Concrete Planning, PlanICS, SMT, Genetic Algorithm, Hybrid Algorithm, Simulated Annealing, GEO
DOI: 10.3233/FI-2016-1409
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 289-313, 2016
Authors: Podymov, Vladislav
Article Type: Research Article
Abstract: For many program analysis problems it is useful to have means to efficiently prove that given programs have similar (equivalent) behaviors. Unfortunately, in most cases to prove the behavioral equivalence is an undecidable problem. A common way to overcome such undecidability is to consider a model of programs with an abstract semantics based on the real one, in which only some simple properties are captured, and to provide an efficient equivalence-checking algorithm for the model. We focus on two kinds of properties of data-modifying statements of imperative programs. Statements a and b are commutative, if the execution of …sequences ab and ba lead to the same result. A statement b is (left-)absorptive for a statement a , if the execution of sequences ab and b lead to the same result. We consider propositional program models in which commutativity and absorption properties are caprtured (CA-models). Formally, data states for a CA-model are elements of a monoid over the set of statement symbols, defined by an arbitrary set of relations of the form ab = ba (for commutativity) and ab = b (for absorption). We propose an equivalence-checking algorithm for CA-models based on (what we call) progressive monoids. The algorithm terminates in time polynomial in size of programs. As a consequence, we prove a polynomial-time decidability for the equivalence problem in such CA-models. Show more
Keywords: program models, equivalence checking, semigroups, commutativity, left absorption
DOI: 10.3233/FI-2016-1410
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 315-336, 2016
Authors: Polkowski, Lech T. | Nowak, Bartosz
Article Type: Research Article
Abstract: In this work, we approach the problem of data analysis from a new angle: we investigate a relational method of separation of data into disjoint sub–data employing a modified betweenness relation, successfully applied by us in the area of behavioral robotics, and, we set a scheme for applications to be studied. The effect of the action by that relation on data is selection of a sub–data, say, ‘kernel’ with the property that each thing in it is a convex combination, in a sense explained below, of some other things in the kernel. One can say that kernel thus exhibited is …‘self–closed’. Algorithmically, this is achieved by means of a new construct, called by us a ‘dual indiscernibility matrix’. On the other hand, the complement to kernel consists of things in the data, which have some attribute values not met in any other thing. It is proper to call this complement to kernel the residuum . We examine both the kernel and the residuum from the point of view of quality of classification into decision classes for a few standard data sets from the UC Irvine Repository finding the results very satisfactory. Conceptually, our work is set in the framework of rough set theory and rough mereology and the main tool in inducing of the betweenness relation is the Łukasiewicz rough inclusion. Apart from the classification problem, we propose some strategies for conflict resolution based on concepts introduced in this work, and in this way we continue conflict analysis in rough set framework initiated by Zdzisław Pawlak. Show more
Keywords: rough inclusion, betweenness, Euclidean representation of a granule, hyper–granules, coalitions, conflict resolutions, classifier synthesis
DOI: 10.3233/FI-2016-1411
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 337-352, 2016
Authors: Przybyła-Kasperek, Małgorzata
Article Type: Research Article
Abstract: Issues that are related to decision making that is based on dispersed knowledge are discussed in the paper. A dispersed decision-making system that was proposed in the earlier paper of the author is used in this paper. In the system the process of combining classifiers in coalitions is very important and negotiation is applied in the clustering process. The main aim of the article is to compare the results obtained using five different methods of conflict analysis in the system. All of these methods are used when the individual classifiers generate probability vectors over decision classes. The most popular methods …are considered - a sum rule, a product rule, a median rule, a maximum rule and a minimum rule. An additional aim is to compare the results obtained with using a dispersed decision-making system with the results obtained when the prediction results are aggregated directly using the conflict analysis methods. Tests, that were performed on data from the UCI repository are presented in the paper. The best methods in a particular situation are also indicated. It was found that some methods do not generate satisfactory results when there are dummy agents in a dispersed data set. That is, there are undecided agents who assign the same probability value to many different decision values. Another conclusion was that the use of a dispersed system improves the efficiency of inference. Show more
Keywords: decision support system, dispersed knowledge, conflict analysis, sum rule, product rule, median rule, maximum rule, minimum rule
DOI: 10.3233/FI-2016-1412
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 353-370, 2016
Authors: Skowron, Andrzej | Jankowski, Andrzej
Article Type: Research Article
Abstract: In several papers we have discussed a computing model, called the Interactive Granular Computing (IGrC), for interactive computations on complex granules. In this paper, we compare two models of computing, namely the Turing model and the IGrC model.
Keywords: granular computing, rough set, interaction, information granule, physical object, complex granule, interactive granular computing
DOI: 10.3233/FI-2016-1413
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 371-385, 2016
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
[email protected]
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office [email protected]
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
如果您在出版方面需要帮助或有任何建, 件至: [email protected]