Search results: Found 2

Listing 1 - 2 of 2
Sort by
Information Decomposition of Target Effects from Multi-Source Interactions

Authors: --- --- ---
ISBN: 9783038970156 9783038970163 Year: Pages: 336 DOI: 10.3390/books978-3-03897-016-3 Language: englisch
Publisher: MDPI - Multidisciplinary Digital Publishing Institute
Subject: Mathematics --- Physics (General)
Added to DOAB on : 2018-09-04 13:22:10
License:

Loading...
Export citation

Choose an application

Abstract

Using Shannon information theory to analyse the contributions from two source variables to a target, for example, we can measure the information held by one source about the target, the information held by the other source about the target, and the information held by those sources together about the target. Intuitively, however, there is strong desire to measure further notions of how this directed information interaction may be decomposed, e.g., how much information the two source variables hold redundantly about the target, how much each source variable holds uniquely, and how much information can only be discerned by synergistically examining the two sources together.The absence of measures for such decompositions into redundant, unique and synergistic information is arguably the most fundamental missing piece in classical information theory. Triggered by the formulation of the Partial Information Decomposition framework by Williams and Beer in 2010, the past few years have witnessed a concentration of work by the community in proposing, contrasting, and investigating new measures to capture these notions of information decomposition.This Special Issue seeks to bring together these efforts, to capture a snapshot of the current research, as well as to provide impetus for and focused scrutiny on newer work, present progress to the wider community and attract further research. Our contributions present: several new approaches for measures of such decompotions; commentary on properties, interpretations and limitations of such approaches; and applications to empirical data (in particular to neural data).

Transfer Entropy

Author:
ISBN: 9783038429197 9783038429203 Year: Pages: VIII, 326 Language: Englisch
Publisher: MDPI - Multidisciplinary Digital Publishing Institute
Subject: Computer Science
Added to DOAB on : 2018-08-24 17:15:19
License:

Loading...
Export citation

Choose an application

Abstract

Statistical relationships among the variables of a complex system reveal a lot about its physical behavior. Therefore, identification of the relevant variables and characterization of their interactions are crucial for a better understanding of a complex system. Linear methods, such as correlation, are widely used to identify these relationships. However, information-theoretic quantities, such as mutual information and transfer entropy, have been proven to be superior in the case of nonlinear dependencies. Mutual information quantifies the amount of information obtained about one random variable through the other random variable, and it is symmetric. As an asymmetrical measure, transfer entropy quantifies the amount of directed (time-asymmetric) transfer of information between random processes and, thus, it is related to concepts, such as the Granger causality. This Special Issue includes 16 papers elucidating the state of the art of data-based transfer entropy estimation techniques and applications, in areas such as finance, biomedicine, fluid dynamics and cellular automata. Analytical derivations in special cases, improvements on the estimation methods and comparisons between certain techniques are some of the other contributions of this Special Issue. The diversity of approaches and applications makes this book unique as a single source of invaluable contributions from experts in the field.

Listing 1 - 2 of 2
Sort by
Narrow your search

Publisher

MDPI - Multidisciplinary Digital Publishing Institute (2)


License

CC by-nc-nd (2)


Language

englisch (2)


Year
From To Submit

2018 (2)