Search results: Found 4

Listing 1 - 4 of 4
Sort by
Information Decomposition of Target Effects from Multi-Source Interactions

Authors: --- --- ---
ISBN: 9783038970156 9783038970163 Year: Pages: 336 DOI: 10.3390/books978-3-03897-016-3 Language: englisch
Publisher: MDPI - Multidisciplinary Digital Publishing Institute
Subject: Mathematics --- Physics (General)
Added to DOAB on : 2018-09-04 13:22:10
License:

Loading...
Export citation

Choose an application

Abstract

Using Shannon information theory to analyse the contributions from two source variables to a target, for example, we can measure the information held by one source about the target, the information held by the other source about the target, and the information held by those sources together about the target. Intuitively, however, there is strong desire to measure further notions of how this directed information interaction may be decomposed, e.g., how much information the two source variables hold redundantly about the target, how much each source variable holds uniquely, and how much information can only be discerned by synergistically examining the two sources together.The absence of measures for such decompositions into redundant, unique and synergistic information is arguably the most fundamental missing piece in classical information theory. Triggered by the formulation of the Partial Information Decomposition framework by Williams and Beer in 2010, the past few years have witnessed a concentration of work by the community in proposing, contrasting, and investigating new measures to capture these notions of information decomposition.This Special Issue seeks to bring together these efforts, to capture a snapshot of the current research, as well as to provide impetus for and focused scrutiny on newer work, present progress to the wider community and attract further research. Our contributions present: several new approaches for measures of such decompotions; commentary on properties, interpretations and limitations of such approaches; and applications to empirical data (in particular to neural data).

Information-based methods for neuroimaging: analyzing structure, function and dynamics

Authors: --- ---
Book Series: Frontiers Research Topics ISSN: 16648714 ISBN: 9782889195022 Year: Pages: 191 DOI: 10.3389/978-2-88919-502-2 Language: English
Publisher: Frontiers Media SA
Subject: Neurology --- Science (General)
Added to DOAB on : 2015-12-03 13:02:24
License:

Loading...
Export citation

Choose an application

Abstract

The aim of this Research Topic is to discuss the state of the art on the use of Information-based methods in the analysis of neuroimaging data. Information-based methods, typically built as extensions of the Shannon Entropy, are at the basis of model-free approaches which, being based on probability distributions rather than on specific expectations, can account for all possible non-linearities present in the data in a model-independent fashion.Mutual Information-like methods can also be applied on interacting dynamical variables described by time-series, thus addressing the uncertainty reduction (or information) in one variable by conditioning on another set of variables.In the last years, different Information-based methods have been shown to be flexible and powerful tools to analyze neuroimaging data, with a wide range of different methodologies, including formulations-based on bivariate vs multivariate representations, frequency vs time domains, etc. Apart from methodological issues, the information bit as a common unit represents a convenient way to open the road for comparison and integration between different measurements of neuroimaging data in three complementary contexts: Structural Connectivity, Dynamical (Functional and Effective) Connectivity, and Modelling of brain activity. Applications are ubiquitous, starting from resting state in healthy subjects to modulations of consciousness and other aspects of pathophysiology.Mutual Information-based methods have provided new insights about common-principles in brain organization, showing the existence of an active default network when the brain is at rest. It is not clear, however, how this default network is generated, the different modules are intra-interacting, or disappearing in the presence of stimulation. Some of these open-questions at the functional level might find their mechanisms on their structural correlates. A key question is the link between structure and function and the use of structural priors for the understanding of the functional connectivity measures. As effective connectivity is concerned, recently a common framework has been proposed for Transfer Entropy and Granger Causality, a well-established methodology originally based on autoregressive models. This framework can open the way to new theories and applications.This Research Topic brings together contributions from researchers from different backgrounds which are either developing new approaches, or applying existing methodologies to new data, and we hope it will set the basis for discussing the development and validation of new Information-based methodologies for the understanding of brain structure, function, and dynamics.

Transfer Entropy

Author:
ISBN: 9783038429197 9783038429203 Year: Pages: VIII, 326 Language: Englisch
Publisher: MDPI - Multidisciplinary Digital Publishing Institute
Subject: Computer Science
Added to DOAB on : 2018-08-24 17:15:19
License:

Loading...
Export citation

Choose an application

Abstract

Statistical relationships among the variables of a complex system reveal a lot about its physical behavior. Therefore, identification of the relevant variables and characterization of their interactions are crucial for a better understanding of a complex system. Linear methods, such as correlation, are widely used to identify these relationships. However, information-theoretic quantities, such as mutual information and transfer entropy, have been proven to be superior in the case of nonlinear dependencies. Mutual information quantifies the amount of information obtained about one random variable through the other random variable, and it is symmetric. As an asymmetrical measure, transfer entropy quantifies the amount of directed (time-asymmetric) transfer of information between random processes and, thus, it is related to concepts, such as the Granger causality. This Special Issue includes 16 papers elucidating the state of the art of data-based transfer entropy estimation techniques and applications, in areas such as finance, biomedicine, fluid dynamics and cellular automata. Analytical derivations in special cases, improvements on the estimation methods and comparisons between certain techniques are some of the other contributions of this Special Issue. The diversity of approaches and applications makes this book unique as a single source of invaluable contributions from experts in the field.

Entropy Measures for Data Analysis: Theory, Algorithms and Applications

Author:
ISBN: 9783039280322 9783039280339 Year: Pages: 260 DOI: 10.3390/books978-3-03928-033-9 Language: English
Publisher: MDPI - Multidisciplinary Digital Publishing Institute
Subject: Technology (General) --- General and Civil Engineering --- Environmental Engineering
Added to DOAB on : 2020-01-07 09:21:22
License:

Loading...
Export citation

Choose an application

Abstract

Entropies and entropy-like quantities play an increasing role in modern non-linear data analysis. Fields that benefit from this application range from biosignal analysis to econophysics and engineering. This issue is a collection of papers touching on different aspects of entropy measures in data analysis, as well as theoretical and computational analyses. The relevant topics include the difficulty to achieve adequate application of entropy measures and the acceptable parameter choices for those entropy measures, entropy-based coupling, and similarity analysis, along with the utilization of entropy measures as features in automatic learning and classification. Various real data applications are given.

Keywords

experiment of design --- empirical mode decomposition --- signal analysis --- similarity indices --- synchronization analysis --- auditory attention --- entropy measure --- linear discriminant analysis (LDA) --- support vector machine (SVM) --- auditory attention classifier --- electroencephalography (EEG) --- vague entropy --- distance induced vague entropy --- distance --- complex fuzzy set --- complex vague soft set --- entropy, entropy visualization --- entropy balance equation --- Shannon-type relations --- multivariate analysis --- machine learning evaluation --- data transformation --- sample entropy --- treadmill walking --- center of pressure displacement --- dual-tasking --- analog circuit --- fault diagnosis --- cross wavelet transform --- Tsallis entropy --- parametric t-distributed stochastic neighbor embedding --- support vector machine --- information transfer --- Chinese stock sectors --- effective transfer entropy --- market crash --- system coupling --- cross-visibility graphs --- image entropy --- geodesic distance --- Dempster-Shafer evidence theory --- uncertainty of basic probability assignment --- belief entropy --- plausibility transformation --- weighted Hartley entropy --- Shannon entropy --- learning --- information --- novelty detection --- non-probabilistic entropy --- learning systems --- permutation entropy --- embedded dimension --- short time records --- signal classification --- relevance analysis --- global optimization --- meta-heuristic --- firefly algorithm --- cross-entropy method --- co-evolution --- symbolic analysis --- ordinal patterns --- Permutation entropy --- conditional entropy of ordinal patterns --- Kolmogorov-Sinai entropy --- algorithmic complexity --- information entropy --- particle size distribution --- selfsimilar measure --- simulation --- data analysis --- entropy --- entropy measures --- automatic learning

Listing 1 - 4 of 4
Sort by
Narrow your search