Search results: Found 3

Listing 1 - 3 of 3
Sort by
Differential Geometrical Theory of Statistics

Authors: ---
ISBN: 9783038424253 9783038424246 Year: Pages: XIV, 458 DOI: 10.3390/books978-3-03842-425-3 Language: English
Publisher: MDPI - Multidisciplinary Digital Publishing Institute
Subject: Physics (General)
Added to DOAB on : 2017-06-12 12:20:37
License:

Loading...
Export citation

Choose an application

Abstract

This Special Issue "Differential Geometrical Theory of Statistics" collates selected invited and contributed talks presented during the conference GSI'15 on "Geometric Science of Information" which was held at the Ecole Polytechnique, Paris-Saclay Campus, France, in October 2015 (Conference web site: http://www.see.asso.fr/gsi2015).

Information Geometry

Author:
ISBN: 9783038976325 Year: Pages: 356 DOI: 10.3390/books978-3-03897-633-2 Language: eng
Publisher: MDPI - Multidisciplinary Digital Publishing Institute
Subject: Mathematics --- Science (General)
Added to DOAB on : 2019-04-05 10:34:31
License:

Loading...
Export citation

Choose an application

Abstract

This Special Issue of the journal Entropy, titled “Information Geometry I”, contains a collection of 17 papers concerning the foundations and applications of information geometry. Based on a geometrical interpretation of probability, information geometry has become a rich mathematical field employing the methods of differential geometry. It has numerous applications to data science, physics, and neuroscience. Presenting original research, yet written in an accessible, tutorial style, this collection of papers will be useful for scientists who are new to the field, while providing an excellent reference for the more experienced researcher. Several papers are written by authorities in the field, and topics cover the foundations of information geometry, as well as applications to statistics, Bayesian inference, machine learning, complex systems, physics, and neuroscience.

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

Author:
ISBN: 9783038979364 / 9783038979371 Year: Pages: 344 DOI: 10.3390/books978-3-03897-937-1 Language: eng
Publisher: MDPI - Multidisciplinary Digital Publishing Institute
Subject: Social Sciences --- Sociology --- Statistics
Added to DOAB on : 2019-06-26 08:44:06
License:

Loading...
Export citation

Choose an application

Abstract

This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.

Keywords

sparse --- robust --- divergence --- MM algorithm --- Bregman divergence --- generalized linear model --- local-polynomial regression --- model check --- nonparametric test --- quasi-likelihood --- semiparametric model --- Wald statistic --- composite likelihood --- maximum composite likelihood estimator --- Wald test statistic --- composite minimum density power divergence estimator --- Wald-type test statistics --- Bregman divergence --- general linear model --- hypothesis testing --- influence function --- robust --- Wald-type test --- log-linear models --- ordinal classification variables --- association models --- correlation models --- minimum penalized ?-divergence estimator --- consistency --- asymptotic normality --- goodness-of-fit --- bootstrap distribution estimator --- thematic quality assessment --- relative entropy --- logarithmic super divergence --- robustness --- minimum divergence inference --- generalized renyi entropy --- minimum divergence methods --- robustness --- single index model --- model assessment --- statistical distance --- non-quadratic distance --- total variation --- mixture index of fit --- Kullback-Leibler distance --- divergence measure --- ?-divergence --- relative error estimation --- robust estimation --- information geometry --- centroid --- Bregman information --- Hölder divergence --- indoor localization --- robustness --- efficiency --- Bayesian nonparametric --- Bayesian semi-parametric --- asymptotic property --- minimum disparity methods --- Hellinger distance --- Berstein von Mises theorem --- measurement errors --- robust testing --- two-sample test --- misspecified hypothesis and alternative --- 2-alternating capacities --- composite hypotheses --- corrupted data --- least-favorable hypotheses --- Neyman Pearson test --- divergence based testing --- Chernoff Stein lemma --- compressed data --- Hellinger distance --- representation formula --- iterated limits --- influence function --- consistency --- asymptotic normality --- location-scale family --- n/a

Listing 1 - 3 of 3
Sort by
Narrow your search

Publisher

MDPI - Multidisciplinary Digital Publishing Institute (3)


License

CC by-nc-nd (3)


Language

eng (2)

english (1)


Year
From To Submit

2019 (2)

2017 (1)