TY - BOOK
ID - 33643
TI - New Developments in Statistical Information Theory Based on Entropy and Divergence Measures
AU - Pardo, Leandro
PB - MDPI - Multidisciplinary Digital Publishing Institute
PY - 2019
KW - sparse
KW - robust
KW - divergence
KW - MM algorithm
KW - Bregman divergence
KW - generalized linear model
KW - local-polynomial regression
KW - model check
KW - nonparametric test
KW - quasi-likelihood
KW - semiparametric model
KW - Wald statistic
KW - composite likelihood
KW - maximum composite likelihood estimator
KW - Wald test statistic
KW - composite minimum density power divergence estimator
KW - Wald-type test statistics
KW - Bregman divergence
KW - general linear model
KW - hypothesis testing
KW - influence function
KW - robust
KW - Wald-type test
KW - log-linear models
KW - ordinal classification variables
KW - association models
KW - correlation models
KW - minimum penalized ?-divergence estimator
KW - consistency
KW - asymptotic normality
KW - goodness-of-fit
KW - bootstrap distribution estimator
KW - thematic quality assessment
KW - relative entropy
KW - logarithmic super divergence
KW - robustness
KW - minimum divergence inference
KW - generalized renyi entropy
KW - minimum divergence methods
KW - robustness
KW - single index model
KW - model assessment
KW - statistical distance
KW - non-quadratic distance
KW - total variation
KW - mixture index of fit
KW - Kullback-Leibler distance
KW - divergence measure
KW - ?-divergence
KW - relative error estimation
KW - robust estimation
KW - information geometry
KW - centroid
KW - Bregman information
KW - Hölder divergence
KW - indoor localization
KW - robustness
KW - efficiency
KW - Bayesian nonparametric
KW - Bayesian semi-parametric
KW - asymptotic property
KW - minimum disparity methods
KW - Hellinger distance
KW - Berstein von Mises theorem
KW - measurement errors
KW - robust testing
KW - two-sample test
KW - misspecified hypothesis and alternative
KW - 2-alternating capacities
KW - composite hypotheses
KW - corrupted data
KW - least-favorable hypotheses
KW - Neyman Pearson test
KW - divergence based testing
KW - Chernoff Stein lemma
KW - compressed data
KW - Hellinger distance
KW - representation formula
KW - iterated limits
KW - influence function
KW - consistency
KW - asymptotic normality
KW - location-scale family
KW - n/a
SN - 9783038979364 / 9783038979371
AB - This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.
ER -