Search results:
Found 50
Listing 1  10 of 50  << page >> 
Sort by

Choose an application
This book demonstrates how nonlinear/nonGaussian Bayesian time series estimation methods were used to produce a probability distribution of potential MH370 flight paths. It provides details of how the probabilistic models of aircraft flight dynamics, satellite communication system measurements, environmental effects and radar data were constructed and calibrated. The probability distribution was used to define the search zone in the southern Indian Ocean.The book describes particlefilter based numerical calculation of the aircraft flightpath probability distribution and validates the method using data from several of the involved aircraft’s previous flights. Finally it is shown how the Reunion Island flaperon debris find affects the search probability distribution.
Choose an application
AgePeriodCohort Analysis: New Models, Methods, and Empirical Applications is based on a decade of the authors’ collaborative work in ageperiodcohort (APC) analysis. Within a single, consistent HAPCGLMM statistical modeling framework, the authors synthesize APC models and methods for three research designs: agebytime period tables of population rates or proportions, repeated crosssection sample surveys, and accelerated longitudinal panel studies. The authors show how the empirical application of the models to various problems leads to many fascinating findings on how outcome variables develop along the age, period, and cohort dimensions. The book makes two essential contributions to quantitative studies of timerelated change. Through the introduction of the GLMM framework, it shows how innovative estimation methods and new model specifications can be used to tackle the "model identification problem" that has hampered the development and empirical application of APC analysis. The book also addresses the major criticism against APC analysis by explaining the use of new models within the GLMM framework to uncover mechanisms underlying age patterns and temporal trends. Encompassing both methodological expositions and empirical studies, this book explores the ways in which statistical models, methods, and research designs can be used to open new possibilities for APC analysis. It compares new and existing models and methods and provides useful guidelines on how to conduct APC analysis. For empirical illustrations, the text incorporates examples from a variety of disciplines, such as sociology, demography, and epidemiology. Along with details on empirical analyses, software and programs to estimate the models are available on the book’s web page.
Choose an application
Bayesian methods for statistical analysis is a book on statistical methods for analysing a wide variety of data. The book consists of 12 chapters, starting with basic concepts and covering numerous topics, including Bayesian estimation, decision theory, prediction, hypothesis testing, hierarchical models, Markov chain Monte Carlo methods, finite population inference, biased sampling and nonignorable nonresponse. The book contains many exercises, all with worked solutions, including complete computer code. It is suitable for selfstudy or a semesterlong course, with three hours of lectures and one tutorial per week for 13 weeks.
statistics  mathematics  bayesian inference  probability
Choose an application
Since its introduction by Hans Reichenbach, many philosophers have claimed to refute the idea – known as the common cause principle – that any surprising correlation between any two factors that do not directly influence one another is due to some common cause. For example, falsity of the principle is frequently inferred from falsifiability of Bell’s inequalities. The author demonstrates, however, that the situation is not so straightforward. There is more than one version of the principle formulated with the use of different variants of Reichenbachinspired notions; their falsity still remains an open question. The book traces different formulations of the principle and provides proofs of a few pertinent theorems, settling the relevant questions in various probability spaces. In exploring mathematical and philosophical issues surrounding the principle, the book offers both philosophical insight and mathematical rigor.
Choose an application
This book presents a multidisciplinary perspective on chance, with contributions from distinguished researchers in the areas of biology, cognitive neuroscience, economics, genetics, general history, law, linguistics, logic, mathematical physics, statistics, theology and philosophy. The individual chapters are bound together by a general introduction followed by an opening chapter that surveys 2500 years of linguistic, philosophical, and scientific reflections on chance, coincidence, fortune, randomness, luck and related concepts.A main conclusion that can be drawn is that, even after all this time, we still cannot be sure whether chance is a truly fundamental and irreducible phenomenon, in that certain events are simply uncaused and could have been otherwise, or whether it is always simply a reflection of our ignorance. Other challenges that emerge from this book include a better understanding of the contextuality and perspectival character of chance (including its scaledependence), and the curious fact that, throughout history (including contemporary science), chance has been used both as an explanation and as a hallmark of the absence of explanation. As such, this book challenges the reader to think about chance in a new way and to come to grips with this endlessly fascinating phenomenon.
Human Genetics  Philosophy of Science  Probability Theory
Choose an application
The range of Bayesian inference algorithms and their different applications has been greatly expanded since the first implementation of a Kalman filter by Stanley F. Schmidt for the Apollo program. Extended Kalman filters or particle filters are just some examples of these algorithms that have been extensively applied to logistics, medical services, search and rescue operations, or automotive safety, among others. This book takes a look at both theoretical foundations of Bayesian inference and practical implementations in different fields. It is intended as an introductory guide for the application of Bayesian inference in the fields of life sciences, engineering, and economics, as well as a source document of fundamentals for intermediate Bayesian readers.
Physical Sciences, Engineering and Technology  Mathematics  Statistics  Probability Distribution
Choose an application
" This book is intended to help candidates prepare for entrance examinations in mathematics and scientific subjects, including STEP (Sixth Term Examination Paper). STEP is an examination used by Cambridge colleges as the basis for conditional offers. They are also used by Warwick University, and many other mathematics departments recommend that their applicants practice on the past papers even if they do not take the examination. Advanced Problems in Mathematics is recommended as preparation for any undergraduate mathematics course, even for students who do not plan to take the Sixth Term Examination Paper. The questions analysed in this book are all based on recent STEP questions selected to address the syllabus for Papers I and II, which is the Alevel core (i.e. C1 to C4) with a few additions. Each question is followed by a comment and a full solution. The comments direct the reader’s attention to key points and put the question in its true mathematical context. The solutions point students to the methodology required to address advanced mathematical problems critically and independently. This book is a must read for any student wishing to apply to scientific subjects at university level and for anybody interested in advanced mathematics. "
geometry  calculus  probability and statistics  undergraduate mathematics course  step examinations  advanced mathematical problems
Choose an application
This book deals with applications of quantum mechanical techniques to areas outside of quantum mechanics, socalled quantumlike modeling. Research in this area has grown over the last 15 years. But even already more than 50 years ago, the interaction between Physics Nobelist Pauli and the psychologist Carl Jung in the 1950's on seeking to find analogous uses of the complementarity principle from quantum mechanics in psychology needs noting. This book does NOT want to advance that society is quantum mechanical! The macroscopic world is manifestly not quantum mechanical. But this rules not out that one can use concepts and the mathematical apparatus from quantum physics in a macroscopic environment. A mainstay ingredient of quantum mechanics, is 'quantum probability' and this tool has been proven to be useful in the mathematical modelling of decision making. In the most basic experiment of quantum physics, the double slit experiment, it is known (from the works of A. Khrennikov) that the law of total probability is violated. It is now well documented that several decision making paradoxes in psychology and economics (such as the Ellsberg paradox) do exhibit this violation of the law of total probability. When data is collected with experiments which test 'nonrational' decision making behaviour, one can observe that such data often exhibits a complex noncommutative structure, which may be even more complex than if one considers the structure allied to the basic two slit experiment. The community exploring quantumlike models has tried to address how quantum probability can help in better explaining those paradoxes. Research has now been published in very high standing journals on resolving some of the paradoxes with the mathematics of quantum physics. The aim of this book is to collect the contributions of world's leading experts in quantum like modeling in decision making, psychology, cognition, economics, and finance.
Quantumlike models  mathematical formalism of quantum theory  quantum probability  decision making  psychology  cognition  emotions
Choose an application
This open access book presents the key aspects of statistics in Wasserstein spaces, i.e. statistics in the space of probability measures when endowed with the geometry of optimal transportation. Further to reviewing stateoftheart aspects, it also provides an accessible introduction to the fundamentals of this current topic, as well as an overview that will serve as an invitation and catalyst for further research. Statistics in Wasserstein spaces represents an emerging topic in mathematical statistics, situated at the interface between functional data analysis (where the data are functions, thus lying in infinite dimensional Hilbert space) and nonEuclidean statistics (where the data satisfy nonlinear constraints, thus lying on nonEuclidean manifolds). The Wasserstein space provides the natural mathematical formalism to describe data collections that are best modeled as random measures on Euclidean space (e.g. images and point processes). Such random measures carry the infinite dimensional traits of functional data, but are intrinsically nonlinear due to positivity and integrability restrictions. Indeed, their dominating statistical variation arises through random deformations of an underlying template, a theme that is pursued in depth in this monograph. ; Gives a succinct introduction to necessary mathematical background, focusing on the results useful for statistics from an otherwise vast mathematical literature. Presents an up to date overview of the state of the art, including some original results, and discusses open problems. Suitable for selfstudy or to be used as a graduate level course text. Open access.
Probability Theory and Stochastic Processes  Optimal Transportation  MongeKantorovich Problem  Barycenter  Multimarginal Transport  Functional Data Analysis  Point Processes  Random Measures  Manifold Statistics  Open Access  Geometrical statistics  Wasserstein metric  Fréchet mean  Procrustes analysis  Phase variation  Gradient descent  Probability & statistics  Stochastics
Choose an application
We confess that the first part of our title is somewhat of a misnomer. Bayesian reasoning is a normative approach to probabilistic belief revision and, as such, it is in need of no improvement. Rather, it is the typical individual whose reasoning and judgments often fall short of the Bayesian ideal who is the focus of improvement. What have we learnt from over a halfcentury of research and theory on this topic that could explain why people are often nonBayesian? Can Bayesian reasoning be facilitated, and if so why? These are the questions that motivate this Frontiers in Psychology Research Topic. Bayes' theorem, named after English statistician, philosopher, and Presbyterian minister, Thomas Bayes, offers a method for updating one’s prior probability of an hypothesis H on the basis of new data D such that P(HD) = P(DH)P(H)/P(D). The first wave of psychological research, pioneered by Ward Edwards, revealed that people were overly conservative in updating their posterior probabilities (i.e., P(DH)). A second wave, spearheaded by Daniel Kahneman and Amos Tversky, showed that people often ignored prior probabilities or base rates, where the priors had a frequentist interpretation, and hence were not Bayesians at all. In the 1990s, a third wave of research spurred by Leda Cosmides and John Tooby and by Gerd Gigerenzer and Ulrich Hoffrage showed that people can reason more like a Bayesian if only the information provided takes the form of (nonrelativized) natural frequencies. Although Kahneman and Tversky had already noted the advantages of frequency representations, it was the third wave scholars who pushed the prescriptive agenda, arguing that there are feasible and effective methods for improving belief revision. Most scholars now agree that natural frequency representations do facilitate Bayesian reasoning. However, they do not agree on why this is so. The original third wave scholars favor an evolutionary account that posits human brain adaptation to natural frequency processing. But almost as soon as this view was proposed, other scholars challenged it, arguing that such evolutionary assumptions were not needed. The dominant opposing view has been that the benefit of natural frequencies is mainly due to the fact that such representations make the nested set relations perfectly transparent. Thus, people can more easily see what information they need to focus on and how to simply combine it. This Research Topic aims to take stock of where we are at present. Are we in a protofourth wave? If so, does it offer a synthesis of recent theoretical disagreements? The second part of the title orients the reader to the two main subtopics: what works and why? In terms of the first subtopic, we seek contributions that advance understanding of how to improve people’s abilities to revise their beliefs and to integrate probabilistic information effectively. The second subtopic centers on explaining why methods that improve nonBayesian reasoning work as well as they do. In addressing that issue, we welcome both critical analyses of existing theories as well as fresh perspectives. For both subtopics, we welcome the full range of manuscript types.
Bayesian reasoning  belief revision  Risk Communication  subjective probability  human judgment  psychological methods  individual differences  Bayesianism  probabilistic judgment
Listing 1  10 of 50  << page >> 
Sort by
