Search results:
Found 26
Listing 1  10 of 26  << page >> 
Sort by

Choose an application
This book demonstrates how nonlinear/nonGaussian Bayesian time series estimation methods were used to produce a probability distribution of potential MH370 flight paths. It provides details of how the probabilistic models of aircraft flight dynamics, satellite communication system measurements, environmental effects and radar data were constructed and calibrated. The probability distribution was used to define the search zone in the southern Indian Ocean.The book describes particlefilter based numerical calculation of the aircraft flightpath probability distribution and validates the method using data from several of the involved aircraft’s previous flights. Finally it is shown how the Reunion Island flaperon debris find affects the search probability distribution.
Choose an application
Bayesian methods for statistical analysis is a book on statistical methods for analysing a wide variety of data. The book consists of 12 chapters, starting with basic concepts and covering numerous topics, including Bayesian estimation, decision theory, prediction, hypothesis testing, hierarchical models, Markov chain Monte Carlo methods, finite population inference, biased sampling and nonignorable nonresponse. The book contains many exercises, all with worked solutions, including complete computer code. It is suitable for selfstudy or a semesterlong course, with three hours of lectures and one tutorial per week for 13 weeks.
statistics  mathematics  bayesian inference  probability
Choose an application
Since its introduction by Hans Reichenbach, many philosophers have claimed to refute the idea – known as the common cause principle – that any surprising correlation between any two factors that do not directly influence one another is due to some common cause. For example, falsity of the principle is frequently inferred from falsifiability of Bell’s inequalities. The author demonstrates, however, that the situation is not so straightforward. There is more than one version of the principle formulated with the use of different variants of Reichenbachinspired notions; their falsity still remains an open question. The book traces different formulations of the principle and provides proofs of a few pertinent theorems, settling the relevant questions in various probability spaces. In exploring mathematical and philosophical issues surrounding the principle, the book offers both philosophical insight and mathematical rigor.
Choose an application
This book presents a multidisciplinary perspective on chance, with contributions from distinguished researchers in the areas of biology, cognitive neuroscience, economics, genetics, general history, law, linguistics, logic, mathematical physics, statistics, theology and philosophy. The individual chapters are bound together by a general introduction followed by an opening chapter that surveys 2500 years of linguistic, philosophical, and scientific reflections on chance, coincidence, fortune, randomness, luck and related concepts.A main conclusion that can be drawn is that, even after all this time, we still cannot be sure whether chance is a truly fundamental and irreducible phenomenon, in that certain events are simply uncaused and could have been otherwise, or whether it is always simply a reflection of our ignorance. Other challenges that emerge from this book include a better understanding of the contextuality and perspectival character of chance (including its scaledependence), and the curious fact that, throughout history (including contemporary science), chance has been used both as an explanation and as a hallmark of the absence of explanation. As such, this book challenges the reader to think about chance in a new way and to come to grips with this endlessly fascinating phenomenon.
Human Genetics  Philosophy of Science  Probability Theory
Choose an application
The range of Bayesian inference algorithms and their different applications has been greatly expanded since the first implementation of a Kalman filter by Stanley F. Schmidt for the Apollo program. Extended Kalman filters or particle filters are just some examples of these algorithms that have been extensively applied to logistics, medical services, search and rescue operations, or automotive safety, among others. This book takes a look at both theoretical foundations of Bayesian inference and practical implementations in different fields. It is intended as an introductory guide for the application of Bayesian inference in the fields of life sciences, engineering, and economics, as well as a source document of fundamentals for intermediate Bayesian readers.
Physical Sciences, Engineering and Technology  Mathematics  Statistics  Probability Distribution
Choose an application
" This book is intended to help candidates prepare for entrance examinations in mathematics and scientific subjects, including STEP (Sixth Term Examination Paper). STEP is an examination used by Cambridge colleges as the basis for conditional offers. They are also used by Warwick University, and many other mathematics departments recommend that their applicants practice on the past papers even if they do not take the examination. Advanced Problems in Mathematics is recommended as preparation for any undergraduate mathematics course, even for students who do not plan to take the Sixth Term Examination Paper. The questions analysed in this book are all based on recent STEP questions selected to address the syllabus for Papers I and II, which is the Alevel core (i.e. C1 to C4) with a few additions. Each question is followed by a comment and a full solution. The comments direct the reader’s attention to key points and put the question in its true mathematical context. The solutions point students to the methodology required to address advanced mathematical problems critically and independently.This book is a must read for any student wishing to apply to scientific subjects at university level and for anybody interested in advanced mathematics."
geometry  calculus  probability and statistics  undergraduate mathematics course  step examinations  advanced mathematical problems
Choose an application
This book deals with applications of quantum mechanical techniques to areas outside of quantum mechanics, socalled quantumlike modeling. Research in this area has grown over the last 15 years. But even already more than 50 years ago, the interaction between Physics Nobelist Pauli and the psychologist Carl Jung in the 1950's on seeking to find analogous uses of the complementarity principle from quantum mechanics in psychology needs noting. This book does NOT want to advance that society is quantum mechanical! The macroscopic world is manifestly not quantum mechanical. But this rules not out that one can use concepts and the mathematical apparatus from quantum physics in a macroscopic environment. A mainstay ingredient of quantum mechanics, is 'quantum probability' and this tool has been proven to be useful in the mathematical modelling of decision making. In the most basic experiment of quantum physics, the double slit experiment, it is known (from the works of A. Khrennikov) that the law of total probability is violated. It is now well documented that several decision making paradoxes in psychology and economics (such as the Ellsberg paradox) do exhibit this violation of the law of total probability. When data is collected with experiments which test 'nonrational' decision making behaviour, one can observe that such data often exhibits a complex noncommutative structure, which may be even more complex than if one considers the structure allied to the basic two slit experiment. The community exploring quantumlike models has tried to address how quantum probability can help in better explaining those paradoxes. Research has now been published in very high standing journals on resolving some of the paradoxes with the mathematics of quantum physics. The aim of this book is to collect the contributions of world's leading experts in quantum like modeling in decision making, psychology, cognition, economics, and finance.
Quantumlike models  mathematical formalism of quantum theory  quantum probability  decision making  psychology  cognition  emotions
Choose an application
We confess that the first part of our title is somewhat of a misnomer. Bayesian reasoning is a normative approach to probabilistic belief revision and, as such, it is in need of no improvement. Rather, it is the typical individual whose reasoning and judgments often fall short of the Bayesian ideal who is the focus of improvement. What have we learnt from over a halfcentury of research and theory on this topic that could explain why people are often nonBayesian? Can Bayesian reasoning be facilitated, and if so why? These are the questions that motivate this Frontiers in Psychology Research Topic. Bayes' theorem, named after English statistician, philosopher, and Presbyterian minister, Thomas Bayes, offers a method for updating one’s prior probability of an hypothesis H on the basis of new data D such that P(HD) = P(DH)P(H)/P(D). The first wave of psychological research, pioneered by Ward Edwards, revealed that people were overly conservative in updating their posterior probabilities (i.e., P(DH)). A second wave, spearheaded by Daniel Kahneman and Amos Tversky, showed that people often ignored prior probabilities or base rates, where the priors had a frequentist interpretation, and hence were not Bayesians at all. In the 1990s, a third wave of research spurred by Leda Cosmides and John Tooby and by Gerd Gigerenzer and Ulrich Hoffrage showed that people can reason more like a Bayesian if only the information provided takes the form of (nonrelativized) natural frequencies. Although Kahneman and Tversky had already noted the advantages of frequency representations, it was the third wave scholars who pushed the prescriptive agenda, arguing that there are feasible and effective methods for improving belief revision. Most scholars now agree that natural frequency representations do facilitate Bayesian reasoning. However, they do not agree on why this is so. The original third wave scholars favor an evolutionary account that posits human brain adaptation to natural frequency processing. But almost as soon as this view was proposed, other scholars challenged it, arguing that such evolutionary assumptions were not needed. The dominant opposing view has been that the benefit of natural frequencies is mainly due to the fact that such representations make the nested set relations perfectly transparent. Thus, people can more easily see what information they need to focus on and how to simply combine it. This Research Topic aims to take stock of where we are at present. Are we in a protofourth wave? If so, does it offer a synthesis of recent theoretical disagreements? The second part of the title orients the reader to the two main subtopics: what works and why? In terms of the first subtopic, we seek contributions that advance understanding of how to improve people’s abilities to revise their beliefs and to integrate probabilistic information effectively. The second subtopic centers on explaining why methods that improve nonBayesian reasoning work as well as they do. In addressing that issue, we welcome both critical analyses of existing theories as well as fresh perspectives. For both subtopics, we welcome the full range of manuscript types.
Bayesian reasoning  belief revision  Risk Communication  subjective probability  human judgment  psychological methods  individual differences  Bayesianism  probabilistic judgment
Choose an application
This new and expanded edition is intended to help candidates prepare for entrance examinations in mathematics and scientific subjects, including STEP (Sixth Term Examination Paper). STEP is an examination used by Cambridge Colleges for conditional offers in mathematics. They are also used by some other UK universities and many mathematics departments recommend that their applicants practice on the past papers even if they do not take the examination.Advanced Problems in Mathematics bridges the gap between school and university mathematics, and prepares students for an undergraduate mathematics course. The questions analysed in this book are all based on past STEP questions and each question is followed by a comment and a full solution. The comments direct the reader’s attention to key points and put the question in its true mathematical context. The solutions point students to the methodology required to address advanced mathematical problems critically and independently.This book is a must read for any student wishing to apply to scientific subjects at university level and for anyone interested in advanced mathematics.
Choose an application
From ABO typing during the first half of the 20th century, to the use of enzymes and protein contained in blood serums and finally direct DNA typing, biology has been serving forensic purposes for many decades. Statistics, in turn, has been constantly underpinning the discussions of the probative value of results of biological analyses, in particular when defendants could not be considered as excluded as potential sources because of different genetic traits. The marriage between genetics and statistics has never been an easy one, though, as is illustrated by fierce arguments that peaked in the socalled "DNA wars" in some American courtrooms in the mid1990s. This controversy has contributed to a lively production of research and publications on various interpretative topics, such as the collection of relevant data, foundations in population genetics as well as theoretical and practical considerations in probability and statistics. Both DNA profiling as a technique and the associated statistical considerations are now widely accepted as robust, but this does not yet guarantee or imply a neat transition to their application in court. Indeed, statistical principles applied to results of forensic DNA profiling analyses are a necessary, yet not a sufficient preliminary requirement for the contextually meaningful use of DNA in the law. Ultimately, the appropriate use of DNA in the forensic context relies on inference, i.e. reasoning reasonably in the face of uncertainty. This is all the more challenging that such thought processes need to be adopted by stakeholders from various backgrounds and holding diverse interests. Although several topics of the DNA controversy have been settled over time, some others are still debated (such as the question of how to deal with the probability of error), while yet others  purportedly settled topics  saw some recent revivals (e.g., the question of how to deal with database searches). In addition, new challenging topics have emerged over the last decade, such as the analysis and interpretation of traces containing only low quantities of DNA where artefacts of varying nature may affect results. Both technical and interpretative research involving statistics thus represent areas where ongoing research is necessary, and where scholars from the natural sciences and the law should collaborate. The articles in this Research Topic thus aim to investigate, from an interdisciplinary perspective, the current understanding of the strengths and limitations of DNA profiling results in legal applications. This Research Topic accepts contributions in all frontiers article type categories and places an emphasis on topics with a multidisciplinary perspective that explore (while not being limited to) statistical genetics for forensic scientists, case studies and reports, evaluation and interpretation of forensic findings, communication of expert findings to laypersons, quantitative legal reasoning and factfinding using probability.
Forensic DNA profiling  interpretation  Statistics and the law  probability theory  Commercialization  DNA transfer  Lowtemplate DNA analysis  forensic molecular biology  Bacterial DNA
Listing 1  10 of 26  << page >> 
Sort by
