Probability and Measure Theory, Second Edition, is a text for a graduate-level course in probability that includes essential background topics in analysis. It provides extensive coverage of conditional probability and expectation, strong laws of large numbers, martingale theory, the central limit theorem, ergodic theory, and Brownian motion. Clear, readable style Solutions to many problems presented in text Solutions manual for instructors Material new to the second edition on ergodic theory, Brownian motion, and convergence theorems used in statistics No knowledge of general topology required, just basic analysis and metric spaces Efficient organization
· 1996
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for self-study. This new edition contains substantial revisions and updated references. The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures. Proofs for a number of some important results which were merely stated in the first edition have been added. The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables.
· 2013
A New York Times bestseller "Brilliant, funny…the best math teacher you never had." —San Francisco Chronicle Once considered tedious, the field of statistics is rapidly evolving into a discipline Hal Varian, chief economist at Google, has actually called "sexy." From batting averages and political polls to game shows and medical research, the real-world application of statistics continues to grow by leaps and bounds. How can we catch schools that cheat on standardized tests? How does Netflix know which movies you’ll like? What is causing the rising incidence of autism? As best-selling author Charles Wheelan shows us in Naked Statistics, the right data and a few well-chosen statistical tools can help us answer these questions and more. For those who slept through Stats 101, this book is a lifesaver. Wheelan strips away the arcane and technical details and focuses on the underlying intuition that drives statistical analysis. He clarifies key concepts such as inference, correlation, and regression analysis, reveals how biased or careless parties can manipulate or misrepresent data, and shows us how brilliant and creative researchers are exploiting the valuable data from natural experiments to tackle thorny questions. And in Wheelan’s trademark style, there’s not a dull page in sight. You’ll encounter clever Schlitz Beer marketers leveraging basic probability, an International Sausage Festival illuminating the tenets of the central limit theorem, and a head-scratching choice from the famous game show Let’s Make a Deal—and you’ll come away with insights each time. With the wit, accessibility, and sheer fun that turned Naked Economics into a bestseller, Wheelan defies the odds yet again by bringing another essential, formerly unglamorous discipline to life.
· 2016
This new edition to the classic book by ggplot2 creator Hadley Wickham highlights compatibility with knitr and RStudio. ggplot2 is a data visualization package for R that helps users create data graphics, including those that are multi-layered, with ease. With ggplot2, it's easy to: produce handsome, publication-quality plots with automatic legends created from the plot specification superimpose multiple layers (points, lines, maps, tiles, box plots) from different data sources with automatically adjusted common scales add customizable smoothers that use powerful modeling capabilities of R, such as loess, linear models, generalized additive models, and robust regression save any ggplot2 plot (or part thereof) for later modification or reuse create custom themes that capture in-house or journal style requirements and that can easily be applied to multiple plots approach a graph from a visual perspective, thinking about how each component of the data is represented on the final plot This book will be useful to everyone who has struggled with displaying data in an informative and attractive way. Some basic knowledge of R is necessary (e.g., importing data into R). ggplot2 is a mini-language specifically tailored for producing graphics, and you'll learn everything you need in the book. After reading this book you'll be able to produce graphics customized precisely for your problems, and you'll find it easy to get graphics out of your head and on to the screen or page.
Statistical Foundations of Data Science gives a thorough introduction to commonly used statistical models, contemporary statistical machine learning techniques and algorithms, along with their mathematical insights and statistical theories. It aims to serve as a graduate-level textbook and a research monograph on high-dimensional statistics, sparsity and covariance learning, machine learning, and statistical inference. It includes ample exercises that involve both theoretical studies as well as empirical applications. The book begins with an introduction to the stylized features of big data and their impacts on statistical analysis. It then introduces multiple linear regression and expands the techniques of model building via nonparametric regression and kernel tricks. It provides a comprehensive account on sparsity explorations and model selections for multiple regression, generalized linear models, quantile regression, robust regression, hazards regression, among others. High-dimensional inference is also thoroughly addressed and so is feature screening. The book also provides a comprehensive account on high-dimensional covariance estimation, learning latent factors and hidden structures, as well as their applications to statistical estimation, inference, prediction and machine learning problems. It also introduces thoroughly statistical machine learning theory and methods for classification, clustering, and prediction. These include CART, random forests, boosting, support vector machines, clustering algorithms, sparse PCA, and deep learning.
· 2008
Quantum Mechanics and its applications are a vibrant, central part of today's research in both experimental and theoretical physics. Designed for the one-semester course, Quantum Mechanics expertly guides students through rigorous course material, providing comprehensive explanations, accessible examples, and intuitive equations. This text's in-depth coverage of essential topics, such as harmonic oscillator, barrier penetration, and hydrogen atoms, skillfully bridges the gap between sophomore introduction texts and lower-level graduate treatments. Students will find this user-friendly text, with numerous examples and applications, sets a solid foundation for future courses in the area of Quantum Mechanics.
· 1952
Hardy's Pure Mathematics has been a classic textbook since its publication in 1908. This reissue will bring it to the attention of a whole new generation of mathematicians.
· 1985
Making Decisions Second Edition D.V. Lindley Formerly Professor of Statistics, University College London This book looks at the problems involved in decision-making and argues that there is only one logical way to make a decision. By the use of three basic principles—assigning probabilities to the uncertain events; assigning utilities to the possible consequences; and choosing that decision that maximizes expected utility—decisions can be reached more efficiently and with less disagreement. It shows that only maximization of expected utility leads to sensible decision-making. This extensively revised second edition uses only elementary mathematics and will be of interest to all those concerned with decision-making and its consequences. Since his retirement from University College London in 1977 Professor Lindley has held visiting appointments at Berkeley, University of Florida, George Washington University, University of Sao Paulo, University of Wisconsin, Monash University, Australia, and University of Canterbury, New Zealand. Contents Decisions and uncertain events A numerical measure for uncertainty The laws of probability A numerical measure for consequences The utility of money Bayes’ Theorem Value of information Decision trees The assessment of probabilities and utilities An appreciation Appendix Answers to exercises Glossary of Symbols Subject Index
We present here a one-semester course on Probability Theory. We also treat measure theory and Lebesgue integration, concentrating on those aspects which are especially germane to the study of Probability Theory. The book is intended to fill a current need: there are mathematically sophisticated stu dents and researchers (especially in Engineering, Economics, and Statistics) who need a proper grounding in Probability in order to pursue their primary interests. Many Probability texts available today are celebrations of Prob ability Theory, containing treatments of fascinating topics to be sure, but nevertheless they make it difficult to construct a lean one semester course that covers (what we believe are) the essential topics. Chapters 1-23 provide such a course. We have indulged ourselves a bit by including Chapters 24-28 which are highly optional, but which may prove useful to Economists and Electrical Engineers. This book had its origins in a course the second author gave in Perugia, Italy, in 1997; he used the samizdat "notes" of the first author, long used for courses at the University of Paris VI, augmenting them as needed. The result has been further tested at courses given at Purdue University. We thank the indulgence and patience of the students both in Perugia and in West Lafayette. We also thank our editor Catriona Byrne, as weil as Nick Bingham for many superb suggestions, an anonymaus referee for the same, and Judy Mitchell for her extraordinary typing skills. Jean Jacod, Paris Philip Protter, West Lafayette Contents 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 . . . . . . . . . . . . . .
· 1991
This is a masterly introduction to the modern, and rigorous, theory of probability. The author emphasises martingales and develops all the necessary measure theory.