My library button
  • Book cover of Mostly Harmless Econometrics

    In addition to econometric essentials, this book covers important new extensions as well as how to get standard errors right. The authors explain why fancier econometric techniques are typically unnecessary and even dangerous.

  • Book cover of Limited-Dependent and Qualitative Variables in Econometrics
    G. S. Maddala

     · 1983

    This book presents the econometric analysis of single-equation and simultaneous-equation models in which the jointly dependent variables can be continuous, categorical, or truncated. Despite the traditional emphasis on continuous variables in econometrics, many of the economic variables encountered in practice are categorical (those for which a suitable category can be found but where no actual measurement exists) or truncated (those that can be observed only in certain ranges). Such variables are involved, for example, in models of occupational choice, choice of tenure in housing, and choice of type of schooling. Models with regulated prices and rationing, and models for program evaluation, also represent areas of application for the techniques presented by the author.

  • Book cover of Mastering 'Metrics

    From Joshua Angrist, winner of the Nobel Prize in Economics, and Jörn-Steffen Pischke, an accessible and fun guide to the essential tools of econometric research Applied econometrics, known to aficionados as 'metrics, is the original data science. 'Metrics encompasses the statistical methods economists use to untangle cause and effect in human affairs. Through accessible discussion and with a dose of kung fu–themed humor, Mastering 'Metrics presents the essential tools of econometric research and demonstrates why econometrics is exciting and useful. The five most valuable econometric methods, or what the authors call the Furious Five—random assignment, regression, instrumental variables, regression discontinuity designs, and differences in differences—are illustrated through well-crafted real-world examples (vetted for awesomeness by Kung Fu Panda's Jade Palace). Does health insurance make you healthier? Randomized experiments provide answers. Are expensive private colleges and selective public high schools better than more pedestrian institutions? Regression analysis and a regression discontinuity design reveal the surprising truth. When private banks teeter, and depositors take their money and run, should central banks step in to save them? Differences-in-differences analysis of a Depression-era banking crisis offers a response. Could arresting O. J. Simpson have saved his ex-wife's life? Instrumental variables methods instruct law enforcement authorities in how best to respond to domestic abuse. Wielding econometric tools with skill and confidence, Mastering 'Metrics uses data and statistics to illuminate the path from cause to effect. Shows why econometrics is important Explains econometric research through humorous and accessible discussion Outlines empirical methods central to modern econometric practice Works through interesting and relevant real-world examples

  • Book cover of Theory of Decision Under Uncertainty
    Itzhak Gilboa

     · 2009

    This book describes the classical axiomatic theories of decision under uncertainty, as well as critiques thereof and alternative theories. It focuses on the meaning of probability, discussing some definitions and surveying their scope of applicability. The behavioral definition of subjective probability serves as a way to present the classical theories, culminating in Savage's theorem. The limitations of this result as a definition of probability lead to two directions - first, similar behavioral definitions of more general theories, such as non-additive probabilities and multiple priors, and second, cognitive derivations based on case-based techniques.

  • Book cover of Forecasting with Exponential Smoothing

    Exponential smoothing methods have been around since the 1950s, and are still the most popular forecasting methods used in business and industry. However, a modeling framework incorporating stochastic models, likelihood calculation, prediction intervals and procedures for model selection, was not developed until recently. This book brings together all of the important new results on the state space framework for exponential smoothing. It will be of interest to people wanting to apply the methods in their own area of interest as well as for researchers wanting to take the ideas in new directions. Part 1 provides an introduction to exponential smoothing and the underlying models. The essential details are given in Part 2, which also provide links to the most important papers in the literature. More advanced topics are covered in Part 3, including the mathematical properties of the models and extensions of the models for specific problems. Applications to particular domains are discussed in Part 4.

  • Book cover of Probability Theory and Statistical Inference
    Aris Spanos

     · 2019

    Doubt over the trustworthiness of published empirical results is not unwarranted and is often a result of statistical mis-specification: invalid probabilistic assumptions imposed on data. Now in its second edition, this bestselling textbook offers a comprehensive course in empirical research methods, teaching the probabilistic and statistical foundations that enable the specification and validation of statistical models, providing the basis for an informed implementation of statistical procedure to secure the trustworthiness of evidence. Each chapter has been thoroughly updated, accounting for developments in the field and the author's own research. The comprehensive scope of the textbook has been expanded by the addition of a new chapter on the Linear Regression and related statistical models. This new edition is now more accessible to students of disciplines beyond economics and includes more pedagogical features, with an increased number of examples as well as review questions and exercises at the end of each chapter.

  • Book cover of Foundations of Complex-system Theories

    Analyzes approaches to the study of complexity in the physical, biological, and social sciences.

  • Book cover of Introduction to Econometrics

    Dougherty provides a step-by-step introductory guide to the core areas of this demanding subject. The book includes new material on specification tests, binary choice models, tobit analysis, and unit root tests and cointegration.

  • Book cover of Principles of Economics
    A. Marshall

     · 2013

    Alfred Marshall, Principles of Economics (1890) – Founder of Modern (Neo-classical) Economics. His book Principles of Economics was the dominant textbook in economics for a long time and it is considered to be his seminal work.

  • Book cover of Introduction to Time Series and Forecasting

    Some of the key mathematical results are stated without proof in order to make the underlying theory accessible to a wider audience. The book assumes a knowledge only of basic calculus, matrix algebra, and elementary statistics. The emphasis is on methods and the analysis of data sets. The logic and tools of model-building for stationary and nonstationary time series are developed in detail and numerous exercises, many of which make use of the included computer package, provide the reader with ample opportunity to develop skills in this area. The core of the book covers stationary processes, ARMA and ARIMA processes, multivariate time series and state-space models, with an optional chapter on spectral analysis. Additional topics include harmonic regression, the Burg and Hannan-Rissanen algorithms, unit roots, regression with ARMA errors, structural models, the EM algorithm, generalized state-space models with applications to time series of count data, exponential smoothing, the Holt-Winters and ARAR forecasting algorithms, transfer function models and intervention analysis. Brief introductions are also given to cointegration and to nonlinear, continuous-time and long-memory models. The time series package included in the back of the book is a slightly modified version of the package ITSM, published separately as ITSM for Windows, by Springer-Verlag, 1994. It does not handle such large data sets as ITSM for Windows, but like the latter, runs on IBM-PC compatible computers under either DOS or Windows (version 3.1 or later). The programs are all menu-driven so that the reader can immediately apply the techniques in the book to time series data, with a minimal investment of time in the computational and algorithmic aspects of the analysis.