# Johnston Dinardo Econometric Met: An Introduction to the Theory and Applications of Econometrics

## Johnston Dinardo Econometric Met: A Comprehensive Review

Econometrics is the science of applying statistical methods to economic data in order to test hypotheses, estimate parameters, and forecast outcomes. Econometrics is essential for understanding the behavior of economic agents, evaluating the impact of policies, and designing optimal strategies for decision making.

## Johnston Dinardo Econometric Met

One of the most influential textbooks in econometrics is Econometric Methods by Jack Johnston and John DiNardo. The fourth edition of this book, published in 1997, is a rewrite of the venerable third edition by Johnston that sustained several generations of economists. As stated by the authors themselves, the reason for undertaking this major revision is to provide a comprehensive and accessible account of currently available econometric methodology.

The book has 13 chapters and runs to 531 pages. Each chapter ends with a selection of problems, several of which are new to this edition. Answers are not provided, although a solutions manual is available. Two appendices, one on matrix algebra and the other on statistical preliminaries, are intended to make the book as self-contained as possible. Not unexpectedly, the appendices are somewhat tersely worded, and the reader may wish to supplement them with additional reference material. Conforming to current practice, the book is accompanied by a data diskette containing several data sets, allowing the reader to replicate the applications given in the text.

The purpose of this review is to provide a critical evaluation of Johnston Dinardo Econometric Met, highlighting its strengths, weaknesses, and relevance for modern econometric practice. The review will follow the structure of the book, covering each chapter in detail. The review will also compare Johnston Dinardo Econometric Met with other popular textbooks in econometrics, such as Greene (2018), Wooldridge (2019), Stock and Watson (2020), among others.

## The Bivariate Linear Model

The first chapter of Johnston Dinardo Econometric Met introduces the basic concepts and assumptions of the bivariate linear model, which is the simplest form of regression analysis. The bivariate linear model relates a dependent variable Y to an independent variable X by a linear equation of the form:

$$Y_i = \beta_0 + \beta_1 X_i + u_i$$

where $i = 1, 2, ..., n$ indexes the observations, $\beta_0$ and $\beta_1$ are unknown parameters, and $u_i$ is a random error term. The authors assume that the error term has zero mean and constant variance, and is independent of X. These assumptions imply that the conditional distribution of Y given X is normal, with mean $\beta_0 + \beta_1 X$ and variance $\sigma^2$. The authors also assume that X has a non-degenerate distribution, which ensures that the parameters are identifiable.

The authors then derive the method of ordinary least squares (OLS) for estimating the parameters $\beta_0$ and $\beta_1$. OLS minimizes the sum of squared errors (SSE) between the observed values of Y and the fitted values from the regression line. The authors show that the OLS estimators have desirable properties, such as unbiasedness, consistency, and efficiency. They also derive the formulas for the standard errors and confidence intervals of the OLS estimators, as well as the test statistics for testing hypotheses about the parameters.

The authors also introduce the concepts of correlation coefficients and the bivariate normal distribution, which are useful for measuring the strength and direction of the linear relationship between Y and X. They show how to calculate and interpret the sample correlation coefficient $r$, as well as the population correlation coefficient $\rho$. They also explain how to use the bivariate normal distribution to obtain joint probabilities and conditional expectations of Y and X.

The chapter concludes with a discussion of transformations and stochastic convergence. The authors show how to use various transformations of Y and X, such as logarithms, reciprocals, powers, etc., to achieve linearity, homoskedasticity, or normality. They also introduce some types of stochastic convergence, such as convergence in probability and convergence in distribution, which are important for establishing the asymptotic properties of estimators and test statistics. However, this treatment is too brief and may leave the reader looking for more details.

The chapter contains several applications and examples that illustrate the use of the bivariate linear model in economics. For instance, the authors use data on consumption and income to estimate the marginal propensity to consume; data on wages and education to estimate the return to schooling; data on demand and price to estimate the price elasticity of demand; data on money supply and inflation to test the quantity theory of money; etc. The authors also provide some graphical displays of the data and the regression results, such as scatter plots, histograms, box plots, etc.

## The Multiple Linear Regression Model

The second chapter of Johnston Dinardo Econometric Met generalizes the bivariate linear model to allow for more than one independent variable. The multiple linear regression model relates a dependent variable Y to a set of independent variables $X_1, X_2, ..., X_k$ by a linear equation of the form:

$$Y_i = \beta_0 + \beta_1 X_i1 + \beta_2 X_i2 + ... + \beta_k X_ik + u_i$$

where $i = 1, 2, ..., n$ indexes the observations, $\beta_0, \beta_1, ..., \beta_k$ are unknown parameters, and $u_i$ is a random error term. The authors assume that the error term has zero mean and constant variance, and is independent of X. These assumptions imply that the conditional distribution of Y given X is normal, with mean $\beta_0 + \beta_1 X_1 + ... + \beta_k X_k$ and variance $\sigma^2$. The authors also assume that X has full rank, which ensures that the parameters are identifiable.

The authors then introduce matrix notation and properties, which are essential for handling multiple regression models. They define vectors and matrices, such as $Y = (Y_1,... ,Y_n)'$, $X = (X_ij)$, $\beta = (\beta_0,... ,\beta_k)'$, $u = (u_1,... ,u_n)'$, etc., and show how to perform operations such as addition, multiplication, transposition, inversion, etc. They also define some special matrices, such as identity matrix $I$, diagonal matrix $D$, symmetric matrix $S$, orthogonal matrix $Q$, etc., and state some useful results, such as trace theorem $tr(AB) = tr(BA)$, determinant theorem $AB = A 71b2f0854b