In econometrics, the information matrix test is used to determine whether a regression model is misspecified. The test was developed by Halbert White,[1] who observed that in a correctly specified model and under standard regularity assumptions, the Fisher information matrix can be expressed in either of two ways: as the outer product of the gradient, or as a function of the Hessian matrix of the log-likelihood function.
Consider a linear model , where the errors are assumed to be distributed . If the parameters and are stacked in the vector , the resulting log-likelihood function is
The information matrix can then be expressed as
that is the expected value of the outer product of the gradient or score. Second, it can be written as the negative of the Hessian matrix of the log-likelihood function
If the model is correctly specified, both expressions should be equal. Combining the equivalent forms yields
where is an random matrix, where is the number of parameters. White showed that the elements of , where is the MLE, are asymptotically normally distributed with zero means when the model is correctly specified.[2] In small samples, however, the test generally performs poorly.[3]
References
edit- ^ White, Halbert (1982). "Maximum Likelihood Estimation of Misspecified Models". Econometrica. 50 (1): 1–25. doi:10.2307/1912526. JSTOR 1912526.
- ^ Godfrey, L. G. (1988). Misspecification Tests in Econometrics. Cambridge University Press. pp. 35–37. ISBN 0-521-26616-5.
- ^ Orme, Chris (1990). "The Small-Sample Performance of the Information-Matrix Test". Journal of Econometrics. 46 (3): 309–331. doi:10.1016/0304-4076(90)90012-I.
Further reading
edit- Krämer, W.; Sonnberger, H. (1986). The Linear Regression Model Under Test. Heidelberg: Physica-Verlag. pp. 105–110. ISBN 3-7908-0356-1.
- White, Halbert (1994). "Information Matrix Testing". Estimation, Inference and Specification Analysis. New York: Cambridge University Press. pp. 300–344. ISBN 0-521-25280-6.