Normal-inverse-gamma distribution
In probability theory and statistics, the normal-inverse-gamma distribution (or Gaussian-inverse-gamma distribution) is a four-parameter family of multivariate continuous probability distributions. It is the conjugate prior of a normal distribution with unknown mean and variance.
Probability density function | |||
Parameters |
location (real) (real) (real) (real) | ||
---|---|---|---|
Support | |||
Mean |
| ||
Mode |
| ||
Variance |
, for |
Definition
editSuppose
has a normal distribution with mean and variance , where
has an inverse-gamma distribution. Then has a normal-inverse-gamma distribution, denoted as
( is also used instead of )
The normal-inverse-Wishart distribution is a generalization of the normal-inverse-gamma distribution that is defined over multivariate random variables.
Characterization
editProbability density function
editFor the multivariate form where is a random vector,
where is the determinant of the matrix . Note how this last equation reduces to the first form if so that are scalars.
Alternative parameterization
editIt is also possible to let in which case the pdf becomes
In the multivariate form, the corresponding change would be to regard the covariance matrix instead of its inverse as a parameter.
Cumulative distribution function
editProperties
editMarginal distributions
editGiven as above, by itself follows an inverse gamma distribution:
while follows a t distribution with degrees of freedom.[1]
For probability density function is
Marginal distribution over is
Except for normalization factor, expression under the integral coincides with Inverse-gamma distribution
with , , .
Since , and
Substituting this expression and factoring dependence on ,
Shape of generalized Student's t-distribution is
.
Marginal distribution follows t-distribution with degrees of freedom
.
In the multivariate case, the marginal distribution of is a multivariate t distribution:
Summation
editScaling
editSuppose
Then for ,
Proof: To prove this let and fix . Defining , observe that the PDF of the random variable evaluated at is given by times the PDF of a random variable evaluated at . Hence the PDF of evaluated at is given by :
The right hand expression is the PDF for a random variable evaluated at , which completes the proof.
Exponential family
editNormal-inverse-gamma distributions form an exponential family with natural parameters , , , and and sufficient statistics , , , and .
Information entropy
editKullback–Leibler divergence
editMeasures difference between two distributions.
Maximum likelihood estimation
editThis section is empty. You can help by adding to it. (July 2010) |
Posterior distribution of the parameters
editSee the articles on normal-gamma distribution and conjugate prior.
Interpretation of the parameters
editSee the articles on normal-gamma distribution and conjugate prior.
Generating normal-inverse-gamma random variates
editGeneration of random variates is straightforward:
- Sample from an inverse gamma distribution with parameters and
- Sample from a normal distribution with mean and variance
Related distributions
edit- The normal-gamma distribution is the same distribution parameterized by precision rather than variance
- A generalization of this distribution which allows for a multivariate mean and a completely unknown positive-definite covariance matrix (whereas in the multivariate inverse-gamma distribution the covariance matrix is regarded as known up to the scale factor ) is the normal-inverse-Wishart distribution
See also
editReferences
edit- ^ Ramírez-Hassan, Andrés. 4.2 Conjugate prior to exponential family | Introduction to Bayesian Econometrics.
- Denison, David G. T.; Holmes, Christopher C.; Mallick, Bani K.; Smith, Adrian F. M. (2002) Bayesian Methods for Nonlinear Classification and Regression, Wiley. ISBN 0471490369
- Koch, Karl-Rudolf (2007) Introduction to Bayesian Statistics (2nd Edition), Springer. ISBN 354072723X