Normal-gamma distribution

(Redirected from Gaussian-gamma distribution)

In probability theory and statistics, the normal-gamma distribution (or Gaussian-gamma distribution) is a bivariate four-parameter family of continuous probability distributions. It is the conjugate prior of a normal distribution with unknown mean and precision.[2]

normal-gamma
Parameters location (real)
(real)
(real)
(real)
Support
PDF
Mean [1]
Mode
Variance [1]

Definition

edit

For a pair of random variables, (X,T), suppose that the conditional distribution of X given T is given by

 

meaning that the conditional distribution is a normal distribution with mean   and precision   — equivalently, with variance  

Suppose also that the marginal distribution of T is given by

 

where this means that T has a gamma distribution. Here λ, α and β are parameters of the joint distribution.

Then (X,T) has a normal-gamma distribution, and this is denoted by

 

Properties

edit

Probability density function

edit

The joint probability density function of (X,T) is

 

where the conditional probability for   was used.

Marginal distributions

edit

By construction, the marginal distribution of   is a gamma distribution, and the conditional distribution of   given   is a Gaussian distribution. The marginal distribution of   is a three-parameter non-standardized Student's t-distribution with parameters  .[citation needed]

Exponential family

edit

The normal-gamma distribution is a four-parameter exponential family with natural parameters   and natural statistics  .[citation needed]

Moments of the natural statistics

edit

The following moments can be easily computed using the moment generating function of the sufficient statistic:[3]

 

where   is the digamma function,

 

Scaling

edit

If   then for any   is distributed as[citation needed]  

Posterior distribution of the parameters

edit

Assume that x is distributed according to a normal distribution with unknown mean   and precision  .

 

and that the prior distribution on   and  ,  , has a normal-gamma distribution

 

for which the density π satisfies

 

Suppose

 

i.e. the components of   are conditionally independent given   and the conditional distribution of each of them given   is normal with expected value   and variance   The posterior distribution of   and   given this dataset   can be analytically determined by Bayes' theorem[4] explicitly,

 

where   is the likelihood of the parameters given the data.

Since the data are i.i.d, the likelihood of the entire dataset is equal to the product of the likelihoods of the individual data samples:

 

This expression can be simplified as follows:

 

where  , the mean of the data samples, and  , the sample variance.

The posterior distribution of the parameters is proportional to the prior times the likelihood.

 

The final exponential term is simplified by completing the square.

 

On inserting this back into the expression above,

 

This final expression is in exactly the same form as a Normal-Gamma distribution, i.e.,

 

Interpretation of parameters

edit

The interpretation of parameters in terms of pseudo-observations is as follows:

  • The new mean takes a weighted average of the old pseudo-mean and the observed mean, weighted by the number of associated (pseudo-)observations.
  • The precision was estimated from   pseudo-observations (i.e. possibly a different number of pseudo-observations, to allow the variance of the mean and precision to be controlled separately) with sample mean   and sample variance   (i.e. with sum of squared deviations  ).
  • The posterior updates the number of pseudo-observations ( ) simply by adding the corresponding number of new observations ( ).
  • The new sum of squared deviations is computed by adding the previous respective sums of squared deviations. However, a third "interaction term" is needed because the two sets of squared deviations were computed with respect to different means, and hence the sum of the two underestimates the actual total squared deviation.

As a consequence, if one has a prior mean of   from   samples and a prior precision of   from   samples, the prior distribution over   and   is

 

and after observing   samples with mean   and variance  , the posterior probability is

 

Note that in some programming languages, such as Matlab, the gamma distribution is implemented with the inverse definition of  , so the fourth argument of the Normal-Gamma distribution is  .

Generating normal-gamma random variates

edit

Generation of random variates is straightforward:

  1. Sample   from a gamma distribution with parameters   and  
  2. Sample   from a normal distribution with mean   and variance  
edit

Notes

edit
  1. ^ a b Bernardo & Smith (1993, p. 434)
  2. ^ Bernardo & Smith (1993, pages 136, 268, 434)
  3. ^ Wasserman, Larry (2004), "Parametric Inference", Springer Texts in Statistics, New York, NY: Springer New York, pp. 119–148, ISBN 978-1-4419-2322-6, retrieved 2023-12-08
  4. ^ "Bayes' Theorem: Introduction". Archived from the original on 2014-08-07. Retrieved 2014-08-05.

References

edit
  • Bernardo, J.M.; Smith, A.F.M. (1993) Bayesian Theory, Wiley. ISBN 0-471-49464-X
  • Dearden et al. "Bayesian Q-learning", Proceedings of the Fifteenth National Conference on Artificial Intelligence (AAAI-98), July 26–30, 1998, Madison, Wisconsin, USA.