Talk:Inverse probability
This article is rated Stub-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
|
BDIP
editWhat is meant by block difference inverse probability? anyone please let me know...thanks mail me anandhmurali@gmail.com —Preceding unsigned comment added by 117.204.23.175 (talk) 16:31, 18 August 2010 (UTC)
likelihood definition
edit"the "distribution" of an unobserved variable given data is rather the likelihood function (which is not a distribution)" - this is not correct, the likelihood function is if at all the distribution of the probability of obtaining the data as a function of a parameter.
^^^ All of this and in the original article is very incorrect.
The likelihood function is a function that gives the likelihood of a parameter given the data. A profile likelihood shows the likelihoods of different values on the parameter, given the data.
For a given set of data, the maximum likelihood estimate is the value of the parameter that is most likely, because it most closely approximates the data distribution if the parameter value was equal to the maximum likelihood estimate. — Preceding unsigned comment added by 131.156.156.30 (talk) 23:12, 31 January 2018 (UTC)
- Nevertheless, it is correct to say that the likelihood function itself is not necessarily a probability density. For example, if the data for 2 tosses of a coin is (heads,heads), and the probability of heads is the variable p, then adding up the likelihoods for the values p = .9, p = .8 gives (.9)(.9) + (.8)(.8) = 1.45, so the set of values of the likelihood function cannot be a discrete probability density. And considering p to be a continuous variable, the liklihood function f(p) = p^2 does not integrate to 1 over the interval [0,1].
- @Tashiro~enwiki: The point here surely is that the likelihood function is not a probability distribution for the parameter; it is a probability distribution for the data, given some value of the parameter.
- If you sum it over different values of the parameter for a particular vector of data, that will not sum to one. But, contrariwise, if you sum it over the data for a particular value of the parameter, theb that sum will add to one.
- Baldly saying "the "distribution" of data given the unobserved variable is rather the likelihood function (which is not a probability distribution)" is deeply confusing. p(x|θ) is a probability distribution. If you want to say that the likelihood function is (generally) not a probability distribution for the parameter you need to be far more specific on that point. Jheald (talk) 19:47, 28 August 2023 (UTC)