Talk:Chernoff's inequality
This is the talk page of a redirect that has been merged and now targets the page: • Chernoff bound Because this page is not frequently watched, present and future discussions, edit requests and requested moves should take place at: • Talk:Chernoff bound Merged page edit history is maintained in order to preserve attributions. |
This redirect does not require a rating on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | ||||||||
|
Comments
editI added an article on Chernoff bound which though a special case of the inequality here, merits special attention. Perhaps the articles could be combined in some form. CSTAR 23:08, 21 Jun 2004 (UTC)
Do they really have to be discrete random variables? (At this moment I'm too lazy to figure this out for myself, but I'll probably do so soon.) Michael Hardy 22:51, 24 Jun 2004 (UTC)
Am I right in thinking that μ is the mean of X? -- Anon
I always thought that "Chernoff inequality" refers to the inequality on gaussian variable and differentiable function such that and states that in this case . I promise to write on this subject later. Amir Aliev 11:18, 23 May 2006 (UTC)
Proof
editCan we see a proof of this inequality? Mwilde 19:25, 5 Mar 2005 (UTC)
Mistake?
editIs the inequality correct, as stated?
This article looks like a copy of Theorem 3 (page 6) from paper
http://www.math.ucsd.edu/~fan/wp/concen.pdf
But in that paper we have:
with an additional . So which one is correct?
~Ruksis~
- The second one, I think. Clearly it has to depend on n. Michael Hardy 20:01, 21 July 2006 (UTC)
- No. The first one is correct. It depends on n through . Daniel Hsu, 25 August 2006
The referred to paper is likely the source of confusion. It talks about being the variance of , but that doesn't make sense since the various variables could have distinct variances. Earlier versions in Wikipedia have the same mistake. If all the would have the same variance, say if they're identically distributed, and would denote the variance of (all the) , then the Chernoff bounds needs the in . In its current, more general, form not. Peter van Rossum, 27 September 2006.
Temporary solution
editThe claim as stated was not true. For example, take all to be with probability and with probability . Then the variance of X is 1/n but X does not have exponential distribution: for example, the probability that is rather than .
I've corrected the mistake in a silly way. The guy who put this here originally should find out the correct version of the claim in the generality he had in mind and put it here.
Bounds on
editShould
- for all i
actually be
- for all i?
Requiring with is rather limiting ( ).
Chernoff bounds
editThe bound in this article was known long before Chernoff (see e.g. S.Bernstein's book on probability, 192?). To my opinion it is only confusing to call this Chernoff's inequality; also, it is not reasonable to have 3 articles with more or less the same name. If noone objects, I will unite all the 3 and revise the terminology Sodin 02:29, 9 August 2007 (UTC)
Merger Proposal
editIn the current version, the inequality stated in the article is incorrect. To my opinion, all the (correct) content should be transferred to Chernoff bound. Please comment. Sasha 12 September 2008 —Preceding undated comment was added at 19:58, 12 September 2008 (UTC).