Talk:Chernoff's inequality

Latest comment: 16 years ago by Sodin in topic Merger Proposal

Comments

edit

I added an article on Chernoff bound which though a special case of the inequality here, merits special attention. Perhaps the articles could be combined in some form. CSTAR 23:08, 21 Jun 2004 (UTC)

Do they really have to be discrete random variables? (At this moment I'm too lazy to figure this out for myself, but I'll probably do so soon.) Michael Hardy 22:51, 24 Jun 2004 (UTC)

Am I right in thinking that μ is the mean of X? -- Anon

I always thought that "Chernoff inequality" refers to the inequality on gaussian variable   and differentiable function   such that   and states that in this case  . I promise to write on this subject later. Amir Aliev 11:18, 23 May 2006 (UTC)Reply

Proof

edit

Can we see a proof of this inequality? Mwilde 19:25, 5 Mar 2005 (UTC)

Mistake?

edit

Is the inequality correct, as stated?

 

This article looks like a copy of Theorem 3 (page 6) from paper

http://www.math.ucsd.edu/~fan/wp/concen.pdf

But in that paper we have:

 

with an additional  . So which one is correct?

~Ruksis~

The second one, I think. Clearly it has to depend on n. Michael Hardy 20:01, 21 July 2006 (UTC)Reply
No. The first one is correct. It depends on n through  . Daniel Hsu, 25 August 2006

The referred to paper is likely the source of confusion. It talks about   being the variance of  , but that doesn't make sense since the various variables   could have distinct variances. Earlier versions in Wikipedia have the same mistake. If all the   would have the same variance, say if they're identically distributed, and   would denote the variance of (all the)  , then the Chernoff bounds needs the   in  . In its current, more general, form not. Peter van Rossum, 27 September 2006.

Temporary solution

edit

The claim as stated was not true. For example, take all   to be   with probability   and   with probability  . Then the variance of X is 1/n but X does not have exponential distribution: for example, the probability that   is   rather than  .

I've corrected the mistake in a silly way. The guy who put this here originally should find out the correct version of the claim in the generality he had in mind and put it here.

Uffish

Bounds on

edit

Should

  for all i

actually be

  for all i?

Requiring   with   is rather limiting ( ).

Parubin 22:29, 26 November 2006 (UTC)Reply

Chernoff bounds

edit

The bound in this article was known long before Chernoff (see e.g. S.Bernstein's book on probability, 192?). To my opinion it is only confusing to call this Chernoff's inequality; also, it is not reasonable to have 3 articles with more or less the same name. If noone objects, I will unite all the 3 and revise the terminology Sodin 02:29, 9 August 2007 (UTC)Reply

Merger Proposal

edit

In the current version, the inequality stated in the article is incorrect. To my opinion, all the (correct) content should be transferred to Chernoff bound. Please comment. Sasha 12 September 2008 —Preceding undated comment was added at 19:58, 12 September 2008 (UTC).Reply