Talk:Cochran's theorem

Latest comment: 1 month ago by Dakster86 in topic References?

Something appears to be missing in the statement of the theorem:

First, it seems that the terms on the right of the equation should be expressed as Ut Q_i U, with Q_i being a matrix and U being the vector of normals. This way the statement that Q_i is of rank r_i has some meaning - r.v.'s don't have a rank.

Second, for the theorem to hold, there must be some constraint on the relationship between the Q's, otherwise they could, for example, be identical, and they would definitely not be independent.

--143.183.121.2 16:49, 24 Jun 2005 (UTC)

Who is Cochran?

edit

I assume it's William Gemmell Cochran. Webhat 08:43, 13 May 2006 (UTC)Reply

References?

edit

There are no references included in the article. Cochran's theorem appears in many textbooks, but statements made in the Example — albeit appropriately derived — are new to me. I'm particularly interested in   — does it appear in a book or is it the editor's own contribution? Ml78712 08:11, 28 June 2007 (UTC)Reply

There needs to be a reference to where Cochran first published his theorem. I am not a dog (talk) 15:39, 18 April 2008 (UTC)Reply
The whole article needs to be rewritten. Whoever wrote the article was likely taking   as the maximum likelihood estimator of  , which does use a denominator of n. But that ought to be stated clearly.
The whole theorem is really better stated as a theorem regarding quadratic forms of Normal random variables. In that case, if  , then   if and only if   and   and   for every  . That is, the matrices that define the quadratic forms must idempotent and orthogonal. Moreover,  where   is the rank of  
The original citation is Cochran, W.G., "The Distribution of Quadratic Forms in a Normal System", Proc. Cam. Phil. Soc. (1934), 178. (That citation was pulled from Greybill's Matrices with applications in Statistics.)

Dennis Clason —Preceding unsigned comment added by 97.119.159.219 (talk) 05:53, 12 January 2011 (UTC)Reply

Unless I’m mistaken, I’m not quite sure your restatement of the theorem is true. Specifically, I think you’ve provided a sufficient, but not necessary, condition. For example, take k = 2 with A_1 = 2 I and A_2 = - I. Clearly, U’ (A_1 + A_2) U = U’ U and is chi-squared on n df, but the A_i’s don’t satisfy the requirements of idempotence nor mutual ortjogonality. Dakster86 (talk) 02:01, 9 October 2024 (UTC)Reply

Rank of Qi needs expaining

edit

In Overview, more careful definition and explanation is needed. What is a rank of sum of squares of linear combinations? Are they supposed to be thought of as symmetric bilinear forms? Xenonice (talk) 02:19, 2 October 2008 (UTC)Reply

Definitely they are supposed to be thought of as symmetric bilinear forms. More later........ Michael Hardy (talk) 02:30, 2 October 2008 (UTC)Reply
I agree - something is very wrong in the statement here. Looks like a better statement is [1]. --Zvika (talk) 12:35, 30 July 2009 (UTC)Reply

Expert subject tag

edit

I have added the expert-subject tag for statistics as there has been no action relating to the general "factual acccuracy" tag, which presumably relates to the discussion above. Melcombe (talk) 16:36, 26 May 2010 (UTC)Reply

The sign ~ in "alternative formulation"

edit

I presume that the sign ~ in the following extract from the article was not supposed to be an unreadable superscript. If there were no explanation added in the verbal form, the formula would need more guessing to understand, yet I believe it still deserves a small correction. M-

Alternative formulation

edit

The following version is often seen when considering linear regression.

Suppose   is a standard multivariate Gaussian random variable

Orthogonal Matrix in Proof

edit

Under the proof section, right after showing that all B matrices are simultaneously diagonalizable, an orthogonal matrix S is introduced. Is this always true? I thought only normal matrices can be diagonalized by unitary matrices, and   is not necessarily normal. D4nn0v (talk) 06:05, 15 October 2019 (UTC)Reply

I agree that the article could be clearer.
It says that the matrices Bi are symmetric. If we accept that they are also real, then the matrices Bi are symmetric and real, then there are equal to their conjugate transpose (and to their conjugate and to their transpose). If they are equal to their conjugate transpose, then clearly they commute with their conjugate transpose, and that is the definition of normal for a matrix.
That is if we accept that the matrices Bi are real, which I am not sure the article mentions explicitly.
Part of the claim of the theorem is that the quantities Qi = U^T.Bi.U have chi-square distributions and the U's have standard normal distributions. To me that means that both the Qi's and the U's are real, which clearly points to the fact that the matrices Bi should be real. I do not think we could hope that a matrix Bi is not real and somehow its imaginary parts cancel each other when we compute U^T.Bi.U, because the entries of U are random.
I could be missing something, but that is how I see this part.
But, as stated above, I feel the article could be clearer about this and several other things. 2601:546:C480:750:0:0:0:4 (talk) 22:09, 12 December 2022 (UTC)Reply