Talk:Studentized residual


Relation to weighted least squares

edit

It looks like weighted least squares in which you divide by variance is the same as minimizing the norm of the Studentized residual. Is that right? —Ben FrantzDale (talk) 15:24, 26 November 2008 (UTC)Reply

Almost. It is minimizing the sum (over all data points) of the squares of the Studentized residuals. Physchim62 (talk) 12:58, 3 May 2009 (UTC)Reply

Externally Studentized Residuals

edit

The way I read this, it is suggested that the Studentized residuals are defined by:

 

However I believe the correct definition is:

 

(I looked at equation 8.1.17 of Draper and Smith, Applied Regression Analysis and at how this is implemented in ls.diag in R).

Erwin.kalvelagen (talk) 12:34, 1 October 2009 (UTC)Reply

Correction needed?

edit

It is claimed at User_talk:Michael_Hardy/Archive7#Studentized_residuals that a correction is needed in a formula in this article. I will look at that shortly. In the mean time, I've put a "factual accuracy" tag atop the article. Michael Hardy (talk) 19:09, 18 February 2014 (UTC)Reply

formula for the variances of the residuals

edit

In Kendall's Advanced Theory of Statistics, 5th Ed., by Alan Stuart and J. Keith Ord, page 1044, the variance of the residuals is given as

var (y[n+1] - y-hat[n+1] = sigma^2 * (1 + x_0' (X'X)^-1 x_0)

and Judge, et al., in Introduction to the Theory and Practice of Econometrics, 2nd Ed., page 210 we have

E[(y-hat_0 - y_0)(y-hat_0 - y_0)'] = sigma^2 [X_0 (X'X)^-1 X_0' + I_T0],

and noting that for the simple regression case, (Kendall, page 1045),

x_0' (X'X)^-1 x_0 = 1/n + (x_0 - x-bar)^2 / Sum(x - x-bar)^2

Thus the formula for the variance of the residuals has 1 + 1/n + (x_0 - x-bar)^2 / Sum(x - x-bar)^2, i.e., 1 + h_ii rather than 1 - h_ii

Did I miss anything? — Preceding unsigned comment added by 108.18.33.210 (talk) 19:58, 21 April 2015 (UTC)Reply

Yes, this is the formula for the variance of a new observation; the formula for the variance for the residuals uses 1 - h_ii — Preceding unsigned comment added by 204.188.186.4 (talk) 00:55, 23 April 2015 (UTC)Reply

How can a single residual have its own standard deviation?

edit

"Typically the standard deviations of residuals in a sample vary greatly from one data point to another"

Can someone explain how a single residual can have its own standard deviation? Isn't it a single number? — Preceding unsigned comment added by 2601:641:200:1356:25C7:46B:898F:63B8 (talk) 06:55, 12 November 2015 (UTC)Reply

That would be the a priori/assumed/expected standard deviation; it can be corrected or scaled a posteriori estimating a global variance factor for all residuals, or variance factors for groups of residuals. fgnievinski (talk) 01:43, 13 November 2015 (UTC)Reply
wouldn't that be a standard error?
The introduction is unclear. If by a Studentized residual we mean the distance from zero to that residual divided by the standard deviation of all the other residuals, excluding that residual, then we would have a more general definition that is not limited to the assumption of ordinary least squares (OLS) in y. That is, such a more general definition would work as well for a minimized proportional norm type error, or any other more general than OLS residual set, be a lot clearer and simpler to define, and likely more generally useful. CarlWesolowski (talk) 00:42, 2 September 2022 (UTC)Reply