FAQ/RegressionOutliers - CBU statistics Wiki

Upload page content

You can upload content for the page named below. If you change the page name, you can also upload content for another page. If the page name is empty, we derive the page name from the file name.

File to load page content from
Page name
Comment
Type the missing letters from: Lodon is th capial of nglnd

Revision 20 as of 2013-09-19 11:24:47

location: FAQ / RegressionOutliers

Checking for outliers in regression

According to Hoaglin and Welsch (1978) leverage values above 2(p+1)/n where p predictors are in the regression on n observations (items) are influential values. If the sample size is < 30 a stiffer criterion such as 3(p+1)/n is suggested.

Leverage is also related to the i-th observation's Mahalanobis distance, MD(i), such that for sample size, N

Leverage for observation i = MD(i)/(N-1) + 1/N

so

Critical $$\mbox{MD}_text{i} = (2(p+1)/N - 1/N)(N-1)

(See Tabachnick and Fidell)

Other outlier detection methods using boxplots are in the Exploratory Data Analysis Graduate talk located here or by using z-scores using tests such as Grubb's test - further details and an on-line calculator are located here.

Hair, Anderson, Tatham and Black (1998) suggest Cook's distances greater than 1 are influential.

References

Hair, J., Anderson, R., Tatham, R. and Black W. (1998). Multivariate Data Analysis (fifth edition). Englewood Cliffs, NJ: Prentice-Hall.

Hoaglin, D. C. and Welsch, R. E. (1978). The hat matrix in regression and ANOVA. The American Statistician 32, 17-22.

Return to Statistics FAQ page

Return to Statistics main page

Return to CBU main page

These pages are maintained by Ian Nimmo-Smith and Peter Watson