Size: 2941
Comment:
|
← Revision 13 as of 2013-03-08 10:17:15 ⇥
Size: 5792
Comment: converted to 1.6 markup
|
Deletions are marked like this. | Additions are marked like this. |
Line 1: | Line 1: |
= Synopsis of the Graduate Statistics Course 2007 = | ## page was renamed from Synopsis2007 = Synopsis of the CBU Graduate Statistics Course 2008 = |
Line 6: | Line 7: |
Line 8: | Line 8: |
Line 10: | Line 9: |
Line 12: | Line 10: |
Line 14: | Line 11: |
Line 16: | Line 12: |
Line 18: | Line 13: |
Line 20: | Line 14: |
Line 22: | Line 15: |
Line 24: | Line 16: |
Line 26: | Line 17: |
Line 28: | Line 18: |
Line 30: | Line 19: |
Line 36: | Line 24: |
Line 38: | Line 25: |
Line 40: | Line 26: |
Line 42: | Line 27: |
Line 44: | Line 28: |
Line 49: | Line 32: |
* The Naming of Parts * Categorical Data * Frequency Tables * The Chi-Squared Goodness-of-Fit Test * The Chi-squared Distribution * The Binomial Test * The Chi-squared test for association * Simpson, Cohen and McNemar * SPSS procedures that help * Frequencies * Crosstabs * Chi-square * Binomial * Types of Data * Quantitative * Qualitative * Nominal * Ordinal * Frequency Table * Bar chart * Cross-classification or Contingency Table * Simple use of SPSS Crosstabs * Goodness of Fit Chi-squared Test * Chance performance and the Binomial Test * Confidence Intervals for Binomial Proportions * Pearson’s Chi-squared * Yates’ Continuity Correction * Fisher’s Exact Test * Odds and Odds Ratios * Log Odds and Log Odds ratios * Sensitivity and Specificity * Signal Detection Theory * Simpson’s Paradox * Measures of agreement: Cohen's Kappa * Measures of change: McNemar’s Test * Association or Independence: Chi-squared test of association * Comparing two or more classified samples |
|
Line 52: | Line 73: |
Line 54: | Line 74: |
Line 56: | Line 75: |
Line 58: | Line 76: |
Line 60: | Line 77: |
Line 62: | Line 78: |
Line 64: | Line 79: |
Line 66: | Line 80: |
Line 68: | Line 81: |
Line 70: | Line 82: |
Line 72: | Line 83: |
Line 78: | Line 88: |
Line 80: | Line 89: |
Line 82: | Line 90: |
Line 84: | Line 91: |
Line 86: | Line 92: |
Line 88: | Line 93: |
Line 90: | Line 94: |
Line 92: | Line 95: |
Line 94: | Line 96: |
Line 96: | Line 97: |
Line 101: | Line 101: |
* GLM and Simple Linear Regression * The Design Matrix * Least Squares * ANOVA and GLM * Types of Sums of Squares * Multiple Regression as GLM * Multiple Regression as a sequence of GLMs in SPSS * The two Groups t-test as a GLM * One-way ANOVA as GLM * Multi-factor Model * Additive (no interaction) * Non-additive (interaction) * Analysis of Covariance * Simple regression * 1 intercept * 1 slope * Parallel regressions * multiple intercepts * 1 slope * Non-parallel regressions * multiple intercepts * multiple slopes * Sequences of GLMs in ANCOVA |
|
Line 104: | Line 129: |
Line 106: | Line 130: |
Line 108: | Line 131: |
Line 110: | Line 132: |
Line 112: | Line 133: |
Line 114: | Line 134: |
Line 116: | Line 135: |
Line 118: | Line 136: |
Line 120: | Line 137: |
Line 122: | Line 138: |
Line 127: | Line 142: |
* Two sample t-Test vs. Paired t-Test * Repeated Measures as an extension of paired measures * Single factor Within-Subject design * Sphericity * Two (or more) factors Within-Subject design * Mixed designs combining Within- and Between-Subject factors * Mixed Models, e.g. both Subjects & Items as Random Effects factors * The ‘Language as Fixed Effects’ Controversy * Testing for Normality * Single degree of freedom approach |
|
Line 130: | Line 156: |
Line 132: | Line 157: |
Line 134: | Line 158: |
Line 136: | Line 159: |
Line 138: | Line 160: |
Line 140: | Line 161: |
Line 142: | Line 162: |
Line 144: | Line 163: |
Line 146: | Line 164: |
Line 148: | Line 165: |
Line 150: | Line 166: |
Line 152: | Line 167: |
Line 155: | Line 169: |
1. '''Post-hoc tests, multiple comparisons, contrasts and handling interactions''' | 1. '''What to do following an ANOVA''' * Why do we use follow-up tests? * Different ways to follow up an ANOVA * Planned vs. Post Hoc Tests * Choosing and Coding Contrasts * Handling Interactions * Standard Errors of Differences * Multiple t-tests * Post Hoc Tests * Trend Analysis * Unpacking interactions * Multiple Comparisons: Watch your Error Rate! * Post-Hoc vs A Priori Hypotheses * Comparisons and Contrasts * Family-wise (FW) error rate * Experimentwise error rate * Orthogonal Contrasts or Comparisons * Planned Comparisons vs. Post Hoc Comparisons * Orthogonal Contrasts/Comparisons * Planned Comparisons or Contrasts * Contrasts in GLM * Post Hoc Tests * Control of False Discovery Rate (FDR) * Simple Main Effects |
Synopsis of the CBU Graduate Statistics Course 2008
The Anatomy of Statistics: Models, Hypotheses, Significance and Power
- Experiments, Data, Models and Parameters
- Probability vs. Statistics
- Hypotheses and Inference
- The Likelihood Function
- Estimation and Inferences
- Maximum Likelihood Estimate (MLE)
- Schools of Statistical Inference
- Ronald Aylmer FISHER
- Jergy NEYMAN and Egon PEARSON
- Rev. Thomas BAYES
- R A Fisher: P values and Significance Tests
- Neyman and Pearson: Hypothesis Tests
Type I & Type II Errors
- Size and Power
Exploratory Data Analysis (EDA)
- What is it?
- Skew and kurtosis: definitions and magnitude rules of thumb
- Pictorial representations - in particular histograms, boxplots and stem and leaf displays
- Effect of outliers
- Power transformations
- Rank transformations
Categorical Data Analysis
- The Naming of Parts
- Categorical Data
- Frequency Tables
- The Chi-Squared Goodness-of-Fit Test
- The Chi-squared Distribution
- The Binomial Test
- The Chi-squared test for association
Simpson, Cohen and McNemar
- SPSS procedures that help
- Frequencies
- Crosstabs
- Chi-square
- Binomial
- Types of Data
- Quantitative
- Qualitative
- Nominal
- Ordinal
- Frequency Table
- Bar chart
- Cross-classification or Contingency Table
- Simple use of SPSS Crosstabs
- Goodness of Fit Chi-squared Test
- Chance performance and the Binomial Test
- Confidence Intervals for Binomial Proportions
- Pearson’s Chi-squared
- Yates’ Continuity Correction
- Fisher’s Exact Test
- Odds and Odds Ratios
- Log Odds and Log Odds ratios
- Sensitivity and Specificity
- Signal Detection Theory
- Simpson’s Paradox
- Measures of agreement: Cohen's Kappa
Measures of change: McNemar’s Test
- Association or Independence: Chi-squared test of association
- Comparing two or more classified samples
Regression
- What is it?
- Expressing correlations (simple regression) in vector form
- Scatterplots
- Assumptions in regression
- Restriction of range of a correlation
- Comparing pairs of correlations
- Multiple regression
- Least squares
- Residual plots
- Stepwise methods
- Synergy
- Collinearity
Between subjects analysis of variance
- What is it used for?
- Main effects
- Interactions
- Simple effects
- Plotting effects
- Implementation in SPSS
- Effect size
- Model specification
- Latin squares
- Balance
- Venn diagram depiction of sources of variation
The General Linear Model and complex designs including Analysis of Covariance
- GLM and Simple Linear Regression
- The Design Matrix
- Least Squares
- ANOVA and GLM
- Types of Sums of Squares
- Multiple Regression as GLM
- Multiple Regression as a sequence of GLMs in SPSS
- The two Groups t-test as a GLM
- One-way ANOVA as GLM
- Multi-factor Model
- Additive (no interaction)
- Non-additive (interaction)
- Analysis of Covariance
- Simple regression
- 1 intercept
- 1 slope
- Parallel regressions
- multiple intercepts
- 1 slope
- Non-parallel regressions
- multiple intercepts
- multiple slopes
- Simple regression
- Sequences of GLMs in ANCOVA
Power analysis
- Hypothesis testing
- Boosting power
- Effect sizes: definitions, magnitudes
- Power evaluation methods:description and implementation using an examples
- nomogram
- power calculators
- SPSS macros
- spreadsheets
- power curves
- tables
- quick formula
Repeated Measures and Mixed Model ANOVA
- Two sample t-Test vs. Paired t-Test
- Repeated Measures as an extension of paired measures
- Single factor Within-Subject design
- Sphericity
- Two (or more) factors Within-Subject design
- Mixed designs combining Within- and Between-Subject factors
Mixed Models, e.g. both Subjects & Items as Random Effects factors
- The ‘Language as Fixed Effects’ Controversy
- Testing for Normality
- Single degree of freedom approach
Latent variable modelling – factor analysis and all that!
- Path diagrams – a regression example
- Comparing correlations
- Exploratory factor analysis
- Assumptions of factor analysis
- Reliability testing (Cronbach’s alpha)
- Fit criteria in exploratory factor analysis
- Rotations
- Interpreting factor loadings
- Confirmatory factor models
- Fit criteria in confirmatory factor analysis
- Equivalence of correlated and uncorrelated models
- Cross validation as a means of assessing fit for different models
- Parsimony : determining the most important items in a factor analysis
What to do following an ANOVA
- Why do we use follow-up tests?
- Different ways to follow up an ANOVA
- Planned vs. Post Hoc Tests
- Choosing and Coding Contrasts
- Handling Interactions
- Standard Errors of Differences
- Multiple t-tests
- Post Hoc Tests
- Trend Analysis
- Unpacking interactions
- Multiple Comparisons: Watch your Error Rate!
- Post-Hoc vs A Priori Hypotheses
- Comparisons and Contrasts
- Family-wise (FW) error rate
- Experimentwise error rate
- Orthogonal Contrasts or Comparisons
- Planned Comparisons vs. Post Hoc Comparisons
- Orthogonal Contrasts/Comparisons
- Planned Comparisons or Contrasts
- Contrasts in GLM
- Post Hoc Tests
- Control of False Discovery Rate (FDR)
- Simple Main Effects