df | 0.05 | 0.01 |
---|---|---|
1 | 12.71 | 63.66 |
2 | 4.30 | 9.92 |
3 | 3.18 | 5.84 |
4 | 2.78 | 4.60 |
5 | 2.57 | 4.03 |
6 | 2.45 | 3.71 |
7 | 2.36 | 3.50 |
8 | 2.31 | 3.36 |
9 | 2.26 | 3.25 |
10 | 2.23 | 3.17 |
11 | 2.20 | 3.11 |
12 | 2.18 | 3.05 |
13 | 2.16 | 3.01 |
14 | 2.14 | 2.98 |
15 | 2.13 | 2.95 |
16 | 2.12 | 2.92 |
17 | 2.11 | 2.90 |
18 | 2.10 | 2.88 |
19 | 2.09 | 2.86 |
20 | 2.09 | 2.85 |
21 | 2.08 | 2.83 |
22 | 2.07 | 2.82 |
23 | 2.07 | 2.81 |
24 | 2.06 | 2.80 |
25 | 2.06 | 2.79 |
26 | 2.06 | 2.78 |
27 | 2.05 | 2.77 |
28 | 2.05 | 2.76 |
29 | 2.05 | 2.76 |
30 | 2.04 | 2.75 |
35 | 2.03 | 2.72 |
40 | 2.02 | 2.70 |
45 | 2.01 | 2.69 |
50 | 2.01 | 2.68 |
60 | 2.00 | 2.66 |
70 | 1.99 | 2.65 |
80 | 1.99 | 2.64 |
90 | 1.99 | 2.63 |
100 | 1.98 | 2.63 |
∞ | 1.96 | 2.58 |
Bijlage C — Formules
To copy a formula to MS Word, right-click on the formula and choose ’Copy to clipboard … MathML Code. Then use CTRL/CMD+V to paste the formula.
To copy a formula to an RMarkdown document, right-click on the formula and choose ‘Copy to clipboard … TeX Commands’. In RStudio Visual Markdown Editor choose Insert … Latex Math … Display Math and then CTRL/CMD+V to paste the formula.
C.1 Covariance and Correlation
(Sample) Covariance
\[cov(x,y) = \frac{\sum (x_{i} - \bar{x})(y_{i} - \bar{y})}{n-1}\]
Pearson Correlation
\[r = \frac{cov(x,y)}{SD(x) * SD(y)}\]
C.2 Linear Regression
Linear Regression Equation
\[y_{i} = b_{0} + b_{1}x_{1i} + b_{2}x_{2i} + ... + b_{k}x_{ki} + \epsilon_{i}\]
Simple Linear Regression: Slope
\[b_{1} = \frac{\sum(x_{i} - \bar{x})(y_{i} - \bar{y})}{\sum(x_{i} - \bar{x})^2}\]
Simple Linear Regression: Intercept/Constant
\[b_{0} = \bar{y} - b_{1}\bar{x}\]
Regression Model with Interaction
\[y = b_{0} + b_{1}x_{1} + b_{2}x_{2} + b_{3}(x_{1}x_{2}) + \epsilon\]
Marginal Effects in Interaction Model
\[b_{1} + (x2 * b_{3})\] \[b_{2} + (x1 * b_{3})\]
t-test for regression coefficients
\[t = \frac{b}{SE_{b}}\]
Confidence Interval: Coefficient
\[CI = b \pm (t_{df} * SE)\]
Regression Sum of Squares (Also called: Model Sum of Squares)
\[SS_{Regression} = \sum(\hat{y} - \bar{y})^2\]
Residual Sum of Squares
\[SS_{Residual} = \sum(y_{i} - \hat{y})^2\]
Total Sum of Squares
\[SS_{Total} = \sum(y_{i} - \bar{y})^2\]
R2
\[R^2 = \frac{SS_{Regression}}{SS_{Total}}\]
\[R^2 = 1 - \frac{SS_{Residual}}{SS_{Total}} \]
Mean Squares: Residual
\[MS_{Residual} = \frac{SS_{Residual}}{\textrm{df}_{Residual}}\] \[\textrm{df}_{Residual} = n-k\] Mean Squares: Regression Model
\[MS_{Model} = \frac{SS_{Regression}}{df_{Model}}\]
\[df_{Model} = k\] F
\[F = \frac{MS_{Model}}{MS_{Residual}}\]
C.3 Logistic Regression
Logistic Regression Model with Single Explanatory Variable
\[\textrm{log(Odds)} = b_0 + b_1x_{1i} + b_2x_{2i}...\]
\[P(Y_{i} = 1) = \frac{1}{1 + e^{-(b_{0} + b_{1}x_{1i})}}\]
Odds and Probabiilty
\[odds = \frac{p}{1 - p}\]
\[p = \frac{odds}{1 + odds}\]
Odds Ratio
\[e^{b}\]
z statistic
\[z = \frac{b}{se}\]
Likelihood Ratio
\[\chi^2 = (-2LL_{baseline}) - (-2LL_{new})\]
\[\textrm{df} = k_{new} - k_{baseline}\]