Let's start by defining some notations:
- Treatment:$\alpha_i$, where $i \in (1,...,p)$;
 - Contrast vector: c, which commonly is a column vector;
 - Response: $Y_{ij}$, where $j \in (1,...,n_i)$, where each level of treatment has $n_i} observations.
 - Random error term: $\epsilon \sim N(0,\Sigma)$, where $\Sigma = diag(\sigma^2)$.
 
$Y_i = \mu +\alpha_i+\epsilon_i$, 
in a matrix notation, it is:
$\bf {Y=X\beta}$
where $\bf{\beta} = (\mu \: \alpha_1\: ...\: \alpha_p)'$
Therefore,
Therefore,
$SSE=(Y-X\hat\beta)'(Y-x\hat\beta)=[(I-P)Y]'[(I-P)Y]$
$=\Sigma_{i=1}^p \Sigma_{i=1}^n_i (y_{ij}-\bar{y}_{i.}) $
$=\Sigma_{i=1}^p \Sigma_{i=1}^n_i (y_{ij}-\bar{y}_{i.}) $
We know $c'\hat\beta$ follows a normal distribution with mean $c'\beta$, and variance
 $Var(c'\hat\beta)=c'(X'X)^-c\sigma^2$
where $(X'X)^-$ is the generalized inverse of $X'X$.
We can test $c'(X'X)^-(X'X)=c'$ for estimability.
We can test $c'(X'X)^-(X'X)=c'$ for estimability.
The MLE of p estimable functions $\mu +\hat\beta = (\bar{Y_{1.}} \: ... .\: \bar{Y_{p.}})$
also, $\bar{Y} \sim N(\mu, \sigma^2/n)$  if  $Y_i \sim N(\mu, \sigma^2), i = 1,...,n$.
Therefore,
 $Var(c'\bar{Y})=c'Var( \bar{Y})c=SSE\sum\limits_{i=1}^p c_i^2/n_i$ (SSE is estimator of $\sigma^2$)
Finally, we can show:
 $\sum\limits_{i=1}^p c_i^2/n_i=c'(X'X)^-c$
No comments:
Post a Comment