Wednesday, December 15, 2021

A proof on linear mixed models

In Searle's (1979) notes and in many other places one can find that

$latex (y-X\hat{\beta})'V^{-1} (y-X\hat{\beta})  = y'R^{-1}(y - X\hat{\beta} - Z \hat{u}) $latex

The typical proof is based on projection matrices $latex S $latex but I wanted something simpler. An easier proof is composed of two parts. 


The first part is to show that

$latex (y-X\hat{\beta})'V^{-1} (y-X\hat{\beta}) = yV^{-1} (y-X\hat{\beta}) $latex

which is proven as follows:


$latex (y-X\hat{\beta})'V^{-1} (y-X\hat{\beta}) = y'V^{-1} (y-X\hat{\beta})  + \hat{\beta'}X' V^{-1} (y-X\hat{\beta})  = y'V^{-1} (y-X\hat{\beta})$latex

This is because 

$latex \hat{\beta'}X' V^{-1} (y-X\hat{\beta}) = \hat{\beta'}(X' V^{-1} y-X' V^{-1} X\hat{\beta}) = \hat{\beta'} 0 $latex  

because, of course $latex X' V^{-1} y=X' V^{-1} X\hat{\beta} $latex . It is kind of obvious only after you see it (in Searle et al., Variance Components, p 278).

The second part shows that $latex y'V^{-1} (y-X\hat{\beta})  = y'R^{-1}(y - X\hat{\beta} - Z \hat{u}) $latex . This is done by replacing $latex \hat{u} $latex by its estimation as $latex  \hat{u}=GZ'(y - X\hat{\beta}) $latex, and using that $latex V = ZGZ'+R $latex as follows:

$latex y'R^{-1}(y - X\hat{\beta} - Z \hat{u})= y'R^{-1}(y - X\hat{\beta} - Z GZ'V^{-1}(y - X\hat{\beta})) =  y'R^{-1}(y - X\hat{\beta} - (V-R) V^{-1}(y - X\hat{\beta}))   = y'R^{-1}(y - X\hat{\beta} - y - X\hat{\beta} + R V^{-1} (y - X\hat{\beta})  = y' V^{-1} (y - X\hat{\beta}) $latex