Processing math: 100%

Wednesday, December 15, 2021

A proof on linear mixed models

In Searle's (1979) notes and in many other places one can find that

(y-X\hat{\beta})'V^{-1} (y-X\hat{\beta})  = y'R^{-1}(y - X\hat{\beta} - Z \hat{u})

The typical proof is based on projection matrices S but I wanted something simpler. An easier proof is composed of two parts. 


The first part is to show that

(y-X\hat{\beta})'V^{-1} (y-X\hat{\beta}) = yV^{-1} (y-X\hat{\beta})

which is proven as follows:


(y-X\hat{\beta})'V^{-1} (y-X\hat{\beta}) = y'V^{-1} (y-X\hat{\beta})  + \hat{\beta'}X' V^{-1} (y-X\hat{\beta})  = y'V^{-1} (y-X\hat{\beta})

This is because 

\hat{\beta'}X' V^{-1} (y-X\hat{\beta}) = \hat{\beta'}(X' V^{-1} y-X' V^{-1} X\hat{\beta}) = \hat{\beta'} 0   

because, of course X' V^{-1} y=X' V^{-1} X\hat{\beta} . It is kind of obvious only after you see it (in Searle et al., Variance Components, p 278).

The second part shows that y'V^{-1} (y-X\hat{\beta})  = y'R^{-1}(y - X\hat{\beta} - Z \hat{u}) . This is done by replacing \hat{u} by its estimation as   \hat{u}=GZ'(y - X\hat{\beta}) , and using that V = ZGZ'+R as follows:

y'R^{-1}(y - X\hat{\beta} - Z \hat{u})= y'R^{-1}(y - X\hat{\beta} - Z GZ'V^{-1}(y - X\hat{\beta})) =  y'R^{-1}(y - X\hat{\beta} - (V-R) V^{-1}(y - X\hat{\beta}))   = y'R^{-1}(y - X\hat{\beta} - y - X\hat{\beta} + R V^{-1} (y - X\hat{\beta})  = y' V^{-1} (y - X\hat{\beta})



No comments:

Post a Comment