- #1
bonfire09
- 249
- 0
Homework Statement
Suppose that object 1 weighs [itex]\theta_1[/itex] and object two weights [itex]\theta_2[/itex]. Then each object is weighed once and then together getting three observations [itex] y_1,y_2,y_3[/itex]. The scale gives unbiased weights with normally distributed error (constant variance) Find the least square estimates for [itex]\theta_1[/itex] and [itex]\theta_2[/itex].
Homework Equations
The ls estimates for theta 1 and theta 2 is ## \hat{\theta}=(X^TX)^{-1}X^TY##
The Attempt at a Solution
I wrote the full mode as such
[itex] y_1=\theta_1+\epsilon_1[/itex]
[itex] y_2=\theta_2+\epsilon_2[/itex]
[itex] y_3=\theta_1+\theta_2+\epsilon_3[/itex]
Then it follows that in matrix form we get ## \begin{bmatrix} y_{1} \\ y_{2} \\ y_{3} \end{bmatrix} = \begin{bmatrix} \ \theta_1 \\ \theta_2 \\ \theta_1+\theta_2 \end{bmatrix}+\begin{bmatrix} \epsilon_1 \\ \epsilon_2 \\ \epsilon_3 \end{bmatrix}## ##=\begin{bmatrix} y_{1} \\ y_{2} \\ y_{3} \end{bmatrix} = \begin{bmatrix} 1 &0 \\ 0 & 1 \\1 & 1 \end{bmatrix} \begin{bmatrix} \theta_1 \\ \theta_2 \end{bmatrix}+\begin{bmatrix} \epsilon_1 \\ \epsilon_2 \\ \epsilon_3 \end{bmatrix}##.
Then from here I got as my final answer
## \hat{\theta}=(X^TX)^{-1}X^TY=\begin{bmatrix} \hat{\theta_1} \\ \hat{\theta_2} \end{bmatrix} =\begin{bmatrix} y_1-y_2 \\ \frac{-1}{2}y_1+y_2+\frac{1}{2}y_3 \end{bmatrix}## The thing the I am not sure about is if I wrote my full model correctly. I was thinking that the observations are just the true weight of the object with some random error. But I am not sure.
Last edited: