Regression model

  • Thread starter squenshl
  • Start date
  • #1
479
4

Homework Statement


Suppose that we observe ##i=1,2,\ldots,n## independent observations that can be modeled as follows:
$$Y_i = i(\theta+\epsilon_i) \quad \text{where} \; \; \epsilon_i \sim N(0,\sigma^2).$$

1. Write the above as a regression model, ##E(Y) = X\theta##, ##\text{Cov}(Y) = \sigma^2W## for matrices ##X## and ##W##.

2. Show that ##X^TW^{-1}X = n##.

3. Show that the least squares estimate for ##\theta## is given by
$$\hat{\theta} = \frac{1}{n}\sum_{i=1}^{n} \frac{Y_i}{i}.$$

Consider the following transformation: ##Z_i = \frac{Y_i}{i}.##

4. Show the transformed model can be written as a regression model,
$$E(Z) = 1_n\theta, \quad \text{Cov}(Z) = \sigma^2I_n$$
where ##1_n## is a column vector of ##1##s and ##I_n## is an identity matrix of dimension ##n##.

5. Show that the least squares estimate from this model is exactly the same as the solution from part c).

Homework Equations




The Attempt at a Solution



I have no idea about this question. I get the matrix
$$X = \begin{bmatrix}
1 \\
2 \\
\vdots \\
n
\end{bmatrix}$$ but not sure on ##W##. Once I can get this I can pretty much do the rest.

Please help!!!!
 

Answers and Replies

  • #2
479
4
Never mind I got it but as usual your help was appreciated!!!!!!
 

Related Threads on Regression model

  • Last Post
Replies
0
Views
799
  • Last Post
Replies
7
Views
9K
  • Last Post
Replies
1
Views
762
  • Last Post
Replies
1
Views
433
Replies
0
Views
4K
Replies
0
Views
2K
  • Last Post
Replies
3
Views
2K
  • Last Post
Replies
0
Views
1K
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
1
Views
641
Top