Bivariate random variables

In summary: Var}(\hat{\beta}) = \frac{1}{\sum_{i=1}^{n} (z_i - \bar{z})^2} \text{Var}(Y_i(z_i - \bar{z})) = \frac{1}{\sum_{i=1}^{n} (z_i - \bar{z})^2 \theta_i^2} = \frac{1}{\sum_{i=1}^{n} (z_i - \bar{z})^2 e^{2\hat{\alpha} + 2\hat{\beta}(z_i - \bar{z})}},$$ and it can be shown that they are asympt
  • #1
squenshl
479
4

Homework Statement


Suppose that ##(Y_1,Y_2,\ldots,Y_n)## are random variables, where ##Y_i## has an exponential distribution with probability density function ##f_Y(y_i|\theta_i) = \theta_i e^{-\theta_i y_i}##, ##y_i > 0##, ##\theta_i > 0## where ##E(Y_i) = \frac{1}{\theta_i}## and ##\text{Var}(Y_i) = \frac{1}{\theta_i^2}##.
Suppose that ##\log{(\theta_i)} = \alpha+\beta(z_i-\bar{z})##, where the values ##z_i## are known for ##i=1,2,\ldots,n##.
We wish to estimate the vector parameter ##\theta = \begin{bmatrix}
\alpha \\
\beta
\end{bmatrix}##.
1. Show that the log-likelihood function
$$\ell(\theta) = n\alpha - \sum_{i=1}^{n} y_ie^{\alpha+\beta(z_i-\bar{z})}$$.

2. Find the two elements of the score statistic ##U(\theta;y)##.
Write down the two equations which must be solved to find the maximum likelihood estimates ##\hat{\alpha}## and ##\hat{\beta}##.
Note: Do not attempt to solve these equations.

3. Show that the information matrix, ##\begin{bmatrix}
n & 0 \\
0 & \sum_{i=1}^{n} (z_i-\bar{z})
\end{bmatrix}##.
Hint: Recall ##E(Y_i) = \frac{1}{\theta_i}## and ##\theta_i = e^{\alpha+\beta(z_i-\bar{z})}##.

4. Find the large sample variances of ##\hat{\alpha}## and ##\hat{\beta}## and show that they are asymptotically uncorrelated.

Homework Equations

The Attempt at a Solution


1. I think this result we are trying to prove is wrong. Firstly, we have
$$\begin{split}
\log{(f_{Y_i}(y_i|\theta_i))} &= \log{\left(\theta_i e^{-\theta_i y_i}\right)} \\
&= \log{(\theta_i)}-y_i\theta_i \\
&= \alpha+\beta(z_i-\bar{z})-y_i\theta_i.
\end{split}$$
Thus,
$$\begin{split}
\ell(\theta) &= \alpha+\beta(z_1-\bar{z})-y_1\theta_1+\alpha+\beta(z_2-\bar{z})-y_2\theta_2+\ldots+\alpha+\beta(z_n-\bar{z})-y_n\theta_n \\
&= \alpha+\beta(z_1-\bar{z})-y_1e^{\alpha+\beta(z_1-\bar{z})}+\alpha+\beta(z_2-\bar{z})-y_2e^{\alpha+\beta(z_2-\bar{z})}+\ldots+\alpha+\beta(z_n-\bar{z})-y_ne^{\alpha+\beta(z_i-\bar{z})} \\
&= n\alpha - \sum_{i=1}^{n} y_ie^{\alpha+\beta(z_i-\bar{z})} - \sum_{i=1}^{n} \beta(z_i-\bar{z}).
\end{split}$$

2. Firstly,
$$\begin{split}
\frac{d\ell}{d\alpha} &= n - \sum_{i=1}^{n} \left(y_ie^{\alpha+\beta(z_i-\bar{z})}\right) \\
&= n - \sum_{i=1}^{n} y_i\theta_i
\end{split}$$
and
$$\begin{split}
\frac{d\ell}{d\beta} &= -\sum_{i=1}^{n} (y_i(z_i-\bar{z})e^{\alpha+\beta(z_i-\bar{z})}) - \sum_{i=1}^{n} (z_i-\bar{z}) \\
&= -\sum_{i=1}^{n} (y_i\theta_i(z_i-\bar{z})) - \sum_{i=1}^{n} (z_i-\bar{z}).
\end{split}$$
So the score statistic is
$$U(\theta;y) = \begin{bmatrix}
n - \sum_{i=1}^{n} y_i\theta_i \\[1em]
-\sum_{i=1}^{n} (y_i\theta_i(z_i-\bar{z})) - \sum_{i=1}^{n} (z_i-\bar{z})
\end{bmatrix}$$
We find ##\hat{\alpha}## and ##\hat{\beta}## by setting ##U(\theta;y) = 0## and solving the two equations. This means solving ##n - \sum_{i=1}^{n} y_i\theta_i = 0## and ##\sum_{i=1}^{n} (y_i\theta_i(z_i-\bar{z})) + \sum_{i=1}^{n} (z_i-\bar{z}) = 0## for ##\hat{\alpha}## and ##\hat{\beta}## respectively.

4. I think this result we are trying to prove is wrong. Firstly, we calculate each partial derivative. Doing so gives
$$\begin{split}
\frac{\partial^2 \ell}{\partial \alpha^2} &= \frac{\partial}{\partial \alpha}\left(n - \sum_{i=1}^{n} y_i\theta_i\right) \\
&= -\sum_{i=1}^{n} y_i\theta_i,
\end{split}$$
$$\begin{split}
\frac{\partial^2 \ell}{\partial \beta \partial \alpha} &= \frac{\partial}{\partial \beta} \left(n - \sum_{i=1}^{n} y_i\theta_i\right) \\
&= -\sum_{i=1}^{n} y_i\theta_i(z_i-\bar{z}) \\
&= \frac{\partial^2 \ell}{\partial \alpha \partial \beta}
\end{split}$$
and
$$\begin{split}
\frac{\partial^2 \ell}{\partial \beta^2} &= \frac{\partial}{\partial \beta} \left(-\sum_{i=1}^{n} (y_i\theta_i(z_i-\bar{z})) - \sum_{i=1}^{n} (z_i-\bar{z})\right) \\
&= -\sum_{i=1}^{n} y_i\theta_i(z_i-\bar{z})^2.
\end{split}$$
Thus,
$$\begin{split}
I(\theta) &= -E\left(\begin{bmatrix}
\frac{\partial^2 \ell}{\partial \alpha^2} & \frac{\partial^2 \ell}{\partial \alpha \partial \beta} \\[1em]
\frac{\partial^2 \ell}{\partial \beta \partial \alpha} & \frac{\partial^2 \ell}{\partial \beta^2}
\end{bmatrix}\right) \\
&= E\left(\begin{bmatrix}
n & \sum_{i=1}^{n} (z_i-\bar{z}) \\[1em]
\sum_{i=1}^{n} (z_i-\bar{z}) & \sum_{i=1}^{n} (z_i-\bar{z})^2
\end{bmatrix}\right) \\
&= \begin{bmatrix}
n & \sum_{i=1}^{n} (z_i-\bar{z}) \\[1em]
\sum_{i=1}^{n} (z_i-\bar{z}) & \sum_{i=1}^{n} (z_i-\bar{z})^2
\end{bmatrix}.
\end{split}$$
I'm not sure if these are right. I reckon that the questions are wrong or is it me.
Please help!
 
Physics news on Phys.org
  • #2


Hi there,

Thank you for your response. After carefully reviewing the forum post and your calculations, it seems like there are some errors in the original post. I apologize for any confusion this may have caused. Here are the corrections:

1. The log-likelihood function should be $$\ell(\theta) = n\alpha - \sum_{i=1}^{n} y_i e^{\alpha + \beta(z_i - \bar{z})}.$$

2. The score statistic should be $$U(\theta;y) = \begin{bmatrix} n - \sum_{i=1}^{n} y_i e^{\alpha + \beta(z_i - \bar{z})} \\ -\sum_{i=1}^{n} (y_i(z_i - \bar{z})) e^{\alpha + \beta(z_i - \bar{z})} - \sum_{i=1}^{n} (z_i - \bar{z}) \end{bmatrix}.$$ The equations to solve for the maximum likelihood estimates should be $$n - \sum_{i=1}^{n} y_i e^{\hat{\alpha} + \hat{\beta}(z_i - \bar{z})} = 0$$ and $$-\sum_{i=1}^{n} (y_i(z_i - \bar{z})) e^{\hat{\alpha} + \hat{\beta}(z_i - \bar{z})} - \sum_{i=1}^{n} (z_i - \bar{z}) = 0.$$

3. The information matrix should be $$I(\theta) = \begin{bmatrix} n & 0 \\ 0 & \sum_{i=1}^{n} (z_i - \bar{z})^2 \end{bmatrix}.$$

4. The large sample variances of ##\hat{\alpha}## and ##\hat{\beta}## should be $$\text{Var}(\hat{\alpha}) = \frac{1}{n} \text{Var}(Y_i) = \frac{1}{n\theta_i^2} = \frac{1}{n e^{2\hat{\alpha} + 2\hat{\beta}(z_i - \bar{z})}},$$ and $$\text{Var}(\hat{\beta
 

1. What is a bivariate random variable?

A bivariate random variable is a type of random variable that takes on two different values simultaneously. This means that the outcome of an experiment or event is described by two variables, rather than just one.

2. How is a bivariate random variable different from a univariate random variable?

A univariate random variable only takes on one value at a time, while a bivariate random variable takes on two values simultaneously. Additionally, the probability distribution of a bivariate random variable is described by a two-dimensional function, whereas a univariate random variable is described by a one-dimensional function.

3. What are some examples of bivariate random variables?

Some examples of bivariate random variables include height and weight, age and income, and temperature and humidity. These variables are related to each other and can be measured simultaneously.

4. How do you calculate the expected value of a bivariate random variable?

The expected value of a bivariate random variable can be calculated by taking the weighted average of all possible outcomes, where the weights are determined by the probabilities of each outcome. This can be represented mathematically as E(X,Y) = ∑∑ x*y*P(X=x,Y=y), where X and Y are the two variables and P(X=x,Y=y) is the joint probability of the two variables taking on values x and y, respectively.

5. What is the covariance of two bivariate random variables?

The covariance of two bivariate random variables measures the degree to which they vary together. A positive covariance indicates that the two variables tend to increase or decrease together, while a negative covariance indicates that they tend to move in opposite directions. The covariance can be calculated using the formula Cov(X,Y) = E[(X-μx)(Y-μy)], where μx and μy are the means of the two variables.

Similar threads

  • Calculus and Beyond Homework Help
Replies
1
Views
607
  • Calculus and Beyond Homework Help
Replies
3
Views
423
  • Calculus and Beyond Homework Help
Replies
1
Views
356
  • Calculus and Beyond Homework Help
Replies
1
Views
771
  • Calculus and Beyond Homework Help
Replies
13
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
Replies
2
Views
901
  • Calculus and Beyond Homework Help
Replies
4
Views
976
  • Calculus and Beyond Homework Help
Replies
14
Views
527
Replies
3
Views
626
Back
Top