Confusion about a random process

Click For Summary
The discussion revolves around the conditions for a random process X(t) to be wide-sense stationary, specifically focusing on the role of the Bernoulli random variable α. The user expresses confusion regarding the autocorrelation function R_X(t_1, t_2) and why it appears to depend on the same random variable α at both time instants rather than independent variables α_1 and α_2. It is clarified that using independent random variables for different time instants is valid and does not contradict the established formulas for autocorrelation. The user also highlights the need to correctly calculate covariance by subtracting the product of the means. Ultimately, the conversation emphasizes the importance of understanding the independence of random variables in the context of random processes.
ait.abd
Messages
24
Reaction score
0
Question already asked on http://math.stackexchange.com/quest...m-process?noredirect=1#comment2661260_1310194, but couldn't get an answer so reposting here
------------------------------------------------------------------------------------------------------------------------------------------
Let X(t) be a random process such that:
$$
X(t) = \begin{cases}
t & \text{with probability } \frac{1}{2} \\
2-at & \text{with probability } \frac{1}{2} \\
\end{cases},
$$
where a is a constant.
I need to find the value of a for which X(t) is a wide-sense stationary process. I have made the following definition of the random process:
$$
\begin{equation}
X(t) = \alpha t + (1-\alpha)(2-at),
\end{equation}
$$
where \alpha is a Bernoulli random variable with p=q=0.5.

For mean, we have
$$
E[X(t)] = \frac{t + 2-at}{2}.
$$
For the autocorrelation function,
$$
\begin{align*}
R_X(t_1,t_2) &= E[X(t_1)X(t_2)]\\
&=E[(\alpha t_1 + (1-\alpha)(2-at_1))\times(\alpha t_2 + (1-\alpha)(2-at_2))]\\
&=E[\alpha^2 t_1 t_2 + (1-\alpha)^2(2-at_1)(2-at_2)]\\
&=\frac{t_1 t_2}{2} + \frac{4-a(t_1+at_2)+a^2 t_1 t_2}{2}\\
\end{align*}
$$
As clear from the above equation, there is no value of a for which autocorrelation function is a function of time difference t_2-t_1.

My confusion starts from the way we define autocorrelation function as above in so many textbooks. The above-mentioned definition shows that we sample the ensemble at two time instants t_1 and t_2 to get two random variables. The two random variables X(t_1) and X(t_2) are (possibly different) functions of the same random variable \alpha. My question is why do we need to take the same random variable \alpha at two time instants? It is like saying that if we know X(t_1), we can figure out X(t_2) right away. Shouldn't it be like that at t_1 we should take \alpha_1 and at t_2 we should take \alpha_2, where both \alpha_1 and \alpha_2 are Bernoulli with p=0.5.

I can describe the confusion like the following as well. When we sample the random process at two time instants, we get two random variables A = X(t_1) and B = X(t_2), where
$$
A = \begin{cases}
t_1 & \text{ with probability 0.5} \\
2-at_1 & \text{ with probability 0.5} \\
\end{cases}
$$
and
$$
B = \begin{cases}
t_2 & \text{ with probability 0.5} \\
2-at_2 & \text{ with probability 0.5} \\
\end{cases}.
$$
To calculate E[AB], we need to take the cases into account where A=t_1 and B=2-at_2, and A=2-at_1 and B=t_2. Both of these cases do not appear in the calculation of ensemble autocorrelation function R_X(t_1,t_2). Why do I get to take the two cases into account when I use the formulation in terms of A and B, and these two cases do not appear when I calculated R_X(t_1, t_2) using the formulation with \alpha as discussed in the start of the problem?
 
Physics news on Phys.org
Your covariance is incorrect. You have to subtract the product of the means.

Cov(X(t_1)X(t_2))=E(X(t_1)X(t_2))-E(X(t_1))E(X(t_2))
 
ait.abd said:
Shouldn't it be like that at t_1 we should take \alpha_1 and at t_2 we should take \alpha_2, where both \alpha_1 and \alpha_2 are Bernoulli with p=0.5.

Yes, that is the correct way to look at it.

The formula R(\tau) = \frac{ E( X_{\tau} - \mu) E(X_{t + \tau} - \mu)} {\sigma^2} does not assert that the random variables X_{\tau} and X_{t + \tau} are functions of the same random variable \alpha. In this problem, using two independent random variables \alpha_1, \alpha_2 is correct and does not contradict that formula.
 
The standard _A " operator" maps a Null Hypothesis Ho into a decision set { Do not reject:=1 and reject :=0}. In this sense ( HA)_A , makes no sense. Since H0, HA aren't exhaustive, can we find an alternative operator, _A' , so that ( H_A)_A' makes sense? Isn't Pearson Neyman related to this? Hope I'm making sense. Edit: I was motivated by a superficial similarity of the idea with double transposition of matrices M, with ## (M^{T})^{T}=M##, and just wanted to see if it made sense to talk...

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
2
Views
992
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 27 ·
Replies
27
Views
4K
  • · Replies 6 ·
Replies
6
Views
2K