Sum of Random Variables...

This takes into account the independence of Q and the Xis, as well as the independence of N and Q and Xis.
  • #1
Sitingbull
3
0
Hello I have this following question and I am wondering if i am on the right path : here is the question

A picture in which pixel either takes 1 with a prob of q and 0 with a prob of 1-q, where q is the realized value of a r.v Q which is uniformly distributed in interval [0,1]

Let Xi be the value of pixel i, we observe for each pixel the value Yi = Xi + N where N is normal with mean 2 and unit variance.( Same noise distrib everywhere) Assume that conditional on Q the Xi's are independent and that the noise N is independent of Q and Xis, calculate the E(Yi) and the var(Yi).

For the expectation, i considered to be distributed unfi on [1,0] which means its expectation is 1/2 and and variance is 1/12.
So for the E(Yi) = it will be 1/2 + 2 = 2.5
for V(Yi) = 1/12 + 1 = 13/12

Is that correct or I am totally wrong on that?

Thank you in advance
 
Last edited:
Physics news on Phys.org
  • #2
Yes, your answer is correct. The expectation of Xi is 1/2 since it is uniformly distributed, and the variance is 1/12. When you add the noise N with mean 2 and variance 1, the expectation of Yi becomes 2.5, and the variance of Yi becomes 13/12.
 

What is the concept of "Sum of Random Variables" in statistics?

The sum of random variables is a mathematical operation used to combine multiple random variables into a single new random variable. It is often used in probability theory to model the total outcome of a random experiment that involves multiple variables.

How do you calculate the sum of two random variables?

To calculate the sum of two random variables, you simply add their individual values. For example, if X and Y are two random variables, the sum Z = X + Y is a new random variable that takes on values equal to the sum of X and Y for each possible combination of values.

What is the difference between the sum of independent random variables and the sum of dependent random variables?

The sum of independent random variables is calculated by treating each variable as completely unrelated to the other, while the sum of dependent random variables takes into account their relationship and any potential influence on each other’s values. This can result in different outcomes and is an important distinction in probability theory.

What are some real-world applications of the sum of random variables?

The concept of the sum of random variables is used in various fields such as finance, economics, and engineering. It can be used to model stock prices, insurance claims, and reliability of systems. In biology, it can be used to model genetic traits and in psychology, it can be used to model multiple factors that influence behavior.

How does the central limit theorem relate to the sum of random variables?

The central limit theorem states that the sum of a large number of independent random variables will approximate a normal distribution, regardless of the distribution of the individual variables. This is important because many real-world phenomena can be modeled using a normal distribution, making the sum of random variables a useful tool in statistical analysis.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
457
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
492
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
268
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
904
Back
Top