- #1

- 2

- 0

and I want to find E(x1*x2 | x1 + x2 = x)

Can I simply say: x1 = x - x2 and thus

E(x1*x2 | x1 + x2 = x) =

E[ (x - x2)*x2) = E[ (x * x2) - ((x2)^2) ] <=>

x*E[x2] - E[x2^2] =

0 - 1 =

-1?

- Thread starter redflame34
- Start date

- #1

- 2

- 0

and I want to find E(x1*x2 | x1 + x2 = x)

Can I simply say: x1 = x - x2 and thus

E(x1*x2 | x1 + x2 = x) =

E[ (x - x2)*x2) = E[ (x * x2) - ((x2)^2) ] <=>

x*E[x2] - E[x2^2] =

0 - 1 =

-1?

- #2

mathman

Science Advisor

- 7,868

- 450

You need to transform the problem to one where x1 + x2 is a random variable.

Let u=(x1 + x2)/√2, v=(x1-x2)/√2. (I am using √2 so that u and v will be iid N(0,1))

Then x1=(u+v)/√2 and x2=(u-v)/√2

Your original problem is now E((u

Since u and v are independent, the result is x

- #3

- 16

- 0

I'm confused. Can you please explain more why the first solution (-1) is wrong?

[tex]X_1[/tex] and [tex]X_2[/tex] are 2 RVs, and we're told that [tex]X_1+X_2=y[/tex].

1) I understand that sum of two RVs will be a RV, but here, when we are told the sum is [tex]y[/tex], it is something like a constant? (a realization of [tex]Y[/tex] that is [tex]Y=y[/tex] I mean.)

2) if I'm right about (1), then when we are told the sum is [tex]y[/tex], it means we can focus only on one RV (accurately, I mean we can think we have two RVs: [tex]y-X_1[/tex] and [tex]X_1[/tex]). Now the solution as redflame said will be:

[tex]\int_{x_1} (y-x_1)(x_1) p(x_1) \ud x_1 = \ldots = -1[/tex]

(I'm not sure if the TeX code above is shown correctly, it is (\int_{x_1} (y-x_1)(x_1) p(x_1) \ud x_1 = \ldots = -1), (the dx_1 is not shown correctly for me).

Thanks in advance. (;

[tex]X_1[/tex] and [tex]X_2[/tex] are 2 RVs, and we're told that [tex]X_1+X_2=y[/tex].

1) I understand that sum of two RVs will be a RV, but here, when we are told the sum is [tex]y[/tex], it is something like a constant? (a realization of [tex]Y[/tex] that is [tex]Y=y[/tex] I mean.)

2) if I'm right about (1), then when we are told the sum is [tex]y[/tex], it means we can focus only on one RV (accurately, I mean we can think we have two RVs: [tex]y-X_1[/tex] and [tex]X_1[/tex]). Now the solution as redflame said will be:

[tex]\int_{x_1} (y-x_1)(x_1) p(x_1) \ud x_1 = \ldots = -1[/tex]

(I'm not sure if the TeX code above is shown correctly, it is (\int_{x_1} (y-x_1)(x_1) p(x_1) \ud x_1 = \ldots = -1), (the dx_1 is not shown correctly for me).

Thanks in advance. (;

Last edited:

- #4

mathman

Science Advisor

- 7,868

- 450

Let x1 be N(0,a) and x2 be N(0,b), where a and b are different. Then we have the following:

E(x1*x2|x1+x2=x)=E(x*x2)-E(x2^2)=-b

E(x1*x2|x1+x2=x)=E(x1*x)-E(x1*2)=-a

This is obviously incorrect!

- #5

- 16

- 0

I'm really confused!

Can you please suggest a solution through the definition of E{x1*x2|x1+x2=x}?

I want to correct my beliefs about this kind of problems (working with sum of RVs and ...).

I think E{x1*x2|x1+x2=x}=double integral on x1 and x2 { x1 * x2 * p(x1,x2|x1+x2=x) *dx1 *dx2}

(BTW, is it true?)

Can you continue it?

- #6

- 16

- 0

- #7

mathman

Science Advisor

- 7,868

- 450

As for texts I haven't been keeping up (I'm retired). I hope someone else can help.

- #8

mathman

Science Advisor

- 7,868

- 450

I can give you a very simple example to show what happens. Toss a fair coin twice, and for this analysis, let heads be called +1 and tails -1. Then each toss has mean 0 and variance 1.

Now impose the condition the sum of the tosses = 2. The resulting conditional distribution for both tosses is now +1, so the conditional means are now 1 and the conditional variances are 0.

- Last Post

- Replies
- 1

- Views
- 1K

- Last Post

- Replies
- 1

- Views
- 2K

- Last Post

- Replies
- 2

- Views
- 4K

- Last Post

- Replies
- 3

- Views
- 617

- Last Post

- Replies
- 1

- Views
- 1K

- Last Post

- Replies
- 5

- Views
- 3K

- Last Post

- Replies
- 3

- Views
- 5K

- Last Post

- Replies
- 3

- Views
- 4K

- Replies
- 4

- Views
- 940

- Replies
- 2

- Views
- 6K