- #1

- 4

- 0

## Main Question or Discussion Point

I've been struggling with this problem for more than 4 days now:

Let A, B and C be exponential distributed random variables with parameters lambda_A, lambda_B and lambda_C, respectively.

Calculate E [ B | A < B < C ] in terms of the lambda's.

I always seem get an integral which is impossible to calculate...

Who has a suggestion how to solve this problem?

Let A, B and C be exponential distributed random variables with parameters lambda_A, lambda_B and lambda_C, respectively.

Calculate E [ B | A < B < C ] in terms of the lambda's.

I always seem get an integral which is impossible to calculate...

Who has a suggestion how to solve this problem?