Sum of independent Random Variables

In summary, the sum of independent random variables is a mathematical concept used in probability theory to describe the combination of multiple random variables. It is calculated by adding together the values of each individual random variable and is significant in simplifying the calculation and analysis of probabilities and outcomes. The sum can be negative if one or more of the individual variables has a negative value, and it has many real-world applications such as modeling stock market fluctuations and predicting weather patterns.
  • #1
mrkb80
41
0

Homework Statement


Three yearly losses.
First: Exponential
Second & Third: Weibull

Losses are independent.
Find the 95% VaR of the min loss


Homework Equations





The Attempt at a Solution


My first thought was:
Let L be total loss, A be first Loss, B be second loss, C be third loss

Pr(L<x) = Pr(A<x) * Pr(B<x) * Pr(C<x)

But logically this seems wrong to me, because if Pr(A<x) = 100% then Pr(L<x) should be 100%.

So then I thought maybe:

Pr(L<x) = a_1 Pr(A<x) + a_2 Pr(B<x) + a_3 Pr(C<x)

but how do I choose a_1, a_2,a_3? Seems like these would change based on the values of A,B,C which makes them r.v. themselves!

The prof hinted that we just need to find the CDF and that to find the CDF we first should find the survival function keeping in mind the r.v. are independent.

I'm sort of out of ideas.
 
Physics news on Phys.org
  • #2
mrkb80 said:

Homework Statement


Three yearly losses.
First: Exponential
Second & Third: Weibull

Losses are independent.
Find the 95% VaR of the min loss


Homework Equations





The Attempt at a Solution


My first thought was:
Let L be total loss, A be first Loss, B be second loss, C be third loss

Pr(L<x) = Pr(A<x) * Pr(B<x) * Pr(C<x)

But logically this seems wrong to me, because if Pr(A<x) = 100% then Pr(L<x) should be 100%.

So then I thought maybe:

Pr(L<x) = a_1 Pr(A<x) + a_2 Pr(B<x) + a_3 Pr(C<x)

but how do I choose a_1, a_2,a_3? Seems like these would change based on the values of A,B,C which makes them r.v. themselves!

The prof hinted that we just need to find the CDF and that to find the CDF we first should find the survival function keeping in mind the r.v. are independent.

I'm sort of out of ideas.

You say that you think Pr(L<x) = a_1 Pr(A<x) + a_2 Pr(B<x) + a_3 Pr(C<x).

Why on Earth would you ever think that? There are perfectly standard formulas for the probability distribution of a sum of independent random variables. These formulas are found in every probability textbook, or even in hundreds of on-line articles.

BTW: what you have written is not the standard cdf, which would be Pr(L <= x), with a non-strict inequality. What you wrote was long ago abandoned as the definition of a cdf.
 
  • #3
Point taken. I'm going down this path now:
[itex] F_{A+B+C} = P(A+B+C \le x) [/itex]
[itex] = \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} \int_{-\infty}^{x-b-c} f_A(a) f_B(b) f_C(c) da db dc [/itex]

[itex] = \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} F_A(x-b-c) f_B(b) f_C(c) db dc [/itex]
Not sure where to go next...
 
Last edited:
  • #4
mrkb80 said:
Point taken. I'm going down this path now:
[itex] F_{A+B+C} = P(A+B+C \le x) [/itex]
[itex] = \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} \int_{-\infty}^{x-b-c} f_A(a) f_B(b) f_C(c) da db dc [/itex]

[itex] = \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} F_A(x-b-c) f_B(b) f_C(c) db dc [/itex]
Not sure where to go next...

I doubt that the task is 'doable' analytically. Probably you need to resort to a numerical method.

Probably the easiest way is through successive convolution: if X = B+C, then its density is
[tex] f_X(x) = \int_0^x f_B(y) f_C(x-y) \, dy.[/tex]
Then the density of S = A+B+C = A + X is
[tex] f_S(x) = \int_0^x f_A(y) f_X(x-y) \, dy[/tex]
Note that the integrations only go from 0 to x, not from -∞ to +∞; this is because the random variables are all ≥ 0.

However, at this point I think you are stuck: you probably need to compute and store the values of ##f_X(x)## on a grid of x-values, or maybe come up with a convenient approximate formula for it, because in some cases at least we can prove that there is no finite formula for ##f_X## in terms of elementary functions. We can establish this by example: suppose B and C are iid Weibull with parameters k = 2 and λ = 1, so that the density of B (and C) is
[tex] f_B(x) = f_C(x) = 2x e^{-x^2}.[/tex]
We can actually do the integral to get ##f_X##:
[tex] f_X(x) = x e^{-x^2} + \sqrt{\frac{\pi}{2}}\,(x^2-1)\, e^{ -x^2 /2}\, \text{erf}(x/\sqrt{2}). [/tex]
It is impossible to give a finite, closed-form formula for this in terms of elementary functions, because if we could do it, we would have a finite, closed-form expression for 'erf', and that has been shown to be impossible. Of course, you can give non-finite expressions, such as infinite series and the like.

Anyway, that special case proves to be un-doable in simple terms, so the general case must also be un-doable.

All I can suggest is some type of numerical approximation when actual numerical inputs are specified. However, quantities like the mean and variance of S = A+B+C are easily obtained using standard result about moments of sums of independent random variables.
 

1. What is the definition of "Sum of independent Random Variables"?

The sum of independent random variables is a mathematical concept used in probability theory to describe the combination of multiple random variables. It is the sum of the individual values of each random variable, and is often used to model real-world situations where multiple factors contribute to an outcome.

2. How are the sum of independent random variables calculated?

The sum of independent random variables is calculated by adding together the values of each individual random variable. This can be done using a formula or by simply adding the values together. The result is a new random variable that represents the combined effect of the individual variables.

3. What is the significance of independent random variables in probability theory?

Independent random variables are significant in probability theory because they allow for the modeling of complex systems and situations. By assuming that certain variables are independent, we can simplify the calculation and analysis of probabilities and outcomes.

4. Can the sum of independent random variables ever be negative?

Yes, the sum of independent random variables can be negative. This can occur if one or more of the individual variables has a negative value. It is important to consider the range of possible values for each variable when calculating the sum.

5. Are there any real-world applications of the sum of independent random variables?

Yes, there are many real-world applications of the sum of independent random variables. Some examples include modeling stock market fluctuations, predicting weather patterns, and analyzing the success of a marketing campaign. Any situation where multiple factors contribute to an outcome can be modeled using the concept of the sum of independent random variables.

Similar threads

  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
12
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
701
Replies
1
Views
3K
  • Calculus and Beyond Homework Help
Replies
2
Views
873
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
2K
Back
Top