Is the Proof of Geometric Progression in Probability Common Sense?

  • Context: Undergrad 
  • Thread starter Thread starter Gunmo
  • Start date Start date
  • Tags Tags
    Geometric Proof
Click For Summary

Discussion Overview

The discussion revolves around the proof of a geometric progression in the context of probability, particularly focusing on the expected number of trials until the first success in a Bernoulli process. Participants express uncertainty about the proof's validity and its classification as "common sense" as stated in a statistics textbook.

Discussion Character

  • Debate/contested
  • Mathematical reasoning
  • Technical explanation

Main Points Raised

  • Some participants question the proof's validity, noting that key variables are not defined in the provided equations.
  • Others suggest that the proof relates to the Binomial Theorem, indicating that the expected value can be derived from it.
  • One participant proposes a method involving taking a derivative to solve the summation related to the geometric distribution.
  • Several participants discuss the concept of memorylessness in geometric distributions, suggesting it is relevant to calculating expected values.
  • There is disagreement about whether the problem is binomial or not, with some asserting it is a Bernoulli process rather than a binomial distribution.
  • One participant claims to have proven the expected value associated with the geometric distribution, asserting that no textbook has provided this proof.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the classification of the distribution as binomial or Bernoulli, and there is ongoing debate regarding the validity of the proof and the definitions of the variables involved.

Contextual Notes

Some participants express concerns about convergence issues related to the summation, while others suggest that these concerns can be addressed through various mathematical approaches, including the use of Markov chains.

Gunmo
Messages
11
Reaction score
1
My Statistics textbook does not prove this. The author think it is commons sense. I am not sure about this proof.

Thank you.
 

Attachments

  • Untitled.png
    Untitled.png
    684 bytes · Views: 478
Physics news on Phys.org
Below is the contents of your attachment.
$$u = \sum_{n = 1}^\infty x p(1 - p)^{n - 1}$$
$$ u = \frac 1 p$$
I would not consider this a proof in any way, in part because neither x nor p is defined.
 
sorry typo, i attach again
 

Attachments

  • Untitled.png
    Untitled.png
    684 bytes · Views: 445
Could you have a look new attachment please.
Thank you for finding error in attachment
 
Gunmo said:
Could you have a look new attachment please.
Thank you for finding error in attachment

Do you know the Binomial Theorem?
 
Yes, this is actually about probability.
 
Gunmo said:
Yes, this is actually about probability.

The proof you are looking for involves the Binomial Theorem - which, given you are dealing with a Binomial Distribution, may not be that surprising.

Hint: take out the factor of ##p## and deduce from the answer what the remaining infinite sum must be. Then try to find the trick.
 
Not sure yet, this eventually

Sigma ( x * k^x) 0<k<1 x = 1, 2, 3, 4, 5....n=infinite

x is approach infinite,
k^x approach 0.

The answer is, 1/k.
 
Gunmo said:
Not sure yet, this eventually

Sigma ( x * k^x) 0<k<1 x = 1, 2, 3, 4, 5....n=infinite

x is approach infinite,
k^x approach 0.

The answer is, 1/k.

I'm not sure what that all means. Here's some latex. If you reply to my post, you'll get the Latex:

$$\sum_{n=1}^{\infty} np(1-p)^{n-1} = \frac{1}{p}$$
 
  • #10
Instead of being a binomial distribution (it doesn't have the combinations term), it looks like it is computing the mean number of times you need to flip a (weighted)coin before it comes up heads, with "p" being the probability that heads comes up on any given throw. @PeroK gives a good hint: You need to look for a "trick" of solving the summation of ## n(1-p)^{n-1} ##. I can add to that: the trick involves taking a derivative. ## \\ ## Editing: The OP @Gunmo mentions in the original post, that the author says this is a "common sense" result. It's good to prove it mathematically, but it does make sense in that if you have a probability ## p=\frac{1}{100} ## for a success, in general you need approximately ## \bar{n}= \frac{1}{p}=100 ## tries to get a success.
 
Last edited:
  • #11
There are a lot of ways to solve this problem...

As mentioned, you are trying to find the expected value associated with a geometric distribution here. The solutions mentioned thus far do not address convergence issues. So long as you are ok with that / understand the radius of convergence of a geometric series, I would suggest using the memoryless property of a geometric distribution.

That is: you are calculating expected number of coin flips until heads, where heads occurs with probability ##p##. Your expected value calculation is: you have one coin toss no matter what, and then with probability ##1 - p## you get a tails and thus are starting over with the exact same incremental expectation -- again because of memorylessness

## E[X] = 1 + (1-p)E[X]##

Then solve for ##E[X]##

If you are worried about convergence you either need to explicitly address radius of convergence or set this up as an absorbing state markov chain (2 states only), and use telescoping on the partitioned matrix. In the the latter case, convergence is not an issue that needs addressed. Markov chains are overkill here, but I do like the pictures associated with them, and they provide a nice alternative approach when dealing with convergence questions.
 
  • Like
Likes   Reactions: Charles Link
  • #12
StoneTemplePython said:
## E[X] = 1 + (1-p)E[X]##

Then solve for ##E[X]##

.

That's neat.

There are no convergence issues to worry about if you apply the binomial theorem.
 
  • #13
@PeroK : Please read my post #10. I don't think it is binomial. As @StoneTemplePython says, here the OP's problem is to calculate the expected number of coin flips until heads occurs, given that heads occurs with probability p. (The OP's formula is not calculating the probability of ## k ## successes in ## n ## trials, nor the mean number of successes of a binomial distribution.) ## \\ ## Note: The "trick" that is most readily employed in solving this one is essentially the same one (taking a derivative w.r.t. ## p ##) in solving the mean ## k ## for a binomial, but this one is not the binomial distribution.
 
Last edited:
  • #14
Charles Link said:
@PeroK : Please read my post #10. I don't think it is binomial. As @StoneTemplePython says, here the OP's problem is to calculate the expected number of coin flips until heads occurs, given that heads occurs with probability p. (The OP's formula is not calculating the probability of ## k ## successes in ## n ## trials, nor the mean number of successes of a binomial distribution.) ## \\ ## Note: The "trick" that is most readily employed in solving this one is essentially the same one (taking a derivative w.r.t. ## p ##) in solving the mean ## k ## for a binomial, but this one is not the binomial distribution.

If this isn't a binomial distribution I don't know what is. True, you are calculating the mean time to get the first occurrence, but it's clearly a binomial.
 
  • #15
PeroK said:
If this isn't a binomial distribution I don't know what is. True, you are calculating the mean time to get the first occurrence, but it's clearly a binomial.

Actually, this isn't quite right. It's a Bernoulli Process, the Binomial distribution being a number of Bernoulli trials. I've always used Binomial for both cases, where there are only two outcomes.

The binomial theorem applies in any case.
 
  • Like
Likes   Reactions: Charles Link
  • #16
This mean of geometric probability problem.
No statistics and Math textbook proved it. I proved it now.
.
 

Attachments

  • 1.pdf
    1.pdf
    28.5 KB · Views: 336
  • Like
Likes   Reactions: Charles Link

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 17 ·
Replies
17
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K