# I Could you help me with a proof geometric progression

1. May 14, 2017

### Gunmo

My Statistics text book does not prove this. The author think it is commons sense. I am not sure about this proof.

Thank you.

#### Attached Files:

• ###### Untitled.png
File size:
676 bytes
Views:
56
2. May 14, 2017

### Staff: Mentor

Below is the contents of your attachment.
$$u = \sum_{n = 1}^\infty x p(1 - p)^{n - 1}$$
$$u = \frac 1 p$$
I would not consider this a proof in any way, in part because neither x nor p is defined.

3. May 14, 2017

### Gunmo

sorry typo, i attach again

#### Attached Files:

• ###### Untitled.png
File size:
676 bytes
Views:
34
4. May 14, 2017

### Gunmo

Could you have a look new attachment please.
Thank you for finding error in attachment

5. May 14, 2017

### PeroK

Do you know the Binomial Theorem?

6. May 14, 2017

### Gunmo

Yes, this is actually about probability.

7. May 14, 2017

### PeroK

The proof you are looking for involves the Binomial Theorem - which, given you are dealing with a Binomial Distribution, may not be that surprising.

Hint: take out the factor of $p$ and deduce from the answer what the remaining infinite sum must be. Then try to find the trick.

8. May 14, 2017

### Gunmo

Not sure yet, this eventually

Sigma ( x * k^x) 0<k<1 x = 1, 2, 3, 4, 5.............n=infinite

x is approach infinite,
k^x approach 0.

9. May 14, 2017

### PeroK

I'm not sure what that all means. Here's some latex. If you reply to my post, you'll get the Latex:

$$\sum_{n=1}^{\infty} np(1-p)^{n-1} = \frac{1}{p}$$

10. May 14, 2017

Instead of being a binomial distribution (it doesn't have the combinations term), it looks like it is computing the mean number of times you need to flip a (weighted)coin before it comes up heads, with "p" being the probability that heads comes up on any given throw. @PeroK gives a good hint: You need to look for a "trick" of solving the summation of $n(1-p)^{n-1}$. I can add to that: the trick involves taking a derivative. $\\$ Editing: The OP @Gunmo mentions in the original post, that the author says this is a "common sense" result. It's good to prove it mathematically, but it does make sense in that if you have a probability $p=\frac{1}{100}$ for a success, in general you need approximately $\bar{n}= \frac{1}{p}=100$ tries to get a success.

Last edited: May 14, 2017
11. May 15, 2017

### StoneTemplePython

There are a lot of ways to solve this problem...

As mentioned, you are trying to find the expected value associated with a geometric distribution here. The solutions mentioned thus far do not address convergence issues. So long as you are ok with that / understand the radius of convergence of a geometric series, I would suggest using the memoryless property of a geometric distribution.

That is: you are calculating expected number of coin flips until heads, where heads occurs with probability $p$. Your expected value calculation is: you have one coin toss no matter what, and then with probability $1 - p$ you get a tails and thus are starting over with the exact same incremental expectation -- again because of memorylessness

$E[X] = 1 + (1-p)E[X]$

Then solve for $E[X]$

If you are worried about convergence you either need to explicitly address radius of convergence or set this up as an absorbing state markov chain (2 states only), and use telescoping on the partitioned matrix. In the the latter case, convergence is not an issue that needs addressed. Markov chains are overkill here, but I do like the pictures associated with them, and they provide a nice alternative approach when dealing with convergence questions.

12. May 15, 2017

### PeroK

That's neat.

There are no convergence issues to worry about if you apply the binomial theorem.

13. May 15, 2017

@PeroK : Please read my post #10. I don't think it is binomial. As @StoneTemplePython says, here the OP's problem is to calculate the expected number of coin flips until heads occurs, given that heads occurs with probability p. (The OP's formula is not calculating the probability of $k$ successes in $n$ trials, nor the mean number of successes of a binomial distribution.) $\\$ Note: The "trick" that is most readily employed in solving this one is essentially the same one (taking a derivative w.r.t. $p$) in solving the mean $k$ for a binomial, but this one is not the binomial distribution.

Last edited: May 15, 2017
14. May 15, 2017

### PeroK

If this isn't a binomial distribution I don't know what is. True, you are calculating the mean time to get the first occurrence, but it's clearly a binomial.

15. May 15, 2017

### PeroK

Actually, this isn't quite right. It's a Bernoulli Process, the Binomial distribution being a number of Bernoulli trials. I've always used Binomial for both cases, where there are only two outcomes.

The binomial theorem applies in any case.

16. May 15, 2017

### Gunmo

This mean of geometric probability problem.
No statistics and Math text book proved it. I proved it now.
.

File size:
28.5 KB
Views:
39