# Infinity Conumdrums

1. Sep 29, 2010

### Oldfart

I hope that this is thr right place to post this.

If I walk down an infinite yellow brick road, flipping a fair coin with each step, and taking a step to the left if tails, a step to the right if heads, it would seem possible that I could end up a trillion miles from the road's centerline. (a) But is it certain that this must occur? (b) Is it possible that I could end up an infinite distance from the road's centerline? and (c) Is it possible that after an infinite walk, I would never have exceeded one step to the right or left of the centerline?

I seem to have severe problems with infinity...

OF

2. Sep 29, 2010

### Petr Mugver

You can calculate the probability that, after N steps, you are, say, between a and b steps away from the certerline. If N is large, the central limit theorem says that your stochastic walk is a gaussian aleatory variable with mean value zero and root mean square $$\sqrt{N}$$.
For example, after one million steps, there's a probability of 68% that you are less than thousand steps away from the centerline. To have such a high probability to go a trillion miles away, you need to walk for a (trillion)2 miles... good luck, Forrest Gump!

3. Sep 30, 2010

### Tac-Tics

First, I have to ask... why are you taking steps right and left instead of down and up the road :P

More seriously.

a) There is no certainty that you will ever make any distance. Consider the (highly unlikely) event that you always flip heads then tails and your flips are always opposite the last one. In the limit, you have averaged no distance.

b) In the (equally unlikely) event that you always flip heads (or always flip tails), in the limit, you will have traveled an infinite amount of distance. So this, too, is a possibility.

And c) is obviously possible as I described in a).

Now, calculating the probabilities of these things happening can be tricky, but save that for the homework :)

If you are having trouble understanding it, you might think of the problem this way instead.

Replace the coin flips with a sequence of -1's and +1's. The total distance traveled is the infinite sum of this sequence. The infinite sum is just the limit of the partial sums (with the possibility that the limit diverges).

4. Sep 30, 2010

### mathman

In the simple one dimensional random walk (integer steps with equal prob. in either direction), the probability that any point will be reached eventually is one. This is also true for two dimensions, but not for three or more.

5. Sep 30, 2010

### Petr Mugver

This statement

is related to the follofing problem:

$$p(n,k)=\frac{1}{2}\big[p(n-1,k-1)+p(n-1,k+1)\big]\qquad\textrm{for}\quad n>0,\,\forall k$$

with initial conditions

$$p(0,k)=\delta(k)\qquad\forall k$$

with n and k integers. Anyone knows a general method for solving this kind of equations (difference equations with more than one variable)?

6. Sep 30, 2010

### Oldfart

7. Sep 30, 2010

### Thecla

Re: Infinity Conumdrums- crossing the starting point

According to Alex Bellos, author of Here's Looking at Euclid(p.234) if the coin is tossed infinitely many times, the most likely number of times he will cross the starting point is zero.
The next most likely number is one, then two,three, and so on.

This is against what I first thought: fair coin ,50% probablity. You would think that you would spend equal time on both sides of the starting point. Not true.

8. Oct 1, 2010

### mathman

It doesn't conflict. Probability 1 doesn't mean certainty.

9. Oct 1, 2010

### Oldfart

OK, thanks, I was sort of afraid of something like that. Dang mathspeak...

Can someone briefly explain to me what a probability of 1 conveys? (I thought it was like the probability of a flipped coin landing either heads or tails is !, or 100%.)

Thanks, OF

10. Oct 1, 2010

### G037H3

IMO, it's just like how 0.999 is basically 1 in most applications, but isn't 100% certainty

probabilistic calculations never ensure certainty, except in super rare cases...what they do is plot data according to (usually) a Gaussian or other distribution, in that even events that are so rare as to appear impossible (sigma 5 or greater), can still occur, because of a phenomenon called a fat tail

11. Oct 1, 2010

### Oldfart

OK...

I knew that I shouldn't have come in here...

But as I sneak back out, could anyone tell me what the probability is of a flipped coin landing either heads or tails?

Thanks, OF

12. Oct 1, 2010

### G037H3

Near one, but not perfect.

If you're talking about an actual physical coin, the odds of landing on either side aren't perfectly even, btw, as it is unlikely that the coin is perfectly balanced...

but in a thought experiment, the odds are exactly 50/50, because it is a use of binary logic, 0 or 1, which has a wide range of uses, even outside of computer science. :)

I'm sure that someone with more reading on probability could give you better answers, I personally haven't studied probability at all, but am simply using my logic as a tool to answer your questions.

13. Oct 1, 2010

### Oldfart

14. Oct 1, 2010

### G037H3

Yeah, the coin could land on its edge, likely less than 1/10^6 (1 in 1 million) probability, but it's always possible.

But like I said, the problems/models with coin flipping are hypothetical, where it is a virtual coin, and the odds of either side are exactly 50/50.

Its a pretty basic example of the contrast between a mathematical ideal and reality. The odds of the coin not landing on a side are so low that it is safe to ignore it, and still have accurate results using the binary paradigm represented by the coin. But my point was just that you have to keep in mind that what you're dealing with isn't a real physical coin, you shouldn't think of it that way. It's just an intuitive crutch to help you understand 50/50 probability.

Last edited: Oct 2, 2010
15. Oct 2, 2010

### The_Duck

I think that the OP might enjoy reading the Wikipedia page on http://en.wikipedia.org/wiki/Almost_surely" [Broken].

Last edited by a moderator: May 5, 2017
16. Oct 2, 2010

### Oldfart

Thanks for the link, Duck! Very informative!

OF

Last edited by a moderator: May 5, 2017
17. Oct 2, 2010

### Hurkyl

Staff Emeritus
A probability measure evaluated on an event gives the value 1.

I assume, however, you are interested not in what it means mathematically, but how it might be interpreted into the real world. Frequentism is common -- if we repeat the same experiment indefinitely, the proportion of times the event occurred converges to 1.

(philosophical issues brushed aside)

In such an interpretation, a particular event can be probability 1, so long as it fails to happen sufficiently infrequently. e.g. if it failed the first time, but occurred every other time.

18. Oct 2, 2010

### Petr Mugver

Can you give us a practical example (with coins, dice, etc) of this? Note that the example given by Mathman doesn't apply to this, because he simply says that the probability of his particular event will EVENTUALLY occur is one, so you can never say a particular realization of the experiment failed. In other words: can you give an example of an event having p = 1 but that it can also not occur? (with discrete aleatory variables, so we can rule out problems of zero-measure sets)

19. Oct 2, 2010

### mathman

A simple illustration of the difference between probability one and certainty is by considering an infinite sequence of (fair) coin flips. The probability having an approximately equal number of heads and tails is one (law of large numbers), but it it not certain. It is possible that all the flips could turn up heads, but the probability is zero.

20. Oct 3, 2010

### Hurkyl

Staff Emeritus
The simplest example I've come up with in the past is the probability that, for flipping a particular coin, you have probability 1 of not getting the first heads.