Feynman - Random Walk <D> and coin flipping

AI Thread Summary
The discussion centers on the concept of random walks and their relation to coin flipping, specifically the expected value of distance from the starting point after a number of moves. It highlights that as more moves are made in a random walk, the expected distance from the origin increases, paralleling how the difference between heads and tails in coin flips grows even as the ratio approaches 1/2. Participants agree that the average difference in coin flips corresponds to the expected value of distance in random walks, emphasizing that both processes share the same statistical distribution. Additionally, it is clarified that the text refers to mean average deviation (MAD) rather than root mean square (RMS) distance, correcting a common misunderstanding. The conversation illustrates the interconnectedness of probability theory concepts through these examples.
QED-Kasper
Messages
32
Reaction score
0
Hello,

I have read the probability chapter in Feynman's lectures on physics. And got fascinated by the random walk. There is a statement, that in a game where either a vertical distance of +1 or -1 can be walked each move, the expected value of the absolute distance (lets call it <D>) from initial position 0, will be equal to the square root of N if N moves have been made.

For those that don't know and are interested: http://en.wikipedia.org/wiki/Random_walk.

What was fascinating for me for some reason was the fact that this expected distance <D> was becoming ever greater the more moves were made. For some reason I was thinking that the more moves the more likely the person will be at 0.

While I was thinking of this, the ordinary coin-flipping game came to my head. And I perceived an analogy. The more coins you flip the more likely that the fractional amount of tails you get will be closer to 1/2. Which is the probability of getting tails. However as the fractional amount of tails you get comes closer to 1/2, the difference between the amount of coins and tails on the average becomes bigger. Like this: 10 coin flips 4/10 tails 6/10 heads. the difference is only 2. but the fractional amount of tails is 4/10. Compared to 496 333/1000000 tails and 503777/1000000 heads. The fractional amount of tails is much closer to 1/2 but the difference between the amount of tails and heads is several thousands. So on the average you will see much greater difference between the amount of coins and tails the more you throw.
This is my question:
Isn't the average difference the same as the expected value <D> of the random walk?

Thanks for allowing me to share my experience.
 
Mathematics news on Phys.org
Hello QED-Kasper! :wink:
QED-Kasper said:
Isn't the average difference the same as the expected value <D> of the random walk?

That's right! :smile:

The coin-difference after n flips and the walk-distance after n steps (in 1D) have the same distribution … each process is a model for the other, and in particular, they have the same expected values.
 
Thanks, I appreciate that. And thank you for being extra kind :).
 
The coin-difference after n flips and the walk-distance after n steps (in 1D) have the same distribution … each process is a model for the other, and in particular, they have the same expected values.

I would like to point out two things here:

  1. "Walk-distance" D (generally referred to as "distance from the origin" in the theory of random walks) is defined to be the difference between the number of 'heads' and the number of 'tails' in a (Bernoulli) sequence of 'coin flips,' while the terms "expected value" and "average" have precisely the same meaning. So, "the average difference between heads and tails" and "the expected value of D" are just two ways of saying exactly the same thing. (Stating that "each process is a model for the other" having "the same distribution" and "the same expected values" obscures the fact that they are one and the same process.)
  2. The lecture on Probability in The Feynman Lectures on Physics Volume I, as well as the lecture that precedes it on Time and Distance, were written and delivered by Matthew Sands - Feynman had nothing to do with them (he was called unexpectedly out of town that week).
Mike Gottlieb
Editor, The Feynman Lectures on Physics, Definitive Edition
---
"www.feynmanlectures.info"[/URL]
 
Last edited by a moderator:
Thanks codelieb. I have to add though that I misread the text. In it Sands only mentions the expected distance. Which is also known as the MAD (mean average deviation) in statistics. The "square-root of N rule" applies to the RMS (root mean square) distance, aka standard deviation. This is what is actually being described in the text.
 
Thread 'Video on imaginary numbers and some queries'
Hi, I was watching the following video. I found some points confusing. Could you please help me to understand the gaps? Thanks, in advance! Question 1: Around 4:22, the video says the following. So for those mathematicians, negative numbers didn't exist. You could subtract, that is find the difference between two positive quantities, but you couldn't have a negative answer or negative coefficients. Mathematicians were so averse to negative numbers that there was no single quadratic...
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Thread 'Unit Circle Double Angle Derivations'
Here I made a terrible mistake of assuming this to be an equilateral triangle and set 2sinx=1 => x=pi/6. Although this did derive the double angle formulas it also led into a terrible mess trying to find all the combinations of sides. I must have been tired and just assumed 6x=180 and 2sinx=1. By that time, I was so mindset that I nearly scolded a person for even saying 90-x. I wonder if this is a case of biased observation that seeks to dis credit me like Jesus of Nazareth since in reality...
Back
Top