Probability of Finding a Random Walker in D Dimensions After N Steps

Click For Summary
The discussion revolves around calculating the probability of finding a random walker at a specific site in D dimensions after N steps, starting from the origin. The initial attempts involve misunderstanding the Markov chain properties and the dependency of probabilities on multiple prior states. Participants suggest using combinatorial methods to assess the number of ways to reach the target coordinates and emphasize the importance of correctly partitioning steps across dimensions. A key point is the realization that the number of ways to reach a position depends on the total steps taken and the distribution of those steps among dimensions. The conversation concludes with references to classic probability theory literature for further guidance.
fluidistic
Gold Member
Messages
3,931
Reaction score
281

Homework Statement


Hi guys, I'm absolutely desperate on the following problem:
Consider a random walker who can make steps only to neighbor sites in "D" dimensions where D is an arbitrary natural number. Assume that the distance between 2 adjacent sites is the same for all sites and that the probabilty for the walker to choose between any adjacent site is the same. Let \vec r be a particular site and N be the number of steps the walker has done.
What is the probability to find the random walker at \vec r after N steps if he starts initially at the origin?
Answer to this question using:
1)the assumption that the walk can be described as a Markov chain in D dimensions.
2)the distribution function of the displacement of the walker after N independent steps, \vec r = \vec r _1 + ... + \vec r _N.

Homework Equations


Probably a lot! Not really sure.

The Attempt at a Solution


I tried part 1) so far but did not go far at all.
I know that the probability to choose any adjacent site is \frac{1}{2D} where D is the dimension of the space the walker walks in.
So they are asking me P_N(\vec r). I'll use the notation \vec r=r_1 \hat x_1 +... + r_D \hat x_D; that's the position of the walker after N steps.
I have the initial condition that P_0(\vec r )=\vec 0. Since I'm given an initial condition on P_N(\vec r ) and that I'm asked to find P_N (\vec r ) I can "smell" that I'll have to solve a differential equation or something like that, but I've no idea how to find it.
Now I know that for a Markovian process, the probability that the walker will go to say P _N(\vec r) does not depend on its past but only on its present. Namely only on the "state" P_ {N-1} (\vec l ) where \vec l is the site of the walker prior to \vec r.
If I'm not wrong, then, I guess P_N (\vec r)=\frac{1}{2D} P_{N-1} (\vec l ).
But then if I take P_{N-1} (\vec l ) it will depend only on the previous state. In the end I get the wrong result that P_N ( \vec r ) = \left ( \frac{1}{2D} \right ) ^{N} P_0 (0)=\left ( \frac{1}{2D} \right ) ^{N}.
I know this result is wrong but I don't know what I'm doing wrong. Any help will be appreciated.
 
Physics news on Phys.org
Your method didn't work because it only considered one possible prior state. There are 2D to sum over.
You could try assessing the number of ways N can be partitioned into the D dimensions, then how many ways each of those partitions can lead to the target coordinates. Or (almost the same?) get a recurrence relation based on number of steps available, number of dimensions, and remaining vector for those dimensions. I.e. knock off one dimension at a time.
 
First of all, thanks for the reply.
haruspex said:
Your method didn't work because it only considered one possible prior state. There are 2D to sum over.
Hmm I don't really understand. Since it's a Markov process, the current state only depends on the last one, not on the previous ones.

You could try assessing the number of ways N can be partitioned into the D dimensions, then how many ways each of those partitions can lead to the target coordinates. Or (almost the same?) get a recurrence relation based on number of steps available, number of dimensions, and remaining vector for those dimensions. I.e. knock off one dimension at a time.
Hmm I see, but would that be for part 2)? I don't see what it has to do with Markov chains (I'm explicitely asked to solve the problem by considering the process as a Markov chain in part 1).

I made a mistake in my first post, I reach that P_N(\vec r ,N | 0, 0) = \left ( \frac{1}{2D} \right ) instead of \left ( \frac{1}{2D} \right ) ^{N}. That's still false because there's no dependency on \vec r while there should.

Edit: Apparently I believe my result is wrong because I considered that the Markov chain is homogeneous while it seems it isn't.
A homogeneous Markov chain has the property that P(X_k=i_k | X_{k-1} =i_{k-1} ) does not depend on k. Well I'm confused. In the problem the Markov chain seems homogeneous... but then I reach a non sense answer.
 
Last edited:
Attempt for 2):
Let \vec r = \vec x_1 +... + \vec x_D. Such that x_i is the component of the vector r along the i'th dimension.
For any dimension i, there are {x_ i \choose N} ways for the walker to get to \vec x_i in N steps. And the probability to choose one particular way is \left ( \frac{1}{2D} \right ) ^N.
Since the position of the walker in any dimension has no influence whatsoever on its position along the other dimensions, I can safely write P _N ( \vec r ) = {x_1 \choose N} ... {x_d \choose N} \left ( \frac{1}{2D} \right ) ^N = \left ( \frac{1}{2D} \right ) ^N \sum _{i=1}^D \frac{N!}{(N-x_i )!x_i!}. Does this look correct? I tried D=1 and I think it looked right; at least to me. What do you think? This would be my answer for the probability to find the walker at the position \vec r after N steps.
 
fluidistic said:
Attempt for 2):
Let \vec r = \vec x_1 +... + \vec x_D. Such that x_i is the component of the vector r along the i'th dimension.
For any dimension i, there are {x_ i \choose N} ways for the walker to get to \vec x_i in N steps. And the probability to choose one particular way is \left ( \frac{1}{2D} \right ) ^N.
Since the position of the walker in any dimension has no influence whatsoever on its position along the other dimensions, I can safely write P _N ( \vec r ) = {x_1 \choose N} ... {x_d \choose N} \left ( \frac{1}{2D} \right ) ^N = \left ( \frac{1}{2D} \right ) ^N \sum _{i=1}^D \frac{N!}{(N-x_i )!x_i!}. Does this look correct? I tried D=1 and I think it looked right; at least to me. What do you think? This would be my answer for the probability to find the walker at the position \vec r after N steps.

This is incorrect: for N = x_1 there is just 1 way to get from 0 to x_1 in N steps; for N = x_1 + 1, there is no way to get from 0 to x_1 in N steps; for N = x_1 + 2, there are 2 ways to get from 0 to x_1 in N steps, etc.

RGV
 
Ray Vickson said:
This is incorrect: for N = x_1 there is just 1 way to get from 0 to x_1 in N steps; for N = x_1 + 1, there is no way to get from 0 to x_1 in N steps; for N = x_1 + 2, there are 2 ways to get from 0 to x_1 in N steps, etc.

RGV
Woops I've got some latex typos. All my {n \choose k} should be {k \choose n}. Does this fix the problem?
 
fluidistic said:
Since it's a Markov process, the current state only depends on the last one, not on the previous ones.
Quite so, but you seem to be confusing "the probability of being at a certain point after N steps given it was at a certain neighbour after N-1" with ""the probability of being at a certain point after N steps (given only that it was at the origin at step 0)".
Isn't it clear that P_{N}(\vec{r})=\sum_{\vec{l} \in A(\vec{r})}P_{N-1}(\vec{l})/2D, where A(\vec{r}) is the set of neighbours of r?
In the OP, you don't say you have to "use" Markov chains, merely that you can assume it behaves like one. I don't know why they tell you that since it clearly does from the preceding description.

Wrt your later post, I don't understand how you got \left(^{x_i}_N\right). Maybe you meant \left(_{x_i}^N\right), but I still don't see it. To reach xi in the i-direction there must be s steps in the 'wrong' direction and s+xi in the 'right' direction for any s<N. But the larger s is for one dimension, the less is left for the other dimensions, so it gets quite messy.

Try this: start by considering the case where N is only just large enough to reach r, i.e. no steps in the 'wrong' direction in any dimension. That's fairly easy. Then figure out the multiplier that arises from having spare steps, i.e. a larger N for the same r. Note that an odd number of spares will give you zero!
 
haruspex said:
Quite so, but you seem to be confusing "the probability of being at a certain point after N steps given it was at a certain neighbour after N-1" with ""the probability of being at a certain point after N steps (given only that it was at the origin at step 0)".
Isn't it clear that P_{N}(\vec{r})=\sum_{\vec{l} \in A(\vec{r})}P_{N-1}(\vec{l})/2D, where A(\vec{r}) is the set of neighbours of r?
In the OP, you don't say you have to "use" Markov chains, merely that you can assume it behaves like one. I don't know why they tell you that since it clearly does from the preceding description.

Wrt your later post, I don't understand how you got \left(^{x_i}_N\right). Maybe you meant \left(_{x_i}^N\right), but I still don't see it. To reach xi in the i-direction there must be s steps in the 'wrong' direction and s+xi in the 'right' direction for any s<N. But the larger s is for one dimension, the less is left for the other dimensions, so it gets quite messy.

Try this: start by considering the case where N is only just large enough to reach r, i.e. no steps in the 'wrong' direction in any dimension. That's fairly easy. Then figure out the multiplier that arises from having spare steps, i.e. a larger N for the same r. Note that an odd number of spares will give you zero!
Ok thanks guys. I'll concentrate for now on the 2) part.
I totally overlooked that more steps in a dimension reduce the ones in the other as to reach N total steps...
 
fluidistic said:
Woops I've got some latex typos. All my {n \choose k} should be {k \choose n}. Does this fix the problem?

Sorry: what I said above is incorrect. For n = x+2 there are not just 2 ways to get from 0 to x in n steps; there are C(n,n-1) = C(n,1) = n ways. In general, the number of ways of getting from 0 to x in n steps is
N_{n,x} = C(n, (n+x)/2) = {n \choose (n+x)/2},
where it is understood that this equals 0 unless (n+x)/2 is an integer between 0 and n inclusive.

All this is treated beautifully in the marvelous book by W. Feller "An Introduction to Probability Theory and its Applications", Vol. I (Wiley, 1968), pp. 73-75 (an oldie, but an unbeatable classic).

RGV
 
Last edited:
  • #10
Ray Vickson said:
Sorry: what I said above is incorrect. For n = x+2 there are not just 2 ways to get from 0 to x in n steps; there are C(n,n-1) = C(n,1) = n ways. In general, the number of ways of getting from 0 to x in n steps is
N_{n,x} = C(n, (n+x)/2) = {n \choose (n+x)/2},
where it is understood that this equals 0 unless (n+x)/2 is an integer between 0 and n inclusive.

All this is treated beautifully in the marvelous book by W. Feller "An Introduction to Probability Theory and its Applications", Vol. I (Wiley, 1968), pp. 73-75 (an oldie, but an unbeatable classic).

RGV
Ok thank you infinitely for the information. I am going to take a look at this reference and post back in 1 or a few days.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
1K
Replies
3
Views
2K
Replies
9
Views
4K
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
Replies
16
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
655
  • · Replies 2 ·
Replies
2
Views
3K