Hitting time of two or three parallel random walks

opapa
Messages
3
Reaction score
0
Consider a one-dimensional random walk on the integer lattice, starting at 0. The next step is decided to be +1 or -1 with equal probability 0.5. The hitting time (the expected time required for the random walk to reach any integer α -- also called the first-passage time) will be \Theta(\alpha^2), since random walks are martingales. Similarly, given two numbers (α<0 and β>0), the expected time for the random walk to reach either α or β is α*β.

In my case, I have two or more independent random walks r1, r2, …, that run in parallel, and I want to find the expected time required for at least one of these random walks ri to reach either αi or βi (notice that the boundaries αi and βi can be different for each random walk). My initial intuition was that, since these random walks are independent, I could find the hitting time for each random walk and get the minimum, but I found this to be wrong through a computer simulation.

Any pointers?
Thanks
 
Physics news on Phys.org
opapa said:
My initial intuition was that, since these random walks are independent, I could find the hitting time for each random walk and get the minimum

If we throw two fair dice, is it intuitive that the expected minimum of the two dice would be the minmum of their two expected values? If it were, then the expected minimum from throwing two dice would be the same as the expected minimum from throwing one die, since we'd take the minimum of two equal numbers.
 
Thanks for the response. You are right of course that I cannot deduce something from the expected values for each walk, I need to compute the expected value from scratch.

Trying to recompute the hitting time for two random walks RW1 and RW2 (in the lines of the proof of Lemma 0.2 from http://www.cs.cornell.edu/courses/cs4850/2010sp/Scribe%20Notes%5CLecture18.pdf ), I realized that it gets very complex. The main difference is that now I am not interested in reaching a point for a first time, but in reaching a group of points, e.g., when RW11 or RW22. Furthermore, one can reach any point (RW1=i, RW2=j) following paths for which the last step is not necessarily (RW1=i-1, RW2=j-1). Therefore, it is not clear to me how the starting equation in Lemma 0.2 will be adapted for 2 dimensions.

Is there any other direction that does not have these issues?
 
Last edited by a moderator:
Finding the distribution of the minimum of two independent random variables falls under the heading of "order statistics". The distribution of the minimum depends only on the distrubutions of the random variables involved. http://en.wikipedia.org/wiki/Bapat–Beg_theorem

If you have a more complicated question that is particular to graph theory, I think you should post it in a new thread and put something indicating graph theory in the title.
 
  • Like
Likes 1 person
Thanks! The theorem seems relevant. I will have a look at it, and, if needed, I will initiate a new thread as suggested.
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top