Probability of Mark Ending Up with Ball in Children's Catch Game

  • Thread starter crazy_craig
  • Start date
  • Tags
    Probability
In summary, the probability that Mark will end up with the ball, given that Dick has the ball at the beginning of the game, is 2/5.
  • #1
crazy_craig
9
0

Homework Statement


Six children (Dick, Helen, Joni, Mark, Sam, and Tony) play
catch. If Dick has the ball he is equally likely to throw it to Helen,
Mark, Sam, and Tony. If Helen has the ball she is equally likely to
throw it to Dick, Joni, Sam, and Tony. If Sam has the ball he is
equally likely to throw it to Dick, Helen, Mark, and Tony. If either
Joni or Tony gets the ball, they keep throwing it to each other. If
Mark gets the ball he runs away with it. (a) Find the transition
probability and classify the states of the chain. (b) Suppose Dick
has the ball at the beginning of the game. What is the probability
Mark will end up with it?

I only need help with (b). I created the transition matrix as below.

Homework Equations



So, by "ends", I'm assuming we want the lim n→∞ pn(x,y) = π(y), but I don't
think that this chain converges (by raising this transition matrix to high powers), so there must be another way to solve this. It seems easy and I'm probably overlooking something. I've spent waay too long on this, so I'm asking for some help and a nudge in the right direction.

Thank you very much!

The Attempt at a Solution

\begin{array}{cc}
&D & H & J & M & S & T \\
D& 0&.25&0&.25&.25&.25 \\
H&.25&0&.25&0&.25&.25\\
J&0&0&0&0&0&1\\
M&0&0&0&1&0&0\\
S&.25&.25&0&.25&0&.25\\
T&0&0&1&0&0&0\\
\end{array}
 
Physics news on Phys.org
  • #2
crazy_craig said:

Homework Statement


Six children (Dick, Helen, Joni, Mark, Sam, and Tony) play
catch. If Dick has the ball he is equally likely to throw it to Helen,
Mark, Sam, and Tony. If Helen has the ball she is equally likely to
throw it to Dick, Joni, Sam, and Tony. If Sam has the ball he is
equally likely to throw it to Dick, Helen, Mark, and Tony. If either
Joni or Tony gets the ball, they keep throwing it to each other. If
Mark gets the ball he runs away with it. (a) Find the transition
probability and classify the states of the chain. (b) Suppose Dick
has the ball at the beginning of the game. What is the probability
Mark will end up with it?

I only need help with (b). I created the transition matrix as below.

Homework Equations



So, by "ends", I'm assuming we want the lim n→∞ pn(x,y) = π(y), but I don't
think that this chain converges (by raising this transition matrix to high powers), so there must be another way to solve this. It seems easy and I'm probably overlooking something. I've spent waay too long on this, so I'm asking for some help and a nudge in the right direction.

Thank you very much!

The Attempt at a Solution




\begin{array}{cc}
&D & H & J & M & S & T \\
D& 0&.25&0&.25&.25&.25 \\
H&.25&0&.25&0&.25&.25\\
J&0&0&0&0&0&1\\
M&0&0&0&1&0&0\\
S&.25&.25&0&.25&0&.25\\
T&0&0&1&0&0&0\\
\end{array}

I don't know if you have taken "first-passage probabilities" yet, but that is what is involved in this problem. In case you have not yet seen this material, let me explain briefly.

In the long run, the system either ends up in the set of states {J,T} or the single state {M}. That is, eventually, D, H and S are left forever, never to be re-visited. Furthermore, if state T (or J) is ever visited, states J and T together absorb the process; if state M is ever visited the state remains at M forever after. So, we want the probability that, starting from state D, the process eventually reaches state M. In this case, the limiting probabilities lim_{n}P^n(D,J) and lim_{n} P^n(D,T) do not exist because the states J and T are of period 2 (so for large n, these two probabilities just alternate between two fixed values). However, lim_{n} P^n(D,M) does exist, and that is what you are asked to find.

Here is a hint about how to do it: let f(x) = lim_{n} P^n(x,M) for x = D, H and S. We want to find f(D), but to do that, we need to find also f(H) and f(S). We do that by getting 3 simple linear equations in the three unknowns f(D), f(H) and f(S). Think about starting from D ant taking one step. In order to reach M from D, we need to reach M from wherever we end up at in one step away from D. Similarly for starting from H and S. (Alternatively, you could just look at the limit of P^n(D,M) if you have some good way of taking matrix powers.)

RGV
 
  • #3
Thanks for the quick reply!

Most of what you said makes sense, and I used a calculator to find the limit of P^n(D,M), which is 2/5, but this is a little confusing to me:

"Think about starting from D ant taking one step. In order to reach M from D, we need to reach M from wherever we end up at in one step away from D. ",

and I'm not sure how to set up the system of linear equations.
 
  • #4
Think of setting up the equations for limiting probabilities, except that now they can depend on the starting state. But, still, P^(n+1) = P.P^n, and you can look at what this becomes when n --> infinity (or at least, for those components that do have limits). Here, we just want to look at column M of P^infinity.

RGV
 
  • #5
I finally got it. Thanks a ton, Ray!
 

1. What is a Markov chain?

A Markov chain is a mathematical concept that represents the probability of transitioning between different states of a system over time. It is a type of stochastic process, meaning that it involves random or probabilistic elements.

2. How is a Markov chain different from other types of probability models?

Unlike other probability models, such as the binomial or normal distribution, a Markov chain takes into account the history of the system. This means that the next state in the chain is only dependent on the current state, not on any previous states.

3. What is a stationary distribution in a Markov chain?

A stationary distribution is the long-term probability distribution of a Markov chain. It represents the probabilities of being in each state after a large number of transitions. In other words, it is the equilibrium state of the system.

4. How are Markov chains used in real-world applications?

Markov chains are commonly used in fields such as finance, genetics, and natural language processing. They can be used to model and predict the behavior of complex systems, such as stock prices, genetic mutations, and language patterns.

5. Can a Markov chain be used to predict the future?

Yes, a Markov chain can be used to make predictions about the future state of a system. However, these predictions are only as accurate as the data and assumptions used to create the model. Small errors or changes in the initial conditions can greatly affect the accuracy of the predictions.

Similar threads

  • Calculus and Beyond Homework Help
Replies
6
Views
971
Replies
9
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
638
  • Atomic and Condensed Matter
Replies
3
Views
870
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
5
Views
2K
  • Math Proof Training and Practice
2
Replies
42
Views
6K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Math Proof Training and Practice
2
Replies
40
Views
14K
Back
Top