• Support PF! Buy your school textbooks, materials and every day products Here!

Winning and losing teams probability distribution

  • #1
Suppose two teams play a series of games, each producing a winner and a loser, until one team has won two more games than the other. Let Y be the total number of games played. Assume each team has a chance of 0.5 to win each game, independent of results of the previous games. Find the probability distribution of Y.



The Attempt at a Solution



At first I thought I'd say that given that team A has had X games and team B has Z games, then Y=X+Z, and since it takes two more games for the winner to get, then Z=X+2, so Y=2x+2. Probability function would be then P(Y<=y)=P(Y<=2x+2) but I am not sure where to go from here...Also, was thinking that it may be a binomial, with (Y choose 2x+2)(0.5)^(2x+2)(0.5)^y where x=1,2,3....y.

Any input is appreciated!
 

Answers and Replies

  • #2
Ray Vickson
Science Advisor
Homework Helper
Dearly Missed
10,706
1,728
Suppose two teams play a series of games, each producing a winner and a loser, until one team has won two more games than the other. Let Y be the total number of games played. Assume each team has a chance of 0.5 to win each game, independent of results of the previous games. Find the probability distribution of Y.



The Attempt at a Solution



At first I thought I'd say that given that team A has had X games and team B has Z games, then Y=X+Z, and since it takes two more games for the winner to get, then Z=X+2, so Y=2x+2. Probability function would be then P(Y<=y)=P(Y<=2x+2) but I am not sure where to go from here...Also, was thinking that it may be a binomial, with (Y choose 2x+2)(0.5)^(2x+2)(0.5)^y where x=1,2,3....y.

Any input is appreciated!
Have you studied Markov chains yet? If so, it is a simple exercise to get the generating function of Y, hence to find the distribution P{Y = n}, n = 2,3,4,... .

Here is a hint: let X = #won by A - #won by B. We start at X = 0. X increases by 1 if A wins and decreases by 1 if B wins. The game stops when X reaches ± 2. So, we have a simple 4-state Markov chain with states X = {0,1,-1,'stop'}, and Y = first passage time from state 0 to state 'stop'.

RGV
 
  • #3
No, we haven't done Markov Chains yet...
What do you mean by "first passage time"?
 
  • #4
Ray Vickson
Science Advisor
Homework Helper
Dearly Missed
10,706
1,728
No, we haven't done Markov Chains yet...
What do you mean by "first passage time"?
The first passage time from any state i to state 'stop' is the first time (t = 1,2,3,...) at which state 'stop' is reached. It is exactly the Y that you seek. (Calling it a first-passage time allows you to do a meaningful Google search of that term to find out more about it.)

RGV
 

Related Threads on Winning and losing teams probability distribution

  • Last Post
Replies
12
Views
1K
Replies
9
Views
9K
  • Last Post
Replies
2
Views
4K
Replies
2
Views
1K
  • Last Post
Replies
19
Views
1K
  • Last Post
Replies
0
Views
1K
Replies
5
Views
2K
Replies
6
Views
1K
Replies
5
Views
817
Top