Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Tennis probabilities

  1. Aug 28, 2009 #1

    There are two tennis players. Player A wins a point 50% of the time (against any rival), while Player B wins a point 60% of the time against any rival. That is P(A) = 0.50 and P(B) = 0.60.

    If Player A plays with Player B, what is the probability that Player A beats Player B, i.e. P(A, B) and the probability that Player B beats Player A, P(B, A) ?

    I am not sure how I should simulate this problem.


  2. jcsd
  3. Aug 28, 2009 #2
    Intuitively, since A is an "average" player and B beats the "average" player 60% of the time, perhaps B should win 60% of the time against A.

    But it doesn't necessarily work like that. For example, perhaps A and B are equally skilled when playing against people at a given experience level, except that B is better at taking advantage of the weaknesses of poor players than A is. That would give B a higher win percentage, without giving him any advantage against A.

    Conclusion: you just don't have enough information.
  4. Aug 30, 2009 #3
    Player B has a P(A)*P(B) / (P(A)*P(B) + (1-P(A))*(1-P(B))) = 60% chance of winning a point against player A.

    The probability of then winning a MATCH against A will then be significantly higher (close to 100%) depending on if they play best of three or five sets.
  5. Aug 30, 2009 #4
    Justify your claim. What assumptions are you making?
    Last edited: Aug 30, 2009
  6. Aug 31, 2009 #5
    Can you please explain your answer.

    I am reading a book: Calculated Bets by Steven Skiena. The book explains mathematical modeling to beat the JAI ALAI betting system.
    The author used the following function get the answer to the above problem.

    [tex] P (A , B ) = \frac { 1 + [ P (A) - P(B) ]^ \alpha} {2} [/tex] if P(A) >= P(B)

    [tex] P (A , B ) = \frac { 1 - [ P (B) - P(A) ]^ \alpha} {2} [/tex] if P(A) <= P(B)

    The constant [tex] \alpha >= 0 [/tex] is used as a fudge factor to tune the results of this function to the observed stats.
  7. Sep 1, 2009 #6
    I was using a Bayesian approach.

    Let's denote the probability that player [tex]X_1[/tex] wins against player [tex]X_2[/tex] :
    [tex]P(W | X_1, X_2) [/tex]

    We know the following:

    [tex]P(W | A) = \sum\limits_{X}P(W | A, X)P(X) = 0.5 [/tex] (called P(A) in posts above)
    [tex]P(W | B) = \sum\limits_{X}P(W | B, X)P(X) = 0.6 [/tex] (called P(B) in posts above)

    Furthermore, chance of anyone beating anyone is 50%
    [tex]P(W) = \sum\limits_{X_1}\sum\limits_{X_2}P(W, X_1, X_2)= 0.5 [/tex]
    due to symmetry.

    We want to calculate [tex]P(W | A, B) [/tex]. This can be done from assuming that
    [tex]P(W, X_1, X_2) = P(X_1 | W)P(X_2 | W)P(W)[/tex].

    [tex]P(W| A, B) = \frac{P(W, A, B)}{P(A, B)}[/tex]
    [tex] = \frac{P(A | W)P(B | W)P(W)}{P(A, B)}[/tex]
    [tex] = \frac{P(W|A)P(A) P(W|B)P(B) /P(W)}{(P(W|A)P(A) P(W|B)P(B) /P(W)) + P(L|A)P(A) P(L|B)P(B) /P(L)} [/tex]
    [tex] = \frac{P(W|A) P(W|B) } {(P(W|A) P(W|B)) + P(L|A)P(L|B) } [/tex]
    [tex] = \frac{0.5 * 0.6 } {0.5 * 0.6 + 0.5 * 0.4 } [/tex]
    [tex] = 0.6 [/tex]

    where [tex]P(L|A)=1-P(W|A)[/tex] and [tex]P(L|B)=1-P(W|B)[/tex] is the probability of losing.

    It all got a bit technical, sorry about that, but I can't find any more simple derivation...
    Last edited: Sep 1, 2009
  8. Sep 1, 2009 #7

    Thanks for the detail explanation. However, I am stuck at the following equation.
    I would appreciate it if you could explain this to me.

    I don't understand how we can sum individual probabilities to get a collective probability of A winning against anybody, as the sum may be bigger than 1.

    Let us say, there are 4 players, A, B, X1 and X2, and P(W | A, X1) = 0.4, P(W | A, X2) = 0.7.

    Then P(W | A, X1) + P(W | A, X2) > 1

    IMHO, we should take an average (or weighted average) of individual winning probabilities to get a collective probability.
  9. Sep 1, 2009 #8
    You're perfectly right it dosn't make sense. I meant the expectation over [tex]X[/tex] rather than the sum.
    [tex]P(W | A) = \sum\limits_{X}P(W | A, X)P(X) = 0.5 [/tex]

    (I have corrected my original post)
  10. Sep 1, 2009 #9

    Thanks. I am still digesting your explanation. I am a bit confused with your notations. Could you explain what the following notations stand for?

    [tex] P(X) [/tex]

    [tex] P(W, X_1, X_2) [/tex] versus [tex] P(W | X_1, X_2) [/tex]

    [tex] P(X_1 | W) [/tex]

    [tex] P(W) [/tex]
  11. Sep 1, 2009 #10
    [tex] P(X) [/tex] Probability that any random player will have the value [itex]X[/itex]. If for instance [itex]X[/itex] is a number describing player's strength, there might be 10% of players with [itex]X=1[/itex], 40% with [itex]X=2[/itex] and 50% with [itex]X=3[/itex].

    Note that this probability is not actually used to calculate [itex] P(W | A, B) [/itex], all terms [itex] P(X) [/itex] cancel out along the way.

    [tex] P(W | X_1, X_2) [/tex] Probability of player [itex] X_1 [/itex] winning over player [itex] X_2 [/itex] (vertical bar indicated conditional probability of W (=winning), given players [itex] X_1 [/itex] and [itex] X_2 [/itex].

    [tex] P(W, X_1, X_2) [/tex] Probability of player [itex] X_1 [/itex] winning over player [itex] X_2 [/itex] times relative abundances of players with [itex]X=X_1[/itex] and [itex]X=X_2[/itex], respectively.

    [tex] P(W) [/tex] Probability that any random player will win over another random player

    [tex] P(X_1 | W) [/tex] The probability of a random player [itex] X [/itex] will have the value [itex] X = X_1 [/itex], given that he has just won a match.
  12. Sep 1, 2009 #11

    Thanks for bearing with me. I get all the notations except [tex] P(W, X_1, X_2) [/tex].
    Could you please explain that concept with an example?
  13. Sep 1, 2009 #12
    You have a notational problem. The events separated by commas after the conditional bar should commute, but in your notation P(W|A,B) is apparently not the same as P(W|B,A). You need to keep track of which player the W refers to. For example, you might say P(W(A) | A, B), where W(A) means "A wins," A means "A plays," and B means "B plays." It looks like the fact that you don't do this leads to some confusion later on.
  14. Sep 2, 2009 #13
    As mXSCNT points out, I have messed up the notation in my previous post (definitions of [itex]P(W |X_1)[/itex] and [itex]P(W |X_2)[/itex] are ambiguous). I have tried to make a more formal derivation below.


    Let's denote the probability that player [itex]A[/itex] wins against player [itex]B[/itex] :
    (1) [tex]P(W | A_1, B_2) [/tex]
    where the indices 1 and 2 are only to mark that it is player 1 winning over player 2 (and not the other way around)

    This implies that [itex]P(W | A_1, B_2)[/itex] is equal to the probability that player [itex]B[/itex] loses to player [itex]A[/itex]
    (2) [tex]P(W | A_1, B_2) = P(L | B_1, A_2) [/tex]

    One can define joint probability distributions over the space of all (pairwise) combinations of players and all outcomes (Win / Loss) by multiplying with [itex]P(A)P(B)[/itex]
    (3) [tex]P(W, A_1, B_2) = P(W | A_1, B_2) P(A) P(B)[/tex]

    One can also define marginal probability distributions by summing over all [itex]X_1[/itex] or [itex]X_2[/itex]
    (4) [tex]P(W , A_1) = \sum\limits_{X_2}P(W, A_1, X_2)[/tex]

    (5) [tex]P(W , B_2) = \sum\limits_{X_1}P(W, X_1, B_2)[/tex]

    (6) [tex]P(W) = \sum\limits_{X_1}\sum\limits_{Y_2}P(W, X_1, Y_2)[/tex]

    From equation (2) we can deduce also that [itex]P(W, A_1) =P(L, A_2)[/itex] and [itex]P(W , B_1) =P(L , B_2)[/itex].

    Finally, we can define the conditional probabilites:
    (7) [tex]P(W | A_1) P(A) = P(A_1|W) P(W) = P(W, A_1)[/tex]

    (8) [tex]P(W | B_2) P(B) = P(B_2|W) P(W) = P(W, B_2)[/tex]


    (9) [tex]P(W | B_1) = 0.6 [/tex]

    (10) [tex]P(W | A_2) = 0.5 [/tex]


    (11) [tex]P(W) = P(L)= 0.5 [/tex]
    The probability of a random player winning (or losing) against another random player is exactly 50%

    (12) [tex]P(W, B_1, A_2) = P(B_1 | W) P(A_2 | W) P(W) [/tex]

    This assumption can be justified by the maximum entropy principle, since it minimizes the Shannon information of [itex]P(O, B_1, A_2)[/itex] (O= W or L) given that we know [itex] P(B_1 | W) [/itex], [itex] P(A_2 | W) [/itex] and [itex] P(W) [/itex].


    We want to calculate
    [tex]P(W| B_1, A_2) = \frac{P(W, B_1, A_2)}{P(B) P(A)} [/tex]

    Inserting (12) gives
    [tex]P(W| B_1, A_2) = \frac{P(B_1 | W) P(A_2 | W) P(W)}{P(B) P(A)} [/tex]

    Using (7) and (8) we can finally write
    [tex]P(W| B_1, A_2) = \frac{P(W| B_1) P(W| A_2 ) }{P(W)} [/tex]

    Inserting (9) (10) and (11) gives
    [tex]P(W| B_1, A_2) = \frac{0.6 * 0.5 }{0.5} = 0.6[/tex]

    That is, the probability of player B beating player A is 60%
    Last edited: Sep 2, 2009
  15. Sep 3, 2009 #14

    Thanks a lot. Though your solution is not fully clear to me, I appreciate your efforts.
    I think I need to do more reading on the Bayesian probability theory.

  16. Sep 4, 2009 #15
    Then, considering long time of observation (or "after a large amount of points played" ), for each 10 points played, Player A wins 5 points, and Player B wins 6 points...

    Last edited: Sep 4, 2009
  17. Sep 5, 2009 #16
    It seems the OP, the way it is, leads to a strange conclusion!

    Who is wrong: me, or the assumptions in the OP?
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook