# Homework Help: Paper, scissor, rock

1. Aug 6, 2017

### DesertFox

Let's consider a game of paper, scissor, rock. We have two players: player A and player B. Player A always do paper; player B always uses a strategy, where he tosses a die, and if it comes out 1 or 2- he goes rock, if it comes out 3 or 4- he goes paper, and if it comes out 5 or 6- he goes scissor.

1) If the game is played just one time, e.i. we are considering one single "shot", who has better chance to be the winner?

2) If the game is repeated 100 times, who has better chance to get more wins? (both of the players stick rigidly to their initial strategy!)

3) If the game is repeated infinitely times, who has better chance to prevail on the long run? (both of the players stick rigidly to their initial strategy!)

Here it is my attempt to answer. I am answering really intuitively, because I lack the mathematical approach.
Nobody has better chance; the situation is 50/50.

But as I said, this is my pure intuition... I don't know whether I am right. And even if I am right... I can't explain WHY i am right (in mathematical context)

2. Aug 6, 2017

### person123

There's a one in three chance of Player B playing rock, one in three chance of playing scissors, and one in three chance of playing paper. That means there's a one in three chance of losing, one in three chance of winning, and one in three chance of drawing. That means there's the same probability of winning and losing; what would that indicate?

3. Aug 6, 2017

### DesertFox

So, mixed strategy is not the better strategy, when we are considering one single play of paper, scissor, rock?

Actually, if we are considering one single "shot" of paper, scissor, rock... there is no better strategy, right?
Mixed strategy makes sense only when we are considering a series of plays, because it will keep the opponent guessing. Right?

4. Aug 6, 2017

### person123

That gets into the psychology of the game, which I don't think the question is about. If player B were thinking about what to play, he would start playing scissors because he knows it would win, but both players are stuck to a rigid strategy.

For the question, I don't see why the probability would change based on the number of games.