- #1
ruby_duby
- 46
- 0
Hi
I have just started a game theory module and have been given the following exercise. Can anyone help me with the following question. I have a feeling I can use a minimax argument to answer this but I am not sure how to go about this.
Suppose a two-player zero-sum game has the following matrix:
A =
a b
c d
Suppose that (e1; e1) is an equilibrium solution. Suppose K a member of the ℝ is a constant. Show for the new game matrix:
A0 =
a + K b + K
c + K d + K
that (e1; e1) is still the equilibrium solution.
I have just started a game theory module and have been given the following exercise. Can anyone help me with the following question. I have a feeling I can use a minimax argument to answer this but I am not sure how to go about this.
Suppose a two-player zero-sum game has the following matrix:
A =
a b
c d
Suppose that (e1; e1) is an equilibrium solution. Suppose K a member of the ℝ is a constant. Show for the new game matrix:
A0 =
a + K b + K
c + K d + K
that (e1; e1) is still the equilibrium solution.