I'm not sure what you're saying. Do we agree that p and P don't depend on m and M? m and M are the actual amounts of money we have at the time, the probability distributions p and P are fixed. I'm not saying that m = M all the time, but when m = M, then the expected winnings should be such that game is not advantageous to both. There should be no situation where the game is advantageous to both, including the situation m = M. Can that be agreed upon? If so, then the only way that the game is not advantageous to both when m = M is if their expected winnings are both 0, since their expected winnings happen to be the same when m = M. But since expected winnings are just a function of how much money I have at the time and not how much the other guy has, then if my expected winnings are 0 when m = M, then they are always 0. In other words, my expected winnings are a constant if treated as a function of M (his money). So, if I happened to find that when M = 2m that my expected winnings were a, then my expected winnings should be a for every value of m, since the fact that M = 2m doesn't really matter, since my expected winnings don't vary with M. In yet other words, if I have a function which I know to be a constant, and I find that f(x) = a, then I know that f(y) = a for any y. Assuming this reasoning is sound, then the fact that my expected winnings are 0 when m = M, my expected winnings are 0 for any m. In other words, pick any m. Then since my expected winnings don't vary with M, if we specify an M value and then calculate my expected winnings when I have m dollars and he has M, then this gives my expected winnings when I have m dollars and he has any amount. But whenever m = M, then the expected winnings should be 0, so the expected winnings should always be 0, which leads to some contradictions, since we get an equation of two things but where one of those things increases while the other decreases.
Of course, there is some problem in the reasoning here, but what? I guess it's just in the assumption that a game cannot be advantageous to both. I guess we just have to look at what it means to be advantageous to both, and see that this isn't a contradiction. When Tom Brady is healthy, the Patriots usually win. When Roethlisberger is healthy, the Steelers usually win. If the Patriots and Steelers meet to play football and both have healthy quarterbacks, then probability suggests that both can expect to win, however, this isn't a contradiction. If the Patriots have an 80% chance to win this game, then the Steelers have a 20% chance to win this game. However, I think what our statistics tell us is that the Patriots have an 80% chance of winning games like this one, and the Steelers can very well at the same time have an 80% chance of winning games like this one. They can't both have an 80% chance of winning this particular game, but we have to interpret our statistics to say that they have a good chance of winning games like this, not this particular one.
So going back to our wallet game, when they each have m dollars, they may both have a good chance of winning games like this one (games where they have m dollars) but they can't both have a good chance of winning this game. When you talk about your chances in a specific game, you talk about what the other person has in his wallet (or how healthy their quarterback is), and only when you take both values into account does it become a contradiction to say that the game is advantageous to both. The problem of the game being advantageous to both in general (i.e. regardless of m) was already done away with by rejecting the assumption that p(m < M) = p(m > M), and leaving p variable.