How can a degenerate gambler escape his eternal punishment in hell?

  • Thread starter Office_Shredder
  • Start date
  • Tags
    Challenge
In summary: I wrote at the end:Could the coins that were ever in the winning pile be ordered in such a way that coins played is analogous to the even numbers in my example?I was not sure if we could get that ordering. If we can, then we can switch to a non-cyclic queue which might make the strategy work. Like I said, I did not have a solution. I thought mfb's solution was interesting, but it seemed like it could be generalized even further. I am more curious about the question now than I was before.
  • #1
Office_Shredder
Staff Emeritus
Science Advisor
Gold Member
5,590
1,516
Sorry about missing a week guys.

A degenerate gambler dies and is sentenced to hell. The devil informs him that his punishment is that he must play a slot machine. He starts with one coin, and each time he puts in a coin he gets countably infinite many coins out of the machine. He must continue to play until he has no coins remaining, and only then can he move on from his punishment and enjoy the afterlife. Fortunately, time flies when you're having fun, and he can play the slot machine arbitrarily fast.

The challenge:
Construct and prove the correctness of a strategy in which he has no coins left after one second.

(Note that you must prove he can use every single coin he is given back by the machine, not just that he can use infinitely many coins in one second).

EDIT TO ADD: Arbitrarily fast in this context means that there is no upper bound to how quickly he can put the next coin in.
 
Last edited:
Mathematics news on Phys.org
  • #2
Divide a second into a countable set of intervals: (0,1/2], (1/2, 3/4], (3/4, 7/8], ...

Putting a countable number of coins into the machine will give us a countable number of coins back.

The idea is to use every timestep to put all coins from the previous timestep into the machine (where the first 1/2 second is used for the first coin). As every timestep has a successor, we put all coins that we get back into the machine.

To construct a solution, we just have to construct a way to count all coins gained in the previous interval. This can be done by induction: In timestep 1 we get a countable number of coins, so we can label them a0, a1, ...
Suppose in timestep i we have a countable number of coins labelled a0, a1, ...
Then in this timestep, we get a countable number of countable sets A0, A1, ... with coins a00, a01, ..., a10, a11, ..., ...
We can re-label them with a bijective function ##\mathbb{N} \times \mathbb{N} \to \mathbb{N}##. There are many of those functions, Cantor's pairing function is an example.
 
  • #3
There may be a small issue here about the meaning of arbitrarily fast and infinitely fast. I don't think there is a perfect match between these terms. I think arbitrarily fast still implies there is a limit, just that it is arbitrarily great. I don't think it allows one to say there is no limit to the speed of the gambler. There must be some limit, I think.

I don't have a good suggestion of what to replace it with. His speed can become infinite? Not sure, really.
 
  • #4
If there is a finite limit, I don't see how the gambler is supposed to throw a countable infinite number of coins in.
 
  • #5
By arbitrarily fast I meant to imply that there is no upper bound to how fast he can put it in - I didn't want to say infinitely fast because that might make it sound like he could put multiple coins in without spending any time doing so.

mfb, that is a very nice solution and the most general one that I know of (modulo things like 'use up all the coins from interval i in interval i+2 instead'). Can anyone find a solution which is not covered under mfb's post?
 
  • #6
Here is a possible variation:

For each ##i > 0##, the gambler plays a single coin in ##1 \over 2^i## seconds, gathering all winnings in one countably infinite pile. After one second, a countable infinity of coins have been played, skipping over none, and there is a one-one correspondence between coins played and coins won, so the pile must be empty.

However, this looks highly illogical, it seems the coins are played too slowly to catch up with the winnings. This is related to ordering, for example, I can list the even numbers followed by the odd numbers, then the even numbers are a contiguous, countably infinite subset of this ordering of the natural numbers. Could the coins that were ever in the winning pile be ordered in such a way that coins played is analogous to the even numbers in my example?

Now consider this variation (similar to mfb's solution):

Let there be an initial pile of winnings consisting of exactly one coin. For each ##i > 0##, the gambler uses ##1 \over 2^i## seconds to play all coins in the existing pile, producing a new pile of winnings. Choosing any n, the n'th pile was played at some point in time before 1 second. Could ordering play a part here? Could the order of the piles be analogous to listing all natural numbers > 0 followed by 0? If so, we could, for any n, choose pile n and have one pile unchosen, without having the cardinality be otherwise.

I don't know why we can ignore the ordering in these examples. (Probably this is irrelevant because what I know doesn't matter, but the question is certainly interesting.)
 
  • #7
verty said:
gathering all winnings in one countably infinite pile. After one second, a countable infinity of coins have been played, skipping over none, and there is a one-one correspondence between coins played and coins won, so the pile must be empty.
I don't see the 1-1 correspondence.
After the first coin, your pile is
1 2 3 ...
After the second coin, your pile is
2 3 ... 1 2 3 ...
If you just keep working with that pile, you'll never get to the second group in this pile. If you rearrange the pile each time, you have to make sure that you don't rearrange too much.
 
  • #8
mfb said:
I don't see the 1-1 correspondence.
After the first coin, your pile is
1 2 3 ...
After the second coin, your pile is
2 3 ... 1 2 3 ...
If you just keep working with that pile, you'll never get to the second group in this pile. If you rearrange the pile each time, you have to make sure that you don't rearrange too much.

Thank you for spotting this, I was thinking it should go through but it has this other fault.
 
  • #9
Here is another interesting attempt.

The gambler maintains a list of sequences of coins. He starts with a list containing one sequence of one coin. For each ##i > 0##, he removes one coin from each sequence in the list (finitely many), plays these coins receiving a sequence for each (finitely many), then places these sequences at the end of the list. He does all this in ##1 \over 2^i## seconds. If a sequence ever becomes empty, he deletes that entry from the list.

After 1 second, all the time intervals have come and gone. For any sequence ##s## that populated the list, ##s## was added in a step ##i## and underwent no changes thereafter, except to have coins removed from its head.. After 1 second, ##s## must be empty. Therefore, by induction, every sequence in the list is empty. But this means that no coins remain to be played.

Pursuant to my earlier point, I believe there is a question as to whether the list itself has order type ω. I think this relates to cardinal numbers and ordinal numbers being somehow different in set theory.

-edit-

Actually, I think this proof works. Even if the list is ordered like the naturals > 0 followed by 0, I think the result follows by transfinite induction because the list is well-ordered. But I'm out now, this is all I can add to this topic. Sorry for all the edits and confusion.
 
Last edited:
  • #10
After depositing all the coins obtained in the (n-1)th step, you've received N^n coins, but #N = #N² = ... hence t(1) = t(2) = ... = #N/v where t(n) is the time required to deposit N^n coins. Since v can be uncountably large, (#N)², the total time required is a finite number: #n*t = #n*#N/(#N)² = 1 , where #n = #N.

I'm not familiar with transfinite trickery, but this proof doesn't seem right.
 
Last edited:
  • #11
I want to round off what I was saying in post 9 and before about the unintuitiveness of the solutions to problems of this type, in a way that will shed light on a what is a confusing topic, I believe.

Case 1.
Suppose that the slot machine takes one coin and always gives back one coin. Suppose that for all ##i > 0##, the gambler uses ##1 \over 2^i## seconds to play one coin, receiving one coin in its place. After one second, how many coins remain?

I think the accepted answer is that no coin remains. By induction on the coins, every coin goes into the machine before one second has passed. After one second, all coins have been inserted.

Case 2.
Suppose this time that the machine is faulty. A clever engineer realizes that one could simply connect a pipe from the coin chute to the tray. The gambler who isn't too smart doesn't realize this and plays the machine, although the coin that he inserts is the very same coin that emerges. Now how many coins remain after one second?

I think the answer is that one coin remains. But we don't know in case 1 that the original machine does not return the original coin. So the answer is determined by how the machine is engineered. Or perhaps in this case the answer is still that no coin remains. But then I defer to case 3.

Case 3.
It's the gambler's anniversary and he needs his one coin to buy a bunch of flowers for his wife. He's learned by now that the machine just returns one coin each time, but he isn't sure that it'll always return the coin, perhaps after some time it won't. Rather than take a chance, he plays the machine but only pretends to insert the coin, he keeps the coin in his hand. After one second, how many coins remain? The answer must be that one coin remains.

----

In terms of spacetime, in case 2 imagine the 4D solid that a coin forms while it is outside the machine. We can speak of the set of these solids in case 2 where each solid corresponds to a coin in case 1. Applying the argument from case 1 to the solids, no coin solid remains after 1 second. But a coin remains. Or apply the reasoning to case 3 with analogous coin solids. The result is the same.

After some thought on this, I believe it is consistent to say that a coin remains in these examples, in the same sense that infinitesimals are consistent given the correct definition. There may be a future branch of mathematics that allows a coin to remain. For this reason, I think it will be difficult to track down in the literature why the answer is that no coin remains. Most likely, there was never a need for a different answer and everyone accepted that according to mathematics, no coin remains.
 
  • #12
verty said:
Here is another interesting attempt.

The gambler maintains a list of sequences of coins. He starts with a list containing one sequence of one coin. For each ##i > 0##, he removes one coin from each sequence in the list (finitely many), plays these coins receiving a sequence for each (finitely many), then places these sequences at the end of the list. He does all this in ##1 \over 2^i## seconds. If a sequence ever becomes empty, he deletes that entry from the list.

After 1 second, all the time intervals have come and gone. For any sequence ##s## that populated the list, ##s## was added in a step ##i## and underwent no changes thereafter, except to have coins removed from its head.. After 1 second, ##s## must be empty. Therefore, by induction, every sequence in the list is empty. But this means that no coins remain to be played.

Pursuant to my earlier point, I believe there is a question as to whether the list itself has order type ω. I think this relates to cardinal numbers and ordinal numbers being somehow different in set theory.

-edit-

Actually, I think this proof works. Even if the list is ordered like the naturals > 0 followed by 0, I think the result follows by transfinite induction because the list is well-ordered. But I'm out now, this is all I can add to this topic. Sorry for all the edits and confusion.

Nice, I agree that this solution works as well. The coins that come out have a fixed cardinality only and I would allow the gambler to give any countable ordinal you want.

verty, I like this one coin setup you've constructed. Imagine the opposite perspective: the machine spits out a coin, but every time it does so the gambler puts it back in. After one second how can the coin be anywhere except in the machine?

Bonus points to anyone who can resolve this apparent paradox.


Jan, I don't think t(n) is really well defined in your post because it depends entirely on how the gambler orders his coins/his strategy of putting them in. After that I don't think I understand the rest of your post because you have things like #n = #N where N is the natural numbers and n appears to be an arbitrary finite number.
 
  • #13
Got an idea.
 
  • #14
Office_Shredder said:
Nice, I agree that this solution works as well. The coins that come out have a fixed cardinality only and I would allow the gambler to give any countable ordinal you want.

verty, I like this one coin setup you've constructed. Imagine the opposite perspective: the machine spits out a coin, but every time it does so the gambler puts it back in. After one second how can the coin be anywhere except in the machine?

Bonus points to anyone who can resolve this apparent paradox.


Jan, I don't think t(n) is really well defined in your post because it depends entirely on how the gambler orders his coins/his strategy of putting them in. After that I don't think I understand the rest of your post because you have things like #n = #N where N is the natural numbers and n appears to be an arbitrary finite number.

My take is that like most supertasks, these scenarios exploit common sense intuitions to mask the fact that infinite limits need not commute.

What is the physical final state of the gambler's pocket? It is not defined by the givens of the problem. There is no transition that leads to an unchanging final state.

What is the limiting state of the gambler's pocket in terms of the coins it contains? That depends on how we assign a topology to the set of its possible states. It need not have a limiting state. The discrete topology is an example where it will typically have no limiting state.

What is the physical final state of each coin -- in the gambler's pocket or in the slot machine's till? That is (or may be) defined by the givens of the problem. There is (or may be) a final transition that leads to an unchanging final position.

What is the limiting state of each coin in terms of where it is -- in the gamber's pocket or in the slot machine's till? That seems clear -- it is the final unchanging position of that coin, if any.

The magic is that the wording of the problem as if it were a realistic physical exercise. It urges us to consider that the contents of the pocket and the location of the coins must both have a final state that is determinable from the givens of the problem. Our experience with the everyday finite pockets, finite tills and finite sets of coins leads us to expect that a well defined sequence of transitions always leads to a well defined final state. That state can be determined by looking at either the final state of the pocket or the final state of each coin -- finite limits commute.

But in the infinite case, we intuitively see that the limit of the state of the gambler's pocket is infinite/non-existent. We see that the limit of the state of each coin is such that it ends up in the slot machine's till. These seem not to be compatible. But they are different limits. And infinite limits do not commute.

Let p(i,t) be 1 if coin i is in the pocket at time t.
Let P(t) be the set of coins in the pocket at time t < 1.

For t < 1, it is true that:

P(t) = { i | p(i,t) = 1 }

However, it need not be true that:

limt =>1 P(t) = {i | limt=>1 p(i,t) = 1)}
 
  • #15
You start with one coin: you put it in a you get an countable number of coins out. Label these:

C11, C12, C13 ...

In half second, you put in C11, and you get back C21, C22, C23 ...

In a further 1/4 second you put in C12, C21 and C22. You get back C31, C32, C33...

In a further 1/8 second, you put in C13, C23 and C31, C32, C33.

In this way every coin that comes out goes back in at some point.
 
  • #16
Office_Shredder said:
Nice, I agree that this solution works as well. The coins that come out have a fixed cardinality only and I would allow the gambler to give any countable ordinal you want.

I think I've derailed your puzzle thread with my silly musings, why oh why did I think this was a good idea? Please let's move on from this conversation and have this thread be again about accepted solutions to the original problem.

So that there is some finality, though, I think there is unintuitiveness and that was my original motivation, to remove the unintuitiveness, and later to at least demonstrate it. But having entered a briar patch, the best path of action is to leave it.
 
  • #17
I only have a non-constructive proof:

The number of coins returned by the machine is always countable. Also, the number of times the machine will return coins is countable. Therefore, the total number of coins is countable (countable union of countable sets).

Now, the interval of 1 second is uncountable, so there must be a countable subset of [0,1] with the same cardinality as that of the set of all coins. (This seems intuitive, although I don't have a proof for that). Label the elements of the given subset of [0,1] by [itex]t_0, t_1, ... [/itex]. Then, simply let the gambler put a coin at each moment [itex]t_i[/itex].

Since the cardinality of the [itex]t_i[/itex] is the same as that of the set of all coins, there is a one-to-one mapping from the set of coins to the moment they are put in the machine. Thus every coins is guaranteed to be placed in the machine eventually.
 
  • #18
PeroK said:
You start with one coin: you put it in a you get an countable number of coins out. Label these:

C11, C12, C13 ...

In half second, you put in C11, and you get back C21, C22, C23 ...

In a further 1/4 second you put in C12, C21 and C22. You get back C31, C32, C33...

In a further 1/8 second, you put in C13, C23 and C31, C32, C33.

In this way every coin that comes out goes back in at some point.

Hi: I already solved it! That's the 1-1 mapping you need.
 
Last edited:
  • #19
I thought you could put several coins in at once, but I see now that you probably want a solution for one coin at a time, so the sequence needs a slight modification:

You start with one coin and get back C11, C12, C13 ...

In 1/2 second you put in C11, you get back C21, C22, C23 ...

In 1/4 second you put in C12, you get back C31, C32, C33.

In 1/8 second, you put in C21, you get back C41, C42, C43 ...

In 1/16 second, you put in C22, ...

In 1/32 second, you put in C13

In 1/64 second you put in C23, then (doubling the speed each time) C31, then C32, C33, then C14, C24, C34, C41 ...

Same sequence as before, just one coin at a time.

It takes a bit longer(!) but you still get rid of all the coins - eventually!
 
  • #20
Boorglar, the problem with the non-constructive solution is that your bijection might require putting a coin into the machine before you have actually gotten it out.

Perok, your first solution was actually correct in-so-far as it is in fact possible to put countably many coins into the machine in those time intervals; that was actually mfb's solution up at the top. Your second solution is pretty similar to verty's in post #9 I think.

Can anybody construct a solution where the gambler makes an uncountable number of plays? Or show it's not possible
 
  • #21
Office_Shredder said:
Can anybody construct a solution where the gambler makes an uncountable number of plays? Or show it's not possible

If you have any uncountable subset of [0, 1], it cannot be sequenced.

So, you could say that the gambler puts one coin in for every real number in (0,1). I'm not sure you could do better than that. However elaborately you construct your uncountable subset, it still won't be a sequence.
 
  • #22
I tried extending the process to an uncountable solution as follows:

Put a coin in for every x in [0, 1).

The coins you get out are labelled Cxn (n = 1, 2, 3 ...).

At t = 0, you put in the coin you start with.

For 0 < x < 1/2, you put in C{x/2}1.

Then, you partition [1/2, 1) into a set of sequences, one for each x in [0, 1/2):

[itex]\frac{2^n-1}{2^n}+\frac{x}{2^n}[/itex]

So, the first sequence is 0, 1/2, 3/4, 7/8 ...

And, for every x, it's

x, 1/2 + x/2, 3/4 + x/4 ...

Now, for each sequence we put in the coins as for the countable solution. There's a slight tweak, in that all the sequences for x in (0, 1/4) have already had the first coin put in (at time 2x). And the sequences for x = 0 and x in [1/4, 1/2) haven't. So, the sequences are one coin out of step.

Although, of course, this doesn't describe a process that any gambler, however obsessive, could actually follow!

The function described, hopefully, is a bijection from [0, 1)xIN to [0,1) with the property that for all x and n: f(x, n+1) > f(x, n).
 
  • #23
Nice Perok! I think that works as you described.
 

1. What is "Challenge 6: Gambler's Hell"?

"Challenge 6: Gambler's Hell" is a hypothetical scenario used to illustrate the concept of probability and risk-taking. It involves a gambler who has to choose between two options that have equal chances of winning or losing.

2. What is the purpose of "Challenge 6: Gambler's Hell"?

The purpose of "Challenge 6: Gambler's Hell" is to demonstrate the importance of understanding probability and the potential consequences of risk-taking. It also highlights the concept of expected value and how it can be applied in decision-making.

3. How does "Challenge 6: Gambler's Hell" relate to real-life situations?

"Challenge 6: Gambler's Hell" can be applied to various real-life situations, such as making investments, playing games of chance, or even making important life decisions. It shows how understanding probability can help individuals make informed and rational choices.

4. What are the key takeaways from "Challenge 6: Gambler's Hell"?

Some key takeaways from "Challenge 6: Gambler's Hell" include the importance of understanding probability and expected value, the potential consequences of risk-taking, and the role of rational decision-making in minimizing risks and maximizing rewards.

5. How can "Challenge 6: Gambler's Hell" be used in the field of science?

"Challenge 6: Gambler's Hell" can be used in various scientific fields that involve risk and uncertainty, such as economics, psychology, and biology. It can also be applied in research studies to understand decision-making processes and human behavior in uncertain situations.

Similar threads

  • Math Proof Training and Practice
2
Replies
60
Views
8K
  • Art, Music, History, and Linguistics
Replies
1
Views
1K
  • Programming and Computer Science
Replies
29
Views
2K
  • General Math
4
Replies
125
Views
16K
  • Math Proof Training and Practice
3
Replies
82
Views
11K
  • Math Proof Training and Practice
4
Replies
121
Views
18K
  • Math Proof Training and Practice
2
Replies
48
Views
8K
Replies
13
Views
2K
  • Math Proof Training and Practice
2
Replies
67
Views
10K
Replies
5
Views
2K
Back
Top