Originally from the statistics forum but am told this is more of a calculus question.

I flip 10 coins, if any of the coins land on tails, all of the coins split into 10 new coins and I flip them all again. I keep doing this until a round where every single coin lands on heads. Can I expect to ever stop flipping coins as the number of flips goes to infinity? (and followup question: if so, on average, how many flips would it take me?)

I think we've managed to at least state the problem mathematically but am unsure how to go about deriving an answer.

I see. The number of coins multiplies by 10 every time you get a tail ? And all of them have to end up heads ?
So ##2^{-10}##, then ##
2^{-100}, \quad
2^{-1000}
##etc ?

I flip 10 coins, if any of the 10 are tails, then I go and flip 100 coins. If any of those 100 are tails then I flip 1000 coins. If during any iteration all of the coins I flip in that iteration land on heads then I stop and my task is complete.

I suspect that I'm really asking if the series converges? And if it diverges I'd expect to get all heads an infinite number of times if I kept going, and if it converges then I'd expect to get all heads a finite (and possibly less than 1) number of times if I continued on forever?

Or is making the jump from the value of the series to a statement of probability unfounded/completely wrong?

You stop after 1 flip with probability 2^{-10}, after two flips with probability (1-2^{-10})*2^{-100} and so on. The total probability to stop converges to a value extremely close to 2^{-10}=1/1024, and below 1/1000.
The probability that you ever get "all heads" is also small (below 1/1000, and dominated by the first flip).