n is still n
n + infinity does not just give you infinity back again, it gives you a different (ever so slightly larger) infinity.
if you are to subtract the two infinities, you'd still get back n. n will always be n. you defined it, it's not going to change.
not all infinities are the same. think about all the numbers between 1 and 2. there's an infinite set of fractions between 1 and 2. there's also an infinite set of whole numbers 1,2,3,4... you could consider these infinities the same since the fractions between 1 and 2 are created by the set of whole numbers. you can't write more fractions than you can whole numbers, and you can't write more whole numbers than you can write fractions. these kinds of infinities are equal.
If you consider a set of numbers that includes all the fractions between all the whole numbers and the whole numbers themselves though (all real numbers) though, that infinity IS larger than the others, because each whole number allows you to generate an infinite number of fractions (one for every other whole number).
so if you apply such thinking to your example, you have n, infinity, and n + infinity. for each number n, there will always be an n + infinity, and this term is not the same as plain old infinity.
so really you just need to be more careful with your algibra:
we start with:
n + infinity = (infinity + n)
and you can write:
n = (infinity + n) - infinity
infinity = (infinity + n) - n
so:
n = (infinity + n) - ((infinity + n) - n)
n = (infinity + n) - (infinity + n) + n
n = n
-mike