I've disagreed with my professor on this point, because I still cannot see how I am wrong. Consider the following equation: If x is an integer, (x^3 + x)/(x^4 - 1) = x/(x^2-1) I'm asked to either prove, or disprove this, and so I arrive at the following (man I wish I knew TEX or whatever it's called... lol) [x(x^2+1)]/[(x^2-1)(x^2+1)]= =[(x^2+1)/(x^2+1)]*[x/(x^2-1)]=x/(x^2-1) for all x in Z Okay, now this is what I argued. I said that since (x^2+1)/(x^2+1) = 1 for every integer x, then the two equations equal each other. I mean, when you take any number, and multiply it by 1, you get that number back, right??---closure under scalar multiplication!!! I get really mad over this sometimes. The statement is pure logic to me. My professor argues that since the equations on the left hand side and right hand side are undefined at x=1, they do not equal each other for every integer. But first of all, when did undefined <i>not</i> equal undefined? I just can't comprehend this. If 1*another number doesn't equal that number, I just don't know what will go on next. Thanks so much for your consideration of my thoughts. It's likely that I'm just confused, and if so, a (nonlengthy) explanation would be much appreciated. Thank you!