Let's say I wanted to do the following calculation: (x^2 + 2x + 1) / (x+1) I've scrolled through some online guides, and they all show how to do it, but not the principle behind it. I'm specifically having trouble with the fact, that instead of dividing the largest degree term with the entire denominator x+1 = x1+x0, they only take the largest degree term from the denominator, and divide it into the largest degree term in the numerator, as they progress down the steps. How can they just ignore the number 1 in this case. I know that once the largest degree terms have been divided, the result is then multiplied into the denominator and therefore the 1 is not completely ignored, but it just doesn't make sense to me. I don't see the connection between dividing 121 by 11, which is the same as dividing 121 = 1*10^2 + 2*10^1 + 1 * 10^0 by (1*10^1+1*10^0) which is equivalent to dividing x^2 + 2x + 1 by x+1. Could someone clarify the specifics for me? Why can I ignore the lower degree terms when doing the divisions?