This is from a textbook but it is not a homework problem, it's an example following the introduction of the "Sandwich Theorem". It says "for all x ≠ 0", but then it appears to assume that x = 0 when it finds the limits of g(x) and h(x). Clearly 1 ≤ u(x) ≤ 1 means u(x) = 1, I don't dispute that. I'm just confused as to why they plugged 0 into x after saying that x was not zero. It's more than likely that I'm just reading it incorrectly.. so.. what am I missing?