this is very elemetary, but i'm not following the logic at all. i'm trying to explain to myself after reading the book defn, but still, no luck. Ok, so the defintion of limit is If Given e > 0, there exists a d > 0 such that if x belongs to A and 0 < |x- c| < d, then |f(x) - L| < e. here, e = epsilon, and d = delta. ok, so i'm explaining this to myself as, " if I let e > 0. I can find a delta > 0, that if x is in a, and x - c(with x not = c) is less then delta, then f(x) - L < e. So when I do a problem, such as "the limit (from x to c) x = c, i'm having trouble how the book solves it. the book does: let g(x) = x for all x in R. If e > 0, let delta = e. then if 0 < | x-c| <d, then |g(x) - c| = |x - c| < e. since e > 0, it proves it. =================================== so I try to explain this to myself: first, the book assigns g(x) = x. then it goes through the definition. It lets e = d. so now, if 0 < |x - c| < d, then it means |x - c| < e. so why does that prove it? I dotn know why I can't get it...I think i'm missing something simple here.