Definition for continuity should be right in your textbook (of course, textbooks have a way of turning common sense into a foreign language).
There's three tests for continuity, which might mean a little more (those should also be in your textbook).
To be continuous at some point (we'll call it c),
f(c) has to exist. For example, if f(x) = 1/x, and 0 (one possible value for c) is inserted for x, the equation would be undetermined (i.e. c=0, f(c) does not exist).
The limit of f(x) must exist as x approaches c.
f(c) must equal the limit of f(x) as x approaches c.
So, if f(x)=1/x, then it is continuous at f(c) if c=1. 1/1 equals 1, so f(c) exists. The limit of f(x) as x approaches 1 is 1. Since f(c) and the limit of f(x) as x approaches c both equal 1, f(x) is continuous at x=1.
If c=0, then f(c) doesn't exist, the limit of f(x) as x approaches 0 doesn't exist, rendering the third test moot (and impossible to conduct, in this case). Actually, as soon as any of the tests fail, you can stop.