Even more generally, a tensor is a sort of mathematical machine that takes one or more vectors and produces another vector or number.
A tensor of rank (0,2), often just called rank 2, is a machine that chews up two vectors and spits out a number.
A tensor of rank (0,3) takes three vectors and produces a number.
A tensor of rank (1,2) takes two vectors and produces another vector.
Hopefully you see the pattern.
You actually already know what a (1,1) tensor is -- it's nothing more than a good ol' matrix. It accepts one vector and produces another vector.
If you're working in three dimensions, a (1,1) tensor can be represented by its nine components. Here's a simple (1,1) tensor.
<br />
T = \left(<br />
\begin{array}{ccc}<br />
1 & 0 & 0\\<br />
0 & 1 & 0\\<br />
0 & 0 & 1<br />
\end{array}<br />
\right)<br />
You already know what this guy does -- it takes a vector and gives you back the exact same vector. It's the identity matrix, of course. You would use it as such:
\vec v = T \vec v
If the world were full of nothing but (1,1) tensors, it'd be pretty easy to remember what T means. However, there are many different kinds of tensors, so we need a notation that will help us remember what kind of tensor T is. We normally use something "abstract index notation," which sounds more difficult than it is. Here's our (1,1) tensor, our identity matrix, laid out in all its regalia:
T^a_b
The a and b are referred to as indices. The one on the bottom indicates the tensor takes one vector as "input." The one of the top indicates it produces one vector as "output."
Tensors don't have to accept just vectors or produce just vectors -- vectors are themselves just a type of tensor. Vectors are tensors of type (0,1). In full generality, tensors can accept other tensors, and produce new tensors. Here are some complicated tensors:
<br />
R^a{}_{bcd}\ \ \ \ G_{ab}<br />
The second one, G_{ab} is a neat one to understand. You should already understand from its indices that it is a type (0,2) tensor, which means it accepts two vectors as input and produces a number as output. It's called the metric tensor, and represents an operation you already know very well -- the dot product of two vectors!
In normal Euclidean 3-space, G_{ab} is just the identity matrix. You can easily demonstrate the following statement is true by doing the matrix multiplication by hand:
\vec u \cdot \vec v = G_{ab} \vec u \vec v
The metric tensor is more complicated in different spaces. For example, in curved space, it's certainly not the identity matrix anymore -- which means the vector dot product is no longer what you're used to either when you're near a black hole. Tensors are used extensively in a subject called differential geometry, which deals with, among other topics, curved spaces. General relativity, Einstein's very successful theory which explains gravity as the curvature of space, is cast in the language of differential geometry.
So there you have it: tensors are the generalization of vectors and matrices and even scalars. (Scalars, by the way, are considered to be type (0,0) tensors.)
I should mention that there not all mathematical objects with indices are tensors -- a tensor is a specific sort of object that has the transformation properties described by others in this thread. To be called a tensor, an object much transform like a tensor. Don't worry though, you're not going to run into such objects very often.
- Warren