here is another attempt, only 4 months ago:
vectors are to tensors, as linear polynomials are to higher degree polynomials, except with tensors multiplication is also allowed to be non commnutative.
for example the dot product, a second degree polynomial in the entries of the two vector arguments, is a symmetric (i.e. commutative) tensor.
"short course on tensors:
let elements of R^2 be called vectors, and write them as length 2 column vectors.
then we can define "covectors" as row vectors of length 2. then row vectors may be considered as linear functions on vectors, i.e. as linear functions f:R^2-->R.
some people call vectors, tensors of rank 1 and type (1,0), and call covectors, tensors of rank one and type (0,1).
now consider two covectors f and g. they yield a bilinear function f(tensor)g of two vector variables, by multiplication, i.e. f(tensor)g (v,w) = f(v)g(w), a number.
this product is not commutative since g(tensor)f (v,w) = g(v)f(w).
for the same reason it gives a different answer when applied to (w,v), as compared to when applied to (v,w).
some people call f(tensor)g a rank 2 tensor of type (0,2).
if we add up several such products, e.g. f(tensor)g + h(tensor)k, we still have a bilinear function of two vector variables, hence another rank 2 tensor of type (0,2).
now we could consider also a product v(tensor)f, of a vector and a covector. if we apply this to a vector w we get a vector: namely v times the scalar f(w).
again a sum of such things is another: v(tensor)f + u (tensor)g, applied to w is
f(w) times v + g(w) times u.
some people call such a thing a rank 2 tensor of type (1,1).
since as a function on the vector w, this object is linear, it could be represented as a 2 by 2 matrix, whose columns were the vector values taken by this function at the standard basis vectors (1,0) and (0,1).
the ordinary dot product is a tensor of rank 2 and type (0,2), since it takes two vectors and gives out a number, and is bilinear.
i.e. if f is the linear function taking the vector (x,y) to x, and g is the linear function taking the vector (x,y) to y, then the dot product equals f(tensor)f + g(tensor)g.
I.e. applied to (v,w), where v = (v1,v2) and w = (w1,w2) are vectors, it gives us v1w1 + v2w2. thus it is symmetric.
the reason for distinguishing the vector variables from the covector variables, is that under a linear transformation T:R^2-->R^2, the vectors transform by v goes to Tv, and the covectors transform by f goes to fT, i.e. the multiplication occurs on the other side.
or if you insist on writing a row vector, i.e. a covector as a column vector, then you must multiply it (from the left) by T* = transpose of T.
multiplying more vectors and covectors together, and adding up, gives higher rank tensors.
what i have described are tensor spaces "at a point". just as the tangent space to a sphere, say at one point, is a vector space, so also there are tensor spaces at every point of the sphere.
then just as we can consider familes of tangent vectors, or tangent fileds, we can consider tensor fields, one tensor at each point.
was that simple enough for you? that's about as simple as i can make it, and still be correct."
I might add some people made it seem even simpler by omitting the meanings of the symbols and giving only the rules for their manipulation.