What I think this comes down to is the dichotomy between calculating a quantity and understanding the meaning of that quantity. I claim that understanding allows calculation, but not vice versa.
For instance, on page 14, of his nice notes on GR, Sean Carroll gives the transformation law, (1.51) in his numbering, for tensors and then says: "Indeed a number of books like to define tensors as collections of numbers transforming according to (1.51). While this is operationally useful, it tends to obscure the deeper meaning of tensors as geometric entities with a life independent of any chosen coordinate system." On page 15 he describes the scalar or dot product as a familiar example of a tensor of type (0,2).
I am going to go out on a limb here and try to make a trivial calculation, beginning from a conceptual definition of a tensor of type (0,2) as a bilinear map from pairs of tangent vectors to numbers. I.e. I will try to derive the transformation law from the conceptual meaning.
A simple example of such a tensor is a scalar product, i.e. a symmetric, bilinear mapping from pairs of tangent vectors to scalars. Such a thing is often denoted by brackets (or a dot) taking the pair of tangent vectors v,w to the number <v,w>. Now if f:M-->N is a differentiable mapping from one manifold M to another manifold N, such as a coordinate change, then one can pull back a scalar product from N to M using the derivative of f.
I.e. if u,z are two tangent vectors at a point p of M, then applying the derivative of f to them takes them to 2 tangent vectors at the image point f(p) in N, where we can apply <,> to them. I.e. if <,> is the scalar product on N, then the pulled back scalar product f*(<,>) acts on u,z by the obvious, only possible law: f*(<u,z>) = <f'(u),f'(z)>, where f' is the derivative of f, given as a matrix of partials of f with respect to local coordinates in M and N. For example we could denote this matrix as f' = [dyi/dxj].
Now suppose we express the scalar product in N as a matrix, i.e. in local coordinates as A = [akl], sorry about the lack of subscripts. Imagine k and l are subscripts on a.
Then if we want to express the pulled back scalar product as a matrix, we just see what it does to the vectors u,z as follows:
f*(<u,z>) = <f'(u),f'(z)> = [f'(u)]* [A] [f'(z)] = * [f']* [A] [f'] [z], where now everything is thought of as a matrix, and star means transpose of the matrix.
Well since the matrix of partials f' is just [dyi/dxj], and A is [akl], we just multiply out the matrices to get the matrix of the pulled back scalar product as [f*(<,>)]
= [f']* [A] [f'] = the matrix whose i,j entry is akl (dyk/dxi)(dyl/dxj), summed over k,l.
Now this is exactly the transformation law Carroll calls (1.51) on page 14 of his notes and everyone else also calls the transformation law for a tensor of type or rank (0,2) in the various web sources given here and above.
Notice too, if you can imagine my subscripts, that this satisfies the summation convention for subscripts. But I am not dependent on that because I know what it means, so i don't care whether I can see the subscripts or not, whereas someone dependent on seeing where the indices are may not be able to follow this.
Anyone who knows conceptually what a tensor is would immediately realize that a homogeneous polynomial of degree d in the entries of a tangent vector, is a (symmetric) tensor of type (0,d), and that the components of the tensor are merely the coefficients of the polynomial (written as a non commutative polynomial, i.e. with a separate coefficient for xy and for yx). It follows of course that they transform via a d dimensional matrix of size n, where n is the dimension of the manifold, i.e. by a collection of n^d numbers.
Subscript enthusiasts write this as a symbol like T, with d subscripts.
That is an extremely cumbersome way to discuss tensors in my opinion, and leaves me at the mercy of the type setter, whereas knowing what they mean always bails me out eventually.
I actually wrote a graduate algebra book, including linear and multilinear alkgebra once, and I discovered to my amusement that I could actually write down tensor products as matrices, and so on, just from the definitions, although I had never needed to do so before in my professional life.
peace and love,
roy