So are you talking about a matrix in 7 dimensions or a 7 dimensional array of numbers? That's a little confusing... by a 7D matrix I mean a 7*7 array of numbers. By a 7 dimensional array of numbers I mean a n*n*n*n*n*n*n array of numbers for some positive integer n.
A matrix is a linear object that eats a vector and returns a vector. So M.v = u can be written using indices as Mijvj = ui where the index j is summed over the dimension of the space, ie 1 to n, and the index i is free to be any value in the range 1 to n. See http://en.wikipedia.org/wiki/Einstein_summation_convention" )
Equivalently a matrix is a multilinear object that eats two vectors and returns a scalar: u.M.v = uiMijvj.
Note that linear means that M.(u + x*v) = M.u + x*M.v for x a scalar and u, v vectors. This is extended naturally to multilinearity.
Higher order http://en.wikipedia.org/wiki/Tensor" are linear objects that eat more vectors and return a scalar.
They can be written with more indices Tijk... and are the higher-dimensional arrays that I mentioned above.
(One thing I've not mentioned is the difference between contravariant and covariant indices, which is related to the vector space and its dual - but you probably don't need to worry about this.)
When working with explicit realisations of tensors and matrices, sums over the indices become almost unavoidable. These sums can be optimised and given notational conveniences like the dot product in various computer languages such as Matlab, Mathematica, numpy, etc...
And when working with them by hand, the Einstein (implicit) summation convention is very handy. As is various inner product and dot product notations.