Speaking of Applicable Geometry
How much do you know about the relationship between euclidean geometry and mean/variance?
Your question really concerns the values taken by a quadratic form. Many statistical manipulations (and many useful properties of normal distributions) arise in a geometrically natural manner from manipulating quadratic forms by orthogonal transformations and orthoprojections, together with some notions from affine geometry such as convexity. For example, taking the mean of n variables x_1, \, x_2, \dots x_n, where we think of this data as the vector \vec{x} = x_1 \, \vec{e}_1 + \dots x_n \, \vec{e}_n, corresponds to taking the orthoprojection (defined using standard euclidean inner product) onto the one dimensional subspace spanned by \vec{e}_1 + \vec{e}_2 + \dots \vec{e}_n. If we adopt a new orthonormal basis including the unit vector \vec{f}_n = \frac{1}{\sqrt{n}} \, \left( \vec{e}_1 + \vec{e}_2 + \dots \vec{e}_n \right), this orthoprojection can be thought of very simply, as simply forgetting all but the last component \sqrt{n} \, \overline{x}, which agrees (up to a constant multiple) with the arithmetic mean.
See M. G. Kendall, A Course in the Geometry of n Dimensions, Dover reprint, and then try the same author's book Multivariate Analysis.
I must add a caution: do you see why principle component analysis (PCA) is essentially a method for "lying with statistics"? That is, the geometric (or if you prefer, linear algebraic) manipulations of your data set are mathematically valid, but the statistical interpretation is almost always extremely dubious. Fortunately, my remark about the role of euclidean geometry in mathematical statistics holds true for many more legitimate statistical methods, some discussed in the first book by Kendall cited above.