# Homework Help: Vector Space of Matrices

1. Oct 31, 2004

### AKG

If I have a finite dimensional inner product space $V = M_{n \times n}(\mathbb{R})$, then one basis of V is the set of n² (n x n)-matrices, $\beta = \{E_1, \dots , E_{n^2}\}$ where $E_i$ has a 1 in the $i^{th}$ position, and zeroes elsewhere (and by $i^{th}$ position, I mean that the first position is the top-left, the second position is just to the right of the top-left, and the last position is the bottom-right). Since these matrices are linearly independent and span V, they certainly form a basis, and since there are n² of them, dim(V) = n². Therefore, if I have some linear operator T on V, then $A = [T]_{\beta}$ is an (n² x n²)-matrix, right? However, if v is some element of V, then T(v) = Av, but Av is not even possible, since it involves multiplying two square matrices of different dimension. Now, if I had made a mistake earlier, then maybe A is supposed to be an (n x n)-matrix. But that doesn't seem right.

My textbook proves:

If V is an N-dimensional vector space with an ordered basis $\beta$, then $[I_V]_{\beta} = I_N$, where $I_V$ is the identity operator on V. Now, in our case, N = n², but if I was wrong before, and in the previous example, A should have been an (n x n)-matrix, then the equality above essentially states that an (n x n)-matrix is equal to an (n² x n²)-matrix. Where have I (or my book) made a mistake?

2. Oct 31, 2004

### Atheist

>> However, if v is some element of V, then T(v) = Av, but Av is not even possible, since
>> it involves multiplying two square matrices of different dimension.

I´m not completely sure if I got your question but maybe I´m guessing right where your problems lie:

Assume a linear operator O over the R^n. It can be written in the form of O(a) = Ma for any vector a. M is an n x n matrix here. Certainly M and a don´t have the same "dimension" (dunno the proper english term; probably "level"). And you probably wouldn´t feel that this is not going to work, because you know how to interpret the equation.
In tensorial notation using the components of the vector a above is written like this:
$$T(a^\nu) = \sum_{\nu=1} ^n M^\mu _{\, \nu} a^\nu = b^\mu$$
The last "=" was put in to show that the result b is a vector of R^n again.

Rewriting your "T(v) = Av" in tensorial terms it would be:
$$T(v^{\mu \nu}) = \sum_{\mu=1} ^n \sum _{\nu=1} ^n A^{\alpha \beta}_{\, \, \, \mu \nu} v^{\mu \nu} = b^{\alpha \beta}$$
So the equation is defined and the result is an element of V.

Sidenotes:
- Av = A*v is never possible unless you define what it´s supposed to be. Tensorial notation like above does this.
- I didn´t understand what your textbook sais because I neither know the notation nor do I know what $$I_N$$ is. So it´s well possible that I completely missed your question.

Last edited: Oct 31, 2004