# Components of the trace operation

1. Dec 13, 2011

### fa2209

I'm currently reading "Introduction to tensors and Group Theory for Physicists". I'm stuck on a question posed on dual spaces.

The author gives the trace as an example of a linear functional on the vector space M_n(ℝ) (n x n matrices with real entries) and then asks how one would find the components of the element of the dual space that takes an n x n matrix to a real number-the trace.

I have no idea how to solve this problem, any help would be much appreciated.

2. Dec 13, 2011

### Fredrik

Staff Emeritus
Start with an arbitrary member of V* acting on an arbitrary member of V, in whatever notation you prefer, e.g. ρ(x). Then try two different ideas: 1. Expand x in a basis. 2. Expand x in a basis, and ρ in the dual basis of that first basis. Then compare the results. This will give you the formula for components of dual vectors.

Then choose a basis for the vector space of matrices, and use the formula with the trace and the dual basis. Note that the components of a linear functional are always component with respect to some basis, so it's impossible to do it without choosing a basis.

3. Dec 13, 2011

### fa2209

Thanks a lot for your reply. I'm pretty new to this mathematics so I follow what your saying (sort of) but not sure I could carry out the steps you've outlined.

A few questions:

1). I would probably choose the basis E_ij which represents the matrix with 1 in the ijth position and zero's everywhere else but I wasn't sure what you meant by "and use the formula with the trace and the dual basis".

2). I know that to find the dual basis I use [F E]_ij = δ_ij where F are the bases of the dual space, so I've written ƩFikEkj = δij but I'm not really sure where to go from here to find the dual basis.

thanks

4. Dec 13, 2011

### Fredrik

Staff Emeritus
That looks like a good choice. I meant that you should use the formula for the ith component of ρ in the dual basis of {ei}, that you will find if you do what I suggested first. Of course when you apply it to your problem, you should either replace i by ij, or keep the i and let it take values from 1 to n2 instead.

Edit: You're new here, so you might not understand why I don't just tell you the answer. It's the forum's policy on textbook-style questions. We're required to treat them all as homework, and only give hints, not complete answers. I think I gave you 95% of it though.

The dual basis of {ei} is defined by $e^i e_j =\delta^i_j$. Once you have written that dow, there's nothing to find. Recall that the dual basis is a basis for V* (assuming that {ei} is a basis for V). The members of V* are linear functions from V into ℝ. Two functions are equal if they have the same domain and the same value at each point in the domain. So if you know the result of $e^i$ acting on an arbitrary $v\in V$, then you know $e^i$. And if you know the result of $e^i$ acting on a basis vector, then you know the result of $e^i$ acting on an arbitrary $v\in V$, because $e^i$ is linear.

Last edited: Dec 13, 2011
5. Dec 13, 2011

### fa2209

Haha, thanks but I'm struggling with the next 5% because I've only been doing this kind of mathematics for about 2 days.

What I'm not sure about is how I can extend this to matrices to answer my problem.

I understand how to use the orthogonality relationship with vectors but I'm not really sure about how to use it with matrices. What I mean is that if I use the fact that eieiij then I can instantly see that the dual basis vector can be written explicitly as a row vector (assuming I write ei as a column vector) with a 1 in the same column as the 1 in the row of the vector in V and zero's elsewhere. So how do I extend this to matrices?

I think I also confused myself a bit with my choice of notation, normally if I write Eij I mean the ijth component of a matrix but here it actually represents the entire matrix not just one component.

Finally, I'm trying to find something that maps an n x n matrix to a real number so don't I need a (1 x n) times the original n x n and then multiply on the right by an n x 1 matrix?

Sorry for the ramble but writing down my thinking might help you see where I'm going wrong (apart from being a terrible mathematician).

6. Dec 13, 2011

### Fredrik

Staff Emeritus
I think you need to start by doing what I described in the first paragraph of post #2. If you think you can't, I think you're really just assuming that it will be hard, when in fact it's very easy. I've done that myself a bunch of times. One time it took me more than an hour to prove that if a Banach algebra has an identity element, it must be unique. Banach algebras sounded scary, so I expected it to be hard. I felt really stupid when I realized that the proof looks like this: 1=1·1'=1'.

Last edited: Dec 13, 2011