Components of the trace operation

  • Context: Graduate 
  • Thread starter Thread starter fa2209
  • Start date Start date
  • Tags Tags
    Components Trace
Click For Summary

Discussion Overview

The discussion revolves around understanding the components of the trace operation as a linear functional on the vector space of n x n matrices with real entries. Participants explore how to relate the trace to the dual space and the dual basis, addressing both theoretical and practical aspects of the problem.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Homework-related

Main Points Raised

  • One participant seeks help on finding the components of the trace as a linear functional on the space of n x n matrices.
  • Another suggests starting with an arbitrary member of the dual space and expanding both the matrix and the functional in chosen bases to derive the necessary components.
  • A participant proposes using the basis E_ij, which represents matrices with a 1 in the ijth position, and expresses confusion about the dual basis and its application to the trace.
  • Clarifications are made regarding the definition of the dual basis and its relationship to the original basis, emphasizing the linearity of the functionals.
  • One participant expresses difficulty in extending the concepts from vectors to matrices, particularly in using orthogonality relationships.
  • Another participant encourages the original poster to attempt the suggested steps, noting that initial assumptions about difficulty may be misleading.

Areas of Agreement / Disagreement

Participants generally agree on the approach of using bases to understand the trace operation, but there is no consensus on the specific steps or clarity regarding the extension of these concepts to matrices. The discussion remains unresolved as participants navigate through the complexities of the topic.

Contextual Notes

Participants express uncertainty about the definitions and applications of the dual basis and the trace operation, highlighting the need for clarity in notation and the relationships between different mathematical objects.

fa2209
Messages
23
Reaction score
0
I'm currently reading "Introduction to tensors and Group Theory for Physicists". I'm stuck on a question posed on dual spaces.

The author gives the trace as an example of a linear functional on the vector space M_n(ℝ) (n x n matrices with real entries) and then asks how one would find the components of the element of the dual space that takes an n x n matrix to a real number-the trace.

I have no idea how to solve this problem, any help would be much appreciated.
 
Physics news on Phys.org
Start with an arbitrary member of V* acting on an arbitrary member of V, in whatever notation you prefer, e.g. ρ(x). Then try two different ideas: 1. Expand x in a basis. 2. Expand x in a basis, and ρ in the dual basis of that first basis. Then compare the results. This will give you the formula for components of dual vectors.

Then choose a basis for the vector space of matrices, and use the formula with the trace and the dual basis. Note that the components of a linear functional are always component with respect to some basis, so it's impossible to do it without choosing a basis.
 
Thanks a lot for your reply. I'm pretty new to this mathematics so I follow what your saying (sort of) but not sure I could carry out the steps you've outlined.

A few questions:

1). I would probably choose the basis E_ij which represents the matrix with 1 in the ijth position and zero's everywhere else but I wasn't sure what you meant by "and use the formula with the trace and the dual basis".

2). I know that to find the dual basis I use [F E]_ij = δ_ij where F are the bases of the dual space, so I've written ƩFikEkj = δij but I'm not really sure where to go from here to find the dual basis.

thanks
 
fa2209 said:
1). I would probably choose the basis E_ij which represents the matrix with 1 in the ijth position and zero's everywhere else but I wasn't sure what you meant by "and use the formula with the trace and the dual basis".
That looks like a good choice. I meant that you should use the formula for the ith component of ρ in the dual basis of {ei}, that you will find if you do what I suggested first. Of course when you apply it to your problem, you should either replace i by ij, or keep the i and let it take values from 1 to n2 instead.

Edit: You're new here, so you might not understand why I don't just tell you the answer. It's the forum's policy on textbook-style questions. We're required to treat them all as homework, and only give hints, not complete answers. I think I gave you 95% of it though. :smile:

fa2209 said:
2). I know that to find the dual basis I use [F E]_ij = δ_ij where F are the bases of the dual space, so I've written ƩFikEkj = δij but I'm not really sure where to go from here to find the dual basis.
The dual basis of {ei} is defined by e^i e_j =\delta^i_j. Once you have written that dow, there's nothing to find. Recall that the dual basis is a basis for V* (assuming that {ei} is a basis for V). The members of V* are linear functions from V into ℝ. Two functions are equal if they have the same domain and the same value at each point in the domain. So if you know the result of e^i acting on an arbitrary v\in V, then you know e^i. And if you know the result of e^i acting on a basis vector, then you know the result of e^i acting on an arbitrary v\in V, because e^i is linear.
 
Last edited:
Haha, thanks but I'm struggling with the next 5% because I've only been doing this kind of mathematics for about 2 days.

What I'm not sure about is how I can extend this to matrices to answer my problem.

I understand how to use the orthogonality relationship with vectors but I'm not really sure about how to use it with matrices. What I mean is that if I use the fact that eieiij then I can instantly see that the dual basis vector can be written explicitly as a row vector (assuming I write ei as a column vector) with a 1 in the same column as the 1 in the row of the vector in V and zero's elsewhere. So how do I extend this to matrices?

I think I also confused myself a bit with my choice of notation, normally if I write Eij I mean the ijth component of a matrix but here it actually represents the entire matrix not just one component.

Finally, I'm trying to find something that maps an n x n matrix to a real number so don't I need a (1 x n) times the original n x n and then multiply on the right by an n x 1 matrix?

Sorry for the ramble but writing down my thinking might help you see where I'm going wrong (apart from being a terrible mathematician).
 
I think you need to start by doing what I described in the first paragraph of post #2. If you think you can't, I think you're really just assuming that it will be hard, when in fact it's very easy. I've done that myself a bunch of times. One time it took me more than an hour to prove that if a Banach algebra has an identity element, it must be unique. Banach algebras sounded scary, so I expected it to be hard. I felt really stupid when I realized that the proof looks like this: 1=1·1'=1'.
 
Last edited:

Similar threads

  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 40 ·
2
Replies
40
Views
4K
  • · Replies 32 ·
2
Replies
32
Views
4K
  • · Replies 8 ·
Replies
8
Views
3K
Replies
5
Views
5K
  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 57 ·
2
Replies
57
Views
8K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K