Linear algerba: trace of square matrix is a linear functional

Click For Summary
The discussion centers on proving that the trace of a square matrix is a linear functional, defined as the sum of its diagonal elements. Participants clarify that a linear functional must satisfy two properties: the trace of the sum of two matrices equals the sum of their traces, and the trace of a scalar multiple of a matrix equals the scalar multiplied by the trace of that matrix. They also explore the implications of using complex numbers, noting that the definition of trace can be extended to complex matrices. Additionally, they discuss how the trace relates to scalar products in the context of complex matrices and outline the properties that define a scalar product. The conversation concludes with a focus on the necessary properties for defining a scalar product using the trace function.
skrat
Messages
740
Reaction score
8
Lets define trace for each square matrix A its trace as sum of its diagonal elements, so tr_{n}(A)=\sum_{j=1}^{n}a_{j,j}. Now proove that trace is a linear functional for all square matrix.

I would be happy to know what has to be true for anything to be a linear functional?

If I understand correctly, linear functional works on a vector but returns a real or complex number. So linear functional is a scalar product. Now what?
 
Physics news on Phys.org
skrat said:
If I understand correctly, linear functional works on a vector but returns a real or complex number.
That is correct. The trace takes a matrix and returns a number. All matrices form a vector space, which you can show by checking the properties of a vector space. Or if you want to cheat a bit, you could write down all the entries of the matrix in one big vector and consider it an element of \mathbb{R}^{n^2}.

skrat said:
So linear functional is a scalar product. Now what?
I'm not sure what you mean by "scalar product", are you talking about the inner product \vec a \cdot \vec b = \sum a_i b_i?

skrat said:
I would be happy to know what has to be true for anything to be a linear functional?
It has to satisfy that \operatorname{tr}_n(A + B) = \operatorname{tr}_n(A) + \operatorname{tr}_n(B) and \operatorname{tr}_n(k A) = k \operatorname{tr}_n(A) for all n x n matrices A, B and real numbers k. Those are the two properties that make an arbitrary function V \to \mathbb{R} into a linear functional, see e.g. Wikipedia.
 
CompuChip said:
I'm not sure what you mean by "scalar product", are you talking about the inner product \vec a \cdot \vec b = \sum a_i b_i?

Exactly. That is how we defined linear functional.

So, you are trying to say that the following two rows are a complete proof that trace, defined as a sum of diagonal elements of square matrix, is a linear functional:
tr_{n}(A+B)=\sum_{j=1}^{n}(a_{j,j}+b_{j,j})=\sum_{j=1}^{n}a_{j,j}+\sum_{j=1}^{n}+b_{j,j}=tr_{n}(A)+tr_{n}(B)
and
tr_{n}(\lambda A)=\sum_{j=1}^{n}\lambda a_{j,j}=\lambda \sum_{j=1}^{n}a_{j,j}=\lambda tr_{n}(A)

Does anything change if \lambda \in \mathbb{C}
 
skrat said:
Exactly. That is how we defined linear functional.
Hmm, the scalar product is only a linear functional if you fix one of the vectors, e.g. for any fixed real vector \vec a the functions
f_{\vec a}(\vec v): \mathbb{R}^n \to \mathbb{R}, \vec v \mapsto \vec a \cdot \vec v
and
g_{\vec a}(\vec v): \mathbb{R}^n \to \mathbb{R}, \vec v \mapsto \vec v \cdot \vec a
are both linear functionals (I'll leave it to you to prove it). The inner product \mathbb{R}^n \times \mathbb{R}^n \to \mathbb{R} is what we call bilinear, where the bi- indicates that it is linear in both arguments.
skrat said:
So, you are trying to say that the following two rows are a complete proof that trace, defined as a sum of diagonal elements of square matrix, is a linear functional:
tr_{n}(A+B)=\sum_{j=1}^{n}(a_{j,j}+b_{j,j})=\sum_{j=1}^{n}a_{j,j}+\sum_{j=1}^{n}+b_{j,j}=tr_{n}(A)+tr_{n}(B)
and
tr_{n}(\lambda A)=\sum_{j=1}^{n}\lambda a_{j,j}=\lambda \sum_{j=1}^{n}a_{j,j}=\lambda tr_{n}(A)
Yes, that was what I was saying.

skrat said:
Does anything change if \lambda \in \mathbb{C}
Not really, except that you have to make everything complex, e.g. since \lambda A will be an n x n matrix with complex entries you have to extend the definition of trn those matrices. Note that the example of the scalar product becomes a bit more involved, as \vec v \cdot \vec w = \sum_i v_i^* w_i has an additional complex conjugate compared to the real case.
 
THANK YOU VERY MUCH. One more question:

How do I show that <A,B>=tr(AB^{*}) defines scalar product if A,B are both square matrix with complex elements.

Assuming * here means complex conjugation I started like this:

tr(AB^{*})=\sum_{i=1}^{n}(\sum_{j=1}^{n}a_{i,j}\cdot {b_{j,i}^{*}}) but how do i write <A,B>?
 
A scalar product (or inner product) should satisfy three properties, e.g. one of them is \langle A, A \rangle \ge 0 with equality if and only if A = 0. Can you give me the other two properties?
 
I hope I can:

<A,B>=(<B,A>)^{*} and <\lambda (A+B),C>=\lambda <A,C>+\lambda <B,C>

So, the same goes for trace: tr(AA^{*})=0, tr(AB^{*})=tr(BA^{*}) and tr(\lambda (A+B^{*}))=\lambda tr(A)+\lambda tr(B^{*} )?
 

Similar threads

  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
12K
  • · Replies 7 ·
Replies
7
Views
3K
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
7
Views
2K