How can I find a solution valid for all cases?

  • Thread starter Thread starter weetabixharry
  • Start date Start date
weetabixharry
Messages
111
Reaction score
0
How can I find a solution valid for "all" cases?

I have an equation:

tr\left\{\textbf{AB}\right\} = \sigma

where tr\left\{\right\} denotes the matrix trace. The square matrix \textbf{A} is independent of both the square matrix \textbf{B} and the real scalar \sigma.

I want to determine all possible values of \textbf{B} that will allow the above equation to hold for all \textbf{A}, given the only constraint:

tr\left\{\textbf{A}\right\} = 1

For example, I can see that \textbf{B}=\sigma \textbf{I} will always be valid (where \textbf{I} is the identity matrix). But can I guarantee that there are no other possible values for \textbf{B}?

I have been pondering this problem for some time and cannot see a way of approaching it. Any advice would be greatly appreciated!
 
Physics news on Phys.org


Let A_{i,j} denote a matrix whose (i,j) entry is 1 and the rest of whose entries are zero. If the equation was true for all matrices it would work for each A_{i,j}. Is that going to be possible?
 


Stephen Tashi said:
Let A_{i,j} denote a matrix whose (i,j) entry is 1 and the rest of whose entries are zero. If the equation was true for all matrices it would work for each A_{i,j}. Is that going to be possible?

I think we can only consider the A_{i,i} matrices (i.e. a diagonal entry is 1), in order to satisfy the trace constraint on A. However, I'll look into an approach like this. I have found a possible solution in a similar way - by considering the (i,j)th element. I'll check it over and post it here if I think it has any chance of being correct...
 


I hope someone will have the time to read through the following attempt and point out any errors in my reasoning:

The \left( i,i\right) ^{th} element of \boldsymbol{AB} can be written as:

<br /> \left[ \boldsymbol{AB}\right] _{i,i}=\sum_{j=1}^{N}a_{ij}b_{ji}<br /> <br />

Therefore the trace is:

<br /> {tr}\left\{ \boldsymbol{AB}\right\}<br /> =\sum_{i=1}^{N}\sum_{j=1}^{N}a_{ij}b_{ji}=\sigma <br />

Our only constraint is:

<br /> <br /> {tr}\left\{ \boldsymbol{A}\right\} =\sum_{i=1}^{N}a_{ii}=1<br /> <br />

Therefore, we can write:

<br /> <br /> \begin{eqnarray*}<br /> {tr}\left\{ \boldsymbol{AB}\right\} &amp;=&amp;\sigma {tr}\left\{ <br /> \boldsymbol{A}\right\} \\<br /> \sum_{i=1}^{N}\sum_{j=1}^{N}a_{ij}b_{ji} &amp;=&amp;\sigma \sum_{i=1}^{N}a_{ii}<br /> \end{eqnarray*}<br />

Clearly, the right hand side is only a function of diagonal terms.
Therefore, the a_{ij} (i\neq j) terms must be eliminated from the left
hand side by assuming all b_{ji} (j\neq i) terms are zero.

Therefore \boldsymbol{B} must be diagonal and:

<br /> <br /> \sum_{i=1}^{N}a_{ii}b_{ii}=\sigma \sum_{i=1}^{N}a_{ii}<br /> <br />

The only way (I think) \boldsymbol{B} can satisfy this in general is if
the b_{ii} terms can be removed from the summation (i.e. they are a
constant). Clearly, that constant is \sigma:

<br /> \boldsymbol{B}=\sigma \boldsymbol{I}<br />
 


I didn't quite follow your argument about diagonal terms, but I think you have at least started on the right track. I also think you're making it a bit more complicated than it needs to be. I would begin like this: Let A be an arbitrary matrix such that Tr A=1. Let B be an arbitrary matrix such that Tr(AB)=σ. Then Tr(AB)=σTr(A).

Now start making specific choices of A, to see how they constrain B. For example, what does the equality Tr(AB)=σTr(A) say when A11=1 and all other components of A are =0?
 


Thanks for the advice. I think this idea of choosing various specific examples for \textbf{A} is a very good one and could simplify matters significantly. I'll see what I can come up with...
 

Similar threads

Back
Top