Solving Matrix Equations with Real Scalars

  • Thread starter Thread starter blueberryfive
  • Start date Start date
  • Tags Tags
    Matrix Scalars
blueberryfive
Messages
36
Reaction score
0
Hello,

A(rB) = r(AB) =(rA)B where r is a real scalar and A and B are appropriately sized matrices.

How to even start? A(rbij)=A(rB), but then you can't reassociate...

Also, a formal proof for Tr(AT)=Tr(A)?

It doesn't seem like enough to say the diagonal entries are unaffected by transposition..

Lastly, let A be an mxn matrix with a column consisting entirely of zeros. Show that if B is an nxp matrix, then AB has a row of zeros.

I can't figure out how to make a proof of this. I know how to say what such and such entry of AB is, but I don't know how to designate an entire column. How do you formally say it will be equal to zero, then...just because the dot product of a zero vector with anything is 0?
 
Physics news on Phys.org
blueberryfive said:
Hello,

A(rB) = r(AB) =(rA)B where r is a real scalar and A and B are appropriately sized matrices.

How to even start? A(rbij)=A(rB), but then you can't reassociate...
You don't have to. Every entry in "rB" has a factor of r so every entry in A(rB) has a factor of r so A(rb)= r(AB)= (rA)B

Also, a formal proof for Tr(AT)=Tr(A)?

It doesn't seem like enough to say the diagonal entries are unaffected by transposition..
Why not? Would it be better to say "A^*_{ij}= A_{ji}" so that, replacing j with i, "A^*_{ii}= A_{ii}"? That may look more "formal" but it is really just saying that "the diagonal entries are unaffected by transposition".

Lastly, let A be an mxn matrix with a column consisting entirely of zeros. Show that if B is an nxp matrix, then AB has a row of zeros.
(AB)_{ij}= \sum A_{ik}B_{kj}[/itex]. If the &quot;jth&quot; column of B is all 0s, then the &quot;jth&quot; row of A is all 0.<br /> <br /> <blockquote data-attributes="" data-quote="" data-source="" class="bbCodeBlock bbCodeBlock--expandable bbCodeBlock--quote js-expandWatch"> <div class="bbCodeBlock-content"> <div class="bbCodeBlock-expandContent js-expandContent "> I can&#039;t figure out how to make a proof of this. I know how to say what such and such entry of AB is, but I don&#039;t know how to designate an entire column. How do you formally say it will be equal to zero, then...just because the dot product of a zero vector with anything is 0? </div> </div> </blockquote>
 
Thank you.

Also,

Tr(ATA)\geq0.

I can't even see how to begin...
 
blueberryfive said:
Thank you.

Also,

Tr(ATA)\geq0.

I can't even see how to begin...
That's a sum of squares!
 
blueberryfive said:
Hello,

A(rB) = r(AB) =(rA)B where r is a real scalar and A and B are appropriately sized matrices.

How to even start? A(rbij)=A(rB), but then you can't reassociate...
The definition of rA where r is a real number and A is a matrix is (rA)_{ij}=rA_{ij}. The definition of AB where both A and B are matrices is (AB)_{ij}=\sum_k A_{ik}B_{kj}. It's not hard to use these definitions to show that the equalities you mentioned are true. Start with (A(rB))_{ij}=\sum_k A_{ik}(rB)_{kj}.

All your other questions are also quite easy to answer if you just use these definitions, and the definition of the trace and the transpose: \operatorname{Tr}A=\sum_i A_{ii},\quad (A^T)_{ij}=A_{ji}.
 
Last edited:
I asked online questions about Proposition 2.1.1: The answer I got is the following: I have some questions about the answer I got. When the person answering says: ##1.## Is the map ##\mathfrak{q}\mapsto \mathfrak{q} A _\mathfrak{p}## from ##A\setminus \mathfrak{p}\to A_\mathfrak{p}##? But I don't understand what the author meant for the rest of the sentence in mathematical notation: ##2.## In the next statement where the author says: How is ##A\to...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top