Matrix Multiplication: Solving A*A'=B for Unique A(B)

  • A
  • Thread starter feynman1
  • Start date
In summary: I don't quite know what ##dA/dB## means, but if it presupposes that ##A## can be solved for as a function of ##B##, then yes I don't think it would make sense without extra information.
  • #1
feynman1
435
29
A and B are 2*2 matrices. A' is the transpose of A. Will the solution of A*A'=B for A yield a unique A(B)?
 
Physics news on Phys.org
  • #2
You can write 2X2 matrices on paper easily. Why do not you write them down and see whether formula satisfying your condition determine unique matrix components ?
 
Last edited:
  • #3
No. For example, if ##B## is the identity matrix, then ##A## could be any rotation or reflection matrix.
 
  • Like
Likes feynman1
  • #4
Infrared said:
No. For example, if ##B## is the identity matrix, then ##A## could be any rotation or reflection matrix.
then should we conclude the derivative dA/dB doesn't in general exist?
 
Last edited:
  • #5
feynman1 said:
then should we conclude the derivative A'(B) doesn't in general exist?
What derivative! ##A'## was your notation for the transpose.
 
  • #6
martinbn said:
What derivative! ##A'## was your notation for the transpose.
changed the notation now
 
  • #7
feynman1 said:
then should we conclude the derivative dA/dB doesn't in general exist?
This cannot be concluded so easily. You have to define what you mean by your notation.

Derivatives are always directional. Thus dA/dB indicates a function A considered in direction B. A needs to vary if B does, but how? If it does not, then dA/dB = 0. You have two problems with this notation: How does A depend on B, and how is this dependence locally unique, i.e. locally a function? If there are more than óne possibility, which one should we take?

Another possibility is to consider AA'-B=0 as an algebraic variety, i.e. a geometric object. All matrix elements of A and B (and eventually A' whatever it means) become variables in this perspective. The variety is embedded in a Euclidean vector space so you can consider changes in coordinate directions. But this doesn't give you an explanation for dB.

Long story short: Try to figure out what dA/dB could mean and you will find the difficulties yourself.
 
  • Like
Likes feynman1
  • #8
feynman1 said:
A and B are 2*2 matrices. A' is the transpose of A. Will the solution of A*A'=B for A yield a unique A(B)?

feynman1 said:
then should we conclude the derivative dA/dB doesn't in general exist?

feynman1 said:
changed the notation now
It seems obvious to me that you don't understand this problem. At first you wrote that A' meant the transpose of A, then you decided that it meant the derivative of A. Is that your final answer?

Also, if A' is the derivative, which derivative is it? It doesn't make much sense to me to talk about the derivative of one matrix with respect to another; e.g., dA/dB, but it doe make sense to talk about the derivative of a matrix with respect to some variable, say t; e.g., dA/dt. Since you are so uncertain about this problem, it seems reasonable to assume that you aren't certain which derivative is meant.

If we have ##A(t) = \begin{bmatrix} a(t) & b(t) \\ c(t) & d(t) \end{bmatrix}##, then ##A'(t) = \frac{dA(t)}{dt}## would be ##\begin{bmatrix} a'(t) & b'(t) \\ c'(t) & d'(t) \end{bmatrix}##. In this case, the equation AA' = B could mean ##A \frac{dA}{dt} = B##, and this differential equation could be solved, although not for a unique solution.
 
  • #9
Mark44 said:
It seems obvious to me that you don't understand this problem. At first you wrote that A' meant the transpose of A, then you decided that it meant the derivative of A. Is that your final answer?

Also, if A' is the derivative, which derivative is it? It doesn't make much sense to me to talk about the derivative of one matrix with respect to another; e.g., dA/dB, but it doe make sense to talk about the derivative of a matrix with respect to some variable, say t; e.g., dA/dt. Since you are so uncertain about this problem, it seems reasonable to assume that you aren't certain which derivative is meant.

If we have ##A(t) = \begin{bmatrix} a(t) & b(t) \\ c(t) & d(t) \end{bmatrix}##, then ##A'(t) = \frac{dA(t)}{dt}## would be ##\begin{bmatrix} a'(t) & b'(t) \\ c'(t) & d'(t) \end{bmatrix}##. In this case, the equation AA' = B could mean ##A \frac{dA}{dt} = B##, and this differential equation could be solved, although not for a unique solution.
dA/dB means a tensor valued derivative
 
  • #10
feynman1 said:
then should we conclude the derivative dA/dB doesn't in general exist?

I don't quite know what ##dA/dB## means, but if it presupposes that ##A## can be solved for as a function of ##B##, then yes I don't think it would make sense without extra information.
 
  • #11
After all this, what is the original questions? What is the equation? Is it with ##A'## being the transpose or is it some kond of derivative? I think it would be helpfull if you gave us more information. Also you shouldn't assume that "tensor valued derivative" is something everyone knows. Give as reference or a definition. The notation you use dA/dB is unclear as well. Is it ##dAdB^{-1}## or ##dB^{-1}dA##? After all matrices do not commute in general, so a fraction is ambiguous.
 
  • #12
martinbn said:
After all this, what is the original questions? What is the equation? Is it with ##A'## being the transpose or is it some kond of derivative? I think it would be helpfull if you gave us more information. Also you shouldn't assume that "tensor valued derivative" is something everyone knows. Give as reference or a definition. The notation you use dA/dB is unclear as well. Is it ##dAdB^{-1}## or ##dB^{-1}dA##? After all matrices do not commute in general, so a fraction is ambiguous.
dA/dB is the derivative of some component of A w.r.t some component of B, with multiple elements. It is tensor valued because both A and B are second order tensors, matrices.
 
  • #13
feynman1 said:
dA/dB is the derivative of some component of A w.r.t some component of B, with multiple elements. It is tensor valued because both A and B are second order tensors, matrices.
This doesn't actually answer my question. Anyway, can you at least tell us where the question came from? It is ok to give information about the question. There is no need for us to pry it out of you.
 
  • #14
martinbn said:
This doesn't actually answer my question. Anyway, can you at least tell us where the question came from? It is ok to give information about the question. There is no need for us to pry it out of you.
Just wanted to extend a scaler property to a tensor property without other thoughts, sorry for having no background info
 
  • #15
Well, you should really define exactly what everything means. Naively I would interpret everything as follows:
##A## and ##B## are rank-2 tensors, with components ##A_{\mu\nu}## and ##B_{\mu\nu}##. The derivative is defined as the object with components ##\frac{dA_{\mu\nu}}{dB_{\alpha\beta}}##.
Furthermore, the ##*## operation would be a 1-index contraction (as usual when multiplying matrices).
Therefore the equation
$$A*A'=B$$
is equating a 4-index object in the left with a 2-index object in the right. This makes no sense to me.

So, if you want some help. Either you wait for someone who understands your notation, or you give us an explicit definition of everything involved here.
 
  • #16
A*A'=B where * is a normal dot product. Aik*Ajk=B. A, B are matrices.
 
  • #17
martinbn said:
It is ok to give information about the question. There is no need for us to pry it out of you.
Amen to this!
 
  • #18
feynman1 said:
A*A'=B where * is a normal dot product. Aik*Ajk=B. A, B are matrices.

This makes it look like ' is just transposing again, there are no derivatives in this post.
 
  • #19
Thread closed, since we are no closer to understanding what is being asked than we were in the first post.
 

1. What is matrix multiplication?

Matrix multiplication is a mathematical operation that involves multiplying two matrices to create a new matrix. It is different from regular multiplication because it follows specific rules and properties, such as the number of columns in the first matrix must match the number of rows in the second matrix.

2. What does "A*A'=B" mean in matrix multiplication?

In this equation, A represents the original matrix, A' represents the transpose of A, and B represents the resulting matrix after multiplication. The transpose of a matrix is created by flipping the rows and columns of the original matrix.

3. How do you solve for unique A(B) in "A*A'=B"?

To solve for unique A(B), you can use the inverse of A to multiply both sides of the equation. This will result in A(B) = A'^(-1) * B, where A'^(-1) is the inverse of A'. This method will give you the unique solution for A(B).

4. Can you explain the purpose of solving for unique A(B)?

Solving for unique A(B) allows you to find the original matrix A that, when multiplied by its transpose, results in the given matrix B. This can be useful in various applications, such as data analysis, optimization problems, and solving systems of linear equations.

5. Are there any special cases in matrix multiplication that affect solving for unique A(B)?

Yes, there are special cases such as when the matrix A is not square or when the matrix A is not invertible. In these cases, it may not be possible to solve for unique A(B) using traditional methods. Other techniques, such as least squares approximation, may need to be used.

Similar threads

  • Linear and Abstract Algebra
Replies
6
Views
516
  • Linear and Abstract Algebra
Replies
5
Views
866
  • Linear and Abstract Algebra
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
133
Replies
3
Views
1K
Replies
7
Views
2K
Replies
7
Views
831
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
1K
Back
Top