Linear Algebra / Dot Product / matrices

In summary: I'm picking the ei to be the column vector with a 1 in the ith place and zeros elsewhere. So <ei,Aej> would the ijth entry of A. That makes <A^Tei,ej>=<ej,A^Tei> the jith element of A^T. And, sure, they are equal. That's what the transpose is in terms of matrices, right?
  • #1
Wildcat
116
0

Homework Statement



Let V=Rⁿ and let AεMnxn(R) Prove that <x,Ay> = <A^T x, y> for all x,yεV

Homework Equations





The Attempt at a Solution



Can someone tell me if I'm on the right track?

<x,Ay> = x^T Ay = (x^TA)y = <A^T x,Y>
 
Physics news on Phys.org
  • #3
Thanks! Can I ask a second part to this or should I open a new thread?
 
  • #4
No, you can ask in this thread if you want...
 
  • #5
Suppose BεMnxn(R), and <x,Ay> = <Bx,y> for all x,yε V. Prove that B=A^T.

Wouldn't that be the same thing? or do I have to show something else?
 
  • #6
No, this isn't the same thing (and I can't image that the proof is the same thing either).

What you actually need to show is that

[tex]<x,Ay>=<Bx,y>~\Leftrightarrow B=A^T[/tex]

One implication is already proven in the OP, now you need to prove the other one.
So to prove this, you'll need to take an arbitrary matrix B such that <x,Ay>=<Bx,y>. And you'll need to show that B is actually the transposed of A. This is something quite different from what you've already done...
 
Last edited:
  • #7
Wildcat said:
Suppose BεMnxn(R), and <x,Ay> = <Bx,y> for all x,yε V. Prove that B=A^T.

Wouldn't that be the same thing? or do I have to show something else?

Or assume B=A^T then <A^T x,y>= x^T A y = <x,Ay> ??
 
  • #8
Well, you have proven that B=A^T implies that <x,Ay>=<Bx,y>. But now you have to do the reverse. You don't know that B=A^T, that's what you need to show. So you can't say "assume B=A^T", since there is no reason why you can assume this...
 
  • #9
micromass said:
Well, you have proven that B=A^T implies that <x,Ay>=<Bx,y>. But now you have to do the reverse. You don't know that B=A^T, that's what you need to show. So you can't say "assume B=A^T", since there is no reason why you can assume this...


Thanks, I need to think a little more
 
  • #10
Wildcat said:
Thanks, I need to think a little more

so I need to show B=A^T

if <x,Ay> = <Bx,y>
x^T Ay = x^TB^Ty
A=B^T then can i say A^T = (B^T)^T = B so B = A^T ??
 
  • #11
Wildcat said:
so I need to show B=A^T

if <x,Ay> = <Bx,y>
x^T Ay = x^TB^Ty
A=B^T then can i say A^T = (B^T)^T = B so B = A^T ??

The trouble is that you can't just 'cancel' x^T and y from both sides of the equation to get A=B^T. It's not like x^T and y have inverses or anything like that. I'd suggest you pick a basis and argue from there.
 
  • #12
Dick said:
The trouble is that you can't just 'cancel' x^T and y from both sides of the equation to get A=B^T. It's not like x^T and y have inverses or anything like that. I'd suggest you pick a basis and argue from there.

I was afraid I couldn't cancel them. I'm not sure what you mean by pick a basis. I looked through the chapters on inner product spaces and adjoint of a linear operator and orthonormal basis is in some of the theorems but I'm not sure how to use that.
 
  • #13
Wildcat said:
I was afraid I couldn't cancel them. I'm not sure what you mean by pick a basis. I looked through the chapters on inner product spaces and adjoint of a linear operator and orthonormal basis is in some of the theorems but I'm not sure how to use that.

Pick a basis {ei}. The A_ij entry of the matrix in that basis is <ei,Aej>, right? I might have have the indices backwards, but do you get my point?
 
  • #14
Dick said:
Pick a basis {ei}. The A_ij entry of the matrix in that basis is <ei,Aej>, right? I might have have the indices backwards, but do you get my point?

Still not sure I'll study it awhile and see if I can figure it out.
 
  • #15
Dick said:
Pick a basis {ei}. The A_ij entry of the matrix in that basis is <ei,Aej>, right? I might have have the indices backwards, but do you get my point?

then ei^T Aej = (ei^T A) ej = <A^Tei, ej> which would be the ijth entry of A^T
?
 
  • #16
Wildcat said:
then ei^T Aej = (ei^T A) ej = <A^Tei, ej> which would be the ijth entry of A^T
?

I'm picking the ei to be the column vector with a 1 in the ith place and zeros elsewhere. So <ei,Aej> would the ijth entry of A. That makes <A^Tei,ej>=<ej,A^Tei> the jith element of A^T. And, sure, they are equal. That's what the transpose is in terms of matrices, right? The point here is that you know (x^T)Ay=(x^T)(B^T)y. If you put x=ei and y=ej, doesn't that show that the ijth entry of A is equal to the ijth entry of B^T? That's how you can conclude A=B^T.
 
  • #17
Dick said:
I'm picking the ei to be the column vector with a 1 in the ith place and zeros elsewhere. So <ei,Aej> would the ijth entry of A. That makes <A^Tei,ej>=<ej,A^Tei> the jith element of A^T. And, sure, they are equal. That's what the transpose is in terms of matrices, right? The point here is that you know (x^T)Ay=(x^T)(B^T)y. If you put x=ei and y=ej, doesn't that show that the ijth entry of A is equal to the ijth entry of B^T? That's how you can conclude A=B^T.

I see. Let me ask you something about my first statement


if <x,Ay> = <Bx,y>
x^T Ay = x^TB^Ty
A=B^T then can i say A^T = (B^T)^T = B so B = A^T

I understand I can't cancel the x^T and y but could I multiply both sides by x then wouldn't x*x^T become a number that I could cancel then do the same with right hand multiply by y^T??
 
  • #18
Wildcat said:
I see. Let me ask you something about my first statement


if <x,Ay> = <Bx,y>
x^T Ay = x^TB^Ty
A=B^T then can i say A^T = (B^T)^T = B so B = A^T

I understand I can't cancel the x^T and y but could I multiply both sides by x then wouldn't x*x^T become a number that I could cancel then do the same with right hand multiply by y^T??

x*x^T isn't a number. It's a nxn noninvertible matrix. x^T*x is a number.
 
  • #19
Dick said:
x*x^T isn't a number. It's a nxn noninvertible matrix. x^T*x is a number.

how about y * y^T that would be a number that could cancel?
then could you transpose both sides to get
(x^T Ay)^T = (x^TB^T)^T
then A^Tx = Bx which means A^T = B ?
 
  • #20
Wildcat said:
how about y * y^T that would be a number that could cancel?
then could you transpose both sides to get
(x^T Ay)^T = (x^TB^T)^T
then A^Tx = Bx which means A^T = B ?

Why are you still looking for a number to cancel? It's possible that x^TAy=x^TBy for some particular values of x and y even if A is not equal to B. You have to use the fact it's true for ALL x and y. Use an orthonormal basis.
 
  • #21
Dick said:
Why are you still looking for a number to cancel? It's possible that x^TAy=x^TBy for some particular values of x and y even if A is not equal to B. You have to use the fact it's true for ALL x and y. Use an orthonormal basis.

It's not that I'm looking for a number to cancel, necessarily, I guess I'm trying to see if what I originally started with can be done that way. I do understand using the orthonormal basis just wondering if it could be done the other way.
 

Related to Linear Algebra / Dot Product / matrices

1. What is linear algebra?

Linear algebra is a branch of mathematics that deals with the study of linear equations, vectors, vector spaces, and linear transformations. It is used to solve systems of linear equations, model real-world problems, and analyze data.

2. What is a dot product?

A dot product, also known as a scalar product, is a mathematical operation that takes two vectors and returns a scalar quantity. It is calculated by multiplying the corresponding components of the two vectors and then summing the results. The dot product is often used in physics, engineering, and statistics.

3. How is the dot product related to matrices?

The dot product is closely related to matrices as it can be represented as the product of a row vector and a column vector. In other words, the dot product of two vectors can be calculated by multiplying the transpose of one vector by the other vector, resulting in a 1x1 matrix. This is known as the inner product of matrices.

4. What are matrices used for in linear algebra?

Matrices are used in linear algebra to represent systems of linear equations, perform operations such as addition, subtraction, and multiplication, and solve problems involving transformations and vector spaces. They are also used in computer graphics, economics, and other fields to model and analyze data.

5. What are the applications of linear algebra?

Linear algebra has a wide range of applications in various fields, including physics, engineering, computer science, statistics, economics, and more. It is used to solve systems of equations, analyze data, create computer graphics, and develop algorithms for machine learning and artificial intelligence. Linear algebra is also essential in understanding and solving problems in quantum mechanics and relativity.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
322
  • Calculus and Beyond Homework Help
Replies
13
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
564
  • Calculus and Beyond Homework Help
Replies
8
Views
667
  • Calculus and Beyond Homework Help
Replies
3
Views
936
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
24
Views
867
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
669
Back
Top