- #1
Wildcat
- 116
- 0
Homework Statement
Let V=Rⁿ and let AεMnxn(R) Prove that <x,Ay> = <A^T x, y> for all x,yεV
Homework Equations
The Attempt at a Solution
Can someone tell me if I'm on the right track?
<x,Ay> = x^T Ay = (x^TA)y = <A^T x,Y>
Wildcat said:Suppose BεMnxn(R), and <x,Ay> = <Bx,y> for all x,yε V. Prove that B=A^T.
Wouldn't that be the same thing? or do I have to show something else?
micromass said:Well, you have proven that B=A^T implies that <x,Ay>=<Bx,y>. But now you have to do the reverse. You don't know that B=A^T, that's what you need to show. So you can't say "assume B=A^T", since there is no reason why you can assume this...
Wildcat said:Thanks, I need to think a little more
Wildcat said:so I need to show B=A^T
if <x,Ay> = <Bx,y>
x^T Ay = x^TB^Ty
A=B^T then can i say A^T = (B^T)^T = B so B = A^T ??
Dick said:The trouble is that you can't just 'cancel' x^T and y from both sides of the equation to get A=B^T. It's not like x^T and y have inverses or anything like that. I'd suggest you pick a basis and argue from there.
Wildcat said:I was afraid I couldn't cancel them. I'm not sure what you mean by pick a basis. I looked through the chapters on inner product spaces and adjoint of a linear operator and orthonormal basis is in some of the theorems but I'm not sure how to use that.
Dick said:Pick a basis {ei}. The A_ij entry of the matrix in that basis is <ei,Aej>, right? I might have have the indices backwards, but do you get my point?
Dick said:Pick a basis {ei}. The A_ij entry of the matrix in that basis is <ei,Aej>, right? I might have have the indices backwards, but do you get my point?
Wildcat said:then ei^T Aej = (ei^T A) ej = <A^Tei, ej> which would be the ijth entry of A^T
?
Dick said:I'm picking the ei to be the column vector with a 1 in the ith place and zeros elsewhere. So <ei,Aej> would the ijth entry of A. That makes <A^Tei,ej>=<ej,A^Tei> the jith element of A^T. And, sure, they are equal. That's what the transpose is in terms of matrices, right? The point here is that you know (x^T)Ay=(x^T)(B^T)y. If you put x=ei and y=ej, doesn't that show that the ijth entry of A is equal to the ijth entry of B^T? That's how you can conclude A=B^T.
Wildcat said:I see. Let me ask you something about my first statement
if <x,Ay> = <Bx,y>
x^T Ay = x^TB^Ty
A=B^T then can i say A^T = (B^T)^T = B so B = A^T
I understand I can't cancel the x^T and y but could I multiply both sides by x then wouldn't x*x^T become a number that I could cancel then do the same with right hand multiply by y^T??
Dick said:x*x^T isn't a number. It's a nxn noninvertible matrix. x^T*x is a number.
Wildcat said:how about y * y^T that would be a number that could cancel?
then could you transpose both sides to get
(x^T Ay)^T = (x^TB^T)^T
then A^Tx = Bx which means A^T = B ?
Dick said:Why are you still looking for a number to cancel? It's possible that x^TAy=x^TBy for some particular values of x and y even if A is not equal to B. You have to use the fact it's true for ALL x and y. Use an orthonormal basis.
Linear algebra is a branch of mathematics that deals with the study of linear equations, vectors, vector spaces, and linear transformations. It is used to solve systems of linear equations, model real-world problems, and analyze data.
A dot product, also known as a scalar product, is a mathematical operation that takes two vectors and returns a scalar quantity. It is calculated by multiplying the corresponding components of the two vectors and then summing the results. The dot product is often used in physics, engineering, and statistics.
The dot product is closely related to matrices as it can be represented as the product of a row vector and a column vector. In other words, the dot product of two vectors can be calculated by multiplying the transpose of one vector by the other vector, resulting in a 1x1 matrix. This is known as the inner product of matrices.
Matrices are used in linear algebra to represent systems of linear equations, perform operations such as addition, subtraction, and multiplication, and solve problems involving transformations and vector spaces. They are also used in computer graphics, economics, and other fields to model and analyze data.
Linear algebra has a wide range of applications in various fields, including physics, engineering, computer science, statistics, economics, and more. It is used to solve systems of equations, analyze data, create computer graphics, and develop algorithms for machine learning and artificial intelligence. Linear algebra is also essential in understanding and solving problems in quantum mechanics and relativity.