Proving Orthogonality Between Row and Column Vectors of Invertible Matrices

In summary, to show that the row vector i of matrix A and the column vector j of the inverse of A are orthogonal, you must prove that their dot product is equal to 0. This can be done by multiplying each element in the row vector by the corresponding element in the column vector and taking the sum of all those products. Additionally, to show that A-1 is truly the inverse of A, you can use the fact that in AA-1=I, everything off the diagonal is 0, which further supports the idea that the dot product of the row and column vector is equal to 0 when i≠j.
  • #1
Clandry
74
0
Let A be an n × n invertible matrix. Show that if
i ≠ j, then row vector i of A and column vector
j of A-1 are orthogonal.



I'm lost in regards to where to lost.

I want to show that a vector from row vector i from A is orthogonal to a column vector j from A.
Orthogonal means the dot product is 0 or the angle between the 2 vectors is 90 degrees.

After stating the obvious I'm stuck. I think I need to start with figuring out what the relationship between a row vector from A and a column vector from A^-1 is, but how do I do that?
 
Physics news on Phys.org
  • #2
How would you show that the dot product of (row vector i of A) and (column vector j of A-1) is equal to 0? And in what kind of operation would you perform several such calculations between row vectors of one matrix and column vectors of another?
 
  • #3
If v represents the row vector and w represents the column vector.
I must show that v*wt=0. * means dot product.

The dot product would be then computed by multiplying each element in v by each corresponding element in w and taking the sum of all those products. But I'm not sure how this will show what the problem wants.
 
  • #4
Apart from calculating dot products, when does one multiply the elements of a row with the elements of a column and take the sum of the results?

If that's no help, try this: how would you prove that A-1 really is an inverse of A? What kind of calculation could you perform to show this?
 
  • #5
michael redei said:
apart from calculating dot products, when does one multiply the elements of a row with the elements of a column and take the sum of the results?

If that's no help, try this: How would you prove that a-1 really is an inverse of a? What kind of calculation could you perform to show this?
a-1*a=i?
 
  • #6
How does one multiply one matrix by another? Say you wanted to multiply A by A-1. Can you explain how you'd obtain the element in row i and column j of the resulting matrix?
 
  • #7
Michael Redei said:
How does one multiply one matrix by another? Say you wanted to multiply A by A-1. Can you explain how you'd obtain the element in row i and column j of the resulting matrix?

OH! I see what you're saying. You'd multiply the row by the column and sum up the products.
This would explain how to multiply the 2 vectors together.
 
  • #8
All you need now is some criterion to decide which elements in AA-1 can be zero and which can't.
 
  • #9
Michael Redei said:
All you need now is some criterion to decide which elements in AA-1 can be zero and which can't.

Okay Thanks for the help.

I am still having trouble. Or maybe I understand it but just not realize it.

In AA-1=I, everything off the diagonal is 0. Everything on the diagonal is nonzero, but that's when i=j. Everything off the diagonal is 0 b/c i≠j.

Is that the idea?
 
  • #10
That's exactly the right idea. You can safely ignore the diagonal, since your initial question was only about i≠j.
 

Related to Proving Orthogonality Between Row and Column Vectors of Invertible Matrices

1. What is an orthogonal vector?

An orthogonal vector is a vector that is perpendicular to another vector, meaning that the two vectors form a 90-degree angle with each other.

2. How can I prove that two vectors are orthogonal?

To prove that two vectors are orthogonal, you can use the dot product. If the dot product of two vectors is equal to 0, then the vectors are orthogonal.

3. What is the significance of orthogonal vectors in mathematics?

Orthogonal vectors play an important role in mathematics as they can simplify calculations and provide a geometric understanding of vector operations. They are also used in applications such as physics, engineering, and computer graphics.

4. Can two non-zero vectors ever be orthogonal?

No, two non-zero vectors cannot be orthogonal. For two vectors to be orthogonal, their dot product must be equal to 0, which is not possible if the vectors have non-zero magnitudes.

5. Is the concept of orthogonal vectors limited to just two dimensions?

No, the concept of orthogonal vectors can be extended to any number of dimensions. In higher dimensions, orthogonal vectors are still defined as vectors that form a 90-degree angle with each other.

Similar threads

  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
14
Views
609
  • Calculus and Beyond Homework Help
Replies
16
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
809
  • Calculus and Beyond Homework Help
Replies
11
Views
437
  • Linear and Abstract Algebra
Replies
9
Views
236
  • Calculus and Beyond Homework Help
Replies
15
Views
2K
  • Linear and Abstract Algebra
Replies
4
Views
901
Back
Top