Proving Orthogonality Between Row and Column Vectors of Invertible Matrices

Click For Summary

Homework Help Overview

The discussion revolves around proving the orthogonality between row and column vectors of an invertible matrix, specifically focusing on the relationship between row vector i of matrix A and column vector j of its inverse A-1. Participants explore the implications of orthogonality in the context of linear algebra.

Discussion Character

  • Conceptual clarification, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants express confusion about how to establish the orthogonality condition, particularly through the dot product of the specified vectors. Questions arise regarding the operations needed to demonstrate this relationship and the nature of matrix multiplication.

Discussion Status

There is an ongoing exploration of the mathematical principles involved, with some participants suggesting methods for calculating dot products and others questioning the foundational definitions of matrix operations. While some clarity is emerging, particularly regarding the structure of the identity matrix resulting from the multiplication of A and A-1, no consensus has been reached on the specific steps to prove the orthogonality.

Contextual Notes

Participants note the importance of distinguishing between cases where i is equal to j and where i is not equal to j, particularly in relation to the identity matrix's properties. There is also mention of the need for a criterion to identify which elements in the product of A and A-1 can be zero.

Clandry
Messages
74
Reaction score
0
Let A be an n × n invertible matrix. Show that if
i ≠ j, then row vector i of A and column vector
j of A-1 are orthogonal.



I'm lost in regards to where to lost.

I want to show that a vector from row vector i from A is orthogonal to a column vector j from A.
Orthogonal means the dot product is 0 or the angle between the 2 vectors is 90 degrees.

After stating the obvious I'm stuck. I think I need to start with figuring out what the relationship between a row vector from A and a column vector from A^-1 is, but how do I do that?
 
Physics news on Phys.org
How would you show that the dot product of (row vector i of A) and (column vector j of A-1) is equal to 0? And in what kind of operation would you perform several such calculations between row vectors of one matrix and column vectors of another?
 
If v represents the row vector and w represents the column vector.
I must show that v*wt=0. * means dot product.

The dot product would be then computed by multiplying each element in v by each corresponding element in w and taking the sum of all those products. But I'm not sure how this will show what the problem wants.
 
Apart from calculating dot products, when does one multiply the elements of a row with the elements of a column and take the sum of the results?

If that's no help, try this: how would you prove that A-1 really is an inverse of A? What kind of calculation could you perform to show this?
 
michael redei said:
apart from calculating dot products, when does one multiply the elements of a row with the elements of a column and take the sum of the results?

If that's no help, try this: How would you prove that a-1 really is an inverse of a? What kind of calculation could you perform to show this?
a-1*a=i?
 
How does one multiply one matrix by another? Say you wanted to multiply A by A-1. Can you explain how you'd obtain the element in row i and column j of the resulting matrix?
 
Michael Redei said:
How does one multiply one matrix by another? Say you wanted to multiply A by A-1. Can you explain how you'd obtain the element in row i and column j of the resulting matrix?

OH! I see what you're saying. You'd multiply the row by the column and sum up the products.
This would explain how to multiply the 2 vectors together.
 
All you need now is some criterion to decide which elements in AA-1 can be zero and which can't.
 
Michael Redei said:
All you need now is some criterion to decide which elements in AA-1 can be zero and which can't.

Okay Thanks for the help.

I am still having trouble. Or maybe I understand it but just not realize it.

In AA-1=I, everything off the diagonal is 0. Everything on the diagonal is nonzero, but that's when i=j. Everything off the diagonal is 0 b/c i≠j.

Is that the idea?
 
  • #10
That's exactly the right idea. You can safely ignore the diagonal, since your initial question was only about i≠j.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
15
Views
2K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 11 ·
Replies
11
Views
1K