Row/Column space in relation to row operations

Click For Summary
Row operations on a matrix preserve the linear dependence relations of its columns but not its rows, which is crucial for understanding row and column spaces. When finding a basis for the column space, the pivot positions in the row-reduced form correspond to the original matrix, allowing for the identification of linearly independent columns. However, the row space is equivalent to the row-reduced form, meaning the linear dependence relations for rows change with row operations. This discrepancy arises because elementary row operations affect the rows differently than they do the columns, leading to different pivot positions. Understanding this distinction is essential for grasping the relationship between row and column spaces in linear algebra.
Gridvvk
Messages
54
Reaction score
1
I'm having trouble wrapping my head around what should be a trivial detail, but it is important, so hopefully someone else putting it in explicit words might help me understand it.

What I am having trouble grasping is why do row operations preserve linear dependence relations for the columns of a matrix but not the rows?

The context this comes up is in regards to the row & column space of a matrix. Given a matrix A, to find a basis for the column space we would just take the linearly independent columns of A. However, usually it's difficult to tell what columns are independent, so we find rref(A) and the pivot positions in rref(A) correspond directly to the pivot positions in A, this is true because row operations preserve linear dependence relations for the columns.

For the row space we would take the linearly independent rows of rref(A) this is because the row space of A is equivalent to rref(A); however, the dependence relations for the rows are not the same.

So if I can understand why the dependence relations are the same for columns but different for rows, it would really help me connect everything together.
 
Physics news on Phys.org
Let A be an n\times m matrix.
- Say you want to switch two rows of A to create a matrix B. Is there an n\times n matrix E you can write down that will carry out the operation for you? i.e. Can you choose E to ensure EA=B?
- Say you want to scale some row of A by a nonzero constant to create a matrix C. Is there an n\times n matrix F you can write down that will carry out the operation for you? i.e. Can you choose F to ensure FA=C?
- Say you want to add a multiple of one row of A to another row, to create a matrix D. Is there an n\times n matrix G you can write down that will carry out the operation for you? i.e. Can you choose G to ensure GA=D?

Now that you've figured out what E,F,G all look like, do you notice any feature they all have? [Hint: Do they have any feature that will ensure, for instance, that EA and A have the same number of linearly independent rows?]
 
economicsnerd said:
Let A be an n\times m matrix.
- Say you want to switch two rows of A to create a matrix B. Is there an n\times n matrix E you can write down that will carry out the operation for you? i.e. Can you choose E to ensure EA=B?

Yes E would be the elementary matrix corresponding to the same operation on the identity matrix.
economicsnerd said:
Say you want to scale some row of A by a nonzero constant to create a matrix C. Is there an n\times n matrix F you can write down that will carry out the operation for you? i.e. Can you choose F to ensure FA=C?

F would be the elementary corresponding to the same scaling on the identity matrix.

economicsnerd said:
Say you want to add a multiple of one row of A to another row, to create a matrix D. Is there an n\times n matrix G you can write down that will carry out the operation for you? i.e. Can you choose G to ensure GA=D?

G would also be elementary matrix formed by doing the same operation on the n by n identity.

economicsnerd said:
Now that you've figured out what E,F,G all look like, do you notice any feature they all have? [Hint: Do they have any feature that will ensure, for instance, that EA and A have the same number of linearly independent rows?]

E,F, and G are elementary matrices. EA and A should have the same number of linearly independent rows, but why don't the same pivot positions for these rows in EA correspond to the pivot positions in A?
 
If you perform elementary column operations instead of row operations, then the linear relations between the rows are preserved, but not those between the colums. On the other hand, the column space is unchanged, but not the row space.
 
Erland said:
If you perform elementary column operations instead of row operations, then the linear relations between the rows are preserved, but not those between the colums. On the other hand, the column space is unchanged, but not the row space.

Yes. That would be logically true, if you buy that row operations preserve linear relations between columns but not rows. You wouldn't even need to invent column operations you can instead claim the original proposition for the transpose of a matrix.

That still doesn't explain why it is true.
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
1
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 9 ·
Replies
9
Views
5K