Linear Algera ?Dependent Rows?

  • Thread starter Thread starter 1nvisible
  • Start date Start date
  • Tags Tags
    Linear
Click For Summary

Homework Help Overview

The discussion revolves around the concept of linear dependence of rows in matrices, specifically questioning whether the linear dependence of the rows of matrix A implies the linear dependence of the rows of the product AB. The subject area is linear algebra, focusing on matrix properties and relationships.

Discussion Character

  • Exploratory, Conceptual clarification, Assumption checking

Approaches and Questions Raised

  • Participants explore the definition of linear dependence and its implications for rows of matrices. There is a question about the relationship between rows and columns, and whether the concepts apply similarly. Some participants discuss the properties of invertible matrices and their relation to linear independence.

Discussion Status

The discussion is ongoing, with participants providing insights and clarifications regarding the definitions and properties of linear dependence. Some guidance has been offered regarding the relationship between rows and columns, but no consensus has been reached on the original question.

Contextual Notes

There is mention of potential constraints regarding the types of matrices being discussed, particularly the distinction between square matrices and more general cases. The original poster expresses uncertainty about the definitions being used in their course materials.

1nvisible
Messages
5
Reaction score
0
If the rows of A are linearly dependent, are the rows of AB also linearly dependent? Justify answer.

I don't completely understand this question because (so far) my instructor and my textbook has not discussed what it means for rows to be linearly dependent. There is a similar question in my textbook that asked "if the columns of B are linearly dependent, show that the columns of AB are also."

Does this have something to with transverses? Can I just think about it the same way I have been with columns or does "row" dependencs mean something different?
 
Physics news on Phys.org
(AB)T = BTAT, so yes, they're more or less the same problem.

Vectors v1, …, vn are linearly independent if k1v1 + … + knvn = 0 (the zero vector) implies the scalars ki must be equal to 0.

They are linearly dependent if one of the vectors can be written as a linear combination of the others (or is contained in their span). This is equivalent to saying they're not linearly independent.

(Sorry for the multiple edits, a bit rusty.)
 
Last edited:
In particular, the rows of matrix A are linearly independent if and only if A is invertible. But then [itex]A^T[/itex] is also invertible and its rows are A's columns.
 
HallsofIvy said:
In particular, the rows of matrix A are linearly independent if and only if A is invertible. But then [itex]A^T[/itex] is also invertible and its rows are A's columns.

This is true only for square matrices. I believe the question is asking about matrices in more generality.

I would write the components of a row vector in AB in terms of the dot products of a row vector in A and column vectors in B. Which row vector in AB can be written as a linear combination of the other row vectors in AB?
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
7K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 5 ·
Replies
5
Views
3K