Linear Algera ?Dependent Rows?

  • Thread starter Thread starter 1nvisible
  • Start date Start date
  • Tags Tags
    Linear
1nvisible
Messages
5
Reaction score
0
If the rows of A are linearly dependent, are the rows of AB also linearly dependent? Justify answer.

I don't completely understand this question because (so far) my instructor and my textbook has not discussed what it means for rows to be linearly dependent. There is a similar question in my textbook that asked "if the columns of B are linearly dependent, show that the columns of AB are also."

Does this have something to with transverses? Can I just think about it the same way I have been with columns or does "row" dependencs mean something different?
 
Physics news on Phys.org
(AB)T = BTAT, so yes, they're more or less the same problem.

Vectors v1, …, vn are linearly independent if k1v1 + … + knvn = 0 (the zero vector) implies the scalars ki must be equal to 0.

They are linearly dependent if one of the vectors can be written as a linear combination of the others (or is contained in their span). This is equivalent to saying they're not linearly independent.

(Sorry for the multiple edits, a bit rusty.)
 
Last edited:
In particular, the rows of matrix A are linearly independent if and only if A is invertible. But then A^T is also invertible and its rows are A's columns.
 
HallsofIvy said:
In particular, the rows of matrix A are linearly independent if and only if A is invertible. But then A^T is also invertible and its rows are A's columns.

This is true only for square matrices. I believe the question is asking about matrices in more generality.

I would write the components of a row vector in AB in terms of the dot products of a row vector in A and column vectors in B. Which row vector in AB can be written as a linear combination of the other row vectors in AB?
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top