Understanding Ax=b: Linear Dependency & Solutions

  • Thread starter Thread starter CalculusSandwich
  • Start date Start date
  • Tags Tags
    Confused
CalculusSandwich
Messages
16
Reaction score
0
So it states: The Equation Ax=b has a solution if and only if b is a linear combination of the columns of A.

That means the columns of A are linearly dependent.

So then if I have a matrix A and a vector B, and after row reduction on Ax=B i get, the identity matrix.

So does that imply that Ax=B has no solutions?

Or that Ax=B has the trivial solution.
 
Physics news on Phys.org
Your question is very confused. In particular you say "after row reduction on Ax= B" but you don't apply row reduction to an equation. There are two different ways I could interpret that:
One is that you applied row reduction to the matrix A alone and got the identity matrix. If that were true then there would exist a single solution, got by applying the same row reduction to B.

The other is that you applied row reduction to the augmented matrix, A with B added as an additional column. But it would make no sense for you to get "the identity matrix" because the identity matrix is a square matrix while the augmented matrix is not- it will have one more column than rows. If you mean you got a (non-square) matrix having "1" along its diagonal and "0" elsewhere, then you have reduce the corresponding system of equations to one where the last equation is "0= 1" which is false. In that case the system and so matrix equation has no solution.
 
CalculusSandwich said:
So it states: The Equation Ax=b has a solution if and only if b is a linear combination of the columns of A.

That means the columns of A are linearly dependent.
It doesn't mean that at all. It means that b must be in the subspace spanned by the columns of A for Ax=b to have a solution. If the subspace spanned by the columns of A is equal to the space from which b can be drawn, the equation Ax=b has a solution for every b.
 
maybe you meant the augmented matrix Ab has dependent columns?
 
CalculusSandwich said:
So then if I have a matrix A and a vector B, and after row reduction on Ax=B i get, the identity matrix.

So does that imply that Ax=B has no solutions?
Assuming you meant row-reduction on the augmented matrix. No. It has no solutions if there exists any row of the matrix which consists solely of 0's but yet the corresponding right augmented row entry (ie. vector b) is non-zero.

Or that Ax=B has the trivial solution.
Trivial solution applies only to the homogenous linear system of equations ie. b is the zero vector. What you meant is that the original matrix on the left is the identity matrix, but the above condition (implying there are no solutions) isn't true. Then you can say it has only 1 solution.
 
Defennder said:
Assuming you meant row-reduction on the augmented matrix. No. It has no solutions if there exists any row of the matrix which consists solely of 0's but yet the corresponding right augmented row entry (ie. vector b) is non-zero.

Trivial solution applies only to the homogenous linear system of equations ie. b is the zero vector. What you meant is that the original matrix on the left is the identity matrix, but the above condition (implying there are no solutions) isn't true. Then you can say it has only 1 solution.

Ok I'm confused here. Thanks for the replies, but yes Ivy I meant the Augmented matrix AX=B is written in matrix form and row reduced. You say that there will be a rows of 0's, but there won't because B is the solution set, so won't it just be the square matrix if A is a 3x2 and b is a 3x1? Am i correct in the assumption that I would only get x1, and x2 solution while x3 is free?

Also Defennder you say that when i row reduce and i get a row of 0's that there is no solution. But my book states that in that case there is a free variable, in the col with no pivot in it, and therefore the solution AX=B has infinite solutions, depending on what value you assign to the free variable.

Ok I understand that trivial and nontrivial solutions are assignment to Ax=0, so then Ax=b, can either have 1, infinite, or no solutions?
 
CalculusSandwich said:
Also Defennder you say that when i row reduce and i get a row of 0's that there is no solution.
No, if you get a row of 0's and the corresponding entry on the same row of the augmented column is non-zero, then there are no solutions. Getting a row of 0's throughout means there are infinite number of solutions.

Eg. Assuming this is what you get after row-reduction: \left( \begin{array}{ccc} 1&1&3\\0&0&2\end{array} \right) \mbox{for matrix eqn} \ Ax=b and there are two equations for 2 variables.

Then the system of equations has no solutions.

Ok I understand that trivial and nontrivial solutions are assignment to Ax=0, so then Ax=b, can either have 1, infinite, or no solutions?
Yes.
 
Last edited:
It may be useful to think about such things geometrically.

If A is a matrix and x is a vector, then Ax is another vector that is a linear combination of the columns of A:
Ax = \left[\begin{matrix}\vec{a_1} & | & \vec{a_2} & | & ... & | & \vec{a_m}\end{matrix}\right] \left[\begin{matrix}x_1 \\ x_2 \\ \vdots \\ x_m\end{matrix}\right] = x_1 \vec{a_1} + x_2 \vec{a_2} + ... + x_m \vec{a_m}

So you can see that if Ax = b is true, then b has to be a linear combination of the columns of A.

---

For example, suppose
A = \left[\begin{matrix}1 & 0 & 1 \\ 0 & 1 & 1 \\ 0 & 0 & 0\end{matrix}\right]

Then the columns of A all lie on the xy plane in 3D ([1 0 0]',[0 1 0]', and [1 1 0]'). Since there is no possible way to combine vectors on the xy plane to get a vector off of the plane, you know that the only possible b's that could ever satisfy Ax=b are when b lies on the xy plane (b=[bx,by,0]').
 
CalculusSandwich said:
So it states: The Equation Ax=b has a solution if and only if b is a linear combination of the columns of A.

That means the columns of A are linearly dependent.
No it doesn't. If the columns of A were linearly independent, they would span all of the underlying vector space and so Ax= b would have a solution for all b. If the columns of A are not linearly independent they they span a subspace, Ax is in that subspace for all x, and so Ax= b only if b is in that subspace: if b is a linear combination of A.

So then if I have a matrix A and a vector B, and after row reduction on Ax=B i get, the identity matrix.
Only if the columns of A are linearly independent!

So does that imply that Ax=B has no solutions
Or that Ax=B has the trivial solution.
Neither. It implies that Ax= B has a unique solution for all vectors B. It implies that the equation Ax= 0 has the unique solution x= 0 which is the "trivial solution".

If the columns of A are not linearly independent (so A is not invertible) then Ax= B either has an infinite number of solutions (if B is in the subspace spanned by the columns of A) or there is no solution (if B is not in the subspace spanned by the columns of A).

(If the columns of A are not linearly indepenent, then the null space or "kernel" of A is non-trivial. If x is a solution to Ax= B, then so is x+ y for any y in the null space of A. That's why Ax= B has an infinite number of solutions in this case.)
 
Back
Top