Solving Linear Equations: $Ax=b$ and Rank of A

In summary: Basically, when we say that $r(A)=m$, we're excluding all non-linear equations from the system, is that it?Yes, that is correct.
  • #1
Velo
17
0
So, my linear algebra book, if you can call it that, says the following:

$Ax=b$ is a system of linear equations with $m$ equations and $n$ variables. ${v}_{1}, {v}_{2}, ..., {v}_{n}$ are the vectors in the columns of $A$. The following are equivalent:

(1) The system $Ax=b$ is possible for every vector $b\in{\Bbb{R}}_{m}$.
(2) Every vector $b\in{\Bbb{R}}_{m}$ is a linear combination of $A$'s columns.
(3) $b\in span\left\{{v}_{1}, {v}_{2}, ..., {v}_{n}\right\}$ for every $b\in{\Bbb{R}}_{m}$.
(4) $span\left\{{v}_{1}, {v}_{2}, ..., {v}_{n}\right\}={\Bbb{R}}_{m}$
(5) $r(A)=m$

I don't get why $r(A)=m$ necessarily if the system is possible... Wouldn't, for example, the matrix:

1 1 1 | 0
0 0 1 | 0
0 0 0 | 0

Obtained after applying the Gauss theorem, be possible? Because x=-y, z = 0, and y could take on any arbitrary value?
And $r(A) = 2$, which is less than the initial number of equations...
 
Physics news on Phys.org
  • #2
If $Ax = b$ is solvable for every $b\in \Bbb R_m$, then every vector in $\Bbb R_m$ is an element of the column space of $A$. Hence, $\Bbb R_m$ equals the column space of $A$. The column space has dimension $r(A)$ and $\Bbb R_m$ has dimension $m$, so $r(A) = m$.
 
  • #3
But why is the column space's dimension $r(A)$? Wouldn't that be assuming that the system is linearly independent? Is that what they meant when they said that $Ax=b$ is possible/solvable? 'Cause the way I interpreted it, I thought that all (1) was saying is that the system was consistent... Did I misinterpret?
 
  • #4
Velo said:
But why is the column space's dimension $r(A)$? Wouldn't that be assuming that the system is linearly independent? Is that what they meant when they said that $Ax=b$ is possible/solvable? 'Cause the way I interpreted it, I thought that all (1) was saying is that the system was consistent... Did I misinterpret?

I believe that (1) is saying that the system $A\mathbf{x} = \mathbf{b}$ is consistent for every $\mathbf{b} \in \mathbb{R}^m$, not just for one particular $\mathbf{b}$. Then (1) and (5) are indeed equivalent.

While I am not a native speaker of English, I don't think the phrase "the system is possible" is customary. Usually one says that the system is consistent for one or more choices of the right-hand side, or that it is possible to solve $A\mathbf{x} = \mathbf{b}$ for one or more choices of $\mathbf{b}$.

Also, in the particular case the matrix $A$ is square ($m = n$) and $A\mathbf{x} = \mathbf{b}$ is consistent for every choice of $\mathbf{b}$, then $A$ itself is called "nonsingular".
 
  • #5
Yes, I apologize. English is not my native language either, and most of the mathematical concepts I know are in Portuguese, so I have a rough time translating them sometimes :') I think my confusion comes mostly from when a matrix equation ends up having something like $0=0$ though... Technically, that equation isn't linear since is doesn't define a line, right? So it's irrelevant if that equation shows up in the system or not... And so...

1 0 0 | 0
0 0 1 | 0

and

1 0 0 | 0
0 0 1 | 0
0 0 0 | 0

Would both represent the same system of equations (the same line in this case)? Even though the matrix A in the first example is in ${\Bbb{R}}_{2}$ and in the second example it's in ${\Bbb{R}}_{3}$... Because, since the value of the 3rd position in all of the columns in that matrix is 0, it's almost as if we're not using that third dimension at all... Basically, when we say that $r(A)=m$, we're excluding all non-linear equations from the system, is that it?
 
  • #6
Velo said:
Yes, I apologize.
No need for that.

Velo said:
I think my confusion comes mostly from when a matrix equation ends up having something like $0=0$ though... Technically, that equation isn't linear since is doesn't define a line, right? So it's irrelevant if that equation shows up in the system or not...
What $0 = 0$ really means, is
\[
0\cdot x_1 + 0\cdot x_2 + \ldots + 0\cdot x_n = 0
\]
and of course this equation by itself is satisfied by every point $\mathbf{x} \in \mathbb{R}^n$. In this sense, the above equation is linear, because its solution set is a linear subspace of $\mathbb{R}^n$, namely $\mathbb{R}^n$ itself. The equation does not impose any extra conditions on $\mathbf{x}$.

Velo said:
And so...

1 0 0 | 0
0 0 1 | 0

and

1 0 0 | 0
0 0 1 | 0
0 0 0 | 0

Would both represent the same system of equations (the same line in this case)?

The two augmented matrices that you gave represent the same system, in the sense that their solution sets (i.e. the sets of all $\mathbf{x} \in \mathbb{R}^3$ satisfying the respective system) are the same.

Velo said:
Even though the matrix A in the first example is in ${\Bbb{R}}_{2}$ and in the second example it's in ${\Bbb{R}}_{3}$... Because, since the value of the 3rd position in all of the columns in that matrix is 0, it's almost as if we're not using that third dimension at all... Basically, when we say that $r(A)=m$, we're excluding all non-linear equations from the system, is that it?

The matrices are not in $\mathbb{R}^2$ or $\mathbb{R}^3$. Rather, the first augmented matrix represents a linear system with a right-hand side $\mathbf{b} \in \mathbb{R}^2$, while the second augmented matrix represents a linear system with a right-hand side $\mathbf{b} \in \mathbb{R}^3$.

So, in the first case, you have $A \in \mathbb{R}^{2 \times 3}$ and $\mathbf{b} \in \mathbb{R^2}$.
In the second case, you have $A \in \mathbb{R}^{3 \times 3}$ and $\mathbf{b} \in \mathbb{R^3}$.
In both cases the augmented system is written $(A\,|\, \mathbf{b})$.

The first matrix $A$ has rank two. It is the maximal rank that any $2 \times 3$ matrix can have.
The second matrix $A$ has rank two as well, while the maximal rank of any $3 \times 3$ matrix is three.
So, according to your theorem of equivalences, there must be at least one $\mathbf{b} \in \mathbb{R}^3$ for which $(A\,|\,\mathbf{b})$ is not consistent.
Can you find an example?
 
Last edited:
  • #7
Something along the lines of ${[0, 0, 1]}^{t}$? Or really any vector with a third position different from zero...
 
  • #8
Velo said:
Something along the lines of ${[0, 0, 1]}^{t}$? Or really any vector with a third position different from zero...

Not sure what you're aiming for here.
Anyway, the rank of the matrix is equal to the number of independent rows, which is also equal to the number of independent columns.
And if the system is supposed to be solvable for any $b$, we require at least as many independent rows/columns as there are elements in $b$.
 
  • #9
I like Serena said:
Not sure what you're aiming for here.
Krylov asked me to give an example of one $b∈{\Bbb{R}}^{3}∈$ for which $(A|b)$ is not consistent... With A equal to the second matrix I assumed... So if the last position of the b vector is non zero (1 in my example), the equation would be something like $0{x}_{1}+0{x}_{2}+0{x}_{3}=1$, or $0 = 1$, so the system wouldn't be consistent... Right?

I like Serena said:
Anyway, the rank of the matrix is equal to the number of independent rows, which is also equal to the number of independent columns.
And if the system is supposed to be solvable for any $b$, we require at least as many independent rows/columns as there are elements in $b$.
I see... I didn't know that the number of linear rows was always equal to the number of linear columns... So, the vectors of the transpose of a matrix will always be independent if the vectors of the original matrix were independent as well?

Edit: Spelling mistakes.
 
Last edited:
  • #10
Velo said:
Krylov asked me to give an example of a one $b∈{\Bbb{R}}^{3}∈$ for which $(A|b)$ is not consistent... With A equal to the second matrix I assumed... So if the last position of the b vector is non zero (1 in my example), the equation would be something like $0{x}_{1}+0{x}_{2}+0{x}_{3}=1$, or $0 = 1$, so the system wouldn't be consistent... Right?
Exactly.
 

1. What are linear equations?

Linear equations are mathematical expressions that involve variables raised to the first power and have a constant term. They can be written in the form of Ax = b, where A is a square matrix, x is a column vector of variables, and b is a column vector of constants.

2. How do you solve linear equations?

To solve a linear equation Ax = b, you can use the inverse of matrix A. Multiply both sides of the equation by A^-1 to isolate the variable vector x. The resulting solution will be x = A^-1 * b.

3. What is the rank of a matrix?

The rank of a matrix is the maximum number of linearly independent rows or columns in the matrix. It represents the dimension of the vector space spanned by the rows or columns of the matrix. It is also equal to the number of non-zero rows or columns in the matrix after it has been put into row-echelon form.

4. How does the rank of a matrix affect the solution of a linear equation?

The rank of a matrix determines whether a linear equation has a unique solution, infinitely many solutions, or no solution at all. If the rank of the coefficient matrix A is equal to the rank of the augmented matrix [A | b], then the system has a unique solution. If the ranks are unequal, the system has either infinitely many solutions or no solution.

5. Can a matrix have a rank of 0?

No, a matrix cannot have a rank of 0. A matrix must have at least one non-zero row or column to have a rank. If all the entries in a matrix are 0, then the rank of the matrix is 0. However, if at least one entry is non-zero, the rank will be at least 1.

Similar threads

  • Linear and Abstract Algebra
Replies
4
Views
883
  • Linear and Abstract Algebra
Replies
5
Views
870
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
1K
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
7
Views
1K
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
Back
Top