MHB Solving Linear Equations: $Ax=b$ and Rank of A

Velo
Messages
17
Reaction score
0
So, my linear algebra book, if you can call it that, says the following:

$Ax=b$ is a system of linear equations with $m$ equations and $n$ variables. ${v}_{1}, {v}_{2}, ..., {v}_{n}$ are the vectors in the columns of $A$. The following are equivalent:

(1) The system $Ax=b$ is possible for every vector $b\in{\Bbb{R}}_{m}$.
(2) Every vector $b\in{\Bbb{R}}_{m}$ is a linear combination of $A$'s columns.
(3) $b\in span\left\{{v}_{1}, {v}_{2}, ..., {v}_{n}\right\}$ for every $b\in{\Bbb{R}}_{m}$.
(4) $span\left\{{v}_{1}, {v}_{2}, ..., {v}_{n}\right\}={\Bbb{R}}_{m}$
(5) $r(A)=m$

I don't get why $r(A)=m$ necessarily if the system is possible... Wouldn't, for example, the matrix:

1 1 1 | 0
0 0 1 | 0
0 0 0 | 0

Obtained after applying the Gauss theorem, be possible? Because x=-y, z = 0, and y could take on any arbitrary value?
And $r(A) = 2$, which is less than the initial number of equations...
 
Physics news on Phys.org
If $Ax = b$ is solvable for every $b\in \Bbb R_m$, then every vector in $\Bbb R_m$ is an element of the column space of $A$. Hence, $\Bbb R_m$ equals the column space of $A$. The column space has dimension $r(A)$ and $\Bbb R_m$ has dimension $m$, so $r(A) = m$.
 
But why is the column space's dimension $r(A)$? Wouldn't that be assuming that the system is linearly independent? Is that what they meant when they said that $Ax=b$ is possible/solvable? 'Cause the way I interpreted it, I thought that all (1) was saying is that the system was consistent... Did I misinterpret?
 
Velo said:
But why is the column space's dimension $r(A)$? Wouldn't that be assuming that the system is linearly independent? Is that what they meant when they said that $Ax=b$ is possible/solvable? 'Cause the way I interpreted it, I thought that all (1) was saying is that the system was consistent... Did I misinterpret?

I believe that (1) is saying that the system $A\mathbf{x} = \mathbf{b}$ is consistent for every $\mathbf{b} \in \mathbb{R}^m$, not just for one particular $\mathbf{b}$. Then (1) and (5) are indeed equivalent.

While I am not a native speaker of English, I don't think the phrase "the system is possible" is customary. Usually one says that the system is consistent for one or more choices of the right-hand side, or that it is possible to solve $A\mathbf{x} = \mathbf{b}$ for one or more choices of $\mathbf{b}$.

Also, in the particular case the matrix $A$ is square ($m = n$) and $A\mathbf{x} = \mathbf{b}$ is consistent for every choice of $\mathbf{b}$, then $A$ itself is called "nonsingular".
 
Yes, I apologize. English is not my native language either, and most of the mathematical concepts I know are in Portuguese, so I have a rough time translating them sometimes :') I think my confusion comes mostly from when a matrix equation ends up having something like $0=0$ though... Technically, that equation isn't linear since is doesn't define a line, right? So it's irrelevant if that equation shows up in the system or not... And so...

1 0 0 | 0
0 0 1 | 0

and

1 0 0 | 0
0 0 1 | 0
0 0 0 | 0

Would both represent the same system of equations (the same line in this case)? Even though the matrix A in the first example is in ${\Bbb{R}}_{2}$ and in the second example it's in ${\Bbb{R}}_{3}$... Because, since the value of the 3rd position in all of the columns in that matrix is 0, it's almost as if we're not using that third dimension at all... Basically, when we say that $r(A)=m$, we're excluding all non-linear equations from the system, is that it?
 
Velo said:
Yes, I apologize.
No need for that.

Velo said:
I think my confusion comes mostly from when a matrix equation ends up having something like $0=0$ though... Technically, that equation isn't linear since is doesn't define a line, right? So it's irrelevant if that equation shows up in the system or not...
What $0 = 0$ really means, is
\[
0\cdot x_1 + 0\cdot x_2 + \ldots + 0\cdot x_n = 0
\]
and of course this equation by itself is satisfied by every point $\mathbf{x} \in \mathbb{R}^n$. In this sense, the above equation is linear, because its solution set is a linear subspace of $\mathbb{R}^n$, namely $\mathbb{R}^n$ itself. The equation does not impose any extra conditions on $\mathbf{x}$.

Velo said:
And so...

1 0 0 | 0
0 0 1 | 0

and

1 0 0 | 0
0 0 1 | 0
0 0 0 | 0

Would both represent the same system of equations (the same line in this case)?

The two augmented matrices that you gave represent the same system, in the sense that their solution sets (i.e. the sets of all $\mathbf{x} \in \mathbb{R}^3$ satisfying the respective system) are the same.

Velo said:
Even though the matrix A in the first example is in ${\Bbb{R}}_{2}$ and in the second example it's in ${\Bbb{R}}_{3}$... Because, since the value of the 3rd position in all of the columns in that matrix is 0, it's almost as if we're not using that third dimension at all... Basically, when we say that $r(A)=m$, we're excluding all non-linear equations from the system, is that it?

The matrices are not in $\mathbb{R}^2$ or $\mathbb{R}^3$. Rather, the first augmented matrix represents a linear system with a right-hand side $\mathbf{b} \in \mathbb{R}^2$, while the second augmented matrix represents a linear system with a right-hand side $\mathbf{b} \in \mathbb{R}^3$.

So, in the first case, you have $A \in \mathbb{R}^{2 \times 3}$ and $\mathbf{b} \in \mathbb{R^2}$.
In the second case, you have $A \in \mathbb{R}^{3 \times 3}$ and $\mathbf{b} \in \mathbb{R^3}$.
In both cases the augmented system is written $(A\,|\, \mathbf{b})$.

The first matrix $A$ has rank two. It is the maximal rank that any $2 \times 3$ matrix can have.
The second matrix $A$ has rank two as well, while the maximal rank of any $3 \times 3$ matrix is three.
So, according to your theorem of equivalences, there must be at least one $\mathbf{b} \in \mathbb{R}^3$ for which $(A\,|\,\mathbf{b})$ is not consistent.
Can you find an example?
 
Last edited:
Something along the lines of ${[0, 0, 1]}^{t}$? Or really any vector with a third position different from zero...
 
Velo said:
Something along the lines of ${[0, 0, 1]}^{t}$? Or really any vector with a third position different from zero...

Not sure what you're aiming for here.
Anyway, the rank of the matrix is equal to the number of independent rows, which is also equal to the number of independent columns.
And if the system is supposed to be solvable for any $b$, we require at least as many independent rows/columns as there are elements in $b$.
 
I like Serena said:
Not sure what you're aiming for here.
Krylov asked me to give an example of one $b∈{\Bbb{R}}^{3}∈$ for which $(A|b)$ is not consistent... With A equal to the second matrix I assumed... So if the last position of the b vector is non zero (1 in my example), the equation would be something like $0{x}_{1}+0{x}_{2}+0{x}_{3}=1$, or $0 = 1$, so the system wouldn't be consistent... Right?

I like Serena said:
Anyway, the rank of the matrix is equal to the number of independent rows, which is also equal to the number of independent columns.
And if the system is supposed to be solvable for any $b$, we require at least as many independent rows/columns as there are elements in $b$.
I see... I didn't know that the number of linear rows was always equal to the number of linear columns... So, the vectors of the transpose of a matrix will always be independent if the vectors of the original matrix were independent as well?

Edit: Spelling mistakes.
 
Last edited:
  • #10
Velo said:
Krylov asked me to give an example of a one $b∈{\Bbb{R}}^{3}∈$ for which $(A|b)$ is not consistent... With A equal to the second matrix I assumed... So if the last position of the b vector is non zero (1 in my example), the equation would be something like $0{x}_{1}+0{x}_{2}+0{x}_{3}=1$, or $0 = 1$, so the system wouldn't be consistent... Right?
Exactly.
 

Similar threads

Back
Top