- #1
Conrad Manfried
It is the demonstration of an important theorem I do not succeed in understanding.
"A matrix has rank k if - and only if - it has k rows - and k columns - linearly independent, whilst each one of the remaining rows - and columns - is a linear combination of the k preceding ones".
Let's suppose that the matrix is
$$\begin{bmatrix}a_{1,1}&a_{1,2}& \ldots&a_{1,n}\\a_{2,1}&a_{2,2}&\ldots&a_{2,n}\\\ldots&\ldots&\ldots&\ldots\\a_{m,1}&a_{m,2}&\ldots&a_{m,n}\end{bmatrix}\tag{1}$$
and let's suppose that the minor
$$\lambda = \begin{vmatrix}a_{1,1}&a_{1,2}&\ldots&a_{1,k}\\a_{2,1}&a_{2,2}&\ldots&a_{2,k}\\\ldots&\ldots&\ldots&\ldots\\a_{k,1}&a_{k,2}&\ldots&a_{k,n}\end{vmatrix}\tag{23}$$
formed by the first k rows and the first k columns - situation which it is always possible to get to by means of adequate substitutions between rows and columns - is not null. It should be noted that the first k rows are linearly independent. Indeed, should the contrary be true, at least one of the rows would be a linear combination of the remaining ones and minor (23) would be null against the hypothesis. Let's now consider minor (24) - having order k+1 - $$\begin{vmatrix}a_{1,1}&\ldots&a_{1,k}&a_{1,j}\\\ldots&\ldots&\ldots&\ldots\\a_{k,1}&\ldots&a_{k,k}&a_{k,j}\\a_{i,1}&\ldots&a_{i,k}&a_{i,j}\end{vmatrix}\tag{24}$$
with i=k+1,k+2,…,m and j=1,2,...,n. This minor is null: if j=<k because it happens to feature two identical columns, but even if j>k. Actually, determinant (24) can be developed according to the elements of the last column and one gets:
λai,j+λ1a1,j+λ2a2,j+ . . . + λkak,j = 0 (25)
where λ1, λ2, . . .,λk (26)
are nothing else but the algebraic complements - with respect to determinant (24) - of a1,j, a2,j, . . .,ak,j. It should be explicitly noted that (26) are constants and that they do not depend on j - WHY? - . After solving (25) with respect to ai,j - remember that λ is not null - and putting μ1 = −λ1/λ, μ2 = − λ2/λ, . . ., μk = − λk/λ, one gets ai,j = μ1a1,j + μ2a2,j + . . . +μkak,j with 1≤j≤n. As a consequence, the i-th row (i>k) in the matrix is nothing else but a linear combination of the first k ones.
The theorem is thus proved as for raws and can be analogously demonstrated as for columns.
I think that assuming that coefficients λ1, λ2, . . ., λk - algebraic complements with respect to the determinant (24) - do not depend on j - that is on the column, but only on i, that is on the row beyond the rank, even if the text of the demonstration does not explicitly admits it - is totally equivalent to the demonstration itself. But - even more - I wonder why the algebraic complements of the same index should always be equal in the corresponding positions independently on the columns on which one calculates the determinant. All the theorem is based on the fact that coefficients λ's depend only on the row to which they belong and that , as a consequence, if they belong to the same row, even if the column varies, they are equal. In fact, if they - generally - depended on (i,j) - that is if they were λi,j as I would have expected them to be and not simply λi -, the theorem would not be valid any longer.
If one develops the determinant (24 ): $$\begin{vmatrix}a_{1,1}&\ldots&a_{1,k}&a_{1,j}\\\ldots&\ldots&\ldots&\ldots\\a_{k,1}&\ldots&a_{k,k}&a_{k,j}\\a_{i,1}&\ldots&a_{i,k}&a_{i,j}\end{vmatrix} = \lambda a_{i,j} + \begin{vmatrix}a_{2,1}&\ldots&a_{2,k}\\a_{k,1}&\dots&a_{k,k}\\a_{i,1}&\ldots&a_{i,k}\end{vmatrix} a_{1,j} + . . . \tag{24-25} $$, one can easily realize that
$$\lambda_1 = \begin{vmatrix}a_{2,1}&\ldots&a_{2,k}\\a_{k,1}&\dots&a_{k,k}\\a_{i,1}&\ldots&a_{i,k}\end{vmatrix}$$ does not depend on j (it's, obviously, made with the lines 2 to k and i and the columns 1 to k).
But - formally, at least - it's not the same for the other columns. In fact, if one calculates λ's on other columns, it proves impossible to manage to formally show that - for the same index - they are equal and this - nevertheless - is what a sound demonstration should grant.
In fact, it is not sufficient to only prove that λ's are not dependent on j's, but it should be proved that they are equal when calculated on different columns of the matrix.
Am I right or not or what is totally escaping me in this demonstration ?
"A matrix has rank k if - and only if - it has k rows - and k columns - linearly independent, whilst each one of the remaining rows - and columns - is a linear combination of the k preceding ones".
Let's suppose that the matrix is
$$\begin{bmatrix}a_{1,1}&a_{1,2}& \ldots&a_{1,n}\\a_{2,1}&a_{2,2}&\ldots&a_{2,n}\\\ldots&\ldots&\ldots&\ldots\\a_{m,1}&a_{m,2}&\ldots&a_{m,n}\end{bmatrix}\tag{1}$$
and let's suppose that the minor
$$\lambda = \begin{vmatrix}a_{1,1}&a_{1,2}&\ldots&a_{1,k}\\a_{2,1}&a_{2,2}&\ldots&a_{2,k}\\\ldots&\ldots&\ldots&\ldots\\a_{k,1}&a_{k,2}&\ldots&a_{k,n}\end{vmatrix}\tag{23}$$
formed by the first k rows and the first k columns - situation which it is always possible to get to by means of adequate substitutions between rows and columns - is not null. It should be noted that the first k rows are linearly independent. Indeed, should the contrary be true, at least one of the rows would be a linear combination of the remaining ones and minor (23) would be null against the hypothesis. Let's now consider minor (24) - having order k+1 - $$\begin{vmatrix}a_{1,1}&\ldots&a_{1,k}&a_{1,j}\\\ldots&\ldots&\ldots&\ldots\\a_{k,1}&\ldots&a_{k,k}&a_{k,j}\\a_{i,1}&\ldots&a_{i,k}&a_{i,j}\end{vmatrix}\tag{24}$$
with i=k+1,k+2,…,m and j=1,2,...,n. This minor is null: if j=<k because it happens to feature two identical columns, but even if j>k. Actually, determinant (24) can be developed according to the elements of the last column and one gets:
λai,j+λ1a1,j+λ2a2,j+ . . . + λkak,j = 0 (25)
where λ1, λ2, . . .,λk (26)
are nothing else but the algebraic complements - with respect to determinant (24) - of a1,j, a2,j, . . .,ak,j. It should be explicitly noted that (26) are constants and that they do not depend on j - WHY? - . After solving (25) with respect to ai,j - remember that λ is not null - and putting μ1 = −λ1/λ, μ2 = − λ2/λ, . . ., μk = − λk/λ, one gets ai,j = μ1a1,j + μ2a2,j + . . . +μkak,j with 1≤j≤n. As a consequence, the i-th row (i>k) in the matrix is nothing else but a linear combination of the first k ones.
The theorem is thus proved as for raws and can be analogously demonstrated as for columns.
I think that assuming that coefficients λ1, λ2, . . ., λk - algebraic complements with respect to the determinant (24) - do not depend on j - that is on the column, but only on i, that is on the row beyond the rank, even if the text of the demonstration does not explicitly admits it - is totally equivalent to the demonstration itself. But - even more - I wonder why the algebraic complements of the same index should always be equal in the corresponding positions independently on the columns on which one calculates the determinant. All the theorem is based on the fact that coefficients λ's depend only on the row to which they belong and that , as a consequence, if they belong to the same row, even if the column varies, they are equal. In fact, if they - generally - depended on (i,j) - that is if they were λi,j as I would have expected them to be and not simply λi -, the theorem would not be valid any longer.
If one develops the determinant (24 ): $$\begin{vmatrix}a_{1,1}&\ldots&a_{1,k}&a_{1,j}\\\ldots&\ldots&\ldots&\ldots\\a_{k,1}&\ldots&a_{k,k}&a_{k,j}\\a_{i,1}&\ldots&a_{i,k}&a_{i,j}\end{vmatrix} = \lambda a_{i,j} + \begin{vmatrix}a_{2,1}&\ldots&a_{2,k}\\a_{k,1}&\dots&a_{k,k}\\a_{i,1}&\ldots&a_{i,k}\end{vmatrix} a_{1,j} + . . . \tag{24-25} $$, one can easily realize that
$$\lambda_1 = \begin{vmatrix}a_{2,1}&\ldots&a_{2,k}\\a_{k,1}&\dots&a_{k,k}\\a_{i,1}&\ldots&a_{i,k}\end{vmatrix}$$ does not depend on j (it's, obviously, made with the lines 2 to k and i and the columns 1 to k).
But - formally, at least - it's not the same for the other columns. In fact, if one calculates λ's on other columns, it proves impossible to manage to formally show that - for the same index - they are equal and this - nevertheless - is what a sound demonstration should grant.
In fact, it is not sufficient to only prove that λ's are not dependent on j's, but it should be proved that they are equal when calculated on different columns of the matrix.
Am I right or not or what is totally escaping me in this demonstration ?