- #1

DumpmeAdrenaline

- 78

- 2

```
\begin{pmatrix}
2 & 4 & 6 \\
3 & 5 & 8 \\
1 & 2 & 3
\end{pmatrix}
```

Using the row operations, R2<-- R2-3R1 R3<-- R3-R1 we find the row echelon form of the matrix.

```
\begin{pmatrix}
1 & 2 & 3 \\
0 & -1 & -1 \\
0 & 0 & 0
\end{pmatrix}
```

Based on the definition of row space in the book Í am studying from, the row space is a subspace that comprises an infinite collection of linearly independent rows of X.

To check if the row vectors [1,2,3] and [0,-1,-1] are linearly independent we write

$$ \beta_{1} [1,2,3]+\beta_{2} [0,-1,-1]=[\beta_{1}, 2\beta_{1}-\beta_{2}, 3\beta_{1}-\beta_{2}]=[0,0,0] $$

where β1 and β2 are scalars that belong to the field of real numbers.

If we consider the above, the only scalars (solution) that yields the 0 row vector are β1=β2=0. Therefore, the row vectors are independent. How to determine if the independent row vectors span the subspace and they form the basis vector for that subspace. Is the subspace considered here the infinite collection of 3*1 row vectors? If they are basis vectors for subspace does this imply if we add a new row vector to the given matrix we can write it in terms of the identified LI vectors or do we have to go through a new LU decomposition?