MHB How can I find out if this matrix A's columns are linearly independent?

Click For Summary
The matrix A, represented as $\begin{bmatrix}1&0\\0&0\end{bmatrix}$, has columns that are linearly dependent because one column is the zero vector. The trivial solution indicates that the only solution to the equation involving the columns is when both coefficients are zero. Additionally, since the matrix projects onto the $x_1$ axis, it is not one-to-one due to the presence of the zero column. To determine linear independence, one can also calculate the determinant of the matrix formed by the columns. Therefore, the columns of this matrix are confirmed to be linearly dependent.
shamieh
Messages
538
Reaction score
0
How can I find out if this matrix A's columns are linearly independent?

$\begin{bmatrix}1&0\\0&0\end{bmatrix}$

I see here that $x_1 = 0$ and similarly $x_2 = 0$ does this mean that this matrix A's columns are therefore linearly dependent?

Also this is a projection onto the $x_1$ axis so is it also safe to say that this is also one-to-one since it has only the trivial solution of $0$?
 
Physics news on Phys.org
shamieh said:
How can I find out if this matrix A's columns are linearly independent?

$\begin{bmatrix}1&0\\0&0\end{bmatrix}$

I see here that $x_1 = 0$ and similarly $x_2 = 0$ does this mean that this matrix A's columns are therefore linearly dependent?

Also this is a projection onto the $x_1$ axis so is it also safe to say that this is also one-to-one since it has only the trivial solution of $0$?

Hi shamieh, The two columns of the matrix are $$\begin{pmatrix}1\\0\end{pmatrix}$$ and $$\begin{pmatrix}0\\0\end{pmatrix}$$. To check the linear independence of these two vectors take $$\alpha$$ and $$\beta$$ such that, ​$\alpha\begin{pmatrix}1\\0\end{pmatrix}+\beta\begin{pmatrix}0\\0\end{pmatrix}=\begin{pmatrix}0\\0\end{pmatrix}$.Then, $$\alpha=0$$ and $$\beta\in\Re$$ is arbitrary. Thus these two vectors are linearly dependent. Another method of finding the linear independence of two vectors is to calculate the determinant of the matrix formed by them. This is illustrated in the following Wikipedia article. https://en.wikipedia.org/wiki/Linear_independence#Alternative_method_using_determinants
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
Replies
27
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 33 ·
2
Replies
33
Views
1K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 34 ·
2
Replies
34
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 15 ·
Replies
15
Views
2K