MHB Linear Independence of Vectors: Why Determinant ≠ 0?

Petrus
Messages
702
Reaction score
0
Hello MHB,
I got one question. If we got this vector $$V=(3,a,1)$$,$$U=(a,3,2)$$ and $$W=(4,a,2)$$ why is it linear independence if determinant is not equal to zero? (I am not interested to solve the problem, I just want to know why it is)

Regards,
$$|\pi\rangle$$
 
Physics news on Phys.org
Petrus said:
Hello MHB,
I got one question. If we got this vector $$V=(3,a,1)$$,$$U=(a,3,2)$$ and $$W=(4,a,2)$$ why is it linear independence if determinant is not equal to zero? (I am not interested to solve the problem, I just want to know why it is)

Regards,
$$|\pi\rangle$$

If you put your vectors in a matrix, you get a linear function identified by the matrix.
If you can "reach" all of $\mathbb R^3$ with this function, your vectors are linearly independent.
The determinant shows if this is possible. Zero means it's not.
 
I like Serena said:
If you put your vectors in a matrix, you get a linear function identified by the matrix.
If you can "reach" all of $\mathbb R^3$ with this function, your vectors are linearly independent.
The determinant shows if this is possible. Zero means it's not.
Thanks! I start to understand now!:)

Regards,
$$|\pi\rangle$$
 
Is this state true.
1. We can check if a vector is linear Independence by checking if the determinant is not equal to zero
2. For a linear independence matrix there exist a inverse
3.
I did find this theorem in internet that don't say in my book.
"If amounts of vector $$v_1,v_2,v_3...v_p$$ is in a dimension $$\mathbb R^n$$
then it is linear dependen if $$p>n$$

Regards,
$$|\pi\rangle$$
 
Last edited:
Petrus said:
Is this state true.
1. We can check if a vector is linear Independence by checking if the determinant is not equal to zero
2. For a linear independence matrix there exist a inverse
3.
I did find this theorem in internet that don't say in my book.
"If amounts of vector $$v_1,v_2,v_3...v_p$$ is in a dimension $$\mathbb R^n$$
then it is linear dependen if $$p>n$$

Regards,
$$|\pi\rangle$$

You can only calculate the determinant of a square matrix.
That means you can only use a determinant to check independence of n vectors, each of dimension n.

An inverse can only exist for a square matrix.
If the matrix is square and the vectors in it are linearly independent, then there exists an inverse.

If you have n linearly independent vectors, they span an n-dimensional space, like $\mathbb R^n$.
If you have one more vector, that won't fit in that n-dimensional space anymore in an independent manner.
So a set of n+1 vectors in $\mathbb R^n$ will have to be dependent.
 
I like Serena said:
You can only calculate the determinant of a square matrix.
That means you can only use a determinant to check independence of n vectors, each of dimension n.

An inverse can only exist for a square matrix.
If the matrix is square and the vectors in it are linearly independent, then there exists an inverse.

If you have n linearly independent vectors, they span an n-dimensional space, like $\mathbb R^n$.
If you have one more vector, that won't fit in that n-dimensional space anymore in an independent manner.
So a set of n+1 vectors in $\mathbb R^n$ will have to be dependent.
Thanks, I meant a square matrix :)

Regards,
$$|\pi\rangle$$
 
Thread 'Determine whether ##125## is a unit in ##\mathbb{Z_471}##'
This is the question, I understand the concept, in ##\mathbb{Z_n}## an element is a is a unit if and only if gcd( a,n) =1. My understanding of backwards substitution, ... i have using Euclidean algorithm, ##471 = 3⋅121 + 108## ##121 = 1⋅108 + 13## ##108 =8⋅13+4## ##13=3⋅4+1## ##4=4⋅1+0## using back-substitution, ##1=13-3⋅4## ##=(121-1⋅108)-3(108-8⋅13)## ... ##= 121-(471-3⋅121)-3⋅471+9⋅121+24⋅121-24(471-3⋅121## ##=121-471+3⋅121-3⋅471+9⋅121+24⋅121-24⋅471+72⋅121##...
Back
Top