MHB Vector Spaces .... Linear Dependence and Indepence .... Basic Proof Required

Click For Summary
The discussion revolves around proving linear dependence in vector spaces as outlined in Andrew McInerney's book. A set of vectors is considered linearly dependent if one vector can be expressed as a linear combination of others. The proof demonstrates that if a vector is a linear combination of others, it leads to a non-trivial linear combination equating to zero, confirming dependence. It also clarifies that a single non-zero vector is independent, emphasizing that linear dependence requires at least two vectors. The example provided illustrates a specific case of linear dependence in a set of vectors in R^3.
Math Amateur
Gold Member
MHB
Messages
3,920
Reaction score
48
In Andrew McInerney's book: First Steps in Differential Geometry, Theorem 2.4.3 reads as follows:https://www.physicsforums.com/attachments/5252McInerney leaves the proofs for the Theorem to the reader ...

I am having trouble formulating a proof for Part (3) of the theorem ...

Can someone help ...

Peter
 
Physics news on Phys.org
Suppose $S$ has $n$ elements, so $S = \{v_1,v_2,\dots,v_n\}$.

If one of these, say, $v_n$ (we can always "re-organize" our set $S$, so that the vector that is a linear combination of the others is the last one), is a linear combination of the others, we have:

$v_n = c_1v_1 + c_2v_2 + \cdots + c_{n-1}v_{n-1}$, for some scalars (field elements) $c_1,\dots,c_{n-1}$.

Hence:

$c_1v_1 + c_2v_2 +\cdots + c_{n-1}v_{n-1} + (-1)v_n = 0$.

These scalars cannot all be $0$, since in any field $1 \neq 0$, hence $-1 \neq -0 = 0$.

So, by the *definition* of linear dependence, $S$ is a linearly dependent set.

One caveat: $n = 1$ doesn't work. Why? Because if $v_1 \neq 0$, the set $\{v_1\}$ is linearly independent, since if:

$c_1v_1 = 0$, from $v_1 \neq 0$, we must have $c_1 = 0$.

On the other hand, if $S$ is a linearly dependent set of $n$ vectors, that for some $c_1,\dots,c_n$ not ALL $0$, we have:

$c_1v_1 +\cdots + c_nv_n = 0$.

Choose any $c_j \neq 0$ (we have at least one).

Then $v_j = -\left(\dfrac{c_1}{c_j}\right)v_1 - \cdots - \left(\dfrac{c_{j-1}}{c_j}\right)v_{j-1} - \left(\dfrac{c_{j+1}}{c_j}\right)v_{j+1} - \cdots - \left(\dfrac{c_n}{c_j}\right)v_n$

which is a linear combination of the other $n-1$ vectors.

For example, the set $S \subseteq \Bbb R^3$ given by:

$S = \{(0,1,0),(0,1,0),(2,5,0)\}$ is linearly dependent.
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K