MHB Proving Linear Dependence and Span in n-dimensional Space

  • Thread starter Thread starter mathmari
  • Start date Start date
  • Tags Tags
    Span
Click For Summary
The discussion focuses on proving properties of linear dependence and span in n-dimensional space. Participants explore the implications of linear dependence, specifically how the span of a set of vectors remains unchanged when a linearly dependent vector is removed. They clarify that if vectors are linearly dependent, one can express one vector as a linear combination of others, which supports the assertion that removing it does not affect the span. Additionally, there is confusion regarding the notation used to express the equivalence of sets and whether it implies a trivial case or highlights the commutative property of vector addition. The conversation emphasizes the importance of definitions in linear algebra to establish these relationships clearly.
mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey! :o

Let $1\leq n,k\in \mathbb{N}$ and let $v_1, \ldots , v_k\in \mathbb{R}^k$. Show that:
  1. Let $w\in \text{Lin}(v_1, \ldots , v_k)$. Then it holds that $\text{Lin}(v_1, \ldots , v_k)=\text{Lin}(v_1, \ldots , v_k,w)$.
  2. Let $v_1, \ldots , v_k$ be linearly dependent. Thn there is a $1\leq i\leq k$ and $\lambda_1, \ldots , \lambda_k$ such that $v_i=\lambda_1v_1+\ldots +\lambda_{i-1}v_{i-1}+\lambda_{i+1}v_{i+1}+\ldots +\lambda_nk_n$.
  3. Let $i_1, \ldots i_k\in \mathbb{N}$, such that $\{1, \ldots , k\}=\{i_1, \ldots , i_k\}$. Then it holds that $\text{Lin}(v_1, \ldots , v_k)=\text{Lin}(v_{i_1}, \ldots , v_{i_k})$.
  4. Let $v_1, \ldots , v_k$ be linearly dependent. Then there is a $1\leq i\leq k$ such that $\text{Lin}(v_1, \ldots , v_k)=\text{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)$.

I have already shown the first two points. Could you please give me a hint fot the point $3$ ? (Wondering) As for point $4$ : Do we use here the point $2$ ? Suppose $v_i=\lambda_1v_1 +\ldots \lambda_{i-1}v_{i-1}+\lambda_{i+1}v_{i+1}+\ldots +\lambda_kv_k$. Then it holds that $\text{Lin}(v_1, \ldots , v_k)\subseteq \text{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)$, or not?
No it is left to show that $\text{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)\subset \text{Lin}(v_1, \ldots , v_k)$, or not?

Or is there an other for this proof?

(Wondering)
 
Physics news on Phys.org
For (3), what does "\{1, …, k\}= \{i_1, …, i_k\}" mean? With standard set notation that would just mean that v_1= v_{i_1}, …, v_k= v_{i_k} but then the problem is trivial. Or is the point that the order doesn't matter? Then the problem is almost trivial- just using the fact that vector addition is commutative.
 
mathmari said:
Hey! :o

Let $1\leq n,k\in \mathbb{N}$ and let $v_1, \ldots , v_k\in \mathbb{R}^k$. Show that:
4. Let $v_1, \ldots , v_k$ be linearly dependent. Then there is a $1\leq i\leq k$ such that $\text{Lin}(v_1, \ldots , v_k)=\text{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)$.

As for point $4$ : Do we use here the point $2$ ? Suppose $v_i=\lambda_1v_1 +\ldots \lambda_{i-1}v_{i-1}+\lambda_{i+1}v_{i+1}+\ldots +\lambda_kv_k$. Then it holds that $\text{Lin}(v_1, \ldots , v_k)\subseteq \text{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)$, or not?

Hey mathmari!

Normally we start from the definition.
From wiki:
The vectors in a subset $S=\{\vec v_1,\vec v_2,\dots,\vec v_k\}$ of a vector space $V$ are said to be ''linearly dependent'', if there exist scalars $a_1,a_2,\dots,a_k$, not all zero, such that
$$a_1\vec v_1+a_2\vec v_2+\cdots+a_k\vec v_k= \vec 0,$$
where $\vec 0$ denotes the zero vector.


Let $a_i$ be one of those scalars that is not zero.
Then:
$$a_1\vec v_1+a_2\vec v_2+\cdots+a_k\vec v_k= \vec 0
\implies \vec v_i = -\frac{1}{a_i}\left(a_1 \vec v_1+\cdots + a_{i-1}\vec v_{i-1}+ a_{i+1}\vec v_{i+1}+\cdots+a_k\vec v_k\right)
$$
So $\vec v_i \in \operatorname{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)$, isn't it? (Wondering)
mathmari said:
No it is left to show that $\text{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)\subset \text{Lin}(v_1, \ldots , v_k)$, or not?

Yes, and that follows from the definition of a linear span, doesn't it?
What is the definition of a linear span? (Wondering)
 
Klaas van Aarsen said:
Let $a_i$ be one of those scalars that is not zero.
Then:
$$a_1\vec v_1+a_2\vec v_2+\cdots+a_k\vec v_k= \vec 0
\implies \vec v_i = -\frac{1}{a_i}\left(a_1 \vec v_1+\cdots + a_{i-1}\vec v_{i-1}+ a_{i+1}\vec v_{i+1}+\cdots+a_k\vec v_k\right)
$$
So $\vec v_i \in \operatorname{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)$, isn't it? (Wondering)

So this direction follows from point 2., doesn't t? (Wondering)
Klaas van Aarsen said:
Yes, and that follows from the definition of a linear span, doesn't it?
What is the definition of a linear span? (Wondering)

Let $x\in \text{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)$. Then $x$ is a linear combination of the elements $v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k$, i.e. \begin{equation*}x=\lambda_1v_1+ \ldots + \lambda_{i-1}v_{i-1}+\lambda_{i+1} v_{i+1}+ \ldots+ \lambda_kv_k\end{equation*} Then we can write this element also as follows \begin{equation*}x=\lambda_1v_1+ \ldots + \lambda_{i-1}v_{i-1}+0\cdot v_i+\lambda_{i+1} v_{i+1}+ \ldots+ \lambda_kv_k\end{equation*} and now it is a linear combination of the elements $v_1, \ldots , v_{i-1}, v_i,v_{i+1}, \ldots, v_k$ and this means that $x\in \text{Lin}(v_1, \ldots , v_k)$.

So we get that $\text{Lin}(v_1, \ldots , v_{i-1}, v_{i+1}, \ldots, v_k)\subseteq \text{Lin}(v_1, \ldots , v_k)$. Is everything correct? (Wondering)

- - - Updated - - -

HallsofIvy said:
For (3), what does "\{1, …, k\}= \{i_1, …, i_k\}" mean? With standard set notation that would just mean that v_1= v_{i_1}, …, v_k= v_{i_k} but then the problem is trivial. Or is the point that the order doesn't matter? Then the problem is almost trivial- just using the fact that vector addition is commutative.

I am also a bit confused about the meaning. I think that your second assumption is meant, since the first were too easy. (Thinking)

So do we have to show that at the linear combination we can change the order of the vectors? (Wondering)
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 24 ·
Replies
24
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 25 ·
Replies
25
Views
3K
Replies
3
Views
3K
  • · Replies 40 ·
2
Replies
40
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K