Statements with linearly independent vectors

In summary: Yes, for question 3 you had the correct idea. We can use our knowledge from question 3 to help us solve question 4. For question 3, you are on the right track. However, you may want to adjust your reasoning a bit. Remember, we are looking for the number of vectors $w$ that make $v_1, \ldots, v_m, w$ linearly independent. That means for each $w$, we want to choose $a_1, \ldots, a_m$ such that $w = \sum_{i=1}^m a_i v_i$ is linearly independent from $v_1, \ldots, v_m$. Since $w$
  • #1
mathmari
Gold Member
MHB
5,049
7
Hey! 😊

Let $\mathbb{K}$ a field and let $V$ a $\mathbb{K}$-vector space. Let $1\leq m, n\in \mathbb{N}$ and $n=\dim_{\mathbb{K}}V$. Let $v_1, \ldots , v_m\in V$ be linearly independent.
  1. Let $\lambda_1, \ldots , \lambda_m, \mu_1, \ldots , \mu_m\in \mathbb{K}$ such that $\displaystyle{\sum_{i=1}^m\lambda_iv_i=\sum_{i=1}^m\mu_iv_i}$. Then show that $\lambda_i=\mu_i$ for all $1\leq i\leq m$.
  2. Let $w\in V$. Then show that $w\notin \text{span}(v_1, \ldots , v_m) \iff v_1, \ldots , v_m, w$ linearly independent.
  3. Let $2\leq q\in \mathbb{N}$ with $|\mathbb{K}|=q$. Then show that $ \text{span}(v_1, \ldots , v_m) $ has $q^m$ elements and determine the number of vectors $w\in V$ such that $v_1, \ldots, v_m, w$ are linearly independent.
  4. Let $2\leq q\in \mathbb{N}$ with $|\mathbb{K}|=q$. Then show that $$\prod_{i=0}^{n-1}(q^n-q^i)=(q^n-1)(q^n-q)\cdots (q^n-q^{n-2})(q^n-q^{n-1})$$ the number of (ordered) bases of $V$.

For question 1, I have done the following:
\begin{align*}\sum_{i=1}^m\lambda_iv_i=\sum_{i=1}^m\mu_iv_i &\Rightarrow \sum_{i=1}^m\lambda_iv_i-\sum_{i=1}^m\mu_iv_i=0\\ & \Rightarrow \sum_{i=1}^m\left (\lambda_i-\mu_i\right )v_i=0 \\ & \ \overset{v_1, \ldots , v_m\text{ linear unabhängig}}{\Longrightarrow } \ \lambda_i-\mu_i=0, \ \forall i\in \{1, \ldots , m\} \\ & \Rightarrow \lambda_i=\mu_i, \ \forall i\in \{1, \ldots , m\}\end{align*}

For question 2, do we have to reformulate the equivalence into:
\begin{equation*}w\in \text{span}(v_1, \ldots , v_m) \iff v_1, \ldots , v_m, w\text{ linearly dependent }\end{equation*}
or can we show the equivalance in the given form? 🤔
But what would then $w\notin \text{span}(v_1, \ldots , v_m)$ mean? That $\displaystyle{w\neq\sum_{i=1}^m\alpha_iv_i }$ ? Could you give me a hint for questions 3 & 4 ?
 
Physics news on Phys.org
  • #2
What does "|K|" mean?
 
  • #3
Country Boy said:
What does "|K|" mean?

It's the dimension.
 
  • #4
Hi mathmari,

Very interesting set of questions.

Question 2
Yes, nicely done. We can reformulate the statement this way and it is a good idea to do so.

Question 3
For the number of elements in the span, note that for each $v_{i}$, there are $q$ choices from $\mathbb{K}$ for $v_{i}$'s coefficient. Hence, by the fundamental counting principle, there are $q^{m}$ total elements in their span.

For the second part, I will start off with an example that highlights all the important features of the general proof. Take a look and see if you can extend these ideas. If anything is unclear, certainly feel free to let me know.

Suppose $n=5$ and $m=2$. Let $v_{3},v_{4},$ and $v_{5}$ be vectors such that $\{v_{1}, v_{2}, v_{3}, v_{4}, v_{5}\}$ is a basis for $V$. Consider a vector of the form $$w=\sum_{i=1}^{5}a_{i}v_{i}.$$ The key here is to determine what condition must be placed on $a_{3}, a_{4}$ and $a_{5}$ to guarantee that $w$ will be linearly independent from $v_{1}$ and $v_{2}$. Can you think of anything that would do the trick? I will leave it at this for now but am certainly happy to help more if need be.

Question 4
This question relies heavily on having a proof/formula from Question 3, so I will refrain on the details of this until then. However, using the formula we discover in Question 3, we simply use it repeatedly to deduce the desired result.
 
Last edited:
  • #5
GJA said:
Question 3
For the number of elements in the span, note that for each $v_{i}$, there are $q$ choices from $\mathbb{K}$ for $v_{i}$'s coefficient. Hence, by the fundamental counting principle, there are $q^{m}$ total elements in their span.

For the second part, I will start off with an example that highlights all the important features of the general proof. Take a look and see if you can extend these ideas. If anything is unclear, certainly feel free to let me know.

Suppose $n=5$ and $m=2$. Let $v_{3},v_{4},$ and $v_{5}$ be vectors such that $\{v_{1}, v_{2}, v_{3}, v_{4}, v_{5}\}$ is a basis for $V$. Consider a vector of the form $$w=\sum_{i=1}^{5}a_{i}v_{i}.$$ The key here is to determine what condition must be placed on $a_{3}, a_{4}$ and $a_{5}$ to guarantee that $w$ will be linearly independent from $v_{1}$ and $v_{2}$. Can you think of anything that would do the trick? I will leave it at this for now but am certainly happy to help more if need be.

Do we use here the question 2, and $w$ must not be in the span of the vectors? :unsure:
 
  • #6
Exactly right, great job connecting the ideas. Can you determine a condition/requirement on $a_{3}, a_{4}$ and $a_{5}$ that would ensure $w$ is not in the span of $v_{1}$ and $v_{2}$?
 
  • #7
GJA said:
Exactly right, great job connecting the ideas. Can you determine a condition/requirement on $a_{3}, a_{4}$ and $a_{5}$ that would ensure $w$ is not in the span of $v_{1}$ and $v_{2}$?

The coefficients $a_3,a_4,a_5$ should not be all equal to zero, right?
 
  • #8
Yeah, that's it! So now we know that $a_{1}$ and $a_{2}$ can be any elements from $\mathbb{K}$, and that $a_{3}, a_{4},$ and $a_{5}$ can't all be zero. Can you use these pieces of information to count the total number of elements $w$ from $V$ such that $v_{1}, v_{2}, w$ are linearly dependent?
 
  • #9
Do we do for question 3 the following?

We consider the vectors $w\in V$ such that $v_1, \ldots , v_m, w$ are linearly independent. For the question 2 it holds that $w\notin \text{Lin}(v_1, \ldots , v_m)$.
The dimension of $V$ is $n$ and we have that $|\mathbb{K}|=q$.
It holds that \begin{equation*}q^n=\left (\#\text{ linearly independent vectors to }v_1, \ldots , v_m \right )+\left (\#\text{ linearly dependent vectors to }v_1, \ldots , v_m \right )\end{equation*}
The number of dependent vectors to $v_1, \ldots , v_m$ is the number of vectors in the span of $v_1, \ldots , v_m$. Since the span contains $q^m$ elements, tehre are $q^m$ elements $w$ such that $\displaystyle{w=\sum_{i=1}^m\alpha_iv_i}$.
Therefore the number of vectors that are linear independent to $v_1, \ldots , v_m$ is equal to $q^n-q^m$.

:unsure:For question 4:

Do we use question 3 for different values of $m$ ?

Or do we do the following:

There are $q^n$ elements in the vector space and so for the first choice for the basis there are $q^n-1$ ways ($0$ is excluded).
This vector generates a subspace of dimension $q$ and so for the second one there are $q^n-q$ ways.
If we continue we get that the number of ordered bases of $V$ is equal to \begin{equation*}(q^n-1)(q^{n-1}-q)\cdots (q^n-q^{n-2})(q^n-q^{n-1})=\prod_{i=0}^{n-1}(q^n-q^i)\end{equation*}

:unsure:
 
  • #10
Yes, this is all correct. Nicely done!

Edit: A minor point of correction. Your post says " This vector generates a subspace of dimension $q$..." This is not correct. A single vector generates a 1-dimensional subspace. There are $q$ total elements in this 1-dimensional subspace, but the space itself is still 1-dimensional. This is just like how the $x$-axis in $\mathbb{R}^{2}$ contains infinitely many vectors but, as a whole, is 1-dimensional.

The slightly corrected version of your idea would be to select a single non-zero vector, say $b_{1}$, to be the first basis element for $V$. Because we must exclude zero, there are $q^{n}-1$ ways to do this. The next vector we choose must be linearly independent of $b_{1}$. Since the span of $b_{1}$ is 1-dimensional, from Question 3 we know there are $q^{n}-q$ choices for $b_{2}.$ By induction, if the first $m-1$ basis elements are selected, Question 3 tells us there are $q^{n}-q^{m-1}$ ways to select the next basis element $b_{m}$ so that $b_{m}$ is linearly independent of $b_{1}, b_{2},\ldots, b_{m-1}$. Equivalently, $b_{m}$ is not part of the $m-1$ dimensional subspace spanned by $b_{1}, b_{2},\ldots, b_{m-1}$. As you've correctly stated, multiplying these values together gives the total number of ordered bases for $V$ over the field $\mathbb{K}$.
 
Last edited:
  • #11
mathmari said:
It's the dimension.
K is a field, not a vector space. What do you mean by "dimension" of a field?
 

1. What does it mean for vectors to be linearly independent?

Linearly independent vectors are a set of vectors that cannot be written as a linear combination of each other. This means that none of the vectors in the set can be expressed as a multiple of another vector in the set. In other words, the vectors are not redundant and each one contributes to the overall span of the set.

2. How can I determine if a set of vectors is linearly independent?

To determine if a set of vectors is linearly independent, you can perform a simple test called the linear independence test. This involves setting up a system of equations with the vectors as coefficients and solving for the variables. If the only solution is the trivial solution (where all variables equal 0), then the vectors are linearly independent. If there are other solutions, then the vectors are linearly dependent.

3. What is the importance of linearly independent vectors in mathematics and science?

Linearly independent vectors play a crucial role in many areas of mathematics and science. In linear algebra, they are used to form a basis for vector spaces, which are essential in understanding and solving systems of linear equations. In physics, they are used to represent physical quantities and describe the relationships between them. In machine learning, they are used to create linearly independent features for data analysis and modeling.

4. Can a set of vectors be linearly independent in one dimension but not in another?

Yes, a set of vectors can be linearly independent in one dimension but not in another. This is because the concept of linear independence is dependent on the dimension of the vector space. For example, a set of two vectors may be linearly independent in a two-dimensional space, but when extended to a three-dimensional space, they may become linearly dependent.

5. How can I use linearly independent vectors to solve a system of linear equations?

In a system of linear equations, the coefficients of the variables can be represented as a matrix. By finding a set of linearly independent vectors that span the columns of this matrix, you can determine the solutions to the system. This is because the linearly independent vectors form a basis for the vector space, allowing you to express any vector in the space as a linear combination of these vectors.

Similar threads

  • Linear and Abstract Algebra
Replies
9
Views
999
Replies
24
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
989
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
925
  • Linear and Abstract Algebra
Replies
12
Views
1K
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
965
  • Linear and Abstract Algebra
Replies
7
Views
1K
  • Linear and Abstract Algebra
Replies
15
Views
1K
Back
Top