MHB Every set of k+1 elements is dependent.

  • Thread starter Thread starter Guest2
  • Start date Start date
  • Tags Tags
    Elements Set
Guest2
Messages
192
Reaction score
0
Let $S = \left\{x_1, \ldots, x_k\right\}$ be independent set consisting of $$k$$ elements in a linear space $$V$$ and let $l(S)$ be the subspace spanned by $S$. Then every set of $$k+1$$ elements in $$l(S)$$ is dependent.

I've gotten nowhere with this - but I'm posting my attempt nonetheless if only for the sake showing effort.

Since $$S$$ is independent there exist scalars $d_1, \ldots, d_k$ such that $$\sum_{1 \le i \le k}d_ix_i = 0 \implies d_1 = d_2 = \ldots = d_k ~~~~~~ (1)$$

Let $y_1, \ldots, y_{k+1} \in l(S)$, and let $c_1, \ldots, c_k$ be scalars not all zero. Then we can write:

$$\begin{align} & \begin{aligned} c_1y_1 & = \sum_{1 \le i \le k}a_{1, i} x_i, ~ c_2y_2 & = \sum_{1 \le i \le k}a_{2, i} x_i, ~ \ldots, ~ c_{k+1}y_{k+1} = \sum_{1 \le i \le k}a_{k+1, i}x_i \end{aligned} \end{align}$$

Where $a_{1, i}, \ldots, a_{k+1, i}$ are scalars not all zero. Therefore $$\displaystyle \sum_{1 \le i \le k+1}c_iy_i = \sum_{1 \le r \le k+1}\sum_{1 \le i \le k}a_{r,i}x_i ~~~~~~~~~~~ (2)$$

Now if $a_{r,i}$ are all zero, then $\displaystyle \sum_{1 \le i \le k}c_iy_i = 0$ and we're done.

If not, then it's too hard for me. I thought of letting letting $a_{1,i} = -d_1, ~ a_{2,i} = d_{2}, \ldots, a_{k, i} = (-1)^kd_k$ and finally $a_{k+1, i} = 0$ if $k$ is even, and $d_{k}$ if $k$ is odd. Then from $(1)$ and (2) we have $\displaystyle \sum_{1 \le i \le k+1}c_iy_i = \sum_{1 \le r \le k+1}\sum_{1 \le i \le k} (-1)^i d_i x_i$. But nothing cancels, as $x_1, \ldots, x_k$ are distinct.
 
Last edited:
Physics news on Phys.org
Re: Every set of k+1 elements is independent.

To use linear independence and linear dependence in proofs, it is crucial to have a firm understanding of what the definitions are, and what they mean.

The definition of linear dependence:

Given a set of vectors $\{v_1,\dots,v_n\}$ there exists a non-zero linear combination that equals the 0-vector.

That is, there exists $c_1,\dots,c_n$ NOT ALL 0, such that:

$c_1v_1 + \cdots + c_nv_n = 0$.

We have to make the caveat "not all 0" for the $c_i$, because if they were, it would trivially be true that:

$0v_1 + \cdots + 0v_n = 0$, and EVERY set of vectors would be linear dependent, rendering it useless as a concept.

What it means:

We don't need ALL the vectors $\{v_1,\dots,v_n\}$ to generate the linear span of linear combinations:

$\{a_1v_1 + \cdots + a_nv_n : a_1,\dots,a_n \in F\}$ (here $F$ is the underlying field of our vector space-it many applications this is $\Bbb R$ or $\Bbb C$ (but it doesn't HAVE to be).

In other words, we have some redundancy in our generating set. To see this, suppose we have a linear combination that sums to the 0-vector:

$c_1v_1 +\cdots + c_nv_n = 0$.

Suppose that we know $c_k \neq 0$ ($k$ might be 1, or $n$, but I will pretend it is between them, you can figure out the cases $k = 1$ and $k = n$ on your own) is the non-zero scalar coefficient.

Then:

$v_k = \dfrac{-c_1}{c_k}v_1 +\cdots \dfrac{-c_{k-1}}{c_k}v_{k-1} + \dfrac{-c_{k+1}}{c_k}v_{k+1} + \cdots + \dfrac{-c_n}{c_k}v_n$

(again, this formula actually presupposes $2 < k < n-1$, and again, you can puzzle out on your own these special cases).

So $v_k$ can be expressed as a linear combination of the remaining vectors, it is unnecessary to generate the span.

Now for linear independence.

The definition:

the set $\{v_1,\dots,v_n\}$ of vectors is linearly independent if the ONLY linear combination of these vectors is the 0-combination, that is:

$c_1v_1 + \cdots + c_nv_n = 0 \implies c_1 =\cdots = c_n = 0$ (these are "two different zeros", the first is the 0-vector, the second is the "field/scalar zero").

What it means: we need ALL of the $v_i$ to generate the linear span.

To see this, suppose that we could write:

$v_k = a_1v_1 + \cdots + a_{k-1}v_{k-1} + a_{k+1}v_{k+1} + \cdots + a_nv_n$ for some $k$.

(again, the same caveat about the "special cases of $k$" apply as above, the modifications needed are minor).

Then $a_1v_1 + \cdots + a_{k-1}v_{k-1} + (-1)v_k + a_{k+1}v_{k+1} + \cdots + a_nv_n = 0$

and since $-1 \neq 0$, this is a non-zero linear combination that sums to the 0-vector. But, from the definition, no such linear combination exists, so we have a contradiction. So our supposition that we could write our $v_k$ as a linear combination of the others must be false.

*********************

Will post more later.
 
Re: Every set of k+1 elements is independent.

Thank you, I'll study your post throughly.
 
I asked online questions about Proposition 2.1.1: The answer I got is the following: I have some questions about the answer I got. When the person answering says: ##1.## Is the map ##\mathfrak{q}\mapsto \mathfrak{q} A _\mathfrak{p}## from ##A\setminus \mathfrak{p}\to A_\mathfrak{p}##? But I don't understand what the author meant for the rest of the sentence in mathematical notation: ##2.## In the next statement where the author says: How is ##A\to...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...

Similar threads

Back
Top