MHB Proving $\{w_1, \ldots , w_m\}$ is a Basis of $\text{Lin}(v_1, \ldots , v_k)$

  • Thread starter Thread starter mathmari
  • Start date Start date
  • Tags Tags
    Basis
mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey! :o

Let $1\leq n\in \mathbb{N}$ and $v_1, \ldots , v_k\in \mathbb{R}^n$. Show that there exist $w_1, \ldots , w_m\in \{v_1, \ldots , v_k\}$ such that $(w_1, \ldots , w_m)$ is a basis of $\text{Lin}(v_1, \ldots , v_k)$. I have done the following:

A basis of $\text{Lin}(v_1, \ldots , v_k)$ is a linearly independent set of vectors of $\{v_1, \ldots , v_k\}$.

So let $\{w_1, \ldots , w_m\}\subseteq \{v_1, \ldots , v_k\}$ be a linearly independent set.

$\text{Lin}(v_1, \ldots , v_k)$ is the set of all linear combinations of $v_1, \ldots , v_k$. So it left to show that we can express every linear combination of that set using the vectors $\{w_1, \ldots , w_m\}$, or not? (Wondering)
 
Physics news on Phys.org
mathmari said:
Hey! :o

Let $1\leq n\in \mathbb{N}$ and $v_1, \ldots , v_k\in \mathbb{R}^n$. Show that there exist $w_1, \ldots , w_m\in \{v_1, \ldots , v_k\}$ such that $(w_1, \ldots , w_m)$ is a basis of $\text{Lin}(v_1, \ldots , v_k)$. I have done the following:

A basis of $\text{Lin}(v_1, \ldots , v_k)$ is a linearly independent set of vectors of $\{v_1, \ldots , v_k\}$.

So let $\{w_1, \ldots , w_m\}\subseteq \{v_1, \ldots , v_k\}$ be a linearly independent set.

$\text{Lin}(v_1, \ldots , v_k)$ is the set of all linear combinations of $v_1, \ldots , v_k$. So it left to show that we can express every linear combination of that set using the vectors $\{w_1, \ldots , w_m\}$, or not?

Hey mathmari!

Yes, a basis must also span the space. (Thinking)
 
That was implied in the first post. mathmari said that the basis we seek is a linearly independent subset of \{v_1, v_2, \cdot\cdot\cdot, v_k\} which was already said to span the space.

mathmari, you say "let \{w_1, w_2, \cdot\cdot\cdot, w_m\}\subseteq \{v_1, v_2, \cdot\cdot\cdot, v_k\}<br /> be a linearly independent subset". You are missing the crucial point- proving that such a linearly independent subset, that still spans the space, exists! You need to say something like "If \{v_1, v_2, \cdot\cdot\cdot, v_k\}<br />, which spans the space, is also linearly independent then we are done- it is a basis. If not then there exist numbers, \alpha_1, \alpha_2, \cdot\cdot\cdot, \alpha_k, not all 0, such that \alpha_1v_1+ \alpha_2v_2+ \cdot\cdot\cdot+ \alpha_kv_k= 0. Let \alpha_n be one of the non-zero \alphas. Then v_n= -\frac{1}{\alpha_n}(\alpha_1v_1+ \alpha_2v_2+ \cdot\cdot\cdot+ \alpha_{n-1}v_{n-1}+ \alpha_{n+2}v_{n+2}+ \cdot\cdot\cdot+ \alpha_kv_k) so that v_n can be replaced by that linear combination of the other vectors. This smaller set of vectors still spans the vector space. If it is linearly independent we are done, we have a basis. If it is not repeat the process. Since the initial set of vectors was finite, this will eventually terminate.
 
Last edited by a moderator:
I got it! Thank you very much! (Smile)
 
I asked online questions about Proposition 2.1.1: The answer I got is the following: I have some questions about the answer I got. When the person answering says: ##1.## Is the map ##\mathfrak{q}\mapsto \mathfrak{q} A _\mathfrak{p}## from ##A\setminus \mathfrak{p}\to A_\mathfrak{p}##? But I don't understand what the author meant for the rest of the sentence in mathematical notation: ##2.## In the next statement where the author says: How is ##A\to...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top